Abstract:
In computer vision, image segmentation is defined as the process of partitioning an image into several regions with homogeneous features. The region of our interest here in this thesis is the liver.
The main goal of the liver segmentation process is to divide the pixels of the medical image depending on specific criteria into two groups: pixels that belong to the object of interest (liver) and the rest of pixels that don't belong to the liver. It is an essential task in oncological therapy monitoring and radio-therapeutic treatment where tumor information is vital for correct dosimetry calculations.
Usually, the liver segmentation has been done manually by trained clinicians but it is time-consuming and requires much effort also different from one clinician to another because of the observer variability; as a result of that, an automatic liver segmentation system would thus be a great boon for performing these tasks. Because of the complexity of liver shapes and variable liver sizes among patients, the segmentation of the liver from medical images is very difficult and also due to low contrast between the liver and surrounding organs like the stomach, pancreas, kidney, and muscles.
Before the deep learning revolution, traditional handcrafted features were used for liver segmentation but with deep learning, the features are obtained automatically. There are numerous semi-automatic and fully automatic methodologies that have been proposed to improve liver segmentation some of them use deep learning techniques for segmentation and others use a classical based method for segmentation but still, there are no none of them achieve a hundred percent of accuracy.
In this thesis, we use the deep learning technique in particular U-net architecture to enhance the Automatic Liver Segmentation process. MICCA and 3D-IRCAD datasets are used to training and testing the model.The proposed Unet model, it was able to achieve the Dice similarity coefficient for MICCA dataset is equal to 0.97% and for a 3D-IRCAD dataset is equal to 0.96%.