A novel deep learning based technique for effective cancer detection.
Breast cancer is the most common form of cancer in women aged 2059 years worldwide. According to the data provided by the American Cancer Society, in 2019, about 268,600 new cases of invasive breast cancer and about 62,930 new cases of in situ breast cancer were reportedly diagnosed in which nearly 41,760 women reportedly died from breast cancer. Early detection of the breast cancer is therefore important for increasing the survival rates of patients. If diagnosed in early stages, the survival rate from breast cancer may be increased up to 80%. The high morbidity and considerable healthcare cost associated with cancer has inspired researchers to develop more accurate models for cancer detection. In our B.Tech project we explore and study different ways for computer aided diagnosis(CAD) of cancer detection and find some new ways to tackle this problem. We approached the problem as binary classification problem and adopted deep learning measures to tackle the problem. First we drew inspiration from Pattern Recognition Letters and adopted transfer learning technique to overcome the challenge of non-existence of large dataset which when trained fully give highly overfit models which don’t generalise well. In opposition to the models provided in the paper we suggest new architectures (ResNext, XceptionNet) for better feature extraction as well as new data augmentation techniques. The dataset used was publicly available BreakHis dataset by UFPR which has 2480 histological samples of benign tumors and 5429 histological samples of malignant tumors. Adopting these measures we were able to get 85.2% accuracy. Transfer Learning though a very useful learning technique has some demerits:
Transfer learning only works if the initial and target problems are similar enough. Since we are using architectures trained on ImageNet which does image classification, we don’t have any proof of positive correlation between the classes of ImageNet and that of cancerous tissues.
For transfer learning to work properly, we need that the distribution of the training data which our pre-trained model has used should be like the data that we are going to face during test time or at least don't vary too much. But since cancer detection is inherently imbalanced classification and classes in ImageNet are fairly balanced, thus transfer learning is at disadvantage here.
Therefore, to overcome the demerits of transfer learning, we solve the “PatchCamelyon” problem hosted on ‘www.grand-challenge.org’. This problem is of cancerous tissue detection in which we are provided with a dataset of 327,680 colour images (96 x 96px) extracted from histopathologic scans of lymph node sections. Each image is annotated with a binary label indicating presence of metastatic tissue. We again use the data augmentation techniques which we suggested earlier and use an ensemble(soft voting) of XceptionNet, ResNext and CapsuleNet for predictions. Instead of traditional learning technique in which the learning rate is either fixed or degrades by a fixed ratio, we use cyclic restarts so that the model finds flatter minima and generalises well. Also each model uses weight space Ensembling instead of model space ensembling(also referred to as snapshot ensembling) since ensembling in weight space generalise better. Applying all these techniques yields us AUC(metric used for evaluation of PatchCamelyon challenge) of 0.958.