Classification Of Breast Cancer Histology Images Using ALEXNET

Wajahat Nawaz*, Sagheer Ahmed, Ali Tahir, Hassan Aqeel Khan

*Corresponding author for this work

    Research output: Chapter in Book/Published conference outputConference publication


    Training a deep convolutional neural network from scratch requires massive amount of data and significant computational power. However, to collect a large amount of data in medical field is costly and difficult, but this can be solved by some clever tricks such as mirroring, rotating and fine tuning pre-trained neural networks. In this paper, we fine tune a deep convolutional neural network (ALEXNET) by changing and inserting input layer convolutional layers and fully connected layer. Experimental results show that our method achieves a patch and image-wise accuracy of 75.73% and 81.25% respectively on the validation set and image-wise accuracy of 57% on the ICIAR-2018 breast cancer challenge hidden test set.

    Original languageEnglish
    Title of host publicationImage Analysis and Recognition - 15th International Conference, ICIAR 2018, Proceedings
    EditorsBart ter Haar Romeny, Fakhri Karray, Aurelio Campilho
    Number of pages8
    ISBN (Print)9783319929996
    Publication statusPublished - 6 Jun 2018
    Event15th International Conference on Image Analysis and Recognition, ICIAR 2018 - Povoa de Varzim, Portugal
    Duration: 27 Jun 201829 Jun 2018

    Publication series

    NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
    Volume10882 LNCS
    ISSN (Print)0302-9743
    ISSN (Electronic)1611-3349


    Conference15th International Conference on Image Analysis and Recognition, ICIAR 2018
    CityPovoa de Varzim


    • Carcinoma cancer
    • Convolution neural network
    • Deep learning
    • Pathologists
    • Transfer learning


    Dive into the research topics of 'Classification Of Breast Cancer Histology Images Using ALEXNET'. Together they form a unique fingerprint.

    Cite this