Face-based Gender Classification Using Deep Learning Model

Main Article Content

Buraq Abed Ruda Hassan
Faten Abed Ali Dawood

Abstract

Gender classification is a critical task in computer vision. This task holds substantial importance in various domains, including surveillance, marketing, and human-computer interaction. In this work, the face gender classification model proposed consists of three main phases: the first phase involves applying the Viola-Jones algorithm to detect facial images, which includes four steps: 1) Haar-like features, 2) Integral Image, 3) Adaboost Learning, and 4) Cascade Classifier. In the second phase, four pre-processing operations are employed, namely cropping, resizing, converting the image from(RGB) Color Space to (LAB) color space, and enhancing the images using (HE, CLAHE). The final phase involves utilizing Transfer learning, a powerful deep learning technique that can be effectively employed to Face gender classification using the Alex-Net architecture. The performance evaluation of the proposed gender classification model encompassed three datasets: the LFW dataset, which contained 1,200 facial images. The Faces94 dataset contained 400 facial images, and the family dataset had 400. The Transfer Learning with the Alex-Net model achieved an accuracy of 98.77% on the LFW dataset.


Furthermore, the model attained an accuracy rate of 100% on both the Faces94 and family datasets. Thus, the proposed system emphasizes the significance of employing pre-processing techniques and transfer learning with the Alex-Net model. These methods contribute to more accurate results in gender classification. Where, the results achieved by applying image contrast enhancement techniques, such as HE and CLAHE, were compared. CLAHE achieved the best facial classification accuracy compared to HE.

Article Details

How to Cite
“Face-based Gender Classification Using Deep Learning Model” (2024) Journal of Engineering, 30(01), pp. 106–123. doi:10.31026/j.eng.2024.01.07.
Section
Articles

How to Cite

“Face-based Gender Classification Using Deep Learning Model” (2024) Journal of Engineering, 30(01), pp. 106–123. doi:10.31026/j.eng.2024.01.07.

Publication Dates

References

Alamri, H., Alshanbari, E., Alotaibi, S., and AlGhamdi, M., 2022. Face recognition and gender detection using SIFT feature extraction, LBPH, and SVM. Engineering, Technology & Applied Science Research, 12(2), pp. 8296-8299.‏ Doi:10.48084/etasr.4735

Alkentar, S.M., Alsahwa, B., Assalem, A., and Karakolla, D., 2021. Practical comparation of the accuracy and speed of YOLO, SSD and Faster RCNN for drone detection. Journal of Engineering, 27(8), pp. 19-31. Doi:10.31026/j.eng.2021.08.02‏

Alfarhany, A.A.R., and Abdullah, N.A., 2023. Iraqi Sentiment and Emotion Analysis Using Deep Learning. Journal of Engineering, 29(09), pp. 150-165. Doi: 10.31026/j.eng.2023.09.11

Al Jibory, F.K., Mohammed, O.A., and Al Tamimi, M.S.H., 2022. Age estimation utilizing deep learning Convolutional Neural Network. International Journal on Technical and Physical Problems of Engineering, 14(4), pp. 219–224.

Anusri, U., Dhatchayani, G., Angelinal, Y. P., and Kamalraj, S., 2021. An early prediction of Parkinson’s disease using facial emotional recognition. In Journal of Physics: Conference Series (Vol. 1937, No. 1, p. 012058). IOP Publishing.‏ Doi:10.1088/1742-6596/1937/1/012058

Bajrami, X., Gashi, B., and Murturi, I., 2018. Face recognition performance using linear discriminant analysis and deep neural networks. International Journal of Applied Pattern Recognition, 5(3), pp. 240-250.‏Doi: 10.1504/IJAPR.2018.094818

Berbar, M. A., 2022. Faces recognition and facial gender classification using convolutional neural network. Menoufia Journal of Electronic Engineering Research, 31(2), pp. 1-10. ‏Doi: 10.1109/AIMV53313.2021.9670898

Chaudhari, M. N., Deshmukh, M., Ramrakhiani, G., and Parvatikar, R., 2018. Face detection using viola jones algorithm and neural networks. In 2018 Fourth International Conference on Computing Communication Control and Automation (ICCUBEA), pp. 1-6. Doi:10.1109/ICCUBEA.2018.8697768

Chen, J., Liu, S., and Chen, Z., 2017. Gender classification in live videos. IEEE International Conference on Image Processing (ICIP) pp. 1602-1606. Doi: 10.1109/ICIP.2017.8296552

Ciocca, G., Cusano, C., Gasparini, F., and Schettini, R., 2007. Self-adaptive image cropping for small displays. Transactions on Consumer Electronics, 53(4), pp. 1622-1627. ‏Doi:10.1109/TCE.2007.4429261

Dawood, F.A., and Abood, Z.M., 2018. The importance of contrast enhancement in medical images analysis and diagnosis. International Journal of Engineering Research & Technology (IJERT), V7(12), pp. 21–24. Doi:10.17577/ijertv7is120006

Dhomne, A., Kumar, R., and Bhan, V., 2018. Gender recognition through face using deep learning. Procedia Computer Science, 132, pp. 2–10. Doi:10.1016/j.procs.2018.05.053

Jia, S., Lansdall-Welfare, T., and Cristianini, N., 2016. Gender classification by deep learning on millions of weakly labelled images. International Conference on Data Mining Workshops (ICDMW), pp. 462-467. Doi:10.1109/ICDMW.2016.0072

Fadhil, S.S., and Dawood, F.A.A., 2021. Automatic pectoral muscles detection and removal in mammogram images. Iraqi Journal of Science, 62(2), pp. 676–688. Doi:10.24996/ijs.2021.62.2.31

Galla, D.K.K., Mukamalla, B.R., and Chegireddy, R.P.R., 2020. Support vector machine based feature extraction for gender recognition from objects using lasso classifier. Journal of Big Data, 7(1). Doi:10.1186/s40537-020-00371-0

Ghadi, N.M., and Salman, N.H., 2022. Deep learning-based segmentation and classification techniques for brain tumor mri: a review. Journal of Engineering, 28(12), pp. 93–112. Doi:10.31026/j.eng.2022.12.07

Goodfellow, I., Bengio, Y., and Courville, A., 2016. Deep learning. MIT Press. Ghadi, N.M., and Salman, N.H., 2022. Deep learning-based segmentation and classification techniques for brain tumor MRI: A review. Journal of Engineering, 28(12), pp. 93-112. Doi:10.31026/j.eng.2022.12.07

Hassan, B.A., and Dawood, F.A.A. 2023. Facial image detection based on the Viola-Jones algorithm for gender recognition. International Journal of Nonlinear Analysis and Applications, 14(1), pp. 1593-1599.‏

Heydarzadeh, Y., Haghighat, A. T., and Fazeli, N., 2010. Utilizing skin mask and face organs detection for improving the Viola face detection method. In 2010 Fourth UKSim European Symposium on Computer Modeling and Simulation. pp. 174-178. Doi:10.1109/EMS.2010.38‏

Joodi, M.A., Saleh, M.H., and Khadhim, D.J., 2023. Proposed face detection classification model based on Amazon Web Services Cloud (AWS). Journal of Engineering, 29(4), pp. 176–206. Doi:10.31026/j.eng.2023.04.12

Kang, S., Choi, B., and Jo, D.,2016. Faces detection method based on skin color modeling. Journal of Systems Architecture, 64, pp. 100-109. ‏Doi.org/10.1016/j.sysarc.2015.11.009

Kavi Niranjana, K., Professor, A., and Kalpana Devi, M., 2015. RGB to Lab transformation using image segmentation. International Journal of Advance Research in, 3(11), pp. 8–16. Doi:10.33395/sinkron.v3i2.10102

Lin, C.J., Li, Y.C., and Lin, H.Y., 2020. Using convolutional neural networks based on a Taguchi method for face gender recognition. Electronics (Switzerland), 9(8), pp. 1–15. Doi: 10.3390/electronics9081227

Mateen, M., Wen, J., Nasrullah, N., Sun, S., and Hayat, S., 2020. Exudate detection for diabetic retinopathy using pretrained convolutional neural networks. Complexity, 2020, pp.1-11.

Musa, P., Rafi, F. Al and Lamsani, M., 2018. A review: Contrast-limited adaptive histogram equalization (CLAHE) methods to help the application of face recognition. Proceedings of the 3rd International Conference on Informatics and Computing, ICIC 2018, (November 2020), pp. 1–6. Doi:10.1109/IAC.2018.8780492.

Peng, Z., Wu, J., and Fan, G.,2018. A rapid face detection method based on skin color model and local binary gradient feature. In 2018 5th International Conference on Systems and Informatics (ICSAI) pp. 922-927. Doi:10.1109/ICSAI.2018.8599306

Rafique, R., Nawaz, M., Kibriya, H., and Masood, M., 2021. DeepFake detection using error level analysis and deep learning. In 2021 4th International Conference on Computing & Information Sciences (ICCIS) pp. 1-4.‏ Doi:10.1109/ICCIS54243.2021.9676375

Rathore, V.S., Kumar, M.S., and Verma, A., 2012.Colour based image segmentation using L* a* b* colour space based on genetic algorithm. International Journal of Emerging Technology and Advanced Engineering, 2(6), pp. 156–162.

Suma, S. L., and Raga, S., 2018. Real time face recognition of human faces by using LBPH and Viola Jones algorithm. International Journal of Scientific Research in Computer Science and Engineering, 6(5), pp. 6-10.‏ Doi:10.26438/ijsrcse/v6i5.610

Thepade, S. D., and Abin, D., 2018. Face gender recognition using multi-layer perceptron with OTSU segmentation. In 2018 Fourth International Conference on Computing Communication Control and Automation (ICCUBEA) pp. 1-5. Doi: 10.1109/ICCUBEA.2018.8697480

VenkateswarLal, P., Nitta, G.R., and Prasad, A., 2019. Ensemble of texture and shape descriptors using support vector machine classification for face recognition. Journal of Ambient Intelligence and Humanized Computing, pp. 1-8. Doi:10.1007/s12652-019-01192-7

Vujović, Ž., 2021. Classification model evaluation metrics. International Journal of Advanced Computer Science and Applications, 12(6), pp. 599–606. Doi:10.14569/IJACSA.2021.0120670

Wu, B., Zhu, W., Shi, F., Zhu, S., and Chen, X., 2017. Automatic detection of microaneurysms in retinal fundus images. Computerized Medical Imaging and Graphics, 55, pp. 106–112. Doi:10.1016/j.compmedimag.2016.08.001

Zaman, F.H.K., 2020. Gender classification using custom convolutional neural networks architecture. International Journal of Electrical and Computer Engineering, 10(6), pp. 5758–5771. Doi:10.11591/ijece.v10i6.pp5758-5771

Zheng, J., Ramirez, G. A., and Fuentes, O., 2010. Face detection in low-resolution color images. In Image Analysis and Recognition: 7th International Conference, ICIAR. pp. 454-463. ‏Doi:10.1007/978-3-642-13772-3_46

Similar Articles

You may also start an advanced similarity search for this article.