sikumbang, warnia (2020) [Peer Review]CNN Modelling Untuk Deteksi Wajah Berbasis Gender Menggunakan Python. https://jurnal.pcr.ac.id/index.php/jkt/article/view/3679.
![[thumbnail of LP_JKT.pdf]](http://eprints.pcr.ac.id/style/images/fileicons/text.png)
LP_JKT.pdf
Download (232kB)
Abstract
Face detection (Face Detection) is the Utilization of Biological data (Biometrics) by identifying physical features that exist in humans. Digitalization of gender recognition as a technology to recognize human gender by distinguishing the faces of women and the faces of men based on the Extraction features. The existence of this system can be applied implementatively for automatic surveillance systems and monitoring systems or market segmentation based on demographic trends and can also be applied to restrict access to a room. This research uses Convolutional Neural Network (CNN). CNN is a type of neural network where this method can be used on image data. CNN has the ability to recognize objects in an image. In total, the dataset used has 40 attribute annotations to describe female and male images. This face detection system uses python and Keras as an open source Machine Learning library for nerve networks, developed to make the application of deep learning models. With this system provides an accuracy analysis in gender detection so that it can be developed for more implementative research. The number of images must be balanced to get good performance for modeling, each model will have a training folder, validation and test data. The number of images that are not balanced can affect the performance of the CNN model. The model is built using transfer learning from InceptionV3 where modeling can recognize gender with an accuracy of 92.6%
Item Type: | Other |
---|---|
Subjects: | H Social Sciences > HA Statistics |
Divisions: | Jurusan Teknologi Informasi > Program Studi Sistem Informasi |
Depositing User: | Warnia Nengsih |
Date Deposited: | 30 Jun 2023 09:46 |
Last Modified: | 30 Jun 2023 09:46 |
URI: | http://eprints.pcr.ac.id/id/eprint/131 |