TY - JOUR
T1 - Transfer learning with multiple pre-trained network for fundus classification
AU - Setiawan, Wahyudi
AU - Utoyo, Moh Imam
AU - Rulaningtyas, Riries
N1 - Publisher Copyright:
© 2019 Universitas Ahmad Dahlan.
PY - 2020
Y1 - 2020
N2 - Transfer learning (TL) is a technique of reuse and modify a pre-trained network. It reuses feature extraction layer at a pre-trained network. A target domain in TL obtains the features knowledge from the source domain. TL modified classification layer at a pre-trained network. The target domain can do new tasks according to a purpose. In this article, the target domain is fundus image classification includes normal and neovascularization. Data consist of 100 patches. The comparison of training and validation data was 70:30. The selection of training and validation data is done randomly. Steps of TL i. e load pre-trained networks, replace final layers, train the network, and assess network accuracy. First, the pre-trained network is a layer configuration of the convolutional neural network architecture. Pre-trained network used are AlexNet, VGG16, VGG19, ResNet50, ResNet101, GoogLeNet, Inception-V3, InceptionResNetV2, and squeezenet. Second, replace the final layer is to replace the last three layers. They are fully connected layer, softmax, and output layer. The layer is replaced with a fully connected layer that classifies according to number of classes. Furthermore, it's followed by a softmax and output layer that matches with the target domain. Third, we trained the network. Networks were trained to produce optimal accuracy. In this section, we use gradient descent algorithm optimization. Fourth, assess network accuracy. The experiment results show a testing accuracy between 80% and 100%.
AB - Transfer learning (TL) is a technique of reuse and modify a pre-trained network. It reuses feature extraction layer at a pre-trained network. A target domain in TL obtains the features knowledge from the source domain. TL modified classification layer at a pre-trained network. The target domain can do new tasks according to a purpose. In this article, the target domain is fundus image classification includes normal and neovascularization. Data consist of 100 patches. The comparison of training and validation data was 70:30. The selection of training and validation data is done randomly. Steps of TL i. e load pre-trained networks, replace final layers, train the network, and assess network accuracy. First, the pre-trained network is a layer configuration of the convolutional neural network architecture. Pre-trained network used are AlexNet, VGG16, VGG19, ResNet50, ResNet101, GoogLeNet, Inception-V3, InceptionResNetV2, and squeezenet. Second, replace the final layer is to replace the last three layers. They are fully connected layer, softmax, and output layer. The layer is replaced with a fully connected layer that classifies according to number of classes. Furthermore, it's followed by a softmax and output layer that matches with the target domain. Third, we trained the network. Networks were trained to produce optimal accuracy. In this section, we use gradient descent algorithm optimization. Fourth, assess network accuracy. The experiment results show a testing accuracy between 80% and 100%.
KW - Classification
KW - Convolutional neural network
KW - Multiple pre-trained network
KW - Neovascularization
KW - Transfer learning
UR - http://www.scopus.com/inward/record.url?scp=85084259849&partnerID=8YFLogxK
U2 - 10.12928/TELKOMNIKA.v18i3.14868
DO - 10.12928/TELKOMNIKA.v18i3.14868
M3 - Article
AN - SCOPUS:85084259849
SN - 1693-6930
VL - 18
SP - 1382
EP - 1388
JO - Telkomnika (Telecommunication Computing Electronics and Control)
JF - Telkomnika (Telecommunication Computing Electronics and Control)
IS - 3
ER -