Recognize Flowers species using Transfer Learning

Ajinkya Jawale

Ajinkya Jawale

Pune, Maharashtra

6 0
  • 0 Collaborators

Retraining a classifier trained on Imagenet Dataset using Tensorflow 2.0 to detect the flower species ...learn more

Project status: Under Development

RealSense™, Artificial Intelligence

Groups
Student Developers for AI

Intel Technologies
Intel Python, Other

Code Samples [1]Links [1]

Overview / Usage

Transfer Learning is machine learning approach in which pretrained dataset is used further steps given bellow:

  1. Setup Input Pipeline download the dataset
  2. Rescale the images
  3. Create the base model from the pre-trained convnets
  4. Feature Extraction
    1. add classification head
    2. compile model
    3. train model
  5. fine Tuning
  6. Un-freeze the top layers
  7. compile model

code is here!

https://github.com/ajinkyajawale14/Flower_tflite/blob/master/flowers_tf_lite.ipynb

Methodology / Approach

Fine tuning

In our feature extraction experiment, you were only training a few layers on top of a MobileNet V2 base model. The weights of the pre-trained network were not updated during training.

One way to increase performance even further is to train (or “fine-tune”) the weights of the top layers of the pre-trained model alongside the training of the classifier you added. The training process will force the weights to be tuned from generic features maps to features associated specifically to our dataset.

Un-freeze the top layers of the model

All you need to do is unfreeze the base\_model and set the bottom layers be un-trainable. Then, recompile the model (necessary for these changes to take effect), and resume training.

Compile the model

Compile the model using a much lower training rate.

Convert to TFLite

Saved the model using tf.saved\_model.save and then convert the saved model to a tf lite compatible format.

Let’s take a look at the learning curves of the training and validation accuracy/loss, when fine tuning the last few layers of the MobileNet V2 base model and training the classifier on top of it. The validation loss is much higher than the training loss, so you may get some overfitting. You may also get some overfitting as the new training set is relatively small and similar to the original MobileNet V2 datasets.

Summary:
  • Using a pre-trained model for feature extraction: When working with a small dataset, it is common to take advantage of features learned by a model trained on a larger dataset in the same domain. This is done by instantiating the pre-trained model and adding a fully-connected classifier on top. The pre-trained model is “frozen” and only the weights of the classifier get updated during training. In this case, the convolutional base extracted all the features associated with each image and you just trained a classifier that determines the image class given that set of extracted features.
  • Fine-tuning a pre-trained model: To further improve performance, one might want to repurpose the top-level layers of the pre-trained models to the new dataset via fine-tuning. In this case, you tuned your weights such that your model learned high-level features specific to the dataset. This technique is usually recommended when the training dataset is large and very similar to the original dataset that the pre-trained model was trained on.

So, what then… this tflite model are say called compressed version of the original trained model( i wondered file size was just 9.1 mb! ) these tflite model further can be integrated with Android IOS or various IoT devices like Raspberry-Pi etc. web application Integration can be done using tensorflow.js

Technologies Used

Technology Stack

  1. Tensorflow 2.0

  2. Keras

  3. python 2.7 and above

Base Model:

MobileNet V2 base model

Dataset:

Imagenet Dataset-

https://storage.googleapis.com/download.tensorflow.org/example\_images/flower\_photos.tgz

Repository

https://github.com/ajinkyajawale14/Flower_tflite/blob/master/flowers_tf_lite.ipynb

Comments (0)