Intel OpenVINO on Google Colab
Ujjayanta Bhaumik
Unknown
- 0 Collaborators
The OpenVINO distribution from Intel makes AI inferencing fast and easy. This tutorial would teach you how to test the power of OpenVINO ...learn more
Project status: Published/In Market
Internet of Things, Artificial Intelligence
Intel Technologies
OpenVINO
Overview / Usage
The OpenVINO distribution from Intel makes _**AI inferencing **_fast and easy. Inference is basically the stage where a model has obtained all its education(training) and needs to show off by deducing new information based on its capabilities. The OpenVINO framework makes inference and deployment across various architectures really easy. This tutorial would teach you how to test the power of OpenVINO on Google Colab.
Methodology / Approach
In general, the steps for using OpenVINO are:
- using a pretrained model from Intel Model Zoo or using your own model(build)
- Using the Inference engine to generate xml and bin files(optimize)
- Deploying using your preferred environment
I would be using Google Colab to show how to set up inferencing using generated xml and bin files from pretrained models(Intel OpenVINO Model Zoo). I have also shown how to generate this xml and bin files.
Technologies Used
Google Colab
OpenVINO
Python
GitHub
Documents and Presentations
Repository
https://github.com/jojo96/intel-openvino-colab