APP 03 :   REAL-TIME IN-BROWSER INFERENCE

The article explaining how to use a customized Tensorflow Lite model for in-browser inference is here.


IN-BROWSER ML INFERENCE

I developed this app for image classification using tflite and Javascript. Instead of the default and supported model EfficientNet, I used a PyTorch model. The JS code loads the quantized .tflite model in the browser from a remote repo. 

This solution allows anyone to  train any customized model in Tensorflow/PyTorch, convert to .tflite and deploy in the web browser.