Handwritten Digit Recognition: Draw → Predict in the Browser
This demo lets you draw a digit on a canvas and runs inference in the browser to predict 0–9. It’s built around MNIST, the classic dataset of handwritten digits used as an early benchmark for image classification.
Why it matters
While MNIST is “intro ML,” it’s still a great way to demonstrate the complete loop from user input → preprocessing → model inference → UX feedback. In interviews, this is a compact artifact that shows practical engineering instincts: packaging a model, making it interactive, and shipping it.
End-to-end flow
- Canvas capture: user draws with the mouse/touch. The raw pixels are captured from the canvas.
- Preprocessing: normalize size/contrast and format the pixels the way the model expects.
- Inference: run the trained Keras/TensorFlow.js model client-side to produce class probabilities.
- UX feedback: display the top prediction and (optionally) confidence distribution.
Try it
Open the live demo and draw a few digits. Pay attention to what fails (sloppy strokes, off-center drawings) and how the UI guides correction.