Operational Analytics Surface

Designed for judges, operators, and field teams who need anomaly signals fast.

The platform turns synthetic meter history into a presentation-ready risk report. It keeps Hugging Face access on the server side, preserves a resilient mock fallback, and now ships as a Vercel-ready Next.js product instead of a rough internal demo shell.

Public-safeNext.js on VercelRuntime: live + mock fallback
Operating model
Inference chain
01Browser demo flow
02Next.js route handler
03HF Space or mock scorer

Credentials never reach the browser. The UI only talks to internal route handlers under/api.

Deployment targetVercel production
Inference surfaceServer-side only
Demo continuityMock fallback enabled
Web layer

Committee-friendly demo

Clear KPI framing, sharper product typography, and downloadable scored CSV for presentation-ready walkthroughs.

Inference security

Server-side HF access

Hugging Face Space calls happen inside Next.js route handlers. Optional tokens stay in server-only environment variables.

Operational mode

Resilient fallback

If the live model is unavailable, the app can fall back to deterministic mock scoring so the demo does not stall.

What stays in repo

Lean enough for deployment, rich enough for review.

  • Next.js web app as the main deploy target
  • Synthetic sample CSV for demo runs
  • Legacy Go and Python folders as reference, not the Vercel runtime path
What this setup avoids

Secure by default for a public portfolio setting.

  • No real customer data
  • No browser-side Hugging Face credentials
  • No dependency on in-memory server jobs for production demos