Skip to content

Feat/latency profiling gui evaluator#539

Closed
AdityaX18 wants to merge 2 commits intoJdeRobot:masterfrom
AdityaX18:feat/latency-profiling-gui-evaluator
Closed

Feat/latency profiling gui evaluator#539
AdityaX18 wants to merge 2 commits intoJdeRobot:masterfrom
AdityaX18:feat/latency-profiling-gui-evaluator

Conversation

@AdityaX18
Copy link
Copy Markdown

@AdityaX18 AdityaX18 commented Mar 30, 2026

Summary

Fixes #538

Problem

tabs/evaluator.py runs inference with no timing instrumentation.
perceptionmetrics/cli/computational_cost.py already supports latency metrics —
the GUI was the only entry point missing this functionality.

Changes

File Change
perceptionmetrics/utils/latency_profiler.py AddedLatencyReport dataclass
tabs/evaluator.py Updated — instrumented inference and added metric rendering

What the user sees

After evaluation, a new latency metrics row appears in the GUI:

Mean Latency FPS P95 Latency P99 Latency
34.2 ms 29.2 41.1 ms 58.3 ms

Implementation details

  • Uses time.perf_counter() for high-resolution timing
  • Aggregates metrics using statistics
  • No external dependencies added

Impact

  • Adds real-time inference performance visibility
  • Aligns GUI capabilities with CLI tools
  • Provides critical deployment signal (latency + FPS)

Note: I am a GSoC 2026 applicant for JdeRobot. This change directly supports
the robustness evaluation module in my proposal — latency is a key deployment
signal alongside mAP for real-world robotics systems.

Closes #ISSUE_NUMBER_HERE

perceptionmetrics/cli/computational_cost.py already provided latency
metrics for CLI users but the GUI evaluator showed no timing data.

Changes:
- perceptionmetrics/utils/latency_profiler.py: new LatencyReport class
  recording per-image wall-clock time via time.perf_counter(). Computes
  mean, median, P95, P99, FPS using stdlib only — no new dependencies.
- tabs/evaluator.py: wrap inference call with perf_counter(), render
  4-column Streamlit metric row (mean ms, FPS, P95, P99) after evaluation.

Consistent with existing get_computational_cost() approach in the CLI.
@dpascualhe dpascualhe closed this Apr 17, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

feat: expose inference latency and FPS metrics in Streamlit GUI evaluator tab

2 participants