UXit Documentation

Analytics

Your Analytics Dashboard

The Analytics section displays your evaluation results with three main components:

  • Top Metrics: Quick snapshot showing Latest Score %, Pass/Total count, Total Evaluations, and Trend (improving ↑ or declining ↓)
  • Category Breakdown Chart: Bar chart showing each category as a percentage of passing criteria
  • Detailed Results: List of every question with Pass/Fail status, Category tag, Question code, and Notes

Comparing Flow Versions

Flow Analytics gains power through multiple evaluations of the same flow against consistent guidelines. Run v1, v2, and v3 evaluations and watch the Trend indicator and Category Breakdown chart evolve as you improve. The Top Metrics update with each new evaluation, showing your progress over time.

Look for patterns across versions:

  • Improving categories: Scores going up = your design fixes are working
  • Plateaued categories: Flat scores = already optimized or not a priority
  • Consistent failures: Same questions failing across versions = priority areas for redesign

Use the Detailed Results section to find which specific questions are causing low category scores, then focus your design updates there.

Tips & Best Practices

  • Use consistent guidelines: Multiple versions evaluated against the same guidelines = reliable trends. Changing criteria breaks the data.
  • Regular versioning: Establish v1, v2, v3 cycles. Add notes explaining what changed between versions so you can correlate design updates to score improvements.
  • Document failures: Add notes during evaluation explaining why questions failed. This context helps identify what needs fixing in the next version.
  • Review Methodology: Low category scores? Check the Methodology page to understand what that category measures, then address those specific criteria.

Worked Example: Checkout Flow Analytics

Suppose you're tracking your checkout flow over three iterations against your ecommerce guidelines:

Version 1: Starting Point (D Grade)

Overall Score: 65%
Pass: 13 / 20
Categories:
  Accessibility: 60%
  Performance: 70%
  Security: 75%
  Usability: 50% ← Problem area

Insight: Usability is your biggest weakness at 50%. You're strong in Security (75%) and Performance (70%).

Version 2: Usability Focus (B Grade)

After redesigning the checkout form for clarity and adding better error messages, plus improving color contrast and ARIA labels:

Overall Score: 78%
Pass: 16 / 20
Categories:
  Accessibility: 80% ↑ +20
  Performance: 70%
  Security: 75%
  Usability: 85% ↑ +35

Insight: Your targeted fixes worked—Usability jumped 35 points and Accessibility jumped 20 points. You moved from D to B grade. Performance and Security stayed stable, confirming they don't need work yet.

Version 3: Performance Focus (B+ Grade)

Focused on optimizing images, reducing JavaScript, and implementing lazy loading:

Overall Score: 88%
Pass: 18 / 20
Categories:
  Accessibility: 80%
  Performance: 95% ↑ +25
  Security: 75%
  Usability: 85%

Insight: Performance jumped 25 points. Overall score is now 88% (B grade). The two remaining failures are edge cases. Your focused approach proves each iteration is moving the needle.

Reading This Data

  • Track impact: See which design changes actually improved scores (Usability focus worked; Performance focus worked)
  • Prioritize next: Only Security is below 80%. Is this a risk worth fixing, or acceptable?
  • Report progress: Show stakeholders your 65% → 88% improvement across three iterations

On this page