Home Getting Started Optimal Insights

Optimal Insights

Last updated on Nov 18, 2025

Get a complete view of how your engineering team ships โ€” from delivery speed to AI-assisted development.
Insights connects directly to GitHub, analyzes your real activity data, and turns it into actionable metrics so you can spot friction, measure progress, and ship faster.


๐Ÿš€ PR Cycle Time

Measure how fast your team moves from first commit to merge.

The PR Cycle Time dashboard shows where time is spent in your pull request workflow โ€” from creation to review to merge. It helps you identify slowdowns and track improvements across teams, repos, or individual developers.

Key Metrics

  • Average Cycle Time โ€“ Total time from first commit to merge.

    • Excellent badge appears when under 24 hours.
  • Time to Open โ€“ From first commit to PR creation.

  • Time in Review โ€“ Duration the PR waits for review and approval.

  • Time to Merge โ€“ From PR open to merge into the target branch.

  • Merged to Staging โ€“ How many PRs reached staging environments.

Each metric updates automatically and compares against the previous period (โ†‘ or โ†“ indicators).


Filters and Views

  • Teams vs. Individuals โ€“ Switch between team-wide or contributor-level insights.

  • Repositories โ€“ Focus on one or multiple repos (e.g. insights-app-be).

  • Date Range โ€“ Analyze over 7, 14, 30, or 90-day windows to track trends.

The dashboard refreshes every few hours so youโ€™re always seeing current data.


Average Cycle Time Graph

The line chart visualizes how your teamโ€™s cycle time changes over time:

  • The green line tracks daily performance.

  • The dotted line shows the benchmark from the previous period.

Use it to spot delivery spikes, bottlenecks, or gradual process improvements.


Pull Request Table

Each row shows a PR with details like:

  • Author

  • Reworks %

  • Check Failure Rate

  • PR Size (lines changed)

  • Time to Open / Merge / Review

Quick filters highlight what needs attention:

  • Longest Review Time โ€“ Stuck reviews.

  • Most Discussions โ€“ High comment volume.

  • Most Check Failures โ€“ CI/CD instability.

Column controls (โ‹ฎ) let you toggle data points like Reviewer, Assignee, or Jira issues.


โšก AI Insights

Instantly understand what changed โ€” and why.

The AI Insights side panel automatically analyzes the page youโ€™re on (like PR Cycle Time or Activity) and provides an executive summary.

What It Shows

  • TL;DR Summary โ€“ Explains what shifted (e.g., longer open times but faster merges).

  • Activity Trends โ€“ Week-over-week % changes in each metric.

  • Notable Contributors โ€“ Who or what drove those changes (e.g., โ€œoptibot-dev[bot] active in reviewsโ€).

This helps engineering leads skip raw data analysis and jump straight to decision-making.

Pro tip: open it weekly before sprint reviews โ€” itโ€™s like having a built-in engineering analyst.


๐Ÿ•“ Activity

Visualize your teamโ€™s engineering rhythm.

The Activity view shows commits, reviews, merges, and comments across a timeline โ€” giving you a visual heartbeat of your teamโ€™s workflow.

How It Works

Each developer appears as a row; each bubble represents activity for that day.
The bigger the bubble, the more events occurred.

Color codes:

  • โšช Commit โ€“ Code pushed to the repository

  • ๐Ÿ”ต PR Open โ€“ New pull request created

  • ๐ŸŸก PR Review โ€“ Pull request reviewed

  • ๐ŸŸข Merge Commit โ€“ Pull request merged into the main branch

  • ๐Ÿ”ด Comment โ€“ Review comment added on a pull request

Hovering on any bubble reveals exact counts:

PRs reviewed: 1 โ€ข PRs opened: 2 โ€ข PRs merged: 1 โ€ข Comments: 1 โ€ข Commits: 2

Switch between Week and Month views or zoom to see team-level vs. individual activity.

Why It Matters

  • Spot midweek review peaks and Friday slowdowns.

  • Balance review load across engineers.

  • Detect burnout or under-utilization patterns early.


๐Ÿค– AI Adoption

Measure how AI tools impact your engineering workflow.

The AI Adoption dashboard tracks real usage and engagement with AI coding tools like GitHub Copilot and Claude 3.7.

Core Metrics

  • Overall Code Acceptance Rate โ€“ % of AI-suggested code merged.

  • Average Chat Interactions per Day โ€“ How often engineers engage AI assistants.

  • Top Performing Model by Acceptance โ€“ The AI model producing the most accepted code.

  • Highest Acceptance by Language โ€“ Which language benefits most (e.g., Python).

  • Average Daily Engagement Rate โ€“ % of developers actively using AI tools.

Below, the Copilot User Engagement chart shows daily active vs. engaged AI users over time.

Why It Matters

This dashboard connects productivity metrics to AI usage โ€” revealing whether tools like Copilot are actually helping teams ship faster or just increasing noise.


๐Ÿ’ก Pro Tips

  • Enable AI Insights on every major dashboard for context-aware summaries.

  • Standardize GitHub labels to improve Distributions and Allocations data.

  • Compare time windows (7-day vs 30-day) to measure process improvements.

  • Encourage reviewers to spread feedback load โ€” visible in Activity view.

  • Track AI Adoption alongside PR Cycle Time to measure real ROI of Copilot.


๐Ÿงญ Troubleshooting

  • No data yet? Wait a few hours after connecting GitHub; Insights backfills automatically.

  • Missing PRs or repos? Verify permissions include read + metadata access.

  • Empty AI Adoption view? Ensure AI telemetry is enabled in connected IDEs or extensions.