Blog – Future Processing
Home Blog Software Development Measuring what really works: DORA metrics in the age of AI-driven delivery
Software Development

Measuring what really works: DORA metrics in the age of AI-driven delivery

Read about how the widespread use of AI tools is reshaping DevOps performance and prompting a fresh look at the metrics that matter.
Share on:

Table of contents

Share on:

Over 90% of tech professionals now use AI tools in their daily work, according to the latest State of AI-Assisted Software Development report. This marks a turning point for DevOps: the same research programme that once defined the four famous DORA metrics has now expanded its focus to reflect how AI is transforming software delivery performance.

Key takeaways

  • The 2025 State of AI-Assisted Software Development report marks a shift in focus from traditional DevOps performance to AI-driven delivery practices.
  • The DORA framework has evolved beyond the four classic DevOps metrics, now including a new one, Rework Rate, and a quasi-metric, Reliability. The metric Mean Time to Recovery has been now redefined as Failed Deployment Recovery Time and moved from stability to throughput, changing how recovery speed is interpreted.
  • AI is now seen as an amplifier of performance, accelerating and optimising processes on a large scale.
  • Despite these changes, the core goal remains the same — to balance delivery speed, reliability, and quality using measurable data.

From four classic indicators to a more complete view

For years, teams relied on four key DORA metrics to measure DevOps performance – Deployment Frequency, Lead Time for Changes, Change Failure Rate, and Mean Time to Recovery. Together, they offered a practical way to evaluate delivery speed and system stability.

These metrics still form the foundation of DevOps benchmarking, but the framework has evolved. In 2021, Reliability was added as a quasi-metric, highlighting a team’s ability to meet performance and availability targets defined through Service Level Objectives and Indicators. Unlike the original four, it’s not a single number but a broader reflection of service consistency from a user perspective.

Rework rate and a shift in classification

The 2024 Accelerate State of DevOps report introduced another change – a new metric called Rework Rate. It measures the proportion of unplanned deployments made to fix user-visible issues. The research showed a strong correlation between Rework Rate and Change Failure Rate, suggesting that both stability and rework are critical indicators of quality.

At the same time, DORA reclassified some of its long-standing categories. The old Mean Time to Recovery has been replaced with Failed Deployment Recovery Time and moved from the stability to the throughput group. The reasoning is simple: fast recovery after a failed deployment supports delivery flow, helping teams deploy again sooner. This subtle shift changes how teams interpret performance – from merely ‘fixing failures’ to improving operational momentum.

Decreasing the lead time for changes from 2 months to 1 day and saving 50% of the client’s cloud costs

The client expected significant growth and needed a much more flexible system framework and rapid product innovation. Their software needed modernisation in terms of architecture and technology used.

Thanks to our work, we decreased the lead time for changes from 2 months to 1 day, improved change failure rate from over 30% to below 10%, and saved 50% of the client’s Cloud costs.

The impact of AI on DevOps performance

The 2025 report goes further, recognising AI as a core part of modern software delivery. Rather than replacing human work, AI acts as an amplifier: it strengthens what already works in high-performing teams and exposes weaknesses where foundations are missing.

The DORA researchers have also introduced the AI Capabilities Model, outlining seven foundational practices that support effective AI adoption. These include technical, cultural, and process dimensions – from automation and platform engineering to experimentation and risk management. The former “low-to-elite” performance tiers have been replaced with seven team archetypes that better reflect the diversity of modern engineering setups.

What the current DORA set looks like

As of 2025, the DORA framework includes six measurable dimensions – five formal metrics and one quasi-metric (Reliability) – grouped into two main categories.

Throughput covers Deployment Frequency, Lead Time for Changes, and Failed Deployment Recovery Time.
Stability includes Change Failure Rate and Rework Rate.

Reliability remains a separate category, assessed through SLOs and SLIs that reflect end-user experience.

While the terminology has changed, the purpose remains the same: to help teams measure how effectively they deliver high-quality software at speed, and how resilient their systems are when something goes wrong.

Measuring the new DORA metrics in practice

Collecting accurate data still depends on automation across your CI/CD pipelines, version control, monitoring, and incident-management tools. Each metric has a clear data source – deployment logs, commit history, incident records, recovery timestamps, and reliability dashboards.

As AI becomes more integrated into software workflows, measuring its influence will require additional layers of observability. The goal is not just to see how fast code moves through the pipeline, but also how AI-assisted coding or testing affects quality, lead time, and overall reliability.

Why these metrics still matter

The evolution of DORA doesn’t replace the fundamentals – it refines them. Reliable measurement remains the basis for meaningful improvement. Teams that understand how to interpret their metrics in context can balance delivery speed with stability and use data to guide investments in automation, testing, and platform capabilities.

Stronger DORA metrics translate directly into business outcomes: faster response to market needs, higher customer satisfaction, and reduced operational risk. In the AI-assisted era, they also provide a way to separate genuine performance improvement from productivity illusions created by new tools.

Summary

The DORA framework continues to evolve with the industry it measures. The shift from four to six indicators – and from State of DevOps to State of AI-Assisted Software Development – signals more than just a new report title. It reflects how software delivery has entered a new phase, where metrics, automation, and AI work together to define what high performance truly means.

Why introduce DevOps in your company?

DevOps breaks down silos between development and operations, enabling faster releases, higher quality software, and more agile response to change. It boosts collaboration, automates workflows, and accelerates innovation.

Ready to deliver faster and smarter? Let’s talk about DevOps.

Value we delivered

1
day

Read about how cutting lead time from 2 months to a single day and reducing cloud costs by 50% became possible through targeted modernisation and the right delivery approach.

Let’s talk

Contact us and transform your business with our comprehensive services.