DORA Capabilities Reference

All 20 capabilities — maturity levels for each
← Back to DIAL Docs
Last updated: 2026-03-10
DIAL assesses 20 DORA research capabilities organized across 4 categories. Each capability is rated at one of three maturity levels: Emerging, Advancing, or Leading. Teams score each capability collaboratively during an assessment session, and Claude Sonnet generates tailored recommendations for each one.
20
Total Capabilities
4
Categories
3
Maturity Levels
60
Maturity Descriptors
Maturity Levels:
Emerging — Ad hoc practices, reactive, limited consistency
Advancing — Defined processes, growing adoption, measurable improvement
Leading — Optimized, automated, continuously improving
⚙️

Technical

8 capabilities
Code Maintainability
Emerging Advancing Leading
Emerging
Code quality varies significantly. Limited documentation and inconsistent standards make maintenance difficult. Technical debt accumulates without systematic management.
Advancing
Code standards are defined and partially enforced. Documentation practices are established. Technical debt is tracked and addressed in some areas.
Leading
High code quality through automated enforcement and continuous refactoring. Comprehensive documentation. Technical debt is proactively managed as a first-class concern.
Continuous Delivery
Emerging Advancing Leading
Emerging
Manual deployment processes are common. Releases are infrequent and risky. Limited automation in the delivery pipeline creates bottlenecks.
Advancing
Deployment automation is in place for most steps. Releases are more frequent with reduced manual intervention. Pipeline failures are caught early.
Leading
Fully automated delivery pipeline from commit to production. Teams can deploy on demand with confidence. Deployment is a low-risk, routine event.
Continuous Integration
Emerging Advancing Leading
Emerging
Developers integrate infrequently, causing painful merge conflicts. CI builds run inconsistently. Broken builds are not always fixed promptly.
Advancing
CI is configured and developers integrate at least daily. Most builds include automated testing. Teams respond to build failures within hours.
Leading
Trunk-based development with multiple daily integrations. Fast, comprehensive automated testing. Broken builds are fixed within minutes as a team priority.
Database Change Management
Emerging Advancing Leading
Emerging
Database changes are applied manually and inconsistently. Schema migrations are risky and often require downtime. Change history is poorly tracked.
Advancing
Database migrations are scripted and version-controlled. Most changes are applied through automated pipelines. Rollback procedures are documented.
Leading
All database changes are automated, tested, and deployed through the standard pipeline. Zero-downtime migrations are the norm. Full audit trail maintained.
Deployment Automation
Emerging Advancing Leading
Emerging
Deployments rely on manual steps and runbooks. Environment configuration is inconsistent. Production deployments require significant coordination and are error-prone.
Advancing
Core deployment steps are scripted. Most environments are provisioned consistently. Deployment frequency has increased but some manual gates remain.
Leading
Fully automated, self-service deployments across all environments. Infrastructure as code. Deployments are triggered by pipeline completion with no manual intervention.
Test Automation
Emerging Advancing Leading
Emerging
Testing is primarily manual. Automated tests are sparse and unreliable. Quality gates are weak, leading to defects discovered late in the cycle.
Advancing
Automated unit and integration tests exist and run in CI. Test coverage is growing. Flaky tests are tracked and addressed. E2E testing is partially automated.
Leading
Comprehensive automated test suite at all levels (unit, integration, E2E). High confidence, fast-running tests. Testing is integral to the development process, not an afterthought.
Trunk-Based Development
Emerging Advancing Leading
Emerging
Long-lived feature branches are common. Merges to main are infrequent and painful. Integration conflicts cause significant rework and delays.
Advancing
Branches are shorter-lived (days, not weeks). Teams integrate to main at least weekly. Feature flags are beginning to be used to decouple deployment from release.
Leading
Developers commit to trunk daily or multiple times per day. Feature flags enable continuous integration. Branches, if used, are short-lived and merged within hours.
Version Control
Emerging Advancing Leading
Emerging
Version control is used for application code but inconsistently for configuration, scripts, or infrastructure. Branching strategies are informal or absent.
Advancing
All application code and most configuration is version-controlled. A defined branching strategy is followed by most team members. Code reviews are standard practice.
Leading
Everything — code, config, infrastructure, docs — is version-controlled. Branching strategy enables rapid, safe delivery. Automated policy enforces standards consistently.
🔄

Process

5 capabilities
Streamlining Change Approval
Emerging Advancing Leading
Emerging
Change approval is heavy and slow, often requiring multi-layer sign-offs. Changes are batched to reduce approval overhead, increasing deployment risk.
Advancing
Approval processes are defined and tiered by risk level. Standard changes have lighter oversight. Review cycles are shorter and mostly automated.
Leading
Automated pre-approval for low-risk changes. Peer review plus CI gates replace manual approval boards. High-risk changes have streamlined, risk-appropriate oversight.
Test Data Management
Emerging Advancing Leading
Emerging
Test data is ad hoc and often shared between environments. Tests depend on production data or fragile static fixtures. Data management is a persistent bottleneck.
Advancing
Test data is managed in version control or generated programmatically. Environments are isolated. Most tests run with reliable, repeatable data sets.
Leading
Fully automated test data provisioning. Synthetic data generation removes dependency on production data. Data is isolated, reproducible, and refreshed automatically for every test run.
Working in Small Batches
Emerging Advancing Leading
Emerging
Large batches of work are the norm. Features are developed over weeks or months before delivery. Feedback loops are slow and course-correction is costly.
Advancing
Work is decomposed into smaller units delivered over days rather than weeks. Teams are building the habit of incremental delivery and using it for most new work.
Leading
Small, frequent deliverables are the default. Work is broken down to hours or days. Feedback is continuous, enabling rapid learning and course-correction.
Documentation Quality
Emerging Advancing Leading
Emerging
Documentation is sparse, outdated, or siloed in individual contributors' heads. Onboarding new team members is slow and relies on tribal knowledge.
Advancing
Core documentation exists and is mostly current. Runbooks and architecture docs cover key areas. Documentation is part of the definition of done for most features.
Leading
Living documentation is integrated into the development workflow. Docs-as-code practices ensure currency. Self-service documentation enables fast onboarding and reduces knowledge bottlenecks.
Monitoring and Observability
Emerging Advancing Leading
Emerging
Monitoring is reactive — teams discover problems from user reports. Logging is inconsistent. There is limited visibility into system health or performance.
Advancing
Core metrics and logs are collected and dashboarded. Alerts exist for critical failures. Teams can investigate most incidents without log-diving into production directly.
Leading
Comprehensive observability across logs, metrics, and traces. SLOs defined and actively managed. Proactive alerting detects anomalies before users are impacted. On-call is driven by data, not user complaints.
🌱

Cultural

5 capabilities
Empowering Teams to Choose Tools
Emerging Advancing Leading
Emerging
Tool choices are mandated top-down with little team input. Teams work around tools that don't fit their needs. Tooling friction slows delivery.
Advancing
Teams have some autonomy in tool selection within guardrails. A process exists for requesting new tools. Most teams use tools suited to their work.
Leading
Teams own their toolchain decisions within a lightweight governance framework. Tool choices are driven by team productivity and craft, not organizational inertia.
Generative Organizational Culture
Emerging Advancing Leading
Emerging
Information is siloed and information flow is restricted. Failure is blamed on individuals. Novelty is discouraged and cross-functional collaboration is rare.
Advancing
Information is shared more openly. Post-mortems are blameless in most cases. Cross-team collaboration is growing. Leadership is actively modeling generative behaviors.
Leading
High information flow and radical transparency. Failure is treated as a learning opportunity. Experimentation and cross-functional collaboration are deeply embedded in how work gets done.
Job Satisfaction
Emerging Advancing Leading
Emerging
Work is frequently interrupted by fire-fighting and rework. Team members feel little ownership over their work. Burnout and attrition are concerns.
Advancing
Teams have meaningful work with some autonomy. Interruptions are decreasing. Satisfaction is improving, though technical debt and manual toil still weigh on the team.
Leading
High autonomy, mastery, and purpose. Team members feel their work matters and have the tools and skills to do it well. Toil is minimized. Engagement and retention are strong.
Loosely Coupled Teams
Emerging Advancing Leading
Emerging
Teams have many dependencies on other teams to ship. Cross-team coordination overhead is high. Deployments require synchronization across multiple groups.
Advancing
Teams can complete most work independently. APIs and contracts are defined. Some coordination is still required for cross-cutting changes but it's becoming less frequent.
Leading
Teams own their full service lifecycle and can deploy independently. Clear API boundaries and contracts enable autonomous delivery. Cross-team work is coordinated via interfaces, not meetings.
Well-Being
Emerging Advancing Leading
Emerging
Overwork is normalized. On-call burden is unmanaged and unequally distributed. Sustainable pace is an aspiration, not a practice. Stress and burnout are widespread.
Advancing
Sustainable pace is actively discussed and partially achieved. On-call rotations are fair and supported. Leadership is aware of well-being as a performance driver.
Leading
Team well-being is a first-class engineering concern. On-call is manageable and incidents are rare. Sustainable pace, recovery time, and mental health are actively supported and measured.
📊

Measurement

2 capabilities
Flexible Infrastructure
Emerging Advancing Leading
Emerging
Infrastructure is provisioned manually and changes slowly. Scaling requires significant lead time. Environments are often inconsistent and hard to reproduce.
Advancing
Infrastructure is partially defined as code. Provisioning is faster. Some environments are standardized and auto-scaled. Teams are reducing manual infrastructure toil.
Leading
All infrastructure is defined as code, version-controlled, and provisioned on demand. Environments are identical and ephemeral. Scaling is automatic and instant.
Pervasive Security
Emerging Advancing Leading
Emerging
Security is a late-stage gate, not integrated into development. Security reviews are infrequent and reactive. Developers have limited security training or tooling.
Advancing
Security scanning is integrated into CI. Common vulnerabilities are caught automatically. Security requirements are defined early in the feature lifecycle.
Leading
Security is embedded at every stage — design, development, CI, and operations. Threat modeling is routine. Developers are security-aware and security tooling is a first-class citizen in the pipeline.