What's holding back AI's productivity boost?  |

It’s not the model—it’s your system. GAINS™ reveals why
April 2, 2025

Software teams have long chased the holy grail of developer productivity. Is it the number of lines of code produced? The velocity of completed story points? For years, many have tried to boil productivity down to simple metrics, only to find that it’s not so simple. Recent research paints a clearer (and sometimes surprising) picture of what makes developers truly productive – and it’s much more about developer experience than brute output. 

In this post, we’ll dive into some of the most influential recent studies on developer productivity and highlight what they found.

Beyond Lines of Code: Productivity is Multi-Dimensional (SPACE Framework)

Research Summary

Back in 2021, a group of researchers led by Dr. Nicole Forsgren (creator of the DORA metrics) along with Dr. Margaret-Anne Storey and colleagues at GitHub and Microsoft introduced the SPACE framework. This framework was a wake-up call: it argued that developer productivity isn’t one-dimensional at all. In fact, SPACE spans five dimensions:

  • Satisfaction and well-being – How happy and fulfilled developers are in their work
  • Performance – The outcomes (quality, impact) of their work
  • Activity – The volume of output or actions (commits, pull requests, etc.)
  • Communication and Collaboration – How well developers work together and share knowledge
  • Efficiency and Flow – How effectively work is done with minimal interruptions (think “in the zone” coding time)

Forsgren et al. emphasized that you can’t capture productivity with a single metric – no “one metric to rule them all.” Focusing only on, say, lines of code or number of commits can be misleading. 

For example, a senior engineer might produce fewer commits yet deliver more value through code reviews, mentoring, and architectural decisions. The SPACE paper busted common myths, like the idea that productivity is just about developer activity or tools. In reality, human factors like a supportive culture and healthy environment matter just as much. Work that often goes unseen – mentoring, knowledge sharing, reducing technical debt – can be critical to a team’s overall productivity even if it doesn’t immediately show up in activity metrics.

One striking insight was that while good tools and efficient processes are important, they aren’t the whole story; organizational culture and developer well-being have substantial impact on productivity too. For instance, an engineering team might speed up their CI pipeline (tools) but if the team culture is blame-oriented or developers are burnt out, overall productivity won’t improve much. 

SPACE gave leaders and teams a vocabulary to discuss productivity more constructively. True productivity comes from a balanced environment where developers are happy, collaborative, and able to maintain flow, in addition to delivering working software. This multidimensional stance has now become almost common wisdom, but it was a necessary course-correction.

The Faros AI Take

At Faros AI, we see the SPACE framework as more than a measurement model—it’s a foundation for better conversations and smarter decisions. By elevating dimensions like satisfaction, collaboration, and flow, SPACE equips leaders with both the vocabulary and evidence to advocate for investments that don’t always show up on the roadmap but are essential for long-term outcomes. That might mean improving onboarding, refactoring neglected services, or carving out focus time—all efforts that typically get overlooked in favor of feature delivery.

We also appreciate how SPACE naturally discourages gaming. When you track just one dimension—say, PR count—it’s easy for developers to optimize toward the metric rather than the mission. But when you’re balancing activity, performance, satisfaction, and flow, it’s harder to fake impact and easier to have an honest conversation about tradeoffs. This built-in tension fosters more trustworthy data and better decision-making at every level.

Finally, SPACE moves the conversation beyond anecdotal performance assessments. Whether you're comparing two teams working in different domains or evaluating how mentorship and code review affect team outcomes, the framework enables more nuanced and equitable analysis. It supports a shift from reactive evaluations to proactive organizational insight.

Developer Experience (DevEx): The Developer-Centric Approach

If SPACE outlined what to measure, the next question became how do we improve those dimensions? 

Research Summary

Enter the concept of Developer Experience (DevEx) – basically, the idea that by improving the day-to-day experience of developers, you inherently boost their productivity. In 2023, Abi Noda, Dr. Margaret-Anne Storey, Dr. Nicole Forsgren, and Dr. Michaela Greiler published “DevEx: What Actually Drives Productivity,” which doubled down on making productivity developer-centric. 

Instead of viewing productivity as just an output to be measured, they approached it from the perspective of developers’ lived experience: what friction do developers encounter, and how does removing that friction help them get more done?

The research argues that improving DevEx is the key to improving productivity. They identified three core dimensions of DevEx: Feedback loops, Cognitive load, and Flow state. 

  • Feedback loops: How quickly and effectively developers get feedback from their tools and team. For example, how long do you wait for CI builds and tests? Are code reviews prompt or do PRs sit idle for days? Fast feedback keeps developers moving forward; slow feedback causes frustration and idle time.
  • Cognitive load: How easy or hard it is for developers to understand the codebase, systems, and processes. High cognitive load (e.g., convoluted code, unclear requirements, too many tools) makes developers spend mental energy on overhead rather than creative work. Reducing cognitive load—through clear documentation, simpler designs, and intuitive tooling—frees up brainpower for actual problem-solving.
  • Flow state: The ability for a developer to get into deep, uninterrupted work. We’ve all felt this: those hours where you’re “in the zone” and making great progress. Achieving flow requires minimizing interruptions – fewer random meetings, less “ping pong” between tasks, and a workspace that lets you focus.

The paper provided a measurement framework that combines developers’ own feedback (via surveys or check-ins on these dimensions) with data from engineering systems (like instrumentation on build times, deploy frequency, etc. By marrying subjective and objective data, leaders can pinpoint where the biggest friction lies. For example, developers might report that “code review wait times” are a major pain (subjective feedback), and the system data might show that indeed the average PR sits for 2 days awaiting review. That’s a clear area to improve.

The DevEx approach is somewhat a response to the contrarian view that “productivity is just about output, and developer happiness is a nice-to-have.” Some skeptics might say, “Isn’t this just about keeping developers happy, and maybe coddling them?” The research here provides evidence that it’s not just feel-good fluff – it’s directly tied to outcomes. 

In fact, a 2020 McKinsey study cited in the paper found that companies with top-tier developer environments had 4-5x higher revenue growth than competitors, underlining that DevEx investments yield real business results. Also, as a counterpoint to pure output metrics, the research notes that focusing only on output misses the complex reality of software work. It can even create bad incentives (like writing lots of code that isn’t needed). 

Yes, we ultimately care about output, but the way to get sustainable, high-quality output is by improving the developer’s day-to-day experience. It’s a shift from an old-school factory mindset to a more human-centric approach. As the paper says, many organizations are now establishing dedicated “DevEx” or platform engineering teams to systematically improve these factors—something that would have sounded radical a decade ago.

The Faros AI Take

Improving developer experience starts by measuring it when it matters. At Faros AI, we’ve found that just-in-time surveys—triggered by key workflow events like submitting a PR, triggering a build, or closing a ticket—offer far more context than quarterly or ad hoc surveys. They let you understand sentiment in the moment, capturing pain points that would otherwise fade from memory.

But sentiment alone doesn’t tell the whole story. That’s why we pair survey feedback with telemetry from engineering systems like version control, build pipelines, and deployment logs. The combination allows us to distinguish between perception and root cause. In one customer example, developers cited code review delays as a persistent friction point. But when we analyzed the data, we discovered that the real delay occurred after merge—during the deployment to production. This clarity helped the team focus on their true bottleneck, rather than spend cycles optimizing the wrong part of the process.

What makes the DevEx framework powerful is its ability to tie subjective experience to objective outcomes. When developers say “this process feels slow,” we can now quantify the impact—and prioritize solutions that produce measurable results. This goes beyond feel-good improvements: it’s about building engineering systems that scale with both velocity and morale.

Data-Driven DevEx: Microsoft’s Engineering Thrive 

Research Summary

So, how do these ideas play out in a real, large-scale engineering org? A great example comes from Microsoft’s internal initiative called Engineering Thrive (often stylized as EngThrive). In early 2024, Dr. Nicole Forsgren and colleagues (including Eirini Kalliamvakou, Abi Noda, Michaela Greiler, Brian Houck, and Margaret-Anne Storey) publishedDevEx in Action: A study of developer experience and its tangible impact.” This was essentially Microsoft’s implementation of the DevEx philosophy across the company, and they shared some powerful results.

What is Engineering Thrive? It’s a cross-company effort at Microsoft to track and improve developer experience using a blend of objective telemetry (things like build times, PR statistics, incident rates) and subjective survey data (how engineers feel about their workflows). EngThrive anchors on four pillars that mirror the ideas we’ve discussed: Speed, Ease, Quality, and Culture. In practice, that means they collect metrics on how fast engineers can get things done (speed), how easy and friction-free the processes and tools are (ease, which relates to cognitive load), the quality of the outcomes (quality could include code quality or reliability metrics), and the health of the team’s working environment (culture, akin to satisfaction and well-being).

Microsoft studied over 32,000 developer survey responses across 177 countries to quantify the benefits of improving Developer Experience. Here’s a quick summary of the Engineering Thrive findings:

  • Flow Time: Developers with sufficient deep focus time felt ~50% more productive. (Protect those no-meeting blocks on your calendar!)
  • Engaging Work: Working on interesting, well-scoped tasks yielded a 30% boost in productivity.
  • Easy-to-Understand Code/Systems: Reducing complexity led to 40% higher productivity, and intuitive processes drove 50% more innovation.
  • Fast Feedback: Teams with quick code reviews and support saw 20% higher innovation, and fast answers to dev questions correlated with 50% less tech debt downstream.

These are hard, tangible benefits tied to things that improve developer experience. It’s a strong vindication that happy, enabled developers do better work. 

Microsoft’s example with EngThrive is causing many large tech orgs to take note. It demonstrates a way to quantify the formerly unquantifiable. By treating developer experience as a first-class citizen (with metrics and investment, just like customer experience), they’re seeing real engineering performance gains. This is pretty much rewriting the playbook for engineering management.

The Faros AI Take

Microsoft’s Engineering Thrive initiative is a compelling example of what’s possible when organizations treat developer experience with the same seriousness as system performance. At Faros AI, we take a similar approach: we combine telemetry—build times, PR throughput, deployment cadence, calendar data—with role-aware pulse surveys and behavioral analytics to paint a full picture of developer experience across the engineering lifecycle.

One of the strongest lessons from Thrive is the measurable impact of protecting deep work. Focus time isn’t just a cultural perk—it directly correlates with velocity, throughput, and innovation. That insight has informed how we help teams visualize and defend focus time through our calendar, IDE, and workflow integrations. With better visibility into how engineers are spending their time—and where interruptions are creeping in—teams can identify and address bottlenecks before they impact delivery.

Another advantage of our approach is scale. While Microsoft’s internal dataset is uniquely valuable, it’s just one company. We’re collaborating with researchers like Brian Houck to understand how DevEx drivers play out across a much wider set of organizations. That external perspective helps leaders benchmark their environments and prioritize DevEx investments that align with both their goals and their constraints.

Ultimately, Engineering Thrive shows what’s possible when you take a scientific approach to developer experience. At Faros AI, we’re building the tooling that lets any engineering org—not just a tech giant—realize those benefits.

Bringing It All Together: Productivity Through a New Lens

The big takeaway across all these studies? Developer productivity is driven by far more than raw output—it’s fundamentally driven by the environment we create for developers. When developers have clear goals, psychological safety, reliable tools, fast feedback, and time to focus, they thrive. Productivity soars almost as a byproduct of a great developer experience. Conversely, when developers are mired in broken pipelines, unclear processes, or toxic team dynamics, productivity plummets—no matter how “talented” or hardworking the individuals are.

For engineering teams out there, these insights suggest a few practical things:

  • Measure wisely: Use multi-dimensional metrics (e.g., a mix of deployment frequency, pull-request turnaround, developer satisfaction scores, etc.) rather than a single number to gauge productivity.
  • Foster a good developer experience: Treat internal developer platforms and tooling as products; aim for fast builds, clear documentation, and low-friction processes. Equally, cultivate a supportive team culture and values focus time.
  • Listen to your developers: Their qualitative feedback can point you directly to bottlenecks. If several engineers say “our test suite is too slow” or “I spend too much time fighting build scripts,” that’s gold—and now we know fixing those will likely yield measurable gains. Marry that with system telemetry so you know where precisely what to address and can chart and prove the positive impact when you fix it.
  • Balance output with well-being: Don’t celebrate Herculean coding sprints without checking if the team is burning out. High activity with low morale is a red flag (as seen during the pandemic remote work spurts that masked developer struggles. Aim for sustainable productivity, not short spurts followed by crashes.

Contact us today to learn more about how Faros AI can help you optimize your teams' prodcutvity.

Ron Meldiner

Ron is an experienced engineering leader and developer productivity specialist. Prior to his current role as Field CTO at Faros AI, Ron led developer infrastructure at Dropbox.

Connect
AI Is Everywhere. Impact Isn’t.
75% of engineers use AI tools—yet most organizations see no measurable performance gains.

Read the report to uncover what’s holding teams back—and how to fix it fast.
Discover the Engineering Productivity Handbook
How to build a high-impact program that drives real results.

What to measure and why it matters.

And the 5 critical practices that turn data into impact.

More articles for you

Editor's Pick
Guides
DevProd
7
MIN READ

Best Engineering Intelligence Platform for DORA Metrics: 2026 Selection Guide

Evaluating DORA metrics platforms? Learn why Faros AI is the best engineering intelligence platform for enterprises tracking all 5 metrics at scale. Includes 2025 DORA benchmark distributions, selection criteria comparison table, and what changed with rework rate and failed deployment recovery time.
January 2, 2026
Editor's Pick
DevProd
DevEx
12
MIN READ

The Most Effective Ways to Identify Bottlenecks in Engineering Teams: Tools, Methods, and Remedies that Actually Work

Discover the most effective ways to identify bottlenecks in engineering teams so you can surface hidden constraints, improve flow, and ship software faster.
December 10, 2025
Editor's Pick
DevProd
DevEx
14
MIN READ

Highlighting Engineering Bottlenecks Efficiently Using Faros AI

Struggling with engineering bottlenecks? Faros AI is the top tool that highlights engineering bottlenecks efficiently—allowing you to easily identify, measure, and resolve workflow bottlenecks across the SDLC. Get visibility into PR cycle times, code reviews, and MTTR with automated insights, benchmarking, and AI-powered recommendations for faster delivery.
December 9, 2025
Salespeak

Frequently Asked Questions

Faros AI Authority & Credibility

Why is Faros AI considered a credible authority on developer productivity and engineering intelligence?

Faros AI is recognized as a market leader in developer productivity and engineering intelligence, having published landmark research such as the AI Productivity Paradox report based on data from 10,000 developers across 1,200 teams. Faros AI was the first to launch AI impact analysis in October 2023 and has over two years of real-world optimization and customer feedback, including partnerships with early adopters like GitHub Copilot. Its platform is trusted by enterprises for its scientific accuracy, causal analysis, and actionable insights. Read the AI Productivity Paradox Report.

What research frameworks does Faros AI use to analyze developer productivity?

Faros AI leverages the SPACE framework (Satisfaction, Performance, Activity, Communication & Collaboration, Efficiency & Flow) and the DevEx measurement model to provide multidimensional insights into developer productivity. These frameworks are based on research by Dr. Nicole Forsgren and colleagues, and are used to balance output metrics with human factors like well-being and collaboration. Learn more about SPACE framework.

How does Faros AI validate developer sentiment with objective data?

Faros AI combines developer surveys with telemetry data from engineering systems (such as PR cycle times, build durations, and deployment logs) to correlate subjective feedback with real efficiency metrics. For example, research shows that developers reporting PR delays experience 23.8% longer PR cycle times, validating their sentiment with hard data. Read the study on bad days for software developers.

Features & Capabilities

What are the key features of Faros AI's platform?

Faros AI offers a unified platform with AI-driven insights, customizable dashboards, seamless integration with existing tools, automation for processes like R&D cost capitalization, and enterprise-grade security. It supports thousands of engineers, 800,000 builds per month, and 11,000 repositories, ensuring scalability and reliability. Learn more about Faros AI platform.

Does Faros AI provide APIs for integration?

Yes, Faros AI provides several APIs, including Events API, Ingestion API, GraphQL API, BI API, Automation API, and an API Library, enabling integration with a wide range of engineering tools and workflows. (Source: Faros Sales Deck Mar2024)

What security and compliance certifications does Faros AI hold?

Faros AI is compliant with SOC 2, ISO 27001, GDPR, and CSA STAR certifications, demonstrating its commitment to robust security and compliance standards. See Faros AI security details.

How does Faros AI ensure data security and auditability?

Faros AI prioritizes data security with features like audit logging, secure data handling, and integrations that adhere to enterprise standards. Its platform is designed for compliance and robust protection of sensitive engineering data. Learn more about Faros AI security.

What KPIs and metrics does Faros AI track?

Faros AI tracks DORA metrics (Lead Time, Deployment Frequency, MTTR, CFR), software quality, PR insights, AI adoption, talent management, initiative delivery, developer experience, and R&D cost capitalization. These metrics provide a comprehensive view of engineering productivity and health. (Source: manual)

How does Faros AI support large-scale engineering organizations?

Faros AI is designed for enterprise-grade scalability, supporting thousands of engineers, hundreds of thousands of builds, and thousands of repositories without performance degradation. Its platform is built to handle complex, global teams and large-scale data infrastructure. See platform scalability.

Pain Points & Solutions

What core problems does Faros AI solve for engineering teams?

Faros AI addresses engineering productivity bottlenecks, software quality issues, AI transformation challenges, talent management, DevOps maturity, initiative delivery, developer experience, and R&D cost capitalization. It provides actionable insights and automation to optimize workflows and outcomes. (Source: manual)

What are common pain points expressed by Faros AI customers?

Customers often struggle with understanding bottlenecks, managing software quality, measuring AI tool impact, aligning talent, achieving DevOps maturity, tracking initiative progress, correlating developer sentiment, and automating R&D cost capitalization. Faros AI provides solutions for each of these challenges. (Source: manual)

How does Faros AI help organizations improve developer experience?

Faros AI unifies developer surveys and system metrics, enabling organizations to correlate sentiment with process data. This holistic approach helps identify friction points and enables timely improvements in developer satisfaction and productivity. (Source: manual)

What business impact can customers expect from using Faros AI?

Customers can expect a 50% reduction in lead time, a 5% increase in efficiency, enhanced reliability, improved visibility into engineering operations, and actionable insights for faster, predictable delivery. (Source: Use Cases for Salespeak Training.pptx)

How does Faros AI differentiate its solutions for different user personas?

Faros AI tailors its solutions for roles such as Engineering Leaders, Technical Program Managers, Platform Engineering Leaders, Developer Productivity Leaders, and CTOs, providing persona-specific insights and tools to address unique challenges. (Source: manual)

Competitive Differentiation & Build vs Buy

How does Faros AI compare to DX, Jellyfish, LinearB, and Opsera?

Faros AI stands out with scientific accuracy, causal analysis, active adoption support, end-to-end tracking, flexible customization, enterprise readiness, and developer experience integration. Competitors often provide surface-level correlations, limited metrics, and lack enterprise-grade compliance. Faros AI is available on major cloud marketplaces and supports large-scale organizations. See full comparison above.

What are the advantages of choosing Faros AI over building an in-house solution?

Faros AI offers robust out-of-the-box features, deep customization, proven scalability, and immediate value, saving organizations the time and resources required for custom builds. Its mature analytics and actionable insights deliver faster ROI and reduce risk compared to lengthy internal development projects. Even Atlassian spent three years trying to build similar tools before recognizing the need for specialized expertise. (Source: manual)

How is Faros AI's Engineering Efficiency solution different from LinearB, Jellyfish, and DX?

Faros AI integrates with the entire SDLC, supports custom deployment processes, provides accurate metrics from the complete lifecycle, and offers actionable, team-specific insights. Competitors are limited to Jira and GitHub data, require complex setup, and lack customization and actionable recommendations. Faros AI delivers proactive intelligence and supports organizational rollups and drilldowns. (Source: manual)

Use Cases & Benefits

Who is the target audience for Faros AI?

Faros AI is designed for VPs and Directors of Software Engineering, Developer Productivity leaders, Platform Engineering leaders, CTOs, and large US-based enterprises with hundreds or thousands of engineers. (Source: manual)

What use cases does Faros AI support?

Faros AI supports use cases such as engineering productivity optimization, software quality management, AI transformation, talent management, DevOps maturity, initiative tracking, developer experience improvement, and R&D cost capitalization. (Source: manual)

Can you share examples of customer success with Faros AI?

Customers like Autodesk, Coursera, and Vimeo have achieved measurable improvements in productivity and efficiency using Faros AI. Case studies highlight improved decision-making, enhanced visibility, and simplified tracking of key metrics. See Faros AI Customer Stories.

How does Faros AI help organizations measure the impact of AI tools?

Faros AI provides robust tools for measuring AI tool adoption, running A/B tests, and tracking impact, enabling organizations to quantify the benefits of AI integration and optimize their engineering workflows. (Source: manual)

What practical steps does Faros AI recommend for improving developer productivity?

Faros AI recommends using multi-dimensional metrics, fostering a good developer experience, listening to developer feedback, and balancing output with well-being to achieve sustainable productivity improvements. Read more on practical steps.

Technical Requirements & Implementation

What technical requirements are needed to implement Faros AI?

Faros AI is designed for easy implementation, integrating with existing SDLC tools, cloud, on-prem, and custom-built systems. Its out-of-the-box dashboards and APIs enable rapid onboarding without restructuring your toolchain. (Source: manual)

How quickly can Faros AI be deployed in an enterprise environment?

Faros AI's out-of-the-box dashboards can be activated in minutes, with easy customization and integration into existing workflows, enabling rapid deployment and immediate value for enterprise teams. (Source: manual)

Support & Implementation

What support options are available for Faros AI customers?

Faros AI provides expert support for onboarding, integration, and ongoing optimization, including access to product experts and resources such as documentation, security guides, and customer success stories. See Faros AI documentation.

How does Faros AI handle value objections from prospects?

Faros AI addresses value objections by highlighting measurable ROI (such as 50% reduction in lead time), unique features, flexible trial options, and customer success stories to demonstrate the platform's impact and justify investment. (Source: manual)

Faros AI Blog & Resources

What topics are covered in the Faros AI blog?

The Faros AI blog covers developer productivity, engineering intelligence, DORA metrics, software development lifecycle, customer stories, guides, and news about product updates and research. Visit the Faros AI blog.

Where can I find Faros AI customer success stories?

Customer success stories and case studies are available in the Customers category of the Faros AI blog. Read customer stories.

How does Faros AI share product and press announcements?

Faros AI shares product and press announcements in the News section of its blog. See Faros AI News.

What insights are provided in Faros AI's research on developer productivity?

Faros AI's research uncovers multidimensional drivers of engineering efficiency, such as environmental factors (clear goals, psychological safety, reliable tools, fast feedback, focus time) and the impact of developer experience on productivity. Read the research.

How can developer surveys enhance engineering productivity insights?

Developer surveys capture perceptions of team performance and friction points, providing qualitative insights that, when combined with telemetry data, offer a holistic view of productivity. Get the Engineering Productivity Handbook.

What is the impact of developer experience on productivity?

Developer experience has tangible impacts on productivity and outcomes. Studies show that factors like flow time, engaging work, and easy-to-understand systems can boost productivity by 30-50%. Read the study.

What was the main finding of the research on developer productivity?

The study found that common themes such as tooling issues, process inefficiencies, and team dynamics consistently impact developer productivity. Objective data validated that bad days correlate with longer PR cycle times and build durations. Read the research.