In the world of digital development, it can be tempting to measure success based on numbers alone. We celebrate user counts, high transaction volumes, and adoption rates. These are all important metrics – but if we take a step back and really consider the bigger picture, we must ask ourselves: are these numbers truly reflecting the impact on the people we’re trying to serve?
Dashboards and analytics might show impressive growth, but they don’t reveal the human experience behind the technology. Numbers cannot capture the frustrated parent trying to access their child’s health records, the small business owner trying to navigate a complicated digital permit application, or the elderly person that would rather forego a complicated digital experience at the risk of an inconvenient physical one. These are the real stories that matter – the ones that can be overshadowed by quantitative data.
The Principles of Digital Development call for “using evidence to improve outcomes” to help us make digital initiatives truly work for people. Beyond simple data collection, getting feedback and integrating learnings can be a difficult process. But it is the only way we can use technology to create meaningful change. Below, we offer five pieces of advice to help digital development practitioners put this principle into practice and use evidence in a more thoughtful, human-centered way:
1. Shift the focus from metrics to impact
Metrics such as active users and transaction volume are easy to track, and they can provide a snapshot of digital adoption. But these numbers only tell part of the story. For instance, how do we know if users are finding the services they need? Are they successfully able to complete transactions without frustration or confusion? Is anyone being left behind?
The key is to focus on outcomes over outputs. Success should be measured on how well digital tools address real user goals and pain points, not just how much traffic they attract. We can redefine what success looks like by zeroing in on the impact of a service on people’s lives.
How can this work in practice? An organization building a digital service for public records might traditionally report total user registrations as a success metric. A shift to tracking the percentage of users that successfully find the records they need without frustration would lead to greater understanding of the quality and usefulness of the service.
2. Collect the right data, not just more data
The temptation is to collect and report everything, but more data doesn’t always lead to better insights. Many teams struggle with data overload – too much data without a clear strategy on how to use it.
A good way to circumvent the data trap is to prioritize qualitative feedback through user surveys, focus groups, and accessibility metrics. The point is to collect strategic and specific insights that add depth to an understanding of impact. The most effective digital initiatives today use multiple streams of evidence to understand how people are using the service and how it’s affecting their lives.
How can this work in practice? Imagine a health care initiative that rigorously tracks engagement with pages and screens. User testing sessions can help the team gain meaningful insights into barriers to user navigation which can lead to targeted improvements.
3. Create feedback loops that drive action
Collecting evidence is useful when we actually do something with it. Too often, feedback from users is collected, but the lessons learned are not fed back into the project lifecycle. Failure to act on feedback not only wastes time and valuable insights, but also erodes trust with users. Feedback loops need to be clear, actionable, and responsive.
Good design includes offering multiple channels for users to provide input, whether through surveys, chatbots, or issue reporting. Make sure the process includes acknowledging user feedback, analyzing what they’re saying, making appropriate adjustments and communicating those changes. When users see that their feedback leads to change, it strengthens their trust and engagement with the service.
How can this work in practice? A digital government service for business permits can introduce a visible and prominent feedback button, inviting users to report issues or suggest improvements when they run into problems.
4. Ensure continuous evidence gathering
Gathering evidence is not a one-time task. We need metrics for many things including upstream reporting. However, to truly improve digital services, we need continuous user feedback to course correct when things aren’t working, instead of waiting for the next reporting cycle to make changes.
Good design includes ways for users to provide feedback regularly when interacting with the digital tool. This will help keep your finger on the pulse of how your digital service is performing.
How can this work in practice? An AI tool for agriculture advisories can add a quick, one-click survey at the end of each session, asking users if the information was helpful.
5. Focus on holistic user experience
The full context of the user experience includes the emotional journey, or the challenges users face in the real world. For example, a digital health service might collect data on how often users access health information. Yet, to fully understand the user journey, digital services need to understand the issues a user might face in low connectivity areas, with literacy or language barriers, or in accessing the app while actively facing a health issue.
Collecting deeper insights reveals not just how many people use the service, but how and why they use it, and what their digital experiences mean in their non-digital lives.
How can this work in practice? A digital education platform for remote areas can track enrollment numbers, connectivity issues, device access, and language preferences in an effort to make the platform more accessible for underserved communities.
Collecting the right evidence can be a difficult task. But it’s critical to driving positive impact for people.
Hard data – and the insights that can come from it – is important. It can drive innovation, motivate us to problem solve, and show progress on our goals. But, as we consider the future pathways for digital development, the evidence we gather must reflect the real experiences of users, not just the outputs of a system.
Ready to strengthen your evidence-based approach? Join the Principles community to share experiences and access practical resources for measuring what matters in digital development.