Skip to content

Four Lessons Learned During Our Market Research with Users

|

5 mins read

Earlier this year, the research team at the Digital Impact Alliance conducted market research on how to incorporate marketplace-type features into the Catalog of Digital Solutions. The Catalog was first developed to address a key need in the digital ecosystem: the discoverability of digital public goods – tools, services, and projects being deployed and implemented to support digital transformation in the development context. However, we received frequent feedback from our users to connect demand- and supply-side actors by making the procurement process of digital solutions more streamlined and accessible. Therefore, we set out on a research project to gain in-depth insight and feedback from diverse stakeholders on their needs and challenges when it comes to procuring digital solutions. Below are the four lessons we learned in our efforts to incorporate a user-centered approach to our marketplace research design process. 

Lesson 1: Start by conducting preliminary consultations with existing users on the challenges they face in the implementation and deployment of digital solutions

For certain research projects, it is difficult to know where to start and whether the research work is truly needed. Following multiple brainstorming sessions on how to approach this research project, we decided we needed to first conduct preliminary consultations to find out if there are any gaps and pain points still not being addressed by the Catalog. 

Thus, prior to launching the market research, we reached out to the existing users of the Catalog for general feedback on the digital ecosystem – through surveys, workshops, and user interviews. There was instrumental value in listening to the feedback and questions raised by this stakeholder group during this preliminary phase.   

For example, among the many questions we heard from users on the demand side of the Catalog is that they wanted to know which tools can be used for their specific use cases, how they can deploy the tool themselves or solicit the services of an implementer to do so, how they can build the capacity of their teams to use the tool, and how they can access funding that will help them in their digitalization efforts.  

Users from the supply side – namely product owners or system integrators – wanted to know how they can showcase their tools or services to their consumers and how they can grow, scale, and access reliable revenue to support their business objectives.  

However, the common theme both demand and supply-side stakeholder groups asked us was, “I have found a product on the Catalog that I want to use, now what?” 

This justified a need to: (1) conduct more research and find out if and to what extent the Catalog can support in the procurement process of these products; and (2) if the Catalog can support the procurement process, identify a solution to effectively connect those in demand of digital solutions and services, with those with the expertise and experience supplying these services. 

Lesson 2: Outline research objectives and goals into a learning agenda with a research methodology and timeline 

Based on the data collected and synthesized in the preliminary consultations, we saw the value in outlining the problem statement, research scope, and general questions we wished to solve into a live document. This document became the learning agenda, including a section with the research methodology and timeline.  

In the learning agenda, we outlined questions we wished to explore to decipher stakeholders’ needs, understand short-term and long-term variables needed in the digital ecosystem, and unpack technical and operational considerations around incorporating marketplace-type features into the Catalog.  

Finally, we learned that it is crucial to have multiple feedback loops in the process of finalizing the learning agenda and research methodology document. Thus, this document was shared with key experts in the ecosystem to garner feedback and input before it was finalized. All these key factors supported our research team in designing this research project with the target users’ input and establishing a methodology to keep track of our research deliverables and outputs.  

Lesson 3: Set up a reference group 

When it comes to ensuring a user-centered research process – the most instrumental lesson we learned a few months into the research work was the importance of setting up a reference group prior to finalizing the research scope and implementing the research activities. Specifically, a reference group made up of key experts across different stakeholder groups with relevant expertise on the research topic.  

We saw firsthand, the exceptional value of setting up a reference group with a diverse set of 20 leaders and experts in the Digital Public Goods (DPGs) sphere. A total of three reference group meetings were facilitated by the research team this year. In each meeting, the members were consulted for feedback and asked to vet and review the research data our research team has collated through interviews, surveys, and focus group/visioning sessions.  

In summary, these key experts have been instrumental in providing ongoing advisory and technical support for the Catalog/marketplace research. They have collectively assessed the research findings presented by the research team and provided input on the value, feasibility, and opportunity of specific proposed marketplace-type features for the Catalog.  

Lesson 4: Conduct user-centric research activities – interviews, focus group sessions, and a survey  

The final lesson learnt was to implement a diverse range of research activities with the target users of the Catalog and marketplace components. We first developed a categorization of each stakeholder group based on user personas and created a live document to list and keep track of the individuals consulted, the stakeholder group they represented, and their organization.  

To ensure reach and consultations with as many stakeholders as possible, we learned that it was important to reach out to colleagues within the Digital Impact Alliance and the reference group, for recommendations and introductions to individuals to consult for this research project. This supported us in working towards a more well-rounded and inclusive stakeholder list that promotes representation across roles, geographies, sectors, and gender.  

As there was a specific timeline allocated for this research project, it was not feasible to organize individual interviews with all the stakeholders identified. Thus, our research team organized multiple focus group sessions with individuals from the same stakeholder group to participate in interactive 2-hour focus group sessions.  

To promote interactive exchanges and free-flowing conversations between the participants, we noticed that we received more tangible input by posing a maximum of five pre-prepared questions to the group and then opening the floor for responses. A member of our research team then facilitated an interactive conversation through follow-up questions that engaged all participants to provide their input or build off points raised by their fellow participants.  

Finally, a survey was created and disseminated widely to ensure more reach to individuals/ stakeholder groups not consulted in the focus groups and individual interviews. 

Should you be interested in learning more about the user centric research activities and the output of the Digital Impact Alliance marketplace research – specifically the three marketplace type features being prioritized in the Catalog, all this information can be found in this research report. 

How to stay in touch with us  

As the platform continues to grow and expand, we want to hear from you. How have you found the Catalog useful? We invite you to share your story or to provide feedback, suggestions, or ask us questions. Get in touch with our team at info@solutions.dial.community. 

Sign up to be on our mailing list for our quarterly newsletter. 

Share article