What’s Your (Learning) Agenda?

Tony Senanayake
11 min readApr 1, 2022

Co-authored with Gabrielle Posner

It is 9am, you have just completed the daily Wordle (3/6 not bad…), and you have a call with a potential partner. They are a large social enterprise working in the climate and sustainability space. You will be speaking with Suzy Tanable, the Head of Strategic Initiatives, and Evan Dense, the Monitoring and Evaluation (M&E) lead.

After discussing the weather, Suzy Tanable dives in and asks, “So, we have just received a huge pool of funding from a generous philanthropist. They are the former partner of a billionaire, and their partner made their money by creating a seamless platform for purchasing goods online. Anyway, we have an exciting strategic vision for our organisation and how to catalyse economic development for millions through access to clean energy. We were hoping we may be able to work with you to create a strategy for how data and evidence can help answer all the big and small questions that we have.”

You open Slack and message the group, “I know a framework for this. Let me introduce you to a Learning Agenda.”

  1. So, what is this Learning Agenda thing?

Leaders tend not to think in terms of Theory of Change models, intermediate outcomes, causal effects and evaluation design (as much as we may prefer that they did at times). Rather they tend to be focused on questions of business strategy, 5-year plans, and operational delivery of their programs. Leaders may have questions such as:

  • Are certain core assumptions underlying our program valid?
  • Should we be devoting resources to X, Y or Z to drive a certain outcome?
  • What intervention is most appropriate for driving a certain outcome for a certain type of beneficiary?
  • What barriers and/or opportunities exist to drive a certain outcome?
  • Which beneficiaries are being reached by our program, and are they the same beneficiaries we are targeting?
  • How much is our intervention leading to changes in the welfare for our beneficiaries?

As you read these questions, you may be thinking, “Well, I can think of the right M&E tool for this question,” which is exactly the point. M&E specialists tend to think in tools, in terms of experimental designs and nifty statistical techniques, but most people do not. So a Learning Agenda aims to bring these two types of thinking together. Consequently, in the spirit of being demand-driven, a Learning Agenda aims to meet leaders where they are, and in a language they speak.

At its core, a Learning Agenda is a strategic roadmap to evidence generation that sets out a structured list of key questions that decision-makers have and a path for answering these questions by using monitoring, evaluation and learning activities. See examples from Teaching at the Right Level — Africa, USAID’s Self-Reliance Learning Agenda, and UNRISD’s Strategy.

The Learning Agenda’s target audience is generally key decision-makers in a team or organisation, not the M&E team. A Learning Agenda can be designed for a specific project in rural Bihar or for how to run the Presidency of the United States of America.

A Learning Agenda can be delivered as a written document or presentation; however, irrespective of format, it is structured around key questions. These high-level questions tend to have related or subsidiary questions. The Learning Agenda will then provide a summary of monitoring, evaluation and learning techniques (at a high-level and in plain language) to answer these questions, along with a plan for how this evidence will inform key decisions. It may be the case that many partners may be required to answer the key learning questions that arise from the Learning Agenda particularly where specific expertise is necessary.

You may ask the question: “How is this different from how we generally think about developing an M&E strategy for a partner?” and you would be right to ask this. A Learning Agenda is one way to outline an M&E strategy. However, a Learning Agenda is not focused on the M&E tools; it is focused on learning questions. These questions are structured and drafted in words that resonate with decision-makers. The tools are simply a mechanism for answering questions, and the tools tend to be described in simple language and with minimal focus.

2. Who else is using Learning Agendas?

Learning Agendas are popping up everywhere!

Learning Agendas have been instituted into the United States Government through the Foundations for Evidence-Based Policymaking Act (2018), USAID, and through USAID contractors. Furthermore, other funding agencies, philanthropies, NGOs and think-tanks are increasingly using Learning Agendas. We expect it will become an increasingly important tool for translating the value of data and evidence to decision-makers.

3. How should I go about preparing a Learning Agenda?

It’s day two, and you’ve convinced both your client and your project team that you know how to create a Learning Agenda. Have no fear; just follow these four easy steps (adapted to project context), and you will craft a rigorous learning agenda in no time.

a) Identify the Learning Questions

There are two main methods for extracting learning questions from leaders, namely the top-down or bottom-up approaches. These learning questions will eventually form the foundation for the learning agenda.

The top-down approach starts at the decision level. High-level stakeholders will propose strategy questions that are on their minds, for example: should we invest in product A or B or neither? Then, you map these questions to the Theory of Change and frame them as learning questions: “Which investment — A, B, or neither — will drive a certain outcome which will lead to a certain impact?” You may derive multiple learning questions from one strategic question.

Strategic Question: Is distributed renewable energy the most efficient way to electrify last-mile communities?

Learning Questions:

  • Does providing mini-grid access to rural communities increase electricity consumption for productive use?
  • Will constructing mini-grids in rural communities electrify more, less, or the same amount of people as grid-based and micro-grid sources?

The bottom-up approach starts within the Theory of Change model, particularly within the links connecting each node of the model. The underlying assumptions of a program’s model lie within these connecting links; therefore, so do the questions used to interrogate these assumptions. The goal is to identify where decision-makers need to rely on assumptions and fill those gaps with evidence.

So, which approach should you use? Use the top-down approach if stakeholders have clear strategic questions and have prioritised which are most important. If they have already engaged in a thorough decision-identification process, this is a good indicator that the top-down approach is more likely to be effective. By directly engaging decision-makers in the learning process, you increase the likelihood that the learning agenda will lead to evidence-based decision making.

However, from our experience, this situation is rarely the case. We generally recommend the bottom-up approach. It is a more rigorous, holistic approach for unearthing learning questions that may not be immediately obvious to leaders.

You now have a meeting with Evan Dense, the M&E lead, to kick off the Learning Agenda design process. You present these two approaches to him, and he suggests using the bottom-up approach because the enterprise’s stakeholders are too busy to brainstorm learning questions. You make sure your notifications are turned off and present the Theory of Change on your screen. “Can everyone see my screen?” you ask despite, knowing the answer is always yes.

Taking one “link” at a time, you focus on the first logic chain:

You zoom in on the link between the input and output and follow the following framework suggested by your brilliant colleagues.

Evan Dense is ecstatic about this approach. You take the rest of the work offline and repeat this process for every other link in the Theory of Change. See the “helpful materials” section below for example templates to guide this process.

This process will reveal for you many questions, but not all are empirical research questions. Decision makers are likely to have many strategy-level questions. Be sure to discern and communicate that your role as an M&E advisor is to select the empirical questions, and you can defer the strategy questions to a relevant strategy consultant.

b) Prioritize the learning questions

M&E teams face time and resource constraints that prevent them from deploying M&E activities to answer every learning question they may have. The following framework can help M&E teams prioritise their learning questions to discover which ones are most ripe for evidence generation. Our team used the Eisenhower Matrix, which uses urgent and important criteria (figure 1).

Important: How many key project-level decisions rely on the answer to this evidence gap? Will the answer to this question change/reinforce key project strategic priorities?

Urgent: Will the presence of this evidence gap prevent the project’s progress? How soon will decisions relevant to this question need to be made?

Using your knowledge of the priorities in the climate and sustainability project, you can take the first pass to score the learning questions and provide short justifications for each score. Then, ask Evan Dense and his team to validate the scores. Once the scores for “urgent” and “important” are complete, you may want to add a third criterion: feasibility. How likely is it that this methodology will produce valid and reliable insights? Can an evaluation partner answer this question with sufficient time for results to influence decision making? To facilitate ease in communication with stakeholders, we suggest assigning cut-offs along the numerical scale for high, medium, and low priority questions.

c) Pilot and get feedback on the prioritisation from decision-makers.

At this stage, we recommend socialising the learning agenda with stakeholders familiar with key decisions, strategic priorities, timelines, and levels of uncertainty. They may be able to identify new evidence gaps or needs and offer clarity on how to tie the learning agenda to strategic questions.

Your conversations with decision-makers will also allow you to gauge the impact potential of a particular evaluation. For a particular empirical question, you can try and get a clearer idea of how likely the relevant decision-makers are to actually consult the evidence, rather than intuition, established processes, or firmly-held assumptions, for the decision informed by the answer to this question. We suggest prioritizing those questions tied strongly to decisions dependent on evidence and deprioritizing the others.

You can use this input to rescore some questions and share the “final” list of high-priority learning questions. However, remember to encourage the partner to repeat this process regularly. The hierarchy of questions should adapt as the strategic priorities of the organisation change.

d) Match the Learning Questions to Monitoring and Evaluation Tools

Now, it’s time to dig around your M&E toolbox to find the most appropriate tool to answer each question. One evaluation can answer many questions at once; similarly, evaluation partners may need to employ multiple tools to answer one question adequately. Depending on the audience of your learning agenda, you may want to write short descriptions of the M&E tools for a non-specialist audience with more details in an appendix. Remember, the crux of the learning agenda is the learning questions, not the M&E tools used to answer them.

Learning Question: Which subsidies, if any, reduce the cost of production for energy developers and to what extent?

M&E Tool:

  • A qualitative process evaluation will track the flow of money through stakeholders, from government policies (subsidies) to the cost structure of energy production.
  • A quantitative willingness-to-pay evaluation will measure the price needed for energy developers to change their investment and production decisions.

Suppose you choose to present the M&E tools in stakeholder consultations. In that case, we advise that you focus on the questions and decisions and introduce M&E tools as simply a mechanism for supporting these questions and decisions.

4. The Do’s and Don’t for Preparing a Learning Agenda

Do

  • Live at the question and decision level: Center the questions as the focus of the agenda, where the answer to each question is linked to a decision or set of decisions for the program prioritisation.
  • Use the Theory of Change as a framework: Extract questions from the TOC assumptions. These questions will ensure that you anticipate as many evidence gaps as possible.
  • Focus on questions: The question development, selection, and processes ensure that each question is tied to programmatic priorities. Therefore, it brings the evidence closer to decision making.
  • Have a clear prioritisation process for questions: Define criteria clearly and score questions against them.
  • Work with a range of internal stakeholders and iterate constantly: Create flexibility and feedback loops into the process. There will be a lot of iterations as the agenda is socialised with stakeholders. Have a clear strategy for identifying stakeholders and when and how to engage them. You may go back and forth between a few steps over the rounds of iteration.
  • Socialise the Learning Agenda (if possible): Seek external consultation. We are limited in knowing the full range of questions that need to be asked. Sector experts can help reveal them.

Don’t

  • Live at the monitoring indicator level: The learning questions are meant to ask deep, strategic, empirical questions. More often than not, they will question the causal links in the ToC or try to discover the best way to get from one stage to the next.
  • Use business strategy as a framework: Business strategies are often in flux. By anchoring in one, you risk chasing a moving target. Incorporate flexibility into the process to quickly reflect TOC updates, emerging evidence gaps, and reprioritisation in the Learning Agenda.
  • Focus on M&E tools: For example, don’t ask, “what impact evaluation can we do?” This approach is far removed from the key decisions that need evidence. The tools are only valuable if used to generate evidence that informs decisions.
  • Rely on intuition for question prioritization: Intuition is not a rigorous tool, especially when you are removed from the decision making process.
  • Use stakeholder workshops only to present and validate the Learning Agenda: Learning Agendas are never final documents. They are always meant to change with the evolution of the program. Even from its development phase, the Learning Agenda should adapt to best respond to the programmatic evidence needs of a given time.
  • Disregard humility: While we have vast expertise, many of us are not experts on the many niche sectors that our projects fall into.

Other Helpful Links

--

--