Measuring the impact of adult education policies

Shermaine Barrett
University of Technology




Abstract This article explores factors, issues and recommendations that must be considered when measuring the impact of adult education policies. It is based on a study that polled the perspectives of adult education practitioners, academics and policy decision-makers on the critical principles and appropriate measures necessary to determine the impact of adult education policies.

Adult education is rich in regional, national and international policy papers, outlining aims and setting targets. Among these are the Belem Framework for Action, the Education 2030 Framework for Action, and the various country-level adult education and/or lifelong learning policies. However, as Heck (2004) pointed out, “policies may set directions and provide a framework for change, but they do not determine the outcome directly”. Therefore, a critical aspect in the policy implementation process is the policy evaluation stage. Studying the impact of adult education policies is about evaluating policies, through a robust and systematic process, to determine the extent to which policies have achieved their intended outcomes, to identify notable shortcomings, and to assess whether the policies can be sustained.

This article is based on reflections from eight adult education specialists from Europe, Africa, Latin America and the Caribbean on how to assess impact.

Definition, scope and purpose of adult education

The Belem Framework for Action (2010) defines adult education as “the entire body of ongoing learning processes, formal or otherwise, whereby people regarded as adults by the society to which they belong, develop their abilities, enrich their knowledge, and improve their technical or professional qualifications or turn them in a new direction to meet their own needs and those of their society” (p.1). This definition reveals the breadth and scope of adult education, and therefore its complexity. Consequently, there is no single conceptual framework, basic assumptions or principles from which all adult educators view the field. UNESCO and many civil society organisations such as the International Council for Adult Education view adult education from a rights perspective, while other entities, including many international funding agencies, take a more economic view of adult education.

A wide range of thoughts and ideas characterise the purpose of adult education. As countries seek to address the increasing demands they face in meeting a number of often conflicting goals driven by the needs of individuals, businesses, and society at large, tensions arise around the purpose of adult education (Alfred, Robinson & Alfred 2011). Issues such as the: i) economic concerns for equipping adults with appropriate workplace competences and skills that would allow nations to compete in the global markets; ii) civic concerns for an education agenda that would prepare adults to participate more fully in public life; and iii) individual concerns for self-development through lifelong education, contend in the debates about the purpose of adult education (Kubow & Fossum 2007). Despite these varied concerns most agree that education [including adult education] creates improved citizens and as such aids in the improvement of the general standard of living in a society (Olaniyan and Okemakinde 2008; McGrath 2010).

Policies: their role and function

According to a UNESCO (2018) online posting, “solid coherent policies and plans are the bedrock on which to build sustainable education systems, achieve educational development goals and contribute effectively to lifelong learning.” Fitzpatrick, Sanders and Worthen (2012) define policy as the broader act of a public organisation or a branch of government that is designed to achieve some outcome or change. A policy may be articulated as a law or a regulation that provides the guidelines and regulations to enable the change.

Why should we assess impact?

Impact assessment is the collection, analysis and interpretation of information that sheds light on the effects of a policy change on a set of outcomes of interest, including unintended outcomes. Impact assessment is important to demonstrate the relevance, efficiency, quality and effectiveness of the policy. One respondent noted that: “Policies are established to create an enabling environment to address a need, especially of under-served populations. Therefore, assessing adult education policies becomes important to measure the extent to which the: i) needs of the population have been met; ii) intended goals, objectives, and outcomes of the policy were achieved; iii) fidelity of the policy was met; iv) adequate resources were in place; and v) adjustments that must be made to the policy” (Res.1).

At another level, policy impact assessment is important because – as governments struggle with scarce resources and in many instances budget deficits amidst competing social issues – they need information about the relative effectiveness of programmes and policies in order to make intelligent choices and decisions (Fitzpatrick 2012). One respondent noted that “impact is important for the legitimation of public spending” (Res.7). It is through policy impact studies that governments can decide which policy is working and which ones are not. Questions such as: i) What is working well? ii) What is failing?, and iii) What can be done to improve the situation? are key drivers of such assessment. This further speaks to the issue of accountability which those polled in this survey felt was an essential reason for conducting adult education impact studies.

Another important but less discussed reason for assessing policy that arose in the study is that the results can be used as an advocacy tool. One respondent noted that “the proof of impact is crucial for advocacy, especially as ALE faces the challenges that many decision-makers think about it as a ‘nice to have’, with a limited impact compared to schooling.” (Res.7)

“Three themes influencing policy ­outcomes emerged from the practitioners surveyed for this article. They are context, type of policy and the goal of the policy.”

How should we assess impact?

Another key question of this article is: How should impact assessment be conducted? This includes: i) the types of outcomes to be evaluated, ii) how to infer impact, iii) data collection methods, iv) types of evidence to be used, and v) how the evaluation should be financed.

Three themes influencing policy outcomes emerged from the practitioners surveyed for this article. They are context, type of policy and the goal of the policy. In terms of context, the idea is that social policy, such as an adult education policy, cannot be divorced from its social context. One respondent noted that “types of outcome would be context dependent. A generic one-size-fits-all may not do” (Res.3). In terms of type of policy, the general direction is that what is evaluated is dependent on whether the policy was sector wide or specific to one segment of the educational sector. One respondent said that the outcomes to be assessed depend a lot on the type of policy and how broad the policy “intervention” is: Does it refer to all adult education sectors, or only some of its segments? Is it about certain target groups, or certain issues, or set of principles that need to be established? (Res 2). Another respondent said that impact “will clearly depend on the objectives established for the policy. Goals should generate impacts.” (Res.6)

Taking into consideration the three elements discussed above, outcomes identified include educational outcomes, economic outcomes, well-being and social outcomes. One respondent noted that outcomes should include the following:

  • Educational outcomes – What percentage of the adult population has attained basic literacy skills such as reading, writing, numeracy and computer skills? What percentage of adults have upgraded their educational level?
  • Economic outcomes – How many adult learners have gained employment with higher wages and income due to participation in adult education programmes? Attention should be placed on the attainment of improved standards of living.
  • Well-being outcomes – Have adults gained improvements in self-confidence, communication skills, emotional and physical health, and soft skills?
  • Social outcomes – How have adult learners acquired improved civic attitudes? Have they become active citizens as a result of their engagement in adult learning programmes?
  • Environmental outcomes – To what extent [has] adult learners’ knowledge of the environment and its impacts increased? (Res.8)

Turning to how we infer impact, one respondent argued that “impact would be a clear indication of the measures that indicate change in the problem based on what obtained at the beginning and later on at the point of measuring” (Res.4). This suggests that to be able to infer impact there must be a set of agreed indicators of success or lack thereof. Outcome indicators are the quantitative or qualitative variables that provide a simple and reliable means to measure achievement and to reflect the changes connected to an intervention (Kusek & Rist 2004).

“It is very important to be aware of the paradigm behind our assessment approach [...] since it will determine the results.”

Respondent 2

Measuring the right things

In terms of measures that indicate change, the Belem Framework for Action (2010) provides a set of reliable performance indicators: i) enabling policy environment and adequately ­resourced policy established; ii) good governance structures established; iii) quality provisions established; iv) increased participation, inclusion and equity for the under-served population; and v) adequate financing in place. However, use of these indicators is dependent on one’s views on the purpose of adult education. This is because one’s beliefs about the purpose, role and function of adult education will undoubtedly impact the perception of what indicates impact. One respondent summarised the importance of perspective on decisions about indicators very well by noting that: “It is very important to be aware of the paradigm behind our assessment approach (which includes methodology, indicators...), since it will determine the results. Exclusive use of one approach (for example – the human capital approach) may ­reduce our perception of the effects and impact and even create a false image of the policy. Critical discourse analysis is helpful in preparing the impact evaluation, and in collecting data we should be aware of the goals, methodol­ogies and indicators based on certain paradigms (positivist, interpretative, transformative or critical-emancipatory). This may also shed light on different kinds of biases that may influence the analysis”. (Res.2)

In terms of the type of evidence that would demonstrate impact, the respondents felt that these included both statistical measures as well as qualitative data. One respondent puts it this way: “Evidence collected from both quantitative data (statistics in education but also in other sectors, depending on the scope of evaluation), various statistical measures and various qualitative data, as indicators should inform about the positive changes caused by the policy intervention and prove that they happened because of it ... Indicators should be chosen before the implementation, but some new, not-planned indicators and unintended consequences can be included.” (Res.2)

Another participant cautioned that “it is important to stay mindful that impact can be favourable and unfavourable” (Res.3). In keeping with this thought another noted that “evidence on all aspects of both positive and negative feedback would be necessary to demonstrate impact.” (Res.8)

When to conduct impact assessment

When to conduct policy impact assessment is an important question to be considered when thinking about how to measure policy impact. It is the agreed position of the respondents that assessment should be done during and after the life of the policy. In other words, it should be continuous. It was noted by one respondent that “results-based monitoring and evaluation of policy recommends that “impact” assessment be done at the end of the policy implementation cycle; however, a process of formative evaluation should be employed during the implementation of the policy” (Res.1). The idea of continuous assessment was supported by Heck (2004) who recommended that effectiveness of policy studies be done using a longitudinal approach collecting data at several points during the implementation and effect stages.

Who should sit at the assessment table?

How we collect the impact assessment data and from whom are other concerns that must be addressed when planning. Public policies usually have the citizens or groups of them as their target depending on the scope of the policy. Therefore, any evaluation of such policies should include the targets of those policies. Learners, who are the targets of adult education policies, must be included. They are the persons who are either directly or indirectly impacted by the policy interventions.

© Shira Bentley

Additionally, the various stakeholders “such as governments/education ministries/departments funding agencies, institutes that promote and deliver adult education programmes, programme administrators and facilitators ....” (Res.8) should be included. Further, depending on the nature of the policy, the stakeholder group may include the private sector and various civil society entities.

Collecting data from such a wide cross-section of persons will undoubtedly call for methods beyond numbers and statistics as agreed by the respondents in this study. A mixed method approach to data collection was advanced by all. In support of this point one respondent said “I think quantitative (e.g. surveys, online and other), as well as more qualitative type data (e.g. interviews), along with unobtrusive measures, are needed to afford a comprehensive analysis” (Res.3). In a more detailed response, another respondent said: “Different approaches can be used: observation, interviews, questionnaires, surveys, comparison (between the groups affected by the policy intervention and those not affected; multiple data points – same groups between pre- and post-intervention; between the regions, sectors ...), experiments, testing, report analysis, focus groups, study visits, targeted meetings, narrative data, personal stories, case studies etc.” (Res.2)

These points are supported by the views of Streatfield (2009), who advocated for both quantitative and qualitative methods. Qualitative data being useful in gathering data related to effectiveness while quantitative data is necessary to assess efficiency.

“To the extent possible, it would be helpful to stay mindful that all is not amenable to measurement.”

Respondent 3

The issue of financing impact assessment is vital, as it has the potential to influence who is engaged, how the study is conducted, which areas of the policy are targeted and, ­importantly, how the results are used. In addressing this ­aspect there was agreement among the respondents that governments should be a key source of funding since they should have a vested interest in the results. One respondent noted that “a study about the impact of adult education policies is very important to the development of any country. Thus ... it should be financed by the Government/Ministry of Education.” (Res.8)

In addition to governments, it was also felt that the private sector and international development partners could be other sources of funding and that the scope and type of policy would influence the source of funding. In this regard, one respondent said that: “it depends on the type and scope of policy intervention – it could be some governmental body (as it is probably the main actor in policy planning and implementation) that will bear the costs, but for certain aspects of impact the groups involved or affected could participate in evaluation (for example the private sector that takes part in VET [vocational education and training] policy creation and implementation). ODA [Overseas Development Agencies] might [also] be one of the financial sources.” (Res.2)

Other thoughts included the idea that “in an ideal world, when policy is promulgated, funds should be set aside by whoever is promulgating the policy … .” (Res.5). Similarly another respondent noted that the policy assessment should be “a percentage of the policy budget [and] it should be budgeted into the original project.” (Res.6)


One respondent highlighted the point that: “Measuring the impact of policies in general, and particularly in adult edu­cation, can be a messy process. To the extent possible, it would be helpful to stay mindful that all is not amenable to measurement. For example, insights may be equally important to guide understanding. Such information may come in the form of change stories from key stakeholders. Other relevant considerations include political influences, how the policy was understood, communicated, and implemented, etc. Identifying key knowledgeable informants who can provide insights into potential unseen dynamics such as attitudes or relationships that could have influenced impact is also relevant.” (Res.3)

Another respondent alerts us to a challenge that may be more noticeable in some jurisdictions than others and it is that “researchers may encounter difficulties as there may be no specific written policies with regard to adult learning and education, and there may be a lack of official records of work actually done in adult learning and education.” (Res.8)

Two things to keep in mind

In conclusion, I would like to take us back to two points made earlier in this article. Firstly, “solid coherent policies and plans are the bedrock on which to build sustainable education systems, achieve educational development goals and contribute effectively to lifelong learning” (UNESCO 2018). However, policies may set directions and provide a framework for change, but they do not determine the outcomes directly (Heck 2004). Therefore, the periodic study of the effectiveness of such policies is vital in ensuring that the policies are accomplishing what they set out to do. In conducting such studies, however, it is absolutely important that attention be given to issues of how, when, and who should participate. We should carefully articulate our objectives, consider the impact indicators, decide on the evidence to be collected, and then, how to use that evidence to establish the impact and ultimately the value of the policy.


Alfred, M. V.; Robinson, P. E. and Alfred, M. C. (2011): Adult education and lifelong learning in the Caribbean and Latin America.

Barrett, S. (2014): Adult education, social change and development in Post-Colonial Jamaica: Investing in human capital. Germany: Lambert Academic Publishing.

Fitzpatrick, J. L., Sanders, J. R. and Worthen, B. R. (2012): Program evaluation: Alternative approaches and practical guidelines (4th ed.). New Jersey, NJ: Pearson Education.

Heck, R. H. (2004): Studying educational and social policy: Theoretical concepts and research methods. London: Lawrence Erlbaum Associates.

Kubow, P. and Fossum, P. (2007): Comparative and international education. Exploring issues in international context. New Jersey: Pearson Education, Inc.

Kusek, J. Z. and Rist, R. C. (2004): Ten steps to a results-based monitoring and evaluation system. Washington, DC: World Bank.

McGrath, S. (2010): The role of education in development: An educationalist’s response to some recent work in development economics. Comparative Education, 46(2), 237-253.

Olaniyan, D. A. and Okemakinde, T. (2008): Human capital theory: Implications for educational development. European Journal of Scientific Research, 24(2), 157-162.

Streatfield, D. (2009): What is impact assessment and why is it important?

UNESCO (2018): Education policy review.

UNESCO Institute for Lifelong Learning (2010): Belém framework for action.

About the author

Shermaine Barrett (PhD) is a Senior ­Lecturer at the University of Technology, ­Jamaica. She currently serves as Vice ­Pre­sident for the Caribbean Region of the International Council for Adult Education (ICAE) and President of the Jamaican Council for Adult Education (JACAE). Her research interests include adult teaching and learning, workforce education and teacher professional development.


Important notice: If you click on this link, you will leave the websites of DVV International. DVV International is not responsible for the content of third party websites that can be accessed through links. DVV International has no influence as to which personal data of yours is accessed and/or processed on those sites. For more information, please review the privacy policy of the external website provider.