Monitoring competencies and skills for life and work

From left to right:

Aaron Benavot, Director
Global Education Monitoring Report
UNESCO

Alasdair McWilliam
Global Education Monitoring Report
UNESCO 


Abstract – In September 2015, the UN Member States adopted the 2030 Agenda for Sustainable Development, and with it SDG 4, the succeeding global education agenda to Education for All (EFA). Coverage of adult education and training, including skills development, under three of the ten SDG 4 indicators is welcome. However, as this article discusses, each of these targets presents particular conceptual and measurement challenges. While important steps have been taken since the adoption of the SDGs, many practical challenges to collect comparable data across countries have yet to be tackled.


In September 2015, the UN Member States adopted the 2030 Agenda for Sustainable Development with 17 goals and 169 specific targets, which address a broad set of economic, social and environmental challenges. The new universal agenda includes a standalone goal on quality education and lifelong learning (SDG 4), with 10 targets, which effectively succeeds and, in many ways, expands on the unfinished Education for All (EFA) and MDG agendas. In November 2015 more than 100 ministers of education adopted the Education 2030 Frame work for Action , which reflects the ambition and principles of SDG 4, as well as the issues related to its implementation. SDG 4-Education 2030 specifically commits countries to one year of free and compulsory pre-primary education, universal primary and secondary education, equal opportunity to post-basic education, attention to equity, as well as inclusive, effective and relevant learning outcomes.

SDG 4 includes several targets to increase the skills and competencies of the adult population, with direct measures of skills an explicit part of the monitoring framework. However, much work remains to ensure that the necessary data is made available within a reasonable timeframe (say, 5–7 years).

More ambition, wider scope, and skills added

In the EFA agenda, education was viewed as a fundamental human right. Likewise, the SDG agenda, including the fourth global goal on education, is rooted in a rights-based approach that ensures universal access to basic education within a lifelong learning perspective. While the two global education agendas are similar in spirit, SDG 4 is considerably more comprehensive and ambitious: both in terms of the scope of the education targets and the extent to which progress towards them will be explicitly informed by data. Notable differences include targets beyond basic education (e.g.,upper secondary education, higher education, TVET, skills for employment), a greater emphasis on disaggregated data for monitoring targets, and an orientation to reporting on learning outcomes and skills in addition to measures of access and completion1.

Under the new agenda the shift towards quantitatively monitoring outcomes is clear, especially in relation to children, youth and adults.

The new education agenda also differs somewhat with EFA in how skills and competencies are addressed. Under EFA, objectives to improve adult skills were included under goal 3 (Promote learning and life skills for young people and adults) and goal 4 (Increase adult literacy by 50 percent). In contrast, adult skills are included in three of the SDG targets. Under SDG 4, adult skills are primarily covered under target 4.4 to provide “relevant” skills for decent work and employment (removing reference to life skills) and target 4.6 to achieve literacy and numeracy among a substantial proportion of adults. Target 4.7, to promote knowledge and skills needed to promote sustainable development, in theory covers a range of outcomes related to “life skills”, although many of these should more accurately be described as attitudes and beliefs rather than skills and competencies.

Under the new agenda the shift towards quantitatively monitoring outcomes is clear, especially in relation to children, youth and adults. The EFA goal on skills had little explicit basis for measurement. In practice, the monitoring of adult skills largely took place under EFA goal 4 on adult literacy, drawing on literacy data from national censuses and select household surveys. In comparison, creating country agreement on global and thematic indicators has been a central part of the SDG process and implementation framework, with indicators being established for each of the SDG targets. However, the actual monitoring of the SDGs presents significant challenges in generating new data, which can accurately capture the target objectives.

Three main levels proposed

Monitoring of the 169 SDG targets is to take place on three main levels, each associated with a distinct set of indicators:

• Global: a limited set of globally comparable indicators formulated by the Inter-Agency and Expert Group on Sustainable Development Goal (IAEG-SDGs) to monitor the 169 targets (with at least one indicator per target). Countries will be obliged to report on these indicators and the results reported in an annual progress report.

• Thematic: A wider set of globally comparable indicators that better reflect the specific objectives of each target. Here the Technical Advisory Group (TAG), coordinated by the UNESCO Institute for Statistics, proposed a broad set of 43 indicators as part of the Education 2030 Framework for Action.

• National and regional: Under the SDG agenda, national governments and regional entities are expected to: a) selectively draw on the proposed TAG indicators and b) consider developing other indicators, more closely aligned with local contexts and regional priorities.

In regard to adult education and skills under SDG 4, Table 1 lists the global and thematic indicators for the three relevant targets. These indicators have yet to be officially adopted, and are expected to be improved and refined over time. Regardless of the final indicator framework, however, priority will be given to nationally representative data that is comparable and standardised. Given the fact that most countries do not currently collect data for many indicators, efforts will be made to strengthen national statistical capacities and define common international statistical standards and tools.These challenges are compounded by the need for data disaggregated by gender, wealth, location and for other relevant social groups, as specified by target 4.5 and the UNSG Synthesis Report2 (Table 1).

Each of the listed targets presents particular conceptualisation and measurement challenges. The efforts made to establish a concise and measurable set of indicators at the global and thematic level are ambitious. However, the practical challenges of comprehensively collecting data across countries have yet to be tackled. Questions also remain over what benchmarks are appropriate across countries, and if the proposed indicators are adequately aligned with the objectives of the targets (and if not, what indicators at the regional/national level might be more appropriate). Let us have a closer look at these concerns.

Target 4.4: Skills for employment, decent jobs and entrepreneurship

Both the global and thematic levels propose measures of ICT skills/digital literacy as an indicator of whether adults and youth have relevant skills for decent work. The current proposal for the global indicator are a set of nine self-assessed questions relating to an array of technology-related tasks (e.g. whether the respondent has recently copied or moved a file; used basic arithmetic formulae in a spreadsheet; wrote a computer programme using a specialised programming language). The advantage of such an approach is that data can be relatively easily captured by national statistical agencies through existing traditional household surveys. On the downside, the indicator does not determine whether activities were undertaken effectively. Since the reference period for the activities are the previous three months, it can also be argued that the indicator better captures that the respondent is employed in a white-collar occupation (where use of ICT technology is part of the job description), rather than an indication the respondent has the skills to access “decent work” more broadly.

A more detailed direct measurement of ICT skills at the thematic, national or regional levels may therefore prove valuable. Existing cross-country examples of such measurements include the Assessment and Teaching of 21st Century Skills and the IEA International Computer and Literacy Study covering students and the Programme for the International Assessment of Adult Competencies (PIAAC), which measures problem-solving skills among adults in OECD countries. These assessments share common outcomes with the measurement for global monitoring, but are assessed on a scale, demand higher level cognitive abilities and follow a sequence of increasing complexity4.

However, given uncertainty over exactly what ICT skills are most relevant for employment outcomes, at the country level, governments may wish to draw upon broader measures of cognitive skills which are proven to affect a range of employment outcomes. In this respect, adult skills surveys such as PIAAC and the World Bank’s Skills Toward Employment and Productivity survey (STEP) provide examples for measuring literacy and numeracy which can be comparably applied across countries5. Aside from capturing a broader range of cognitive abilities necessary for decent and productive employment, assessments of literacy and numeracy may also be more relevant to the labour markets of lower income economies.

Ultimately, however, what is considered decent work, as well as the attributes required to access this work will vary by country and over time.

Ultimately, however, what is considered decent work, as well as the attributes required to access this work will vary by country and over time. Formulating a precise benchmark of relevant skills for decent employment, which is strongly predictive across countries for the next 15 years, is therefore an extremely difficult task. In addition to reporting on global and thematic indicators for target 4.4, governments need to establish national benchmarks across the skill measures deemed most relevant to their economic contexts.

Governments should also seek to continue and expand collection of more traditional survey measures of educational attainment, as specified in the proposed thematic indicators. The range of occupations which meet standards of decent work are wide and varied, as does the skills and knowledge required to perform in such employment. Broad measures of skill, whether literacy, numeracy or ICT can only capture a portion of the “relevant” worker attributes.

Target 4.6: Literacy and numeracy

As within the EFA goals, adult literacy retains a separate target under SDG 4. Yet monitoring of adult literacy under the new agenda entails notable developments. The first is an explicit targeting of numeracy, which was not formally monitored under EFA. The second is a tacit shift towards assessment of skill proficiency on a scale, moving away from binary measurements of literacy employed as part of the EFA agenda6, and allowing a more accurate and nuanced understanding of adult capabilities. Ensuring that such data becomes widely available in the coming years requires that operational definitions of literacy and numeracy are agreed, developed into a scale valid across languages and cultures, and applied cost-effectively within surveys.

Among high-income countries, the most comprehensive source for adult literacy and numeracy proficiency is the PIAAC assessment and its predecessors, IALS and ALL. Within low and middle-income countries, prominent surveys are the UIS Literacy Assessment Monitoring Programme (LAMP) survey, and STEP. The latter uses a comparable methodology to PIAAC in estimating literacy. On the whole however, few countries have participated in such surveys outside the OECD.

Recently, OECD, UIL and UIS have proposed developing a short standardised literacy assessment drawing on PIAAC items, which could be implemented across a wider range of countries. This would assess respondents along a scale, divided into six proficiency levels. For example, individuals at level 2 literacy “can integrate two or more pieces of information based on criteria, compare and contrast or reason about information and make low-level inferences”.

Implementing such an assessment across countries presents several practical and technical challenges. A primary concern is agreeing on a common measurement of literacy/ numeracy and ensuring that the assessment is comparable across language groups. This applies not only across countries, but also within ethnically diverse states where many may not speak the official language.


An additional related challenge is designing a survey which national statistical agencies – often understaffed and underfunded – are able to conduct affordably. The complex sample and psychometric design of the PIAAC assessment, together with its duration (on average 50 minutes for the cognitive assessment7) make it technically difficult and costly to implement. An alternative is assessment based on a shorter and simpler module based on a common pool of limited items. This would have the advantage of simplicity and reduced costs, but would compromise the depth and validity of the assessment.

Target 4.7: knowledge and skills needed to promote sustainable development and global citizenship

Incorporating key objectives of sustainable development, target 4.7 can be considered a bold, and arguably necessary, endeavour to align global education with the post-2015 sustainability agenda. Yet monitoring the target presents signifi cant challenges. None of the proposed global or thematic indicators explicitly addresses adult skills and competencies. This sub-section therefore focusses on possible measures of adult skills at the national or regional level.

Indicators related to global citizenship and equality that are meaningful across a large spectrum of socio-economic conditions, political systems, and religious beliefs present significant challenges. One good example is the 2009 International Civic and Citizenship Education Study assessment, conducted among eighth grade students. The assessment, based on a 79 item test administered across 38 countries from Europe, Asia and Latin America measures conceptual knowledge and understanding and attitudes related to citizenship. The survey includes globally relevant items on attitudes, but also includes regional modules to capture issues more pertinent to local contexts8.

In 2018, participating PISA countries are developing an assessment of “global competences”. Global competence is defined as “the capability and disposition to act and interact appropriately and effectively, both individually and collaboratively, when participating in an interconnected, interdependent and diverse world”. Key dimensions of the assessment include communication and behaviour to interact appropriately and effec tively with others holding diverse perspectives, and knowledge and interest in global developments, challenges and trends.

Considering their greater influence on political outcomes and more solidified views, comprehensively measuring knowledge and attitudes among adults is highly relevant to the SDG agenda. However, the funding and organisation necessary to implement detailed surveys is likely to be less forthcoming. With this in mind, opinion polls could provide a useful measure of relevant attitudes to sustainable development issues. For example, international opinion surveys, such as the regional barometer surveys and the World Values Surveys, include some questions on attitudes, values, and behaviour patterns such as tolerance and equal opportunity, social and interpersonal trust, identity, and the environment and global warming. While the data gained from such surveys is less detailed, including a limited set of items into an existing instrument can prove both informative and cost effective.


Notes

1 / While EFA goal 6 specified that literacy, numeracy and life skills should be improved through formal education, this largely did not result in the generation of comparable data on learning outcomes across countries, particularly beyond secondary school.

2 / The UN Secretary General’s Synthesis Report specifies that targets can only be considered achieved if they have been met for all relevant income and social groups, requiring clear levels of disaggregation for SDG indicators. In regard to SDG 4, UIS/UNICEF/World Bank and OECD launched on April 5th 2015 an Inter-agency group on disaggregated indicators in education to ensure harmonisation of standards and methodologies for equity.

3 / Each of the global indicators are included in the thematic monitoring framework.

4 / For example, in the problem-solving component of the PIAAC survey, one question asks respondents to access and evaluate information in the context of a simulated job search, finding one or more sites that do not require users to register or pay a fee.

5 / STEP does not assess numeracy.

6 / During the EFA agenda, assessment of literacy was commonly based on whether respondents in Demographic and Health Survey (DHS) and the Multiple Indicator Cluster Survey (MICS) were able to read a simple sentence aloud.

7 / Covering literacy, numeracy and problem solving in technology rich environments.

8 / The 2016 ICCS assessment will incorporate more items relevant to sustainable development. Students will be asked to rate the seriousness of a broad range of threats such as the extent of poverty, living standards, human dignity, economic well-being, and environmental health (Schulz et al., 2016).


References

Hanushek, E. A., Schwerdt, G., Wiederhold, S. and Woessmann, L. (2013): Returns to Skills Around the World: Evidence from PIAAC Paris, OECD. (OECD Education Working Paper, 101.)

Valerio, A., Puerta, M. L. S., Tognatta, N. and Monroy-Taborda, S. (2015): Are There Skills Payoffs in Low and Middle Income Countries? Empirical Evidence Using STEP Data. Conference paper for 10th IZA/World Bank Conference on Employment and Development: Technological Change and Jobs, Bonn, Germany, Institute for the Study of Labor (IZA). 


About the authors

Aaron Benavot is the Director of the UNESCO Global Education Monitoring Report. Previous to this post, he served as Professor in the School of Education at Albany-State University of New York, USA and consulted for UNESCO, its Institutes, and UNICEF.

Contact
a.benavot@unesco.org 

Alasdair McWilliam joined the UNESCO Global Education Monitoring Report Team in December 2011. Previously he was a research consultant with the Centre for Aid and Public Expenditure at the Overseas Development Institute in London, working on aid effectiveness and public expenditure, alongside broader public policy issues in low and middle income countries.

Contact
a.mc-william@unesco.org 

Cookie-Settings
YOU ARE LEAVING DVV INTERNATIONAL
Important notice: If you click on this link, you will leave the websites of DVV International. DVV International is not responsible for the content of third party websites that can be accessed through links. DVV International has no influence as to which personal data of yours is accessed and/or processed on those sites. For more information, please review the privacy policy of the external website provider.