Appendix 1: Definitions of 21st-century skills
|Literacy||Ability to read, understand and use written language|
|Numeracy||Ability to use numbers and other symbols to understand and express quantitative relationships|
|Scientific literacy||Ability to use scientific knowledge and principles to understand one's environment and test hypotheses|
|ICT literacy||Ability to use and create technology-based content, including finding and sharing information, answering questions, interacting with other people and computer programming|
|Financial literacy||Ability to understand and apply conceptual and numerical aspects of finance in practice|
|Cultural and civic literacy||Ability to understand, appreciate, analyse and apply knowledge of the humanities|
|Ability to identify, analyse and evaluate situations, ideas and information to formulate responses and solutions|
|Creativity||Ability to imagine and devise new, innovative ways of addressing problems, answering questions or expressing meaning through the application, synthesis or repurposing of knowledge|
|Communication||Ability to listen to, understand, convey and contextualize information through verbal, nonverbal, visual and written means|
|Collaboration||Ability to work in a team towards a common goal, including the ability to prevent and manage conflict|
|Curiosity||Ability and desire to ask questions and to demonstrate open-mindedness and inquisitiveness|
|Initiative||Ability and desire to proactively undertake a new task or goal|
|Persistence/ grit||Ability to sustain interest and effort and to persevere to accomplish a task or goal|
|Adaptability||Ability to change plans, methods, opinions or goals in light of new information|
|Leadership||Ability to effectively direct, guide and inspire others to accomplish a common goal|
|Social and cultural awareness||Ability to interact with other people in a socially, culturally and ethically appropriate way|
Sources: ESCO Skills Hierarchy for Transversal Skills; Partnership for 21st Century Skills. "Framework for 21st Century Learning." NEXT: Washington DC, 2001; Burkhardt, Gina. "enGauge 21st Century Skills: Literacy in the Digital Age." North Central Regional Educational Laboratory and The Metiri Group, 2003.; Learning Metrics Taskforce. "Towards Universal Learning: What Every Child Should Learn." Center for Universal Education at the Brookings Institution and UNESCO Institute for Statistics: Washington, DC, 2013; The Economist Intelligence Unit. "The Learning Curve: Education and Skills for Life." Pearson: London, 2014. Other sources considered but not included: AT21CS, WorldSkills, Iowa Dept. of Education's 21st Century Skills, and Tony Wagner's Seven Survival Skills.
Appendix 2: The measurement challenge
Measuring 21st-century skills presents numerous obstacles. Researchers have access to only limited direct metrics to assess performance on the full range of skills. In addition, the coverage of these metrics is often confined to the developed world.
The majority of tests measuring 21st-century skills focus on foundational literacies. Beyond the indicators we used in our methodology – the Programme for International Student Assessment (PISA), the Southern and Eastern Africa Consortium for Monitoring Educational Quality (SACMEQ) and the Latin American Laboratory for Assessment of the Quality of Education (LLECE) – other tests that measure literacy, numeracy and scientific literacy include the Progress in International Reading Literacy Study (PIRLS), the Early Grade Reading Assessment (EGRA), the Program for the Analysis of Education Systems (PASEC) and the Trends in International Mathematics and Science Study (TIMSS).
The three other literacies – financial, ICT and cultural and civic – have not been part of the traditional focus of international assessments and therefore there is less data and fewer assessments available to draw on. The only test currently available for financial literacy is PISA, but that test covers only 16 countries. For civic and cultural literacy, we evaluated two direct measurements, the International Civic and Citizenship Education Study (ICCS) and the Civic Education Study (CivEd). We picked ICCS for its wider coverage. Finally, we used PISA’s digital literacy assessment, which is a valuable assessment but has limited global coverage.
We found large gaps in coverage in the measurement of many core skills. When we combined existing metrics for literacy and numeracy, for example, we were able to cover fewer than half of the countries in the world.
Measurement challenges are amplified when it comes to competencies and character qualities. PISA has pioneered the assessment of problem-solving, a key competency. This assessment still covers only approximately 44 countries. For creativity, communication and collaboration, no direct measurements exist to date. For creativity, we used a proxy from one of the sub-scores in PISA’s mathematics assessment. We encountered difficulties finding metrics that measure character qualities, with the exception of curiosity. For that metric, we used PISA’s problem-solving subscale.
Note that PISA is in the middle of promising work to extend its 2015 and 2018 assessments. It plans to add collaborative problem-solving and global competencies, measuring skills such as intercultural understanding, empathy and perspective taking.
It is critical that countries support and facilitate research to improve both the direct measurement of 21st-century skills as well as their global coverage. Only then will countries be able to create an accurate baseline from which to measure progress in the future.
Appendix 3: Indicators considered and used in the report
|Cultural and civic literacy||
|Social and cultural awareness||
Factors influencing indicator selection include broad country coverage, direct skill measure and independent assessment.
bold denotes indicator selected to measure skill performance
Indicators considered and used to estimate educational factors holding countries back:
|Policy Enablers||Standards that govern K-12 education||
|Human Capital||Teacher quality, training and expertise||
Ideal quality metric would have included:
|Financial Resources||Importance of education in public budgets||
|Technological Infrastructure||Access to new digital tools and content via the internet||
Sources: OECD, UNESCO, American Educational Research Association.
Other factors, such as socioeconomic status and conflict, also present significant challenges to educational attainment.
bold denotes indicator selected to measure skill performance
Appendix 4: Countries with available skill data included in the report
|High-income OECD||High-income non-OECD||Upper-middle income||Lower-middle income||Low-income|
|Iceland||Trinidad and Tobago||Kazakhstan|
|Ireland||United Arab Emirates||Malaysia|
|Slovak Republic||South Africa|
Sources: World Bank; project team analysis.
Appendix 5: A comparison of performance data across tests
Many countries use widely varying measures to assess similar skills, making comparisons between countries’ absolute scores on comparable tests difficult. To increase the number of countries with comparable data for literacy, numeracy and scientific literacy, we conducted a crosswalk analysis, which allows researchers to compare results on tests of comparable skills that use widely different scales.
For countries in Africa and Latin America that had taken only the SACMEQ or LLESE tests and not the PISA test, we devised a way to convert those region-specific test scores into the equivalent of PISA scores. We looked at the handful of countries in those areas that had taken both PISA and either of the SACMEQ or LLESE tests in 2009 or 2012, in order to calculate an average conversion factor from one of the two regional tests into PISA. We then applied that conversion factor to SACMEQ and LLESE scores to translate them into PISA scores. Since we did not have access to the raw data, we assumed that the statistical distribution of the converted scores corre- sponded to the distribution of the original scores. The methodology allowed only for a ranking-based comparison of the countries studied, not an absolute score assessment. We therefore have not provided the converted scores for comparison but rather used percentile ranks. This approach draws on the more advanced methodology demonstrated by Altinok and Murseli, as well as Hanushek, Peterson and Ludger, and it is intended to provide an indicative comparison among countries rather than a rigorous assessment of relative performance.
As a result of the analysis, we increased the sample size from 72 to 91 countries. In particular, coverage for the lower-middle-income cluster increased from two to 12 countries and coverage for the low-income cluster increased from zero to six countries.
- ^ PISA 2012 mathematics subscale: "For individuals to use their mathematical knowledge and skills to solve a problem, they often first need to translate the problem into a form that is amenable to mathematical treatment. The framework refers to this process as one of formulating situations mathematically. In the PISA assessment, students may need to recognize or introduce simplifying assumptions that would help make the given mathematics item amenable to analysis. They have to identify which aspects of the problem are relevant to the solution and which might safely be ignored. They must recognize words, images, relationships or other features of the problem that can be given a mathematical form; and they need to express the relevant information in an appropriate way, for example in the form of a numeric calculation or as an algebraic expression."
- ^ PISA 2012 Creative Problem Solving, acquisition of knowledge subscale: "In knowledge-acquisition tasks, the goal is for students to develop or refine their mental representation of the problem space. Students need to generate and manipulate the information in a mental representation. The movement is from concrete to abstract, from information to knowledge. In the context of the PISA assessment of problem solving, knowledge-acquisition tasks may be classified either as “exploring and understanding” tasks or as “representing and formulating” tasks."
- ^ Calculated using UIS data: Enrollment in primary education [number] / (teachers in primary education [number] x teachers in primary education who are trained [%]). Calculated using Teacher Quality Opportunity Gap and National Achievement data for countries without UIS data for primary teacher education rates.
- ^ Shanghai is grouped in high-income non-OECD due to its income level. (PISA reports China data for Shanghai only.)
- ^ Altinok, Nadir and Hatidje Murseli. “International Database on Human Capital Quality.” Economics Letters 96, no. 2. 2007; Altinok, Nadir. “A New International Database on the Distribution of Student Achievement.” 2011. United Nations Educational, Scientific and Cultural Organization (UNESCO).
- ^ Hanushek, Eric A., Paul E. Peterson and Ludger Woessman. “Achievement Growth: International and U.S. State Trends in Student Performance.” Harvard Kennedy School of Government. 2012.