Frequently Asked Questions
Table of Contents
Country coverage and indicator selection
Interpreting the results and limitations
Q: What are the Sustainable Development Goals (SDGs)?
A: The 17 SDGs are universal goals that were adopted by all member states of the United Nations in 2015 to guide international collaboration towards sustainable development. They follow the Millennium Development Goals, and aim to end poverty, tackle inequality, protect the planet, promote peace, and ensure prosperity for all. Each goal has specific targets to be achieved before 2030. See the UN website for more information about the SDGs.
Q: Why develop an SDG Index and Dashboards?
A: The SDG Index aggregates available data on all SDGs into a composite index to provide countries with a quick assessment of how they are performing relative to their peers. In this way the SDG Index can help draw attention to the SDGs and their role as a tool for guiding national policies and long-term strategies for sustainable development. Its purpose is not to compare countries with vastly different development status, but to allow countries to benchmark themselves using a single holistic measure that encompasses all SDGs and treats each goal equally. Just like the SDG Dashboards, the SDG Index is designed to support national discussions on operationalizing the SDGs instead of monitoring progress towards achieving the goals.
The SDG Index shows that rich countries, particularly from Northern Europe, perform best. Yet, this does not mean that Sweden and other highly ranked countries have achieved all the SDGs. As made clear by the SDG Dashboards all countries score “red” in at least one SDG and “orange” or “yellow” on many others. It is clear that the SDGs require further actions by all countries.
Q: Do the SDG Index and Dashboards replace or compete with official SDG monitoring?
A: No. The SDG Index and Dashboards are preliminary analytical tools to help governments and other stakeholders take stock of where they currently stand with regards to achieving the SDGs and to identify priorities for early action. As new data become available they will be included in the SDG Index and Dashboards, which will be published on an annual basis for the next three years. Simultaneously, countries will need to develop a full suite of monitoring systems to track the SDG metrics recommended by the UN Statistical Commission. This will require major investments in statistical capacity development, particularly in poorer countries or those with low statistical capacity. Over time every country should be able to track critical SDG variables to monitor progress towards achieving the goals.
Q: How and by whom were the SDG Index and Dashboards developed?
A: The SDG Index and Dashboards have been developed jointly by the Bertelsmann Stiftung and the Sustainable Development Solutions Network (SDSN), led by scientific co-directors Guido Schmidt-Traub and Christian Kroll. The authors have drawn extensively on the SDG Indicators proposed by the UN Statistical Commission and consulted widely on methodology and appropriate data with experts around the world. The SDG Index and Dashboards also drew on an earlier prototype SDG Index for OECD countries developed by the Bertelsmann Stiftung and a report on SDG indicators prepared by the SDSN. All data and methodological assumptions are presented in Annex 2 and are available online.
2. Country coverage and indicator selection
Q: How have indicators been selected for the Index and Dashboards? Why are they not identical to the recently proposed official SDG Indicators?
A: In early 2016, the UN Statistical Commission recommended several indicators for measuring the progress towards the SDGs, which in April 2017 reached 232 indicators and was endorsed by the UN Statistical Commission. Yet for most countries data remain unavailable for the vast majority of these proposed indicators. It will take time and investments in statistical capacity to build up national data systems so that every country can monitor progress against the official indicators (see also recommendations by the Expert Group on SDG Indicators). Meanwhile, countries need to start the process of operationalizing and implementing the SDGs using data available today. As a result, the SDG Index and Dashboards are based on published, currently available data. No new data were collected. Instead, a combination of the proposed indicators for which data was available and of unofficial but similar indicators that passed the selection criteria were used.
The SDG Index and Dashboards use appropriate indicators for which data are available today, for at least 80% of the 154 countries with a population greater than 1 million, i.e. at least 124 countries. To identify appropriate indicators, all recently proposed official SDG Indicators were reviewed for data availability and suitability for inclusion in an SDG Index and Dashboards. Major gaps were filled with other metrics from official or other reputable sources. Indicators that meet the standards for inclusion have been incorporated into the SDG Index and Dashboards. Countries with a population smaller than 1 million are included in the Index and Dashboards if sufficient data are available, thus allowing a total of 157 countries to be included. Decisions on indicator selection are described in the methodology section and the online metadata.
Q: Why develop a separate SDG Index and Dashboard for OECD countries?
A: The report proposes an Augmented SDG Index and Dashboards for OECD countries. Both augment the global Index and Dashboards with 15 additional variables to provide a richer assessment of the SDG challenges faced by OECD countries. The inclusion of additional variables holds OECD countries to a higher standard, which is justified since they have larger resources available to achieve the SDGs. The Augmented SDG Index and Dashboards might also help identify priorities for statistical capacity development and for generating new SDG data in non-OECD countries.
Q: Why are some countries not included in the SDG Index?
A: A country is included in the SDG Index if it has data for at least 80% of the indicators. Some countries with a population of less than one million have sufficient data and are therefore included in the SDG Index, despite the indicator selection criteria of 80% data availability in countries with a population above one million. The fact that some countries lack sufficient data for inclusion in the SDG Index underscores the need for greater investments in statistical capacity building. A new feature of the 2018 report is that all countries are presented in the dashboards regardless of whether they have enough data for inclusion into the index. This allows for seeing in which areas data gaps are particularly severe.
Q.: Where do the data for the SDG Index and Dashboards come from?
A: To the greatest extent possible, the SDG Index and Dashboards rely on internationally comparable official statistics. In order to fill in some gaps in the official data, notably to address the issue of “spillover effects”, non-official metrics from other reputable sources have been used, as described in the online metadata. Data for each indicator have been rigorously selected and reviewed for quality, timeliness and verifiability.
Q: How do the Index and Dashboards compare performance across different indicators?
A: To ensure comparability we normalize the data for each indicator by transforming it linearly into a scale from 0 to 100. A value of 100 denotes the technical optimum, while a value of zero denotes the 2.5th percentile in the distribution. For clarity and ease of interpretation, we transform some indicators so that in each case a higher score on the normalized indicator corresponds to a higher overall progress.
Q: How are the SDGs and the indicators weighted?
A: Each SDG has the same weight in the Index and Dashboards, which is in line with the spirit of the SDGs adopted in September 2015. This implies that countries need to pursue all 17 goals through integrated strategies. Within each goal every indicator is equally weighted, which implies that every indicator is weighted inversely to the number of indicators available for that particular SDG. An advantage of this approach is that as more and better data become available, new variables can easily be added to individual SDGs without changing the relative weighting of the goals. In this way the SDG Index and Dashboards can evolve over time as each epistemic community generates new and better data.
Q: How to interpret the SDG Dashboards?
A: Some other indices use relative performance across countries to define thresholds. We believe that absolute thresholds are more suitable since most SDGs require absolute benchmarks to be achieved. To assess a country’s progress on a particular indicator, such absolute quantitative thresholds are introduced to differentiate between situations where an SDG threshold has been met (green), where significant challenges remain (yellow & orange), and where major challenges must be overcome if the country is to meet the goal (red). Where possible, these thresholds are derived from the SDGs, their targets, or other official sources. All thresholds are specified in the online metadata.
Q: How are the SDG Index and Dashboards scores calculated and what aggregation method is used?
A: As described in Part 2, the choice of aggregation formula can have important implications for the results of both the SDG Index and Dashboards. Taking a simple average of indicator values (arithmetic aggregation) implies that the indicators are perfectly substitutable: progress on one variable can offset lack of progress on another. This approach is reasonable for indicators within the same goal that tend to complement one another, so we use arithmetic means to aggregate indicators within each SDG for the Index and Dashboards.
However, major trade-offs may occur across SDGs. Progress on one goal (e.g. higher economic growth) cannot fully offset lack of progress on another (e.g. rising inequality or environmental degradation). For this reason countries need to make progress towards every goal. In other words, one must assume limited substitutability across goals, which is commonly done by using the geometric mean. As a result, one could argue for using the geometric average of the scores for each SDG to compute the overall SDG Index.
Nevertheless, the two methods of aggregation give almost the same rankings and nearly the same scores for most countries. For simplicity, we therefore use the arithmetic aggregation even though the geometric aggregation is conceptually attractive. This leaves a natural interpretation of the meaning of the national SDG Index score. A SDG Index value (e.g. 70) therefore means that the country is a certain percentage (e.g. 70%) of the way from the worst to the best score on average across the 17 SDGs.
A third method for aggregating indicator scores is the Leontief minimum function, which ascribes the value of the indicator on which the country performs worst as the score for the SDG. This aggregation is helpful for identifying the areas within each goal where a country needs to make the greatest progress but is too “tough” an approach to allow comparison of countries.
For the SDG Dashboards, we use the average of the two worst performing indicators to assign colors to SDGs.. To score “red”, both worst performing indicators must be “red”. To achieve “green”, all indicators under the goal must be “green”. If the average rating falls in the “caution lane”, the SDG is assigned “yellow” or “orange”, depending on how far along the country is on the path from “red” to “green.”
Q: How do the SDG Index and Dashboards deal with missing data?
A: The SDG Index and Dashboards do not model or extrapolate data at the indicator level to fill gaps because such extrapolations are prone to errors. However, for the purposes of calculating countries’ total index scores, we impute average regional goal scores for those countries that have no data under a goal. This applies primarily to goal 10 (Reduced Inequalities) and goal 14 (Life Below Water). Still, they are presented as missing data in the country profiles. At this stage in the implementation of the SDGs we also want to highlight data gaps so as to encourage governments and the international system to fill them. Part 2 describes a few exceptions where data were imputed for entire groups of countries.
Q: How do you estimate trends?
Using historic data, we estimate how fast a country has been progressing towards an SDG and determine whether – if continued into the future – this pace will be sufficient to achieve the SDG by 2030. For each indicator, SDG achievement is defined by the green threshold set for the SDG Dashboards. The difference in percentage points between the green threshold and the normalized country score denotes the gap that must be closed to meet that goal. To estimate SDG trends, we calculated the linear annual growth rates (i.e. annual percentage improvements) needed to achieve the goal by 2030 (i.e. 2010-2030) which we compared to the average annual growth rate over the most recent period (usually 2010-2015). Progress towards goal achievement on a particular indicator is described using a 5-arrow system.
To estimate overall trends for an SDG, each indicator trend for that SDG was re-normalized on a linear scale from 0-4. The trend for an SDG was calculated as the arithmetic average of all trend indicators for that goal. An average between 0-1 corresponds to a “decreasing” goal trend, between 1-2 to a “stagnating” goal trend, 2-3 “moderately improving goal trend”, 3-4 “on track” goal trend. Maintaining SDG achievement corresponds to a normalized score of exactly 3. Trends are reported at the SDG level only if trend data were available for at least 75% of SDG Dashboards indicators under that goal. SDG Trends are based on data points that precede the adoption of SDGs, because data is reported with long lags at the international level due to lengthy validation processes. Over time, we will update the data to use 2015 as baseline year for SDG Trends.
Trends indicators were selected from among the indicators included in the SDG Dashboards based on the availability of trend data. When the value for one year was not available we used the closest available value with a maximum of one-year difference. Table 14 provides the list of trend indicators and the period over which the trend was calculated.
Several other calculation methods were considered. For instance, we tested the sensitivity of the results when using technical optimums (100 score) as “goal achievement” and calculate distance to technical optimums. This approach yielded harsher results and is not consistent with our conceptual assumption that lower green thresholds correspond to goal achievement. We also considered using compound annual growth rates (CAGR) instead of linear growth rates. The two approaches yield rather similar results and we could not identify a strong argument for using the more sophisticated CAGR method. Finally, while the dashboards are based only on the two-worst indicators trends are generated using all indicators under the goal. This is because the dashboards aim to highlight goals where particular attention is required due to very poor performance on some of the underlying indicators whereas trends aim to reflect insights on the overall goal evolution including all indicators.
Q: How does the trend analysis deal with countries that have already met an SDG target?
A: Our methodology ignores movements (both positive and negative) which are above goal achievement (the green threshold). Conceptually, our objective is to show how much countries are progressing towards reaching the goals. Therefore, a country above the green threshold and that maintained its performance above the green threshold is automatically considered as having maintained performance above goal achievement. At the goal level, this arrow is only given when all of a countries’ trend indicators have maintained performance above their respective green thresholds.
Read the 2018 SDG Index and Dashboards Methodological Paper
4. Interpreting the results and limitations
Q: Sweden is ranked number 1 in the SDG Index. Does this mean the country has achieved all the SDGs?
A: Absolutely not. While Sweden performs best on average based on the data we were able to mobilize for the SDG Index.. The SDG Dashboards makes clear that every country faces major challenges in achieving the SDGs. This applies equally to Sweden and other top-ranking countries.
Q: The SDGs define a universal agenda. So why do rich countries perform relatively well in the SDG Index?
A: Some observers have expressed surprise that the ranking of countries in the SDG Index does resemble the ranking of more narrow indices that focus on income per capita and other measures of human development, such as educational attainment and health. Their concern is that the SDG Index may omit important variables on which rich countries perform worse than others and may therefore produce biased results.
To this end, the 2017 SDG Index and Dashboards have been augmented with indicators measuring international spillovers. As discussed in the report, the additional indicators affect the ranking of some countries, however they do not alter the performance of countries fundamentally.
On balance, an equal weighting of all SDGs will lead higher-income countries to perform better on average. These countries tend to perform better on most economic and social SDG priorities. They also perform better on some “local” environmental priorities, including access to wastewater treatment, deforestation rates, and rates of biodiversity loss. Rich countries perform worse on greenhouse gas emissions and some metrics for sustainable consumption and production, but these represent a modest share of SDG priorities.
Q: How does the SDG Index relate to other development indices for the SDGs?
A: Many other composite development indices exist, but we are not aware of one tracking all 17 SDGs at the country level. In 2015, the Bertelsmann Stiftung issued a report, which was the first to develop an index for OECD countries to track SDG achievement and determine priorities for implementation in each country. Another significant effort has been undertaken by the Overseas Development Institute, which presents a regional SDG Scorecard, projecting trends across key dimensions of the SDGs to determine areas in which the fastest acceleration of progress will be required. The SDG Index and Dashboards, however, provides a comprehensive global index to track the implementation of the SDGs.
Q: How can I access the data for my country or region?
A: Country profiles are available in Part 3 of the report. The entire dataset is publicly available on the website http://www.sdgindex.org/download/. The data will be updated each year.
Q: What are the major data limitations?
A: As explained in the report, the lack of data in some areas leaves significant gaps in the analysis. The major data gaps can be found listed in Table 9. In addition, the SDG Dashboards do not capture important regional challenges that are less relevant at the global level, such as neglected tropical diseases, malaria, or inequality in education outcomes. Similarly, no globally available data could be found to track the impact a country might have on SDG achievement in another country (e.g. by sourcing natural resources from abroad). These challenges require careful analysis and will be addressed in later versions of the SDG Index and Dashboards.
Q: When will the SDG Index and Dashboards be updated?
A: The SDG Index and Dashboards will be updated annually to include new indicators as they become available, update the data, and incorporate suggestions on how to make the tools more useful for countries and other stakeholders. The website will be continuously improved to facilitate the real-time use of the data and comparisons across countries.
Q: To whom can I address my comments on the SDG Index and Dashboards?
A: We welcome comments and suggestions for improving the SDG Index and Dashboards. Please address your comments and suggestions to firstname.lastname@example.org or to email@example.com.