Samir KC is a researcher in the IIASA World Population Program. He worked on the population projections that form the “human core” of the Shared Socioeconomic Pathways (SSPs), a set of scenarios designed for climate change research, but increasingly being applied more broadly to research in sustainability and environmental change.
What are the SSPs? The Shared Socioeconomic Pathways are about the future, how the future could look like under different set of conditions. When we want to talk about the future or we need to think about the future, we always have to do some kind of a projection. Whatever the topic is, even in our personal life, we can use scenarios to map out how things might develop, creating different pathways, which can allow us to better understand how our choices could affect these pathways.
Socioeconomic means the major factors socially as well as economically that can affect future changes on our planet—demographic, socially, and economic. But within this broad umbrella, there are multiple disciplines who work on their own topics and have their own methods and data. If they want to work together they have to match with each other so that output of one work could be the input to another group. That’s why the word shared is there.
The SSPs were developed for the Intergovernmental Panel on Climate Change (IPCC). Why were they needed? For one thing, we just needed to update the data from the earlier generation of emissions scenarios, and define new scenarios. But secondly, the focus changed a bit between the IPCC’s last report and the most recent one, released in 2014. In the new scenarios, the focus is more on challenges to adaptation and mitigation of climate change. These dimensions are harder to incorporate because they depend on a lot of socioeconomic factors.
You worked specifically on the population projections for the SSPs, which were published in 2014. How did this process work? The first thing that we did was to define narratives for each of the SSPs, essentially a story about how the world would look like in the future. This first part is very important. These narratives were based on the current knowledge of science and how the variables are related and interact.
Then for each of the pathways, we had to start defining the variables like population, urbanization, technological change, and economy. Since population is one of the first variables you need in order to calculate other socioeconomic variables, it was the first thing we looked at when turning the narratives into a quantitative projection. Population is needed as a multiplier to calculate demand in the future, for example to calculate how much energy will be required in the future, how much water, and many other things. At the same time when there are adverse effects of climate change, the population determines how many people are impacted as well as who and where. For example the air pollution group who would need population to see how will air pollution affect the population. So population is an important variable.
It was an iterative process—there were lots of calls, involving sometimes 10 or 15 people from many different fields. Whenever we had something to share or something to decide, it was done in this big group. It was a lot of talking and listening to others. That was a very educational for me, because I learned a lot about how people are using population data. It was a very good dialogue—people had sometimes very simple questions but sometimes very interesting questions about population, fertility, mortality, and those kinds of things.
How did your population projections differ from previous demographic data used for climate research? In most climate research, until recently, population was used as a total number. Populations were assumed to be homogenous—everybody the same, the average will represent everyone. We argued that that is not the case, that you need to consider population heterogeneity, not only age and sex, but also education levels. There is a growing body of research showing that these details make a difference.
Still not everybody is using it, but for example, people working on GDP have used it, and hopefully more and more will use these factors in the future. We have shown in the past that knowing the education level of the population can help us make better projections. Having a more educated population has effects on many other socioeconomic measures. For example, more educated societies have higher level of productivity. Education level has also been used to calculate the speed of technological change. In societies where there are highly educated people the advancement in technological change comes faster than otherwise. And these factors are key to understanding humanity’s vulnerability to climate change, our ability to adapt, and our chances to solve the problem.
A lot of your work focuses on what might happen in the future. How do you explain to people the difference between scenarios or projections and predictions? When we make projections about the future, we don’t use the word “prediction.” The chances that such a projection will be wrong are 100%. We can never say exactly what will happen in the future.
It’s important to understand how the narratives were defined, how we defined the scenarios. We cannot guarantee the future, the results, but we can guarantee the quality of what can be done, what we can say now, today about the future. And then there is the idea of uncertainty – we have said something about the future but we haven’t reported any kind of uncertainty there other than reporting ranges of scenarios. This is a big area for future work. It’s difficult to do, and it would be difficult to interpret, but it’s important to consider.
References KC S, Lutz W (2014). The human core of the shared socioeconomic pathways: Population scenarios by age, sex and level of education for all countries to 2100. Global Environmental Change http://pure.iiasa.ac.at/10759/
Shinichiro Fujimori, guest researcher in the IIASA Energy Program, discusses the implications of a recent paper with IIASA Science writer and Editor Daisy Brickhill.
The climate mitigation costs of the Paris Agreement are fairly distributed between countries, but they are not fair for future generations, a new IIASA study has found. This suggests that the relative differences between countries’ climate commitments can be kept the same, but to ensure equity for our descendants they must all be raised .
The Paris Agreement allows each country to set its own climate commitments (known as the Intended Nationally Determined Contributions, or INDCs), and while this autonomy encourages more states to enter into the agreement, it may result in some countries freeloading by not making their fair share of cuts. There is also a trade-off between the mitigation investments we make now, and how much we leave for our descendants to deal with. The study by Fujimori and colleagues examines the issue of equity from different angles.
How did you measure the equity of the climate commitments?
We designed four scenarios: there was the baseline, which has no climate policy, and therefore no emission constraints at all. Then there was a scenario with a carbon price that is the same all over the world, set high enough to put us on course to meet targets to keep warming well below 2°C by 2100. The third scenario allowed different countries to have different carbon prices, meaning that they followed their current INDCs until 2030, but at that point a global carbon price was again put in place to ensure that we reach 2°C targets by the end of the century. Finally, we created a scenario where all emissions reduction targets were 20% higher than the INDCs until 2030. Again, after that a global carbon price was set. For all of the scenarios we also varied what is known in economics as the “discount rate.”
What is a discount rate?
People tend to devalue the future. So, for example, given the choice of €100 now or €150 in five years, many people would choose the €100 now. This is known as a time preference. You can add to this an “inequality aversion.” This is the amount that a wealthy person is willing to reduce their consumption by in order to increase the amount a poor person can consume. Together they make the discount rate.
We used different values of discount rate to see what might happen if people cared a lot about future generations, or poorer countries, or if they did not.
And, are the INDCs fair?
We found that delaying emissions reductions will push the costs onto future generations. In all our scenarios, regardless of the discount rate, there was inequality between the generations. The best scenario for equity between current and future generations was the second scenario with high, globally uniform carbon taxes that start immediately.
The inequity between generations was not unexpected, but what was surprising was that under the Paris Agreement the equity between countries was good. The third scenario, which followed the INDCs until 2030, has much better equality between the regions until the global carbon price began in 2030. This is because low-income countries tended to set lower carbon prices, and more developed countries had higher carbon prices.
That means that the last scenario is the ideal. We can keep the relative differences between the INDCs the same but raise them all so that we can meet the targets. That would give us both inter-regional equity and inter-generation equity.
What about the costs of the impacts of climate change? The Paris Agreement mentioned the need for a mechanism to support the victims of climate-related loss and damage. Might that not create a completely different picture of equity?
That is not something we covered in the study, but it is very important. We need many more studies in that area. We need flood teams, agricultural teams, and others, all collaborating across disciplines. Very much how IIASA works, in fact. Fortunately, the model we constructed for this study can incorporate all of these aspects, as they become available, and translate them into a comprehensive economic assessment.
By David Leclère, IIASA Ecosystems Services and Management Program
August was the warmest ever recorded globally, as was every single month since October 2015. It will not take long for these records to become the norm, and this will tremendously challenge food provision for everyone on the planet. Each additional Celsius degree in global mean temperature will reduce wheat yield by about 5%. While we struggle to take action for limiting global warming by the end of the century to 2°C above preindustrial levels, business as usual scenarios come closer to +5 °C.
However, we lack good and actionable knowledge on this perfect storm in the making. Despite the heat, world wheat production should hit a new record high in 2016, but EU production is expected to be 10% lower than last year. In France, this drop should be around 25-30% and one has to go back to 1983 to find yields equally low. Explanations indeed now point to weather as a large contributor. But underlying mechanisms were poorly anticipated by forecasts and are poorly addressed in climate change impacts research.
Second, many blind spots remain. For example, livestock has a tremendous share in the carbon footprint of agriculture, but also a high nutritional and cultural value. Yet, livestock were not even mentioned once in the summary for policymakers of the last IPCC report dedicated to impacts and adaptation. Heat stress reduces animal production, and increases greenhouse gas emissions per unit of product. In addition, a lower share of animal products in our diet could dramatically reduce pollution and food insecurity. However, we don’t understand well consumers’ preferences in that respect, and how they can be translated in actionable policies.
How can we generate adequate knowledge in time while climate is changing? To be able to forecast yields and prevent dramatic price swings like the 2008 food crisis? To avoid bad surprises due to large missing knowledge, like the livestock question?
In short: it will take far more research to answer these questions—and that means a major increase in funding.
I recently presented two studies by our team at a scientific conference in Germany, which was organized by a European network of agricultural research scientists (MACSUR). One was a literature review on how to estimate the consequences of heat stress on livestock at a global scale. The other one presented scenarios on future food security in Europe, generated in a way that delivers useful knowledge for stakeholders. The MACSUR network was funded as a knowledge hub to foster interactions between research institutes of European countries. In many countries, the funding covered travels and workshops, not new research. Of course, nowadays researchers have to compete for funding to do actual research.
So let’s play the game. The MACSUR network is now aiming at a ‘Future and Emerging Technologies Flagship’, the biggest type of EU funding: 1 billion Euros over 10 years for hundreds of researchers. Recent examples include the Human Brain Project, the Graphene Flagship, and the Quantum Technology Flagship. We are trying to get one on modeling food security under climate change.
Such a project could leapfrog our ability to deal with climate change, a major societal challenge Europe is confronted with (one of the two requirements for FET Flagship funding). The other requirement gave us a hard time at first sight: generating technological innovation, growth and jobs in Europe -but one just needs the right lens. First, agriculture already sustains about 44 million jobs in the EU and this will increase if we are serious about reducing the carbon content of our economy. Second, data now flows at an unprecedented speed (aka, big data). Think about the amount of data acquired with Pokemon Go, and imagine we would harness such concept for science through crowdsourcing and citizen-based science. With such data, agricultural forecasts would perform much better. Similarly, light drones and connected devices will likely open a new era for farm management. Third, we need models that translate big data into knowledge, and not only for the agricultural sector. Similarly, models can also be powerful tools to confront views and could trigger large social innovation.
To get this funding, we need support from a lot of people. The Graphene project claimed support from than 3500 actors, from citizens to industrial players in Europe. We have until end of November to reach 3500 votes, at least. If you think EU should give food security under climate change the same importance as improving the understanding of the human brain, or developing quantum computers, we need you. This will simply never happen without you! Please help us out with two simple actions:
Go the proposal, and vote for/comment it (see instructions, please highlight the potential for concrete innovations)!
Spread the word – share this post with your friends, your family, and your colleagues!
Note: This article gives the views of the author, and not the position of the Nexus blog, nor of the International Institute for Applied Systems Analysis.
It is that time of the year again – in late summer and early fall the media is covering the Arctic sea ice extent. Whether it is another record-breaking low like 2005, 2007, or 2012, or in second place, like this year (see for example New York Times, Guardian), the news is not good.
The minimum Arctic sea ice extent this year tied for second-lowest. Credit: National Snow and Ice Data Center
And again, we hear many speculations on when we will start to experience an ice-free Arctic Ocean during summertime. Will it be 2030, 2050?
Are we stuck in keeping track and recording, observing the change, how fast or slow it is from year to another? Or is something different this year?
I believe that yes, there is a bit of a difference – and a bit more hope. We are in the post-Paris climate agreement (COP21) and UN Sustainable Development Goals (SDG) world.
Today, 48% of 196 nations have formally bound their governments to the Paris agreement, and it is anticipated that by the end of the year, the required 55 nations responsible for 55% of emissions globally will have formally committed to the Paris agreement. This is when the agreement takes legal force, although implementation is another issue and a new story.
I attend scientific meetings, and meetings gathering science, policy, and business stakeholders. Way too often when I attend those meetings, the participants again state that we must do this and we must do that, but they are not prepared to give concrete help and concrete suggestions. They do not talk about the possibility to commit themselves to anything other than stating the need or supervising the statement of needs, leaving the planning of implementation and search for resources happily to some unnamed others.
The Arctic today is in the spotlight not just in the sense that the world’s attention is briefly focused there: it is melting fast under the effect of a variety of physical forces that concentrate warming in the Arctic region. What could we do to help cool the Arctic more quickly?
Melting sea ice in the Arctic, during a 2011 research cruise. Credit: NASA Goddard Space Flight Center
Reducing greenhouse gas emissions through agreements and voluntary implementation by nations, ramping up the use of renewable energy sources and developing new technology, and then waiting for greenhouse gases to decrease in the atmosphere–this will all take a long time. And it will be much longer before we experience the impacts of the emissions reductions. But in parallel to these slow but indispensable developments, there are faster ways of helping out the Arctic in particular. And as a co-benefit, we can clean the air, improve our health, helping the rest of the world as well.
About 25% of the current warming of the Arctic is attributed to black carbon, that is, soot coming from incomplete combustion of fossil fuels.
The main culprit for the man-made black carbon in the Arctic surface atmosphere is gas flaring, wasteful burning of gas in the oil and gas industry. Gas flaring has been found to contribute to 42% of the annual mean black carbon surface concentrations in the Arctic, hence dominating the black carbon emissions north of 66oN.
A large part of the warming experienced in the Arctic is due to black carbon emissions from the eight Arctic nations and the region north of approximately 40oN, including European Union, Russia, Ukraine, China, Canada, and part of the USA.
The USA and Canada have agreed to end routine gas flaring by 2030. My hope is that the IIASA Arctic Futures Initiative could get together science, policy and business stakeholders from the Arctic nations in order to tackle this problem, with other concerned parties, and with countries not yet involved in discussions.
Reference Stohl, A., Aamaas, B., Amann, M., et. al. (2015). Evaluating the climate and air quality impacts of short-lived pollutants, Atmos. Chem. Phys., 15, 10529-10566, doi:10.5194/acp-15-10529-2015, 2015.
Note: This article gives the views of the author, and not the position of the Nexus blog, nor of the International Institute for Applied Systems Analysis.
People have been playing games for fun for many thousands of years. But recently some have been designed not to escape from reality, but to improve it. As the world is becoming more and more complex, and the future more and more uncertain, serious games can be used as innovative tools for learning, decision making, improving effective collaboration and developing strategies for success. With games, we can communicate complex realities and learn from our mistakes without costs.
Systems thinking is required to tackle the challenge of managing both flood risk and development: to live in harmony with floods. Games provide the perfect avenue for exploring these challenges. Games that engage participants have been shown to be very successful and powerful dissemination instruments—with broader outreach than traditional reports. In a team made up of myself, Piotr Magnuszewski from the Water Program, Adam French from the Advanced Systems Analysis and Risk and Resilience Programs, and collaborators from the Zurich Flood Resilience Alliance, we have been developing a game that can help build flood resilience in developing countries.
Because games are experienced as something that feels real, more information is retained, learning is faster, and an intuition is gained about how to make real decisions. Critically, the IIASA Flood Resilience Game is designed to help participants— such as NGO staff working on flood-focused programs—to identify novel policies and strategies which improve flood resilience. In its current form it is a board-game played by at least eight players, who each take on a role as a member of a flood prone community. The direct interactions between players create a rich experience that can be discussed, analysed, and lead to concrete conclusions and actions. This allows players to explore vulnerabilities and capacities—citizens, local authorities and NGOs together—leading to an advanced understanding of interdependencies and the potential for working together.
The game draws on IIASA research on the deep-seated challenges in the typical approach to flood risk management. It allows players to experience, explore, and learn about the flood risk and resilience of communities in river valleys. It lets them experience the effects on resilience of investments in different types of “capital”—such as financial, human, social, physical, and natural. The impacts of flood damage on housing and infrastructure are also an important part of the game, as well as indirect effects on livelihoods, markets, and quality of life.
Playing the game can also improve understanding of the influence of preparedness, response, reconstruction on flood resilience. Importantly, it demonstrates the benefits of investment in risk reduction before the flood strikes, such as via land use planning and flood proofing homes. The effects of institutional arrangements, such as communication between citizens and with government, also become clearer during the course of the game.
Finally, participants can explore the complex outcomes on the economy, society and the environment from long-term development pathways. This highlights the types of decisions needed to avoid creating more flood risk in the future, incentivizing action before a flood through enhancing participatory decision-making. All these complex ideas are experienced with simple, concrete game elements that participants can connect with their daily realities.
From a researcher’s perspective, observing game play deepens our understanding of stakeholder motivations in relation to flood resilience. The game also contributes to better understanding and use of IIASA research via the Zurich flood resilience measurement tool, a ground-breaking approach to resilience measurement.
How did your scientific career evolve into climate change and ecosystem ecology? I studied environmental science in Spain and then I went to Australia, where I started working on free-air CO2 enrichment, or FACE experiments. These are very fancy experiments where you fumigate a forest with CO2 to see if the trees grow faster. In 2014 I moved to London for my PhD project. There, instead of focusing on one single FACE experiment, I collected data from all of them. This allowed me to make general conclusions on a global scale rather than a single forest.
You recently published a paper in Science magazine. Could you summarize the main findings? We found that we can predict how much CO2 plants transfer into growth through the CO2 fertilization effect, based on two variables—nitrogen availability and the type of mycorrhizal, or fungal, association that the plants have. The impact of the type of mycorrhizae has never been tested on a global scale—and we found that it is huge. I think it’s fascinating that such tiny organisms play such a big role at a global scale on something as important as the terrestrial capacity of CO2 uptake.
How did you come up with the idea? One random day in the shower? Long story short, researchers used to think that plants will grow faster, and take up a lot of the CO2 we emit. They assumed this in most of their models as well. But plants need other elements to grow besides CO2. In particular, they need nitrogen. So scientists started to question whether the modeled predictions overestimated the CO2 fertilization effect, because the models did not consider nitrogen limitation. To find out, I analyzed all the FACE experiments and indeed I saw that in general plants were not able to grow faster under elevated CO2 and nitrogen limitation. However, in some cases plants were able to take advantage of elevated CO2 even under nitrogen limitation. I grouped together the experiments where plants could grow under nitrogen limitation and after a lot of reading I saw what they had in common: the type of fungi! It turned out that one type of mycorrhizae is really good at transferring large quantities of nitrogen to the plant and the other type is not.
How did that feel? Awesome! When I saw the graph, I knew: this is going to be important. Of course, after this, my coauthors helped me to polish the story. Without them, the conclusions would not be as robust and clear.
So how does this process work? Where do the fungi get the nitrogen from? Particular soils might have a lot of nitrogen, but the amount available for plants to absorb might be low. Also, plants have to compete with non-fungal microorganisms for nitrogen. So if there is not much there, the microorganisms take it all. It’s called immobilization. Instead of mineralizing nitrogen, they immobilize it so that plants cannot take it up, at least not in the short term. Some types of fungi are much more efficient in accessing nitrogen, and associated with roots they allow plants to overcome limitations.
What is the impact of your findings? Plants currently take up 25-30% of the CO2 we emit, but the question is whether they will be able to continue to do so in the long term. Our findings bring good and bad news. On the one hand, the CO2 fertilization effect will not be limited entirely by nitrogen, because some of the plants will be able to overcome nitrogen limitation through their root fungi. But on the other hand, some plant species will not be able to overcome nitrogen limitation.
There was a big debate about this. One group of scientists believed that plants will continue to take up CO2 and the other group said that plants will be limited by nitrogen availability. These were two very contrasting hypotheses. We discovered that neither of the hypotheses was completely right, but both were partly true, depending on the type of fungi. Our results could bring closure to this debate. We can now make more accurate predictions about global warming.
What will you do at IIASA and how will you link it to your PhD? I want to upscale and quantify how much carbon plants will take up in the future. If we are to predict the capacity of plants to absorb CO2, we need to quantify mycorrhizal distribution and nitrogen availability on a global scale. We are updating mycorrhizal distribution maps according to distribution of plant species. We know for instance that pines are associated with ectomycorrhizal fungi and always will be. To quantify nitrogen availability we use maps of different soil parameters that are available on a rough global scale.