Adil Najam is the inaugural dean of the Pardee School of Global Studies at Boston University and former vice chancellor of Lahore University of Management Sciences, Pakistan. He talks to Science Communication Fellow Parul Tewari about his time as a participant of the IIASA Young Scientists Summer Program (YSSP) and the global challenge of adaptation to climate change.
How has your experience as a YSSP fellow at IIASA impacted your career?
The most important thing my YSSP experience gave me was a real and deep appreciation for interdisciplinarity. The realization that the great challenges of our time lie at the intersection of multiple disciplines. And without a real respect for multiple disciplines we will simply not be able to act effectively on them.
Prof. Adil Najam speaking at the Deutsche Welle Building in Bonn, Germany in 2010 © Erich Habich I en.wikipedia
Recently at the 40th anniversary of the YSSP program you spoke about ‘The age of adaptation’. Globally there is still a lot more focus on mitigation. Why is this?
Living in the “Age of Adaption” does not mean that mitigation is no longer important. It is as, and more, important than ever. But now, we also have to contend with adaptation. Adaptation, after all, is the failure of mitigation. We got to the age of adaptation because we failed to mitigate enough or in time. The less we mitigate now and in the future, the more we will have to adapt, possibly at levels where adaptation may no longer even be possible. Adaption is nearly always more difficult than mitigation; and will ultimately be far more expensive. And at some level it could become impossible.
How do you think can adaptation be brought into the mainstream in environmental/climate change discourse?
Climate discussions are primarily held in the language of carbon. However, adaptation requires us to think outside “carbon management.” The “currency” of adaptation is multivaried: its disease, its poverty, its food, its ecosystems, and maybe most importantly, its water. In fact, I have argued that water is to adaptation, what carbon is to mitigation.
To honestly think about adaptation we will have to confront the fact that adaptation is fundamentally about development. This is unfamiliar—and sometimes uncomfortable—territory for many climate analysts. I do not believe that there is any way that we can honestly deal with the issue of climate adaptation without putting development, especially including issues of climate justice, squarely at the center of the climate debate.
COP 22 (Conference of Parties) was termed as the “COP of Action” where “financing” was one of the critical aspects of both mitigation and adaptation. However, there has not been much progress. Why is this?
Unfortunately, the climate negotiation exercise has become routine. While there are occasional moments of excitement, such as at Paris, the general negotiation process has become entirely predictable, even boring. We come together every year to repeat the same arguments to the same people and then arrive at the same conclusions. We make the same promises each year, knowing that we have little or no intention of keeping them. Maybe I am being too cynical. But I am convinced that if there is to be any ‘action,’ it will come from outside the COPs. From citizen action. From business innovation. From municipalities. And most importantly from future generations who are now condemned to live with the consequences of our decision not to act in time.
© Piyaset I Shutterstock
What is your greatest fear for our planet, in the near future, if we remain as indecisive in the climate negotiations as we are today?
My biggest fear is that we will—or maybe already have—become parochial in our approach to this global challenge. That by choosing not to act in time or at the scale needed, we have condemned some of the poorest communities in the world—the already marginalized and vulnerable—to pay for the sins of our climatic excess. The fear used to be that those who have contributed the least to the problem will end up facing the worst climatic impacts. That, unfortunately, is now the reality.
What message would you like to give to the current generation of YSSPers?
Be bold in the questions you ask and the answers you seek. Never allow yourself—or anyone else—to rein in your intellectual ambition. Now is the time to think big. Because the challenges we face are gigantic.
Note: This article gives the views of the interviewee, and not the position of the Nexus blog, nor of the International Institute for Applied Systems Analysis.
By Piera Patrizio, IIASA Ecosystems Services and Management Program
Biogas–renewable fuel that can be produced from a variety of natural materials including manure, food waste, plant matter, and other organic matter–has the potential to solve a number of environmental challenges simultaneously: It can reduce the emissions of greenhouse gases such as methane (for example, from manure storage) and is the only mature type of renewable energy that can be directly used in electric power generation, heat generation, and transport sectors, and it leads to reduced impacts of pollution from waste disposal.
Biogas can be produced from crops like maize as well as waste and ohter organic materials © Giuliano Del Moretto | Shutterstock
However, biogas is not without impacts of its own. The environmental benefit of using agricultural biogas in particular may be smaller than previously thought, because of the farming activities required for the production of suitable biogas feedstock (such as maize, wheat and triticale), which in turn generates local airborne pollution. Such factors are not adequately reflected in current energy measures.
In other words, existing policy instruments that have been adopted so far in Europe do not reflect the environmental impact associated with the production of certain biofuels because they do not account for other relevant environmental burdens generated along the supply chain.
This is especially the case for biogas, whose production contributes to several environmental burdens such as land use, traffic, and local emissions from the intensive use of fertilizers.
To overcome this issue, my colleagues and I have proposed the adoption of a monetization procedure through which the so-called external costs are incorporated in energy wholesale prices. This method, allows to allocate a cost to the environmental damage associated with emissions of a wide range of pollutants, which can be consequently incorporated in any economic optimization model.
Biogas production plant, Italy © Roberto Lo Savio | Shutterstock
In a new study, which I conducted with Sylvain Leduc and Florian Kraxner, we took a look at the biogas situation in my home country, Italy. We incorporated the total internal and external costs of different biogas utilization pathways in the BeWhere model—a model used for optimizing renewable energy systems–and compared with the performance of the current Italian energy mix.
We found out that, although each type of biogas leads to reduced CO2 emissions compared to fossil fuels, such environmental benefits are sharply reduced when we take other pollutant emissions into account. .
In particular, farming activities generate high non-carbon emissions such as nitrogen oxides (NOx), sulfur dioxide, and particles. Most of this pollution comes from chemical fertilizers and diesel combustion in farming activities–and these emissions corresponding to almost 6% of the energy content of the raw biogas produced.
The second cause of external costs is transportation of the biomass, which mainly produces local emissions of NOx. Local concerns about this issue, are a main source of opposition to new plants, and based on our study, these concerns appear reasonable.
Our results suggest that carbon emission mitigation alone is not always a satisfactory measure to evaluate the sustainability of biogas technologies in order to define energy policies. Other environmental burdens need to be considered when we discuss the environmental sustainability of energy production processes.
Patrizio P, Leduc S, Chinese D, & Kraxner F (2017). Internalizing the external costs of biogas supply chains in the Italian energy sector. Energy 125: 85–96
This article gives the views of the author, and not the position of the Nexus blog, nor of the International Institute for Applied Systems Analysis.
By Linda See, IIASA Ecosystems Services and Management Program
One of the biggest questions when it comes to citizen science is the quality of the data. Scientists worry that citizens are not as rigorous in their data collection as professionals might be, which calls into question the reliability of the data. At a meeting this month in Brussels on using citizen science to track invasive species, we grappled with the question: what it will take to trust this data source, particularly if it’s going to be used to alert authorities regarding the presence of an invasive species in a timely manner.
This discussion got me thinking about what other types of data are supplied by citizens that authorities simply trust, for example, when a citizen calls the emergency services to report an incident, such as a fire. Such reports are investigated by the authorities and the veracity of the alert is not questioned. Instead authorities are obliged to investigate such reports.
Yet the statistics show that false alarms do occur. For example, in 2015, there were more than 2.5 million false fire alarms in the United States, of which just under a third were due to system malfunctions. The remaining calls were unintentional, malicious, or other types of false alarms, such as a bomb scare. Statistics for calls to the emergency services more generally show similar trends in different European countries, where the percentage of false reports range from 40% in Latvia up to 75% in Lithuania and Norway. So why is it that we inherently trust this data source, despite the false alarm rate, and not data from citizen scientists? Is it because life is threatened or because fires are easier to spot than invasive species, or simply because emergency services are mandated with the requirement to investigate?
Volunteers monitor butterflies in Mount Rainier National Park, as part of the Cascade Butterfly Project, a citizen science effort organized by the US National Park Service © Kevin Bacher | US National Park Service
A recent encouraging development for citizen science was the signing of an executive order by President Obama on 6 January 2017, which gave federal agencies the jurisdiction to use citizen science and crowdsourced data in their operations. Do we need something similar in the EU or at the level of member states? And what will it really take for authorities to trust scientific data from citizens?
To move from the current situation of general distrust in citizen science data to one in which the data are viewed as a potentially useful source of information, we need further action. First we need to showcase examples of where data collected by citizens are already being used for monitoring. At the meeting in Brussels, Kyle Copas of the Global Biodiversity Information Facility (GBIF) noted that up to 40% of the data records in GBIF are supplied by citizens, which surprised many of the meeting participants. Data from GBIF are used for national and international monitoring of biodiversity. Secondly, we need to quantify the value of information coming from citizen scientists. For example, how much money could have been saved if reports on invasive species from citizens were acted upon? Third, we need to forge partnerships with government agencies to institutionally embed citizen science data streams into everyday operations. For example, the LandSense citizen observatory, a new project, aims to do exactly this. We are working with the National Mapping Agency in France to use citizen science data to update their maps but there are many other similar examples with other local and national agencies that will be tested over the next 3.5 years.
Finally, we need to develop quality assurance systems that can be easily plugged into the infrastructure of existing organizations. The EU-funded COBWEB project began building such a citizen science-based quality assurance system, which we are continuing to develop in LandSense as a service. Providing out-of-the-box tools may be one solution to help organizations to begin working with citizen science data more seriously at an institutional level.
IIASA researchers test the Fotoquest app, a citizen science game developed at IIASA. ©Katherine Leitzell | IIASA
These measures will clearly take time to implement so I don’t expect that the discussion on the quality of the data will be removed from any agenda for some time to come. However, I look forward to the day when the main issue revolves around how we can possibly handle the masses of big data coming from citizens, a situation that many of us would like to be in.
More Information about the meeting: https://ec.europa.eu/jrc/en/event/workshop/citizen-science-open-data-model-invasive-alien-species-europe
This article gives the views of the author, and not the position of the Nexus blog, nor of the International Institute for Applied Systems Analysis.
Samir KC is a researcher in the IIASA World Population Program. He worked on the population projections that form the “human core” of the Shared Socioeconomic Pathways (SSPs), a set of scenarios designed for climate change research, but increasingly being applied more broadly to research in sustainability and environmental change.
What are the SSPs?
The Shared Socioeconomic Pathways are about the future, how the future could look like under different set of conditions. When we want to talk about the future or we need to think about the future, we always have to do some kind of a projection. Whatever the topic is, even in our personal life, we can use scenarios to map out how things might develop, creating different pathways, which can allow us to better understand how our choices could affect these pathways.
Socioeconomic means the major factors socially as well as economically that can affect future changes on our planet—demographic, socially, and economic. But within this broad umbrella, there are multiple disciplines who work on their own topics and have their own methods and data. If they want to work together they have to match with each other so that output of one work could be the input to another group. That’s why the word shared is there.
The SSPs were developed for the Intergovernmental Panel on Climate Change (IPCC). Why were they needed?
For one thing, we just needed to update the data from the earlier generation of emissions scenarios, and define new scenarios. But secondly, the focus changed a bit between the IPCC’s last report and the most recent one, released in 2014. In the new scenarios, the focus is more on challenges to adaptation and mitigation of climate change. These dimensions are harder to incorporate because they depend on a lot of socioeconomic factors.
Researchers use scenarios to map out a range of possible future developments in the socioeocomic factors that influence climate change. © Salvatore Vastano via Flickr
You worked specifically on the population projections for the SSPs, which were published in 2014. How did this process work?
The first thing that we did was to define narratives for each of the SSPs, essentially a story about how the world would look like in the future. This first part is very important. These narratives were based on the current knowledge of science and how the variables are related and interact.
Then for each of the pathways, we had to start defining the variables like population, urbanization, technological change, and economy. Since population is one of the first variables you need in order to calculate other socioeconomic variables, it was the first thing we looked at when turning the narratives into a quantitative projection. Population is needed as a multiplier to calculate demand in the future, for example to calculate how much energy will be required in the future, how much water, and many other things. At the same time when there are adverse effects of climate change, the population determines how many people are impacted as well as who and where. For example the air pollution group who would need population to see how will air pollution affect the population. So population is an important variable.
It was an iterative process—there were lots of calls, involving sometimes 10 or 15 people from many different fields. Whenever we had something to share or something to decide, it was done in this big group. It was a lot of talking and listening to others. That was a very educational for me, because I learned a lot about how people are using population data. It was a very good dialogue—people had sometimes very simple questions but sometimes very interesting questions about population, fertility, mortality, and those kinds of things.
How did your population projections differ from previous demographic data used for climate research?
In most climate research, until recently, population was used as a total number. Populations were assumed to be homogenous—everybody the same, the average will represent everyone. We argued that that is not the case, that you need to consider population heterogeneity, not only age and sex, but also education levels. There is a growing body of research showing that these details make a difference.
Still not everybody is using it, but for example, people working on GDP have used it, and hopefully more and more will use these factors in the future. We have shown in the past that knowing the education level of the population can help us make better projections. Having a more educated population has effects on many other socioeconomic measures. For example, more educated societies have higher level of productivity. Education level has also been used to calculate the speed of technological change. In societies where there are highly educated people the advancement in technological change comes faster than otherwise. And these factors are key to understanding humanity’s vulnerability to climate change, our ability to adapt, and our chances to solve the problem.
Schoolchildren in Indonesia: Population variables like education have big impacts on greenhouse gas emissions and vulnerability to climate change. © Asian Development Bank
A lot of your work focuses on what might happen in the future. How do you explain to people the difference between scenarios or projections and predictions? When we make projections about the future, we don’t use the word “prediction.” The chances that such a projection will be wrong are 100%. We can never say exactly what will happen in the future.
It’s important to understand how the narratives were defined, how we defined the scenarios. We cannot guarantee the future, the results, but we can guarantee the quality of what can be done, what we can say now, today about the future. And then there is the idea of uncertainty – we have said something about the future but we haven’t reported any kind of uncertainty there other than reporting ranges of scenarios. This is a big area for future work. It’s difficult to do, and it would be difficult to interpret, but it’s important to consider.
KC S, Lutz W (2014). The human core of the shared socioeconomic pathways: Population scenarios by age, sex and level of education for all countries to 2100. Global Environmental Change http://pure.iiasa.ac.at/10759/
Riahi K, van Vuuren DP, Kriegler E, et al. (2016). The Shared Socioeconomic Pathways and their energy, land use, and greenhouse gas emissions implications: An overview. Global Environmental Change. http://pure.iiasa.ac.at/13280/
This article gives the views of the interviewee, and not the position of the Nexus blog, nor of the International Institute for Applied Systems Analysis.
By Anneke Brand, IIASA science communication intern 2016.
Accidents, lane closures, and congestion all affect the flow of road traffic and harmful emissions from vehicles. Live traffic data allow congestion to be detected more accurately and provide a more precise overview of vehicle emissions at different times and places. In his project for the Young Scientists Summer Program (YSSP), Fabian Heidegger investigates how road traffic affects air pollution in cities, using Vienna and surrounding areas as a case study.
Air pollution is a major problem in Europe and globally. Health impacts of air pollution include a range of respiratory and cardiovascular diseases. “10-20% of Europe’s urban population is exposed to excessive levels of nitrogen dioxide (NO2), along with several other air pollutants. NO2 pollution is highest along busy roads. Technical measures have so far often been circumvented, so cities are looking for other measures to reduce the pollution load. Traffic management has therefore gained interest as a way to reduce air pollution,” says Jens Borken-Kleefeld, Heidegger’s study leader at IIASA.
To calculate the amount of air pollution that cars and other vehicles release into the air, researchers use models that apply various sets of data: traffic networks, where and how far people drive, and emission factors of different vehicle categories. Input data for the model may include how many people live in a certain area, how many of them use cars, where they normally drive, and how many grams of pollutants (such as nitric oxide and NO2 gases) their type of cars emit per kilometer.
Inner city Vienna. © Radub85 | Dreamstime.com
Most of these models rely on average daily traffic data. For Heidegger’s YSSP project, which is related to his PhD work at the University of Natural Resources and Life Sciences in Vienna, he is incorporating real-time data, measured every five minutes, into a traffic simulation model developed by Intelligent Transport Systems Vienna Region. A set of detectors in and around the city record the number and speed of vehicles. In addition, location data from the taxi fleet is incorporated into the traffic simulation. Heidegger can therefore immediately identify adverse traffic conditions like stop-and-go traffic, which has a high impact on emissions. This allows for a more accurate calculation and can help design traffic interventions for improving both traffic flow and air quality.
“In the case of a road closure, local emissions will obviously be lower at the specific road but total emissions for the area could be higher than before when drivers use alternative, longer routes or end up in stop-and-go traffic,” says Heidegger.
In order to understand how these diversions and the displacement of pollutants can affect overall emissions, Heidegger will first determine the emissions per street section, and second, what the effects are of diversions from day-to-day traffic patterns. Together with researchers from the Air Quality and Greenhouse Gases Program at IIASA, Heidegger plans to assess the impact of different intervention scenarios, for example an environmental zone in the city, where only modern cars will be allowed to enter. In a second scenario he will look at the effect of people commuting to Vienna, and a third scenario will explore the consequences of expanding pedestrian zones. The researchers hope that this study will better their understanding of the potential of traffic management to reduce air pollution.
Air Pollution Policy Review 2011-2013
Note: This article gives the views of the author, and not the position of the Nexus blog, nor of the International Institute for Applied Systems Analysis.