By Linda See, IIASA Ecosystems Services and Management Program
One of the biggest questions when it comes to citizen science is the quality of the data. Scientists worry that citizens are not as rigorous in their data collection as professionals might be, which calls into question the reliability of the data. At a meeting this month in Brussels on using citizen science to track invasive species, we grappled with the question: what it will take to trust this data source, particularly if it’s going to be used to alert authorities regarding the presence of an invasive species in a timely manner.
This discussion got me thinking about what other types of data are supplied by citizens that authorities simply trust, for example, when a citizen calls the emergency services to report an incident, such as a fire. Such reports are investigated by the authorities and the veracity of the alert is not questioned. Instead authorities are obliged to investigate such reports.
Yet the statistics show that false alarms do occur. For example, in 2015, there were more than 2.5 million false fire alarms in the United States, of which just under a third were due to system malfunctions. The remaining calls were unintentional, malicious, or other types of false alarms, such as a bomb scare. Statistics for calls to the emergency services more generally show similar trends in different European countries, where the percentage of false reports range from 40% in Latvia up to 75% in Lithuania and Norway. So why is it that we inherently trust this data source, despite the false alarm rate, and not data from citizen scientists? Is it because life is threatened or because fires are easier to spot than invasive species, or simply because emergency services are mandated with the requirement to investigate?
Volunteers monitor butterflies in Mount Rainier National Park, as part of the Cascade Butterfly Project, a citizen science effort organized by the US National Park Service © Kevin Bacher | US National Park Service
A recent encouraging development for citizen science was the signing of an executive order by President Obama on 6 January 2017, which gave federal agencies the jurisdiction to use citizen science and crowdsourced data in their operations. Do we need something similar in the EU or at the level of member states? And what will it really take for authorities to trust scientific data from citizens?
To move from the current situation of general distrust in citizen science data to one in which the data are viewed as a potentially useful source of information, we need further action. First we need to showcase examples of where data collected by citizens are already being used for monitoring. At the meeting in Brussels, Kyle Copas of the Global Biodiversity Information Facility (GBIF) noted that up to 40% of the data records in GBIF are supplied by citizens, which surprised many of the meeting participants. Data from GBIF are used for national and international monitoring of biodiversity. Secondly, we need to quantify the value of information coming from citizen scientists. For example, how much money could have been saved if reports on invasive species from citizens were acted upon? Third, we need to forge partnerships with government agencies to institutionally embed citizen science data streams into everyday operations. For example, the LandSense citizen observatory, a new project, aims to do exactly this. We are working with the National Mapping Agency in France to use citizen science data to update their maps but there are many other similar examples with other local and national agencies that will be tested over the next 3.5 years.
Finally, we need to develop quality assurance systems that can be easily plugged into the infrastructure of existing organizations. The EU-funded COBWEB project began building such a citizen science-based quality assurance system, which we are continuing to develop in LandSense as a service. Providing out-of-the-box tools may be one solution to help organizations to begin working with citizen science data more seriously at an institutional level.
IIASA researchers test the Fotoquest app, a citizen science game developed at IIASA. ©Katherine Leitzell | IIASA
These measures will clearly take time to implement so I don’t expect that the discussion on the quality of the data will be removed from any agenda for some time to come. However, I look forward to the day when the main issue revolves around how we can possibly handle the masses of big data coming from citizens, a situation that many of us would like to be in.
More Information about the meeting: https://ec.europa.eu/jrc/en/event/workshop/citizen-science-open-data-model-invasive-alien-species-europe
This article gives the views of the author, and not the position of the Nexus blog, nor of the International Institute for Applied Systems Analysis.
Samir KC is a researcher in the IIASA World Population Program. He worked on the population projections that form the “human core” of the Shared Socioeconomic Pathways (SSPs), a set of scenarios designed for climate change research, but increasingly being applied more broadly to research in sustainability and environmental change.
What are the SSPs?
The Shared Socioeconomic Pathways are about the future, how the future could look like under different set of conditions. When we want to talk about the future or we need to think about the future, we always have to do some kind of a projection. Whatever the topic is, even in our personal life, we can use scenarios to map out how things might develop, creating different pathways, which can allow us to better understand how our choices could affect these pathways.
Socioeconomic means the major factors socially as well as economically that can affect future changes on our planet—demographic, socially, and economic. But within this broad umbrella, there are multiple disciplines who work on their own topics and have their own methods and data. If they want to work together they have to match with each other so that output of one work could be the input to another group. That’s why the word shared is there.
The SSPs were developed for the Intergovernmental Panel on Climate Change (IPCC). Why were they needed?
For one thing, we just needed to update the data from the earlier generation of emissions scenarios, and define new scenarios. But secondly, the focus changed a bit between the IPCC’s last report and the most recent one, released in 2014. In the new scenarios, the focus is more on challenges to adaptation and mitigation of climate change. These dimensions are harder to incorporate because they depend on a lot of socioeconomic factors.
Researchers use scenarios to map out a range of possible future developments in the socioeocomic factors that influence climate change. © Salvatore Vastano via Flickr
You worked specifically on the population projections for the SSPs, which were published in 2014. How did this process work?
The first thing that we did was to define narratives for each of the SSPs, essentially a story about how the world would look like in the future. This first part is very important. These narratives were based on the current knowledge of science and how the variables are related and interact.
Then for each of the pathways, we had to start defining the variables like population, urbanization, technological change, and economy. Since population is one of the first variables you need in order to calculate other socioeconomic variables, it was the first thing we looked at when turning the narratives into a quantitative projection. Population is needed as a multiplier to calculate demand in the future, for example to calculate how much energy will be required in the future, how much water, and many other things. At the same time when there are adverse effects of climate change, the population determines how many people are impacted as well as who and where. For example the air pollution group who would need population to see how will air pollution affect the population. So population is an important variable.
It was an iterative process—there were lots of calls, involving sometimes 10 or 15 people from many different fields. Whenever we had something to share or something to decide, it was done in this big group. It was a lot of talking and listening to others. That was a very educational for me, because I learned a lot about how people are using population data. It was a very good dialogue—people had sometimes very simple questions but sometimes very interesting questions about population, fertility, mortality, and those kinds of things.
How did your population projections differ from previous demographic data used for climate research?
In most climate research, until recently, population was used as a total number. Populations were assumed to be homogenous—everybody the same, the average will represent everyone. We argued that that is not the case, that you need to consider population heterogeneity, not only age and sex, but also education levels. There is a growing body of research showing that these details make a difference.
Still not everybody is using it, but for example, people working on GDP have used it, and hopefully more and more will use these factors in the future. We have shown in the past that knowing the education level of the population can help us make better projections. Having a more educated population has effects on many other socioeconomic measures. For example, more educated societies have higher level of productivity. Education level has also been used to calculate the speed of technological change. In societies where there are highly educated people the advancement in technological change comes faster than otherwise. And these factors are key to understanding humanity’s vulnerability to climate change, our ability to adapt, and our chances to solve the problem.
Schoolchildren in Indonesia: Population variables like education have big impacts on greenhouse gas emissions and vulnerability to climate change. © Asian Development Bank
A lot of your work focuses on what might happen in the future. How do you explain to people the difference between scenarios or projections and predictions? When we make projections about the future, we don’t use the word “prediction.” The chances that such a projection will be wrong are 100%. We can never say exactly what will happen in the future.
It’s important to understand how the narratives were defined, how we defined the scenarios. We cannot guarantee the future, the results, but we can guarantee the quality of what can be done, what we can say now, today about the future. And then there is the idea of uncertainty – we have said something about the future but we haven’t reported any kind of uncertainty there other than reporting ranges of scenarios. This is a big area for future work. It’s difficult to do, and it would be difficult to interpret, but it’s important to consider.
KC S, Lutz W (2014). The human core of the shared socioeconomic pathways: Population scenarios by age, sex and level of education for all countries to 2100. Global Environmental Change http://pure.iiasa.ac.at/10759/
Riahi K, van Vuuren DP, Kriegler E, et al. (2016). The Shared Socioeconomic Pathways and their energy, land use, and greenhouse gas emissions implications: An overview. Global Environmental Change. http://pure.iiasa.ac.at/13280/
This article gives the views of the interviewee, and not the position of the Nexus blog, nor of the International Institute for Applied Systems Analysis.
By Anneke Brand, IIASA science communication intern 2016.
Accidents, lane closures, and congestion all affect the flow of road traffic and harmful emissions from vehicles. Live traffic data allow congestion to be detected more accurately and provide a more precise overview of vehicle emissions at different times and places. In his project for the Young Scientists Summer Program (YSSP), Fabian Heidegger investigates how road traffic affects air pollution in cities, using Vienna and surrounding areas as a case study.
Air pollution is a major problem in Europe and globally. Health impacts of air pollution include a range of respiratory and cardiovascular diseases. “10-20% of Europe’s urban population is exposed to excessive levels of nitrogen dioxide (NO2), along with several other air pollutants. NO2 pollution is highest along busy roads. Technical measures have so far often been circumvented, so cities are looking for other measures to reduce the pollution load. Traffic management has therefore gained interest as a way to reduce air pollution,” says Jens Borken-Kleefeld, Heidegger’s study leader at IIASA.
To calculate the amount of air pollution that cars and other vehicles release into the air, researchers use models that apply various sets of data: traffic networks, where and how far people drive, and emission factors of different vehicle categories. Input data for the model may include how many people live in a certain area, how many of them use cars, where they normally drive, and how many grams of pollutants (such as nitric oxide and NO2 gases) their type of cars emit per kilometer.
Inner city Vienna. © Radub85 | Dreamstime.com
Most of these models rely on average daily traffic data. For Heidegger’s YSSP project, which is related to his PhD work at the University of Natural Resources and Life Sciences in Vienna, he is incorporating real-time data, measured every five minutes, into a traffic simulation model developed by Intelligent Transport Systems Vienna Region. A set of detectors in and around the city record the number and speed of vehicles. In addition, location data from the taxi fleet is incorporated into the traffic simulation. Heidegger can therefore immediately identify adverse traffic conditions like stop-and-go traffic, which has a high impact on emissions. This allows for a more accurate calculation and can help design traffic interventions for improving both traffic flow and air quality.
“In the case of a road closure, local emissions will obviously be lower at the specific road but total emissions for the area could be higher than before when drivers use alternative, longer routes or end up in stop-and-go traffic,” says Heidegger.
In order to understand how these diversions and the displacement of pollutants can affect overall emissions, Heidegger will first determine the emissions per street section, and second, what the effects are of diversions from day-to-day traffic patterns. Together with researchers from the Air Quality and Greenhouse Gases Program at IIASA, Heidegger plans to assess the impact of different intervention scenarios, for example an environmental zone in the city, where only modern cars will be allowed to enter. In a second scenario he will look at the effect of people commuting to Vienna, and a third scenario will explore the consequences of expanding pedestrian zones. The researchers hope that this study will better their understanding of the potential of traffic management to reduce air pollution.
Air Pollution Policy Review 2011-2013
Note: This article gives the views of the author, and not the position of the Nexus blog, nor of the International Institute for Applied Systems Analysis.
César Terrer, participant in the IIASA 2016 Young Scientists Summer Program, and PhD student at Imperial College London, recently made a groundbreaking contribution to the way scientists think about climate change and the CO2 fertilization effect. In this interview he discusses his research, his first publication in Science, and his summer project at IIASA.
Conducted and edited by Anneke Brand, IIASA science communication intern 2016.
César Terrer ©Vilma Sandström
How did your scientific career evolve into climate change and ecosystem ecology?
I studied environmental science in Spain and then I went to Australia, where I started working on free-air CO2 enrichment, or FACE experiments. These are very fancy experiments where you fumigate a forest with CO2 to see if the trees grow faster. In 2014 I moved to London for my PhD project. There, instead of focusing on one single FACE experiment, I collected data from all of them. This allowed me to make general conclusions on a global scale rather than a single forest.
You recently published a paper in Science magazine. Could you summarize the main findings?
We found that we can predict how much CO2 plants transfer into growth through the CO2 fertilization effect, based on two variables—nitrogen availability and the type of mycorrhizal, or fungal, association that the plants have. The impact of the type of mycorrhizae has never been tested on a global scale—and we found that it is huge. I think it’s fascinating that such tiny organisms play such a big role at a global scale on something as important as the terrestrial capacity of CO2 uptake.
How did you come up with the idea? One random day in the shower?
Long story short, researchers used to think that plants will grow faster, and take up a lot of the CO2 we emit. They assumed this in most of their models as well. But plants need other elements to grow besides CO2. In particular, they need nitrogen. So scientists started to question whether the modeled predictions overestimated the CO2 fertilization effect, because the models did not consider nitrogen limitation. To find out, I analyzed all the FACE experiments and indeed I saw that in general plants were not able to grow faster under elevated CO2 and nitrogen limitation. However, in some cases plants were able to take advantage of elevated CO2 even under nitrogen limitation. I grouped together the experiments where plants could grow under nitrogen limitation and after a lot of reading I saw what they had in common: the type of fungi! It turned out that one type of mycorrhizae is really good at transferring large quantities of nitrogen to the plant and the other type is not.
How did that feel?
Awesome! When I saw the graph, I knew: this is going to be important. Of course, after this, my coauthors helped me to polish the story. Without them, the conclusions would not be as robust and clear.
So how does this process work? Where do the fungi get the nitrogen from?
Particular soils might have a lot of nitrogen, but the amount available for plants to absorb might be low. Also, plants have to compete with non-fungal microorganisms for nitrogen. So if there is not much there, the microorganisms take it all. It’s called immobilization. Instead of mineralizing nitrogen, they immobilize it so that plants cannot take it up, at least not in the short term. Some types of fungi are much more efficient in accessing nitrogen, and associated with roots they allow plants to overcome limitations.
Nitrogen mobilization abilities of different types of fungi. Growth of plants associated with fungi not beneficial for nitrogen uptake (illustrated as grass roots on the left) could be limited by low nitrogen availability in soil. Other plants have the advantage of increased nitrogen uptake due to their beneficial association with certain types of fungi (illustrated as yellow mushrooms connected to the roots of the tree on the right). ©Victor O. Leshyk.
What is the impact of your findings?
Plants currently take up 25-30% of the CO2 we emit, but the question is whether they will be able to continue to do so in the long term. Our findings bring good and bad news. On the one hand, the CO2 fertilization effect will not be limited entirely by nitrogen, because some of the plants will be able to overcome nitrogen limitation through their root fungi. But on the other hand, some plant species will not be able to overcome nitrogen limitation.
There was a big debate about this. One group of scientists believed that plants will continue to take up CO2 and the other group said that plants will be limited by nitrogen availability. These were two very contrasting hypotheses. We discovered that neither of the hypotheses was completely right, but both were partly true, depending on the type of fungi. Our results could bring closure to this debate. We can now make more accurate predictions about global warming.
What will you do at IIASA and how will you link it to your PhD?
I want to upscale and quantify how much carbon plants will take up in the future. If we are to predict the capacity of plants to absorb CO2, we need to quantify mycorrhizal distribution and nitrogen availability on a global scale. We are updating mycorrhizal distribution maps according to distribution of plant species. We know for instance that pines are associated with ectomycorrhizal fungi and always will be. To quantify nitrogen availability we use maps of different soil parameters that are available on a rough global scale.
© Adam Edwards | Dreamstime.com
About César Terrer
Prior to his PhD, Terrer studied at the University of Murcia in Spain and the University of Western Sydney in Australia.
Currently he is a member of the Department of Life Sciences at Imperial College London, UK. For this study he collaborated with researchers from the University of Antwerp, Northern Arizona University, Indiana University and Macquarie University.
In the IIASA Young Scientists Summer Program, Terrer works together with Oskar Franklin from the Ecosystem Services and Management Program and Christina Kaiser from the Evolution and Ecology Program.
Note: This article gives the views of the interviewee, and not the position of the Nexus blog, nor of the International Institute for Applied Systems Analysis.
By Andrey Krasovskii, IIASA Ecosystems Services and Management Program
By 2090, the area burned by forest fires in the European Union could increase by 200% because of climate change. However, preventive fires could keep that increase to below 50%. Improved firefighting response could provide additional protection against forest fires. These findings were the result of modeling work we did for the EU Mediation project on projecting future burned areas and adaptation options in Europe. When we talk about these results, people often want to know more about how our model works, what assumptions it makes, and how reliable it is.
Figure 1. The WildFire cLimate impacts and Adaptation Model (FLAM) schematic – estimation of expected burned area.
The model is complex: every link in the schematic shown above represents a specific mathematical formula. These formulas have been developed by many researchers who studied how wildfire occurrence is related to climate, population, and biomass available for burning. Their results have been aggregated into mathematical relations and functions attempting to replicate real processes. The model code runs through the scheme with daily weather inputs in order to calculate the potential for fire ignition, spread, and burned areas. The model transforms spatial and intertemporal inputs into expected burned areas for 25km squares across the entirety of Europe. These squares can be summed up into geographic regions, e.g. countries, as well as burned areas can be aggregated over a given time period, e.g. 10 years.
It took days for our colleague Mirco Migliavacca to run the model during his work at the Joint Research Center of the European Commission. In fact, the scheme depicted in Figure 1 shows only a small piece of a larger picture reflecting the Community Land Model with the integrated fire module (CLM-AB), which he used. CLM-AB calculates all inputs in the indicated fire module, based on modeling processes in the global vegetation system. To speed up the running times for the case study focused on the wildfires in Europe, my colleague Nikolay Khabarov developed a standalone version of the fire model by decoupling the fire module from CLM-AB. When I joined the study, we had also found alternatives for input data, e.g. IIASA’s Global Forest Database, and implemented additional procedures in order to create our wildfire climate impacts and adaptation model (FLAM).
We used the historical data from satellite observations in order to validate modeling results. At the beginning many numerical experiments in CLM and FLAM did not give satisfactory results – there was either overestimation or underestimation of modeled burned areas compared to those reported in available datasets. One day a purely mathematical insight happened. We realized that in the fire algorithm implemented in FLAM, there is a parameter that can be factorized, mathematically speaking. This parameter, a probability of extinguishing a fire in a pixel in one day, was constant for Europe and set to 0.5. It became obvious that this parameter should vary with respect to a region. Factorization of this variable gave a possibility to avoid routine calculations, and use it for calibrating the model over a historical period. This can be done analytically by solving a corresponding polynomial equation. Analytical findings allowed us to introduce an effective calibration procedure and at the same time to estimate a firefighting efficiency on a country level. Further, using the advice of our colleagues Anatoly Shvidenko and Dmitry Schepaschenko, we have introduced adaptation options in the model, for example prescribed burnings, which firefighters use to reduce the fuel availability and, consequently, potential of a major fire.
Prescribed burnings are one tool that can help prevent major wildfires. (cc) US Bureau of Land Management via Flickr
Once we had calibrated the model so that it adequately performed on the historical period (using historical climate data), we used climate scenarios to produce future projections. Currently, we are working on further improvements in modeling accuracy in annual burned areas by introducing additional regionally specific factors in the model. In the recent study published in the International Journal of Wildland Fire, we suggested improving the original model by modifying the fire probability function reflecting fuel moisture. This modification allows for a dramatic improvement of accuracy in modelled burned areas for a range of European countries.
Despite some success in modeling annual burned areas in Europe, we still have difficulties in predicting the extreme fires, in particular in some more arid and hence vulnerable regions such as Spain. However, we accept the challenge, because credible modeling results in terms of burned areas provide important information for assessing economic damages and CO2 emissions, due to climate and human activities. Our research has the potential to help society to realize these risks and undertake preventive measures. It also delivers an additional scientific value due to the fact, that fire risks must be included in forest management models.
I would like to thank all the study co-authors for their valuable contributions and efficient collaboration.
Krasovskii, A., Khabarov, N., Migliavacca, M., Kraxner, F. and Obersteiner, M. (2016) Regional aspects of modelling burned areas in Europe. International Journal of Wildland Fire. http://dx.doi.org/10.1071/WF15012
Note: This article gives the views of the interviewee, and not the position of the Nexus blog, nor of the International Institute for Applied Systems Analysis.