Do smokers know what they are doing to their life expectancy?

By Valeria Bordone, University of Munich Department of Sociology and IIASA World Population Program

Everyone, consciously or unconsciously, formulates in their own mind a subjective survival probability– i.e., an estimate of how long they are going to live. This will affect decisions in different spheres of later life: retirement, investments, and healthy behaviors. Moreover, previous research has found that subjective survival probability is a good predictor of mortality. In fact, on average, people somehow know better than standard health measures the effect that their characteristics and their behavior have on life expectancy. It is however plausible not only to expect differences within the population in terms of survival, but also in the ability to predict their own survival.

(cc) roujo | Flickr

In a recent publication with Bruno Arpino from the University Pompeu Fabra and Sergei Scherbov from the Wittgenstein Centre (IIASA, VID/ÖAW, WU)., we presented for the first time joint analyses of the effect of smoking behavior and education on subjective survival probabilities and on the ability of survey respondents to predict their real survival, using longitudinal data on people aged 50-89 years old in the USA drawn from the Health and Retirement Study.

We found that, consistent with real mortality, smokers report the lowest subjective survival probabilities. Similarly, less educated people report lower subjective survival probabilities than higher education people. This is in line with the well-known positive correlation between education and life expectancy. However, despite being aware of their lower life expectancy as compared to non-smokers and past smokers, people currently smoking at the time of the survey tended to overestimate their survival probabilities. This holds especially for less educated people.

This graph shows the probability of correctly estimating the own survival probabilities with 95% confidence intervals, by smoking behavior and educational attainment. ©Arpino B, Bordone V, & Scherbov S (2017)

Our study suggests that in fact, education also plays an important role in shaping people’s ability to estimate their own survival probability. Whether or not they smoke, we found that more highly educated people are more likely to correctly predict their survival probabilities.

In view of the high proportion of the American population that consists of current or past smokers, a percentage that reached 77% in some male cohorts, our findings emphasize the need to disseminate more information about risks of smoking, specifically targeting people with less education.

By showing that smoking and education play together in determining how well people can assess the own survival potential, this study extends our understanding of the variability of subjective survival probabilities within a population. The fact that sub-groups within the population differently incorporate the effects of smoking into their assessment of survival probabilities may have important consequences for example on when people exit the labor market or whether they buy a life insurance, as individuals are likely to base their decisions also on their longevity expectations.

Policymakers can therefore draw some relevant conclusions from our study to design policies concerned with health and survivorship in later life. Despite the various anti-smoking campaigns and smoking restrictions, smokers may not be fully aware of the risks of smoking. In particular, educational groups seem to be differently exposed to the information that is disseminated to the public. Our study suggests that there is a need to target such information to less educated people, who are the most likely to underestimate the risks of smoking. Providing information on how survival probabilities vary by smoking behavior may not only reduce smoking but it may also increase individuals’ ability to assess their own survival.

(cc) Quinn Dombrowski | Flickr

Arpino B, Bordone V, & Scherbov S (2017). Smoking, Education and the Ability to Predict Own Survival Probabilities: An Observational Study on US Data. IIASA Working Paper. IIASA, Laxenburg, Austria: WP-17-012 []

This article gives the views of the author, and not the position of the Nexus blog, nor of the International Institute for Applied Systems Analysis.

Disappearing Act: Bolivia’s second largest lake dries up

By Parul Tewari, IIASA Science Communication Fellow 2017

In 2016, Bolivia saw its worst drought in nearly 30 years. While the city of La Paz faced an acute water shortage with no piped water in some parts, the agricultural sector was hit the hardest. According to The Agricultural Chamber of the East, the region suffered a loss of almost 50% of total produce. Animal carcasses lay scattered in plain sight in the valleys, where they had died looking for watering holes.

Lake Poopo (Bolivia) before it dried up © David Almeida I Flickr

One of the most dramatic results of this catastrophic drought was that Lake Poopo, (pronounced po-po) Bolivia’s second largest lake was drained of every drop of water. Located at a height of approximately 1127 meters, and covering an area of 1,000 square kilometers, what remains of it now resembles a desert more than a lake. This event forced the fishing community of Uru Uru, which depended on the lake, to either migrate to other lakes or look for alternate livelihood options.

Lake Poopo is located in the central South American Altiplano, one of the largest high plateaus in the world (Bolivia’s largest lake, Titicaca, is located in the north of the region). Due to its unique topography, the highland faces extreme climatic conditions, which are responsible for difficult lives as well as widespread poverty among the people who live there.

While Titicaca is over 100 meters deep, Poopo had a depth of less than three meters. Combined with a high rate of evapotranspiration, erratic rainfall, and limited flow of water from the Desaguadero River, Poopo was in a precarious position even during the best of times. Whatever little water flowed in from the river is further depleted by intensive irrigation activities at the south of Lake Titicaca before the water makes it way down to Poopo.

Sattelite images of Lake Poopo

Changes in water levels of Lake Poopo over 30 years © U.S. Geological Survey, Associated Press

The lake’s existence had been threatened several times in the past. However, the 2016 drought was one of the most devastating ones. According to the Defense Ministry of Bolivia, early this year the lake started recovering after several days of heavy rain, restoring as much as 70% of the water. However, since the lake is a part of a very fragile ecosystem, there have been some irreversible changes to the flora and fauna in addition to the losses to the fishing communities living around the lake.

Charting a better future

Claudia Canedo, a participant of the 2017 Young Scientists Summer Program (YSSP) at IIASA, is exploring the impact of droughts and the risk on agricultural production in the light of this event, after which Bolivia declared a state of water emergency. Canedo was born and raised in the city of La Paz and experienced water shortages while growing up close to the Altiplano. This motivated her to investigate a sustainable solution for water availability in the region. With the results of her study she is hoping to ensure that such a situation doesn’t arise again in the Altiplano – that other communities directly dependent on ecosystem services, like that of Lake Poopo, do not have to lose everything because of an extreme weather event.

For a region where more than half the population is dependent on agriculture for their livelihoods, droughts serve as a major setback to the national economy. “It is not just one factor that led to the drought, though. There were different factors that contributed to the drying up of the lake and also contribute to the agricultural distress,” she says.

“The southern Altiplano lies in an arid zone and receives low precipitation due to its proximity to the Atacama Desert. Poor soil quality (high saline content and lack of nutrients) makes it unsuitable for most crops, except quinoa and potato in some areas,” adds Canedo. Residents also lack the knowledge and the monetary resources to invest in newer technology, which could possibly lead to better water management.

A woman from one of the drought affected communities in Bolivia © EU – Photo credits: EC/ECHO/Laurence Bardon I Flickr

One of the most critical factors in the recent drought was the El Nino- Southern Oscillation, the warming of the sea temperatures in the Pacific Ocean, which in turn carries the warmer oceanic winds and lowers the rate of precipitation in the highland leading to increased evapotranspiration. In 2015 and 2016, the losses due to this phenomenon were devastating for agriculture in the Altiplano, says Canedo.

In her quest to find solutions, the biggest challenge is the lack of recorded data from local weather stations for the past years. Although satellite data is available, it is too generic in nature to do a local analysis. Therefore combining ground and satellite data could enhance the present knowledge and provide consistent results of the climate and vegetation variability. If done successfully, Canedo hopes to identify a correlation between precipitation and vegetation. With this information, she can improve climate forecasting that could help the local people adapt to droughts powerful enough to turn their lives upside down.

With weather forecasts and early warning systems for extreme weather events like droughts, farmers would know what to expect and would be able to plant resilient varieties of crops. This might not earn them the same profits as in a normal year, but would not result in a failed crop. Claudia aims to come up with a drought index useful for drought monitoring and early warning, which will integrate short-term and long-term meteorological predictions.

Perhaps, in the future, with this newfound knowledge, the price for extreme weather events won’t be paid in terms of lost ecosystems like that of Lake Poopo, robbing people of their lives and livelihoods.

About the Researcher

Claudia Canedo is a participant in the 2017 IIASA YSSP. She is pursuing a doctoral program in water resources engineering at Lund University, Sweden. She is interested in studying the hydrological and climatological conditions over small basins in the South American highlands. The aim of her research is to define water resources availability and find strategies for sustainable water management in the semi-arid region.

This article gives the views of the author, and not the position of the Nexus blog, nor of the International Institute for Applied Systems Analysis.




Interview: Living in the age of adaptation

Adil Najam is the inaugural dean of the Pardee School of Global Studies at Boston University and former vice chancellor of Lahore University of Management Sciences, Pakistan. He talks to Science Communication Fellow Parul Tewari about his time as a participant of the IIASA Young Scientists Summer Program (YSSP) and the global challenge of adaptation to climate change.  

How has your experience as a YSSP fellow at IIASA impacted your career?
The most important thing my YSSP experience gave me was a real and deep appreciation for interdisciplinarity. The realization that the great challenges of our time lie at the intersection of multiple disciplines. And without a real respect for multiple disciplines we will simply not be able to act effectively on them.

Prof. Adil Najam speaking at the Deutsche Welle Building in Bonn, Germany in 2010 © Erich Habich I en.wikipedia

Recently at the 40th anniversary of the YSSP program you spoke about ‘The age of adaptation’. Globally there is still a lot more focus on mitigation. Why is this?
Living in the “Age of Adaption” does not mean that mitigation is no longer important. It is as, and more, important than ever. But now, we also have to contend with adaptation. Adaptation, after all, is the failure of mitigation. We got to the age of adaptation because we failed to mitigate enough or in time. The less we mitigate now and in the future, the more we will have to adapt, possibly at levels where adaptation may no longer even be possible. Adaption is nearly always more difficult than mitigation; and will ultimately be far more expensive. And at some level it could become impossible.

How do you think can adaptation be brought into the mainstream in environmental/climate change discourse?
Climate discussions are primarily held in the language of carbon. However, adaptation requires us to think outside “carbon management.” The “currency” of adaptation is multivaried: its disease, its poverty, its food, its ecosystems, and maybe most importantly, its water. In fact, I have argued that water is to adaptation, what carbon is to mitigation.
To honestly think about adaptation we will have to confront the fact that adaptation is fundamentally about development. This is unfamiliar—and sometimes uncomfortable—territory for many climate analysts. I do not believe that there is any way that we can honestly deal with the issue of climate adaptation without putting development, especially including issues of climate justice, squarely at the center of the climate debate.

COP 22 (Conference of Parties) was termed as the “COP of Action” where “financing” was one of the critical aspects of both mitigation and adaptation. However, there has not been much progress. Why is this?
Unfortunately, the climate negotiation exercise has become routine. While there are occasional moments of excitement, such as at Paris, the general negotiation process has become entirely predictable, even boring. We come together every year to repeat the same arguments to the same people and then arrive at the same conclusions. We make the same promises each year, knowing that we have little or no intention of keeping them. Maybe I am being too cynical. But I am convinced that if there is to be any ‘action,’ it will come from outside the COPs. From citizen action. From business innovation. From municipalities. And most importantly from future generations who are now condemned to live with the consequences of our decision not to act in time.

© Piyaset I Shutterstock

What is your greatest fear for our planet, in the near future, if we remain as indecisive in the climate negotiations as we are today?
My biggest fear is that we will—or maybe already have—become parochial in our approach to this global challenge. That by choosing not to act in time or at the scale needed, we have condemned some of the poorest communities in the world—the already marginalized and vulnerable—to pay for the sins of our climatic excess. The fear used to be that those who have contributed the least to the problem will end up facing the worst climatic impacts. That, unfortunately, is now the reality.

What message would you like to give to the current generation of YSSPers?
Be bold in the questions you ask and the answers you seek. Never allow yourself—or anyone else—to rein in your intellectual ambition. Now is the time to think big. Because the challenges we face are gigantic.

Note: This article gives the views of the interviewee, and not the position of the Nexus blog, nor of the International Institute for Applied Systems Analysis.

Is open science the way to go?

By Luke Kirwan, IIASA open access manager

At this year’s European Geosciences Union a panel of experts convened to debate the benefits of open science. Open science means making as much of the scientific output and processes publicly visible and accessible, including publications, models, and data sets.

Open science includes not just open access to research findings, but the idea of sharing data, methods, and processes. ©PongMoji | Shutterstock

In terms of the benefits of open science the panelists—who included representatives from academia, government, and academic publishing—generally agreed that openness favors increased collaboration and the development of large networks, especially in terms of geoscience data, which improves precision in the interpretation of results. There is evidence that sharing data and linking to publications increases both readership and citations. A growing number of funding bodies and journals are also requiring researchers to make the data underlining a publication as publicly available as possible. In the context of Horizon 2020, researchers are instructed to make their data ‘as open as possible, as closed as necessary.’

This statement was intentionally left vague, because the European Research Council (ERC) realized that a one size fits all approach would not be able to cover the entirety of research practices across the scientific community, said Jean-Paul Bourguignon, president of the ERC.

Barbara Romanowicz from Collège de France and Institut de Physique du Glove de Paris also pointed to the need for disciplines to develop standardized metadata standards and a community ethic to facilitate interoperability. She also pointed out that the requirements for making raw data openly accessible are quite different to those for making models accessible. These problems require increased resources to be adequately addressed.

Roche DG, Lanfear R, Binning SA, Haff TM, Schwanz LE, Cain KE, Kokko H, Jennions MD, Kruuk LEB (2014). Troubleshooting public data archiving: suggestions to increase participation. PLOS Biology. 12 (1): e1001779. doi:10.1371/journal.pbio.1001779.

Playing devil’s advocate, Helen Glaves from the British Geological Survey pointed to several areas of potential concern. She questioned whether the costs involved in providing long-term preservation and access to data are the most efficient use of taxpayers money. She also suggested that charging for access could be used to generate revenues to fund future research. However, possibly a more salient concern for researchers that she raised was  the fear of scientists that making their data and research available in good faith, could allow their hard work to be passed off by another researcher as their own.

Many of these issues were raised by audience members during the questions and answer session. Scientists pointed out that research data involved a lot of hard work to collate, they had concerns about inappropriate secondary reuse, jobs and research grants are highly competitive. However, the view was also expressed that paying for access to research fundamentally amounts to ‘double taxation’ if the research has been funded by public money, and that even restrictive sharing is better than not sharing at all. It was also argued that incentivising sharing through increased citations and visibility would both help encourage researchers to make their research more open and aide researchers in the pursuit of grants or research positions. To bring about these changes in research practices will involve investing in training the next generation of scientists in these new processes.

Here at IIASA we are fully committed to open access and in the library, we assist our researchers with any queries or issues they may have with widely sharing their research. As well as improving the visibility of research publications through Pure, our institutional repository, we can also assist with making research data discoverable and citable.

A video of the discussion is available on YouTube.

This article gives the views of the author, and not the position of the Nexus blog, nor of the International Institute for Applied Systems Analysis.

Bringing satellite data down to Earth

By Linda See, IIASA Ecosystems Services and Management Program

Satellites have changed the way that we see the world. For more than 40 years, we have had regular images of the Earth’s surface, which have allowed us to monitor deforestation, visualize dramatic changes in urbanization, and comprehensively map the Earth’s surface. Without satellites, our understanding of the impacts that humans are having on the terrestrial ecosystem would be much diminished.

The Sentinel-2 satellite provides high-resolution land-cover data. © ESA/ATG medialab

Over the past decade, many more satellites have been launched, with improvements in how much detail we can see and the frequency at which locations are revisited. This means that we can monitor changes in the landscape more effectively, particularly in areas where optical imagery is used and cloud cover is frequent. Yet perhaps even more important than these technological innovations, one of the most pivotal changes in satellite remote sensing was when NASA opened up free access to Landsat imagery in 2008. As a result, there has been a rapid uptake in the use of the data, and researchers and organizations have produced many new global products based on these data, such as Matt Hansen’s forest cover maps, JRC’s water and global human settlement layers, and global land cover maps (FROM-GLC and GlobeLand30) produced by different groups in China.

Complementing Landsat, the European Space Agency’s (ESA) Sentinel-2 satellites provide even higher spatial and temporal resolution, and once fully operational, coverage of the Earth will be provided every five days. Like NASA, ESA has also made the data freely available. However, the volume of data is much higher, on the order of 1.6 terabytes per day. These data volumes, as well as the need to pre-process the imagery, can pose real problems to new users. Pre-processing can also lead to incredible duplication of effort if done independently by many different organizations around the world. For example, I attended a recent World Cover conference hosted by ESA, and there were many impressive presentations of new applications and products that use these openly available data streams. But most had one thing in common: they all downloaded and processed the imagery before it was used. For large map producers, control over the pre-processing of the imagery might be desirable, but this is a daunting task for novice users wanting to really exploit the data.

This slideshow requires JavaScript.

In order to remove these barriers, we need new ways of providing access to the data that don’t involve downloading and pre-processing every new data point. In some respects this could be similar to the way in which Google and Bing provide access to very high-resolution satellite imagery in a seamless way. But it’s not just about visualization, or Google and Bing would be sufficient for most user needs. Instead it’s about being able to use the underlying spectral information to create derived products on the fly. The Google Earth Engine might provide some of these capabilities, but the learning curve is pretty steep and some programming knowledge is required.

Instead, what we need is an even simpler system like that produced by Sinergise in Slovenia. In collaboration with Amazon Web Services, the Sentinel Hub provides access to all Sentinel-2 data in one place, with many different ways to view the imagery, including derived products such as vegetation status or on-the-fly creation of user-defined indices. Such a system opens up new possibilities for environmental monitoring without the need to have either remote sensing expertise, programming ability, or in-house processing power. An exemplary web application using Sentinel Hub services, the Sentinel Playground, allows users to browse the full global multi-spectral Sentinel-2 archive in matter of seconds.

This is why we have chosen Sentinel Hub to provide data for our LandSense Citizen Observatory, an initiative to harness remote sensing data for land cover monitoring by citizens. We will access a range of services from vegetation monitoring through to land cover change detection and place the power of remote sensing within the grasp of the crowd.

Without these types of innovations, exploitation of the huge volumes of satellite data from Sentinel-2, and other newly emerging sources of satellite data, will remain within the domain of a small group of experts, creating a barrier that restricts many potential applications of the data. Instead we must encourage developments like Sentinel Hub to ensure that satellite remote sensing becomes truly usable by the masses in ways that benefits everyone.

This article gives the views of the author, and not the position of the Nexus blog, nor of the International Institute for Applied Systems Analysis.