You’ll know by now that six scientists and a government official have been found guilty of manslaughter and sentenced to six years in prison for how they assessed and communicated risk prior to the L’Aquila earthquake that killed 309 people in 2009.
So what can we, in earthquake science, take from these convictions? If we choose to educate the public during an earthquake sequence – as I did ahead of last year’s Christchurch earthquake – how responsible are we for any apparent surprises that eventuate?
Do scientists have a right to be wrong? Does state-of-the-art science really enable us to speak confidentially about “low-probability” events? And is there a way to better communicate risk to the general public?
The Italian verdict was not due to the scientists’ failure to “predict” the earthquake, which most scientists generally agree is not possible with our current knowledge of precursory phenomena (such as foreshocks, radon gas release, and other seismic activity).
Rather, the prosecutor reasoned that “inadequate” risk assessment and scientifically incorrect messages were given in public statements prior to the earthquake.
After months of small (magnitude 3.5 to 4.1) earthquakes leading up to the L’Aquila earthquake, one of the individual’s indicted, Bernardo De Bernardinis, stated: “the scientific community tells me there is no danger because there is an ongoing discharge of energy …”. The inferencence was that these small earthquakes were reducing the possibility of a major quake.
Certainly a statement from one member of the group at the time of the L’Aquila quake – Enzo Boschi, then-president of Italy’s National Institute of Geophysics and Volcanology (INGV) in Rome – was both well-balanced and informative:
It is unlikely that an earthquake like the one in 1703 [a devastating earthquake that previously hit L’Aquila] could occur in the short-term, but the possibility cannot be totally excluded.
Many seismologists consider that an increase in the frequency of small- to medium-sized earthquakes increases the chances of a large earthquake based on long-established fundamental relationships in seismology.
But this effect is relatively small in terms of an absolute probability and does not improve earthquake prediction.
In the past 60 years in Italy, only six of 26 major earthquakes have been preceded by foreshocks and many earthquake swarms have occurred without subsequent large earthquakes.
Italian scientists concluded that a medium-sized shock in a swarm forecasts a major event within several days only about 2% of the time.
If they had issued a specific warning that a major earthquake was coming in L’Aquila prior to the event, they would have had a 98% chance of being wrong.
The L’Aquila earthquake occurred on a previously identified and well-monitored fault zone in an area of elevated historical seismicity that was recognised as one of Italy’s most seismically dangerous regions.
From that perspective, the L’Aquila earthquake was no surprise, and the possibility of an earthquake of its magnitude occurring in this region following months of seismicity should not have been publicly dismissed.
A clear lesson here is that the general public should be made aware of all possible scenarios within an earthquake sequence, regardless of how small the absolute probability of certain scenarios may seem.
The prison sentence for the Italian scientists will seem overly harsh to most of us, given the highly stressful and complex scientific, societal, emotional, and political environment that develops during an earthquake sequence.
One wonders whether appropriate building codes were applied and enforced, and whether these scientists would have been off-the-hook if buildings had coped better with the seismic shaking of L’Aquila’s quake.
It takes a ton of courage for scientists to speak openly about low-probability scenarios, particularly if these comments are used to accuse scientists of scaremongering, and/or have detrimental impacts on earthquake recovery, such as decreasing investor and re-insurer confidence during the rebuild phase and increasing stress levels of local residents.
Once the sorrow of last February’s Christchurch earthquake subsided to a level where I could refocus, I conducted several media interviews and public talks.
Scientists throughout New Zealand made the public aware after the September 2010 Darfield earthquake in Canterbury, New Zealand that a magnitude 6 aftershock was possible.
The public was also aware potential scenarios could include a shallow earthquake in the region east of the Greendale Fault and that aftershocks beneath Banks Peninsula suggested elevated crustal stresses in that area were being partially accommodated by slip on northeast-oriented faults.
In an article written for the New Zealand Herald on September 8 2010, I stated that:
My optimistic guess is that we are unlikely to get an aftershock as big as a Mw 6 based on aftershock data from what I felt were similar earthquake sequences in Haiti and Mexico […] [but] we could get a bigger one months from now.
In retrospect I would have liked to have the former statement back, but in truth this was an example of locally based optimism at a time of heightened public anxiety. I do feel scientists have a right to voice well-grounded hypotheses, just as they have a “right-to-be-wrong”, provided the justification for said hypotheses and the range of possibilities are publicly presented.
The last decade has thrown up many seismic surprises, not least the magnitude 9.0 Tohoku earthquake in Japan that was preceded by a magnitude 7.2 foreshock, affirming that we still have much to learn about earthquake behaviour on our planet.
Having been through a catastrophic earthquake sequence beneath one of its major population centres, the New Zealand earthquake science community is better placed than before to answer the needs of the public.
Data sharing with the general public as quickly as possible via all media avenues is being increasingly recognised as an obligatory responsibility by many practising scientists, but significant barriers to this process remain.
Science does not move at the pace of the media, and science that requires substantial peer review may be less interesting to the broader public by the time it has undergone a lengthy review process.
In order to make money from expensive journal subscriptions, many publishers do not allow the authors of scientific articles to disseminate their original work publicly.
In my view, this is inappropriate in a post-disaster environment, where the affected public deserve the right to freely scrutinise the raw data that has so often been obtained with public funding.
Improvements can be made in the way we communicate earthquake science. Published, publicly available statements of “absolute probabilities” such as “there is a 9% chance of a magnitude 6 to 6.4 earthquake occurring in the Canterbury aftershock region in the next 12 months” should be contextualised against the “probability increase relative to pre-mainshock probabilities”.
This could be done with statements such as: “this probability is 100-times greater than the annual probability of a magnitude 6 to 6.4 earthquake occurring in this region prior to the Darfield earthquake”.
In this way, we might communicate that, while the absolute chance of a major earthquake is low, it is relatively high compared to the way it was before.
Better integration of fault geometries (or maps) and stress modelling will allow earthquake forecast models to be improved. Better understanding of the way seismic energy is soaked up on its way through the crust will provide important information to building codes and assessment of potential damage in future earthquakes.
When forces as great and unpredictable as earthquakes are involved there is no way – at present – to give all the answers ahead of time. But continuing enhancements to the way data is collected, analysed and delivered to the public will make things better.