39
TERRA SEISMIC
Using Big Data To Predict Earthquakes

Background

Terra Seismic are a Jersey-based company established in 2012 with the aim of improving the early detection of natural disasters caused by seismic activity, such as earthquakes and tsunamis. Their mission is “to reduce the risk and damage of earthquakes”. They carry out Big Data analysis of environmental factors and historical data to forecast the likelihood of quakes, and make the data available to the public through their Web portal at www.quakehunters.com.

What Problem Is Big Data Helping To Solve?

Earthquakes and the associated problems they cause – such as tsunamis, aftershocks and public health emergencies – take a tremendous toll on human life. In 2014, 16,674 people lost their lives to them. The rate of fatalities has increased gradually over time, despite advances in medical science and emergency response, owing to increased population density in areas affected by seismic activity. There is also a huge commercial cost in terms of infrastructure damage and emergency response work. Almost a quarter of a million people were made homeless by the 2011 Tōhoku, Japan earthquake. On average, this financial cost is thought to amount to around $13 billion per year.

Developing nations are often hardest hit by these calamities, and the cost of emergency response and rebuilding infrastructure puts further strain on economies, leading to a further spread of hardship. Despite huge amounts of research over many years, until recently many geologists and other academics have believed that earthquakes are, largely, impossible to predict.

How Is Big Data Used In Practice?

Terra Seismic have developed technology that they refer to as “Satellite Big Data” which, they say, can predict earthquakes anywhere in the world with 90% accuracy. To do this, their algorithms monitor live streaming data from satellite images and atmospheric sensors, and analyse this alongside historical data from previous quakes. Atmospheric conditions can reveal telltale patterns of energy release and even unusual cloud formations can give clues to when a quake will occur. When predictive modelling techniques are applied to this amalgamated data, far more accurate predictions can be made.

The predictions made by Terra Seismic are used by insurance companies to accurately assess risks of coverage in areas prone to seismic activity. Hedge funds and traders also use them as part of their analysis of how natural disasters affect financial markets, and multinational companies use them to assess their own exposure to risk. In addition, all of the information on forthcoming quakes is made available to anyone who wants it, for no cost, through the Web portal. Government agencies, charities and disaster relief coordinators can all access and make use of it from there.

What Were The Results?

Terra Seismic say that since they began testing their technology in 2004 it has predicted 90% of major earthquakes. Most quakes with a magnitude of 6-plus on the Richter scale can be accurately predicted to within one and 30 days. When I spoke to CEO Oleg Elshin, he told me that recent successes had included the prediction of the magnitude 6.4 quake which had hit Indonesia on 3 March 2015. Major earthquakes accurately predicted the previous year include the 8.1 magnitude “megaquake” that hit the Chilean Tarapacá region and the 7.2 quake in Guerrero, Mexico.

What Data Was Used?

Data from environmental monitoring stations on the ground in key areas of seismic activity, live streaming satellite images and historical seismic activity data are all captured and monitored.

What Are The Technical Details?

In order to extract insights into the probability of earthquakes striking at particular locations, Terra Seismic have created open-source custom algorithms using Python. These algorithms process large volumes of live satellite data every day, from regions where seismic activity is either ongoing, or expected. Data is stored and distributed from Terra Seismic’s in-house Apache servers.

Any Challenges That Had To Be Overcome?

Historically, earthquakes have struck without warning, and academics and experts have put forward the argument that they are essentially impossible to predict.1 This is largely because of the huge number of factors which are thought to contribute to causing them, and many of these are not properly understood. Although throughout history signs have been documented which could be considered warnings (such as snakes pouring from the ground before the 1975 Haicheng, China earthquake), no scientifically valid method of reliable prediction had been developed.2 Terra Seismic’s challenge is to show that Big Data analysis can provide the reliable, accurate and repeatable predictions needed to properly implement disaster relief, management and rebuilding.

What Are The Key Learning Points And Takeaways?

Don’t believe anything can’t be done until you have tried to do it yourself! Predictive modelling and statistical analysis, backed by large amounts of real-time, unstructured data, are showing us that many things can be accomplished which were previously considered impossible.

Real-time analysis of unstructured data (in this case satellite images) can produce unexpected results. Humans may not recognize that a certain pattern of activity in the data correlates to a particular likelihood of an event taking place. But if there is a correlation then a computer will be able to spot it.

REFERENCES AND FURTHER READING

  1. Alden, A. (2016) Earthquake prediction: Mission impossible, http://geology.about.com/od/eq_prediction/a/aa_EQprediction.htm, accessed 5 January 2016.
  2. Shou, Z. (1999) The Haicheng earthquake and its prediction, http://www.earthquakesignals.com/zhonghao296/A010720.html, accessed 5 January 2016.

Here is the site where the data and predictions about earthquakes is made available:

  1. http://quakehunters.com/
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.117.186.171