Hurricane Irma was the largest Atlantic storm in the last decade. It was the 5th costliest hurricane on record in the U.S.A, causing $5 billion of damage after hitting the south west coast of Florida and moving up through Atlanta and Georgia. A 3m storm surge in the Florida Keys caused damage to approximately 90% of all buildings and collapsed 25%. Disaster response teams struggled for weeks to clear debris and distribute aid. A key requirement for dealing with the destruction was access to timely, accurate information about ground conditions after the storm.
Satellite derived Earth Observation (EO) data can fill this need and has become an indispensable part of disaster response and management. The International Disasters Charter, which provides EO data in the case of a disaster, has been activated 6 times in 2018 already. When hurricane Irma hit in 2017, the charter was activated, and a variety of EO analysis initiatives were started by groups including M.I.T. and Oxford’s Machine Learning Group. A mix of crowd-sourced and machine learning methods were used to guide the rescue efforts of “Rescue Global”, a disaster risk reduction and response charity. The crowd-derived results were analysed using machine learning algorithms to generate heat maps of areas where aid was needed most. Such a “Planetary Response Network” is an excellent example of how remotely sensed data can be an invaluable tool in disaster response, however, it is still reliant on user input. What if new data could identify damage automatically, and create data products for disaster response teams without manual interpretation?
Research has attempted to address this issue, but, a fundamental problem of damage identification is that, in many cases, it cannot be seen from above. Collapsed roofs, for instance, are notoriously difficult to identify because, without height information, collapsed and intact roofs can look very similar. If a digital surface model (DSM) could be generated from one satellite pass shortly before a disaster and another after, the change in height between the two could be used to identify building collapse. This would require stereo data acquisition from a constellation with rapid re-visit times and an automated system for generating DSMs, something that is not currently viable.
“Video from Space” could change this. Multiple frames from a video sequence can be used to derive accurate DSMs –through photogrammetry. Earth-i’s new Vivid-i constellation, with revisit times of up to 4 times a day, will be able to generate accurate DSMs several times a day. In disaster hit cities this information could be an invaluable tool for identifying damage and planning response. Even without high level processing, timely video feeds could show traffic movement and other local ground activity vital for disaster management. The human activity level, visible in a time based video sequence will help immensely with the response activity.
Earth-i is looking to maximise the utilisation of EO data. We are redefining the limits of what Earth Observation can do with new technology and innovation. Our work will provide unique solutions to the global challenges we face today. To have an exclusive ‘first look’ preview to the first video imagery captured by VividX2, please enter your email address below.