Home arrow Projects arrow Projects arrow Mission with USGS iCoast
Mission with USGS iCoast PDF Print E-mail
GISCorps Crowdsource Project Documents Coastal Changes Due To Hurricane Sandy

By David Litke, GISCorps volunteer

Summary

In September 2014, GISCorps received a request from the U.S. Geological Survey for a group of volunteers to take part in a pilot study for the crowdsourcing project called “iCoast - Did the Coast Change?” This web application asks volunteers to identify changes to the coast by comparing and tagging aerial photographs taken before and after storms. This data will help USGS scientists improve the accuracy of USGS coastal change prediction models and vulnerability assessments that support pre-storm planning and post-storm rescue, recovery, and mitigation efforts.  During five weeks in the fall of 2014, 111 GISCorps Volunteers classified 7,634 photos taken before and after Hurricane Sandy using the tools built into the iCoast website.

Introduction

The USGS Coastal and Marine Geology Program (CMGP) supports research to understand coastal and marine environments. One important theme of this research is to understand catastrophic coastal change caused by tsunamis, hurricanes, and extreme storms. To this end, computer models have been developed and calibrated using data from historical storms, to predict the magnitude and geographic distribution of dune erosion, overwash, and inundation in response to future storm events.

The CMGP describes the need for crowdsourced data as follows:

“Since 1995, the U.S. Geological Survey (USGS) has collected over 140,000 aerial photographs of the Atlantic and Gulf coasts before and after 24 extreme storms in order to assess coastal damages. The USGS has not been able to use these images to the fullest extent due to a lack of the information processing capacity and personnel needed to analyze the thousands of images they collect after each storm".

In 2014 the USGS developed a new crowdsourcing application called iCoast (figure 1) to test the efficacy of online processing of coastal photographs by citizen volunteers. As a test case, 7,941 photographs taken after Hurricane Sandy along the coast from North Carolina to Massachusetts were selected for processing. Hurricane Sandy, which hit the Atlantic coast of the United States on October 29, 2012, caused 117 deaths, and damaged 200,000 homes at a cost of more than $50 billion.

The iCoast website is an open-source application written by the USGS using the PHP language. The website was designed to be self-contained and easy to use even by those with limited knowledge of coastal processes: extensive help is available through a Frequently Asked Questions page, and through pop-up windows that display text-and-photo descriptions of each technical term, such as “Barrier Island”, “Dune Scarp”, and “Overwash”.

Figure 1. The iCoast Website Home page

Processing Images

Processing of the oblique aerial images occurs in five steps. The first step is to select a pre-storm photo that closely matches the geographic extent of the post-storm photo (figure 2).

Figure 2. The first step in processing an image is to match the post-storm photo with a pre-storm photo.

Steps two through five involve assigning tags to the photo; that is, various technical keywords are assigned to each photo by clicking on a button. For example, step two asks the volunteer to identify and tag the type of coastal landscape in the photo and the level of development (figure 3).

Figure 3. Step two: tagging the photo for type of coastal landscape and the level of development.

In steps three through five, photos are tagged for impacts to coastal infrastructure, changes to coastal landforms, and other relevant changes (for example, changes in sand and vegetation quantity, visible debris, and construction activity). Processing a photo usually takes from two to ten minutes depending on the complexity of the photo and the experience of the volunteer. 

For quality control, the USGS designed the iCoast procedure to allow a photo to be processed (classified) by multiple individuals; some photos are processed as many as 20 times. The USGS will examine these results to determine the uniformity of tagging between multiple individuals. Each volunteer fills out a profile page which includes level of Coastal Expertise, so tagging by experienced individuals can be compared to tagging by less experienced individuals.

GISCorps Recruitment

The iCoast website was launched in July 2014. Between July and September, more than 344 citizen volunteers had done 3,504 classifications on 2,530 images. In September, the USGS asked GISCorps to provide volunteers to work on the project. GISCorps sent out a recruitment email to the 2,400 members in the United States and in a week had 135 replies. Of these, 111 became active participants in the GISCorps/iCoast project, which was designed to run from September 24 to October 29, 2014 (the two-year anniversary of the hurricane). The GISCorps project manager was David Litke, and the USGS iCoast project manager is Sophia Liu. The names of the 111 GISCorps volunteers are listed at the end of this report.

Project Management

Because the USGS had already developed a citizen-friendly website, GISCorps volunteers could begin work immediately and work independently. A Google Sites website (figure 4) was set up to provide instructions to GISCorps volunteers.

Figure 4. GISCorps/iCoast Project website.

This project website contained an introductory page (“GISCorps/iCoast Project”), a page listing steps on how to get started in the project (“Getting Started”), and a page for asking and answering questions (“Q&A”). The Getting Started page included a link to a Google Sheet where volunteers could enter/verify their personal information, including their preferred email address, and the user name of their iCoast Account. The Q&A page had links to website sub-pages for pre-defined question categories (for example, “Questions on Getting Started”, and “Selecting a photo”); each of these sub-pages in turn was a Google Sites Announcement type page which allowed for Posting and Replying to posts.

Results

Although the official end date of the project was October 29, several volunteers expressed an interest in continuing their work, and the unofficial end of the project occurred at about the end of November when all 7,941 photos had been processed. By this time, the 111 GISCorps volunteers had made 25,987 classifications on 7,634 photos; thus GISCorps volunteers made 60 percent of all iCoast classifications, and worked on 93 percent of all of the photos. These results indicate how a rapidly-deployed and motivated group of volunteers can quickly achieve a goal. Being motivated, and sustaining motivation, can be issues in a crowdsource project, but for this project it appears that volunteers began already motivated (being a GISCorps member suggests a volunteer already is motivated to contribute toward a common good), and stayed motivated, partly due to the iCoast website “My iCoast” page (figure 5), which informs the volunteer of progress and relative accomplishments (rank) relative to all other volunteers. GISCorps volunteers may also be motivated because their volunteer hours can count towards their progress in obtaining professional GISP certification. As typical for crowdsource projects, a few very motivated volunteers contributed a substantial proportion of the work: the top 10 GISCorps volunteers contributed almost 60 percent of the GISCorps classifications.

Project Evaluation

At the end of the project, an online questionnaire was sent to all participants, asking them their thoughts about the project, and 74 volunteers responded to the questionnaire. Volunteers reported spending from 3 to 400 hours on the project, for a total of 1,850 hours. Volunteers agreed that the iCoast website was designed very well, and also performed well, although at times there were connection problems, and working on a large screen was much easier than on a small screen or a tablet. Volunteers felt the on-screen help worked well, but that some of the technical terms needed more complete definitions. Many felt that extracting information from the photos was sometimes difficult due to: 1) poor matches between pre-storm and post-storm photos, 2) poor resolution of photos relative to information being sought,  3) pre- and post-storm photos taken at different seasons, and 4) lack of comparable scales and reference points (for example zero tide stage) between photos. Volunteers felt that project management and communication were good, although not much of either was needed since working independently was easy; some volunteers wished for more frequent status emails during the course of the project. Most volunteers enjoyed the opportunity to help out on a worthwhile cause, and felt they learned a lot while working on the project. As one volunteer put it:

“Seeing is believing. Through participation in [the] iCoast project we learn the power of nature and how it impacted our coastlines; and more importantly we can change the world one step at a time by volunteering in such projects."

Figure 5. iCoast website My iCoast page summarizing a volunteer’s accomplishments.

Project Participants

Many thanks to the GISCorps volunteers who made this project possible: Alex Woldemichael (Maryland), Alice Pence (Idaho), Amanda Grimm (Michigan), Andy Priest (California), Ang Gray (Maryland), Angelina Wainhouse (Washington), Anna Pavlenko (Colorado), Arindam Majumdar (Georgia), Bill Raymond (Massachusets), Bill Schell (Florida), Brandon Elliott (Texas), Brenda Rahal (Colorado), Brian Kranick (Washington), Bridget Kelly (New York), Brooke Elliott (Texas), Brynn Lacabanne (Idaho), Carlie Hulbert (Florida), Cequyna Moore (New York), Chandreyee Lahiri (Massachusets), Chelsea Wu (California), Chris Bischak (Virginia), Chris Jung (Texas), Chris Nelson (California), Christian David (California), Chuck Perino (Oregon), Cinde Morris (Minnesota), Dan Teodor (Michigan), David Hansen (California), David Litke (Colorado), David Lok (California), Deanna Burke (Washington), Elisabeth Smith (Nebraska), Eliza Vermillion (Texas), Eric Gustavus (Ohio), Eric Peńa (Texas), Erik  Hubl (Nebraska), Esther Mandeno (California), Esther Olson-Murphy (New Hampshire), Eva LIpiec (Oregon), Fekadu Wondem (California), Felix Lopez (Missouri), Gary Miller (Massachusets), Gergely Puskas (Nevada), Giovanni Marrero (Georgia), Gouri Vishnubatla (Georgia), Grace Lee (New York), Ilan Segal (Florida), Ivy Lee (Florida), James Smith (Michigan), Janet Vaughn (Washington), Janice Casil (California), Jeffrey Pires (Massachusets), Jeffrey Stevens (Florida), Jeniffer Velez (Florida), Jennifer Austin (Maine), Jennifer Horsman (Colorado), Jennifer Oblinger (Georgia), Jenora D'Acosta (Arizona), Jonathan Ellinger (Washington), Joshua Howell, Katherine Loving (Virginia), Kathryn Butler (South Carolina), Keith McCrary (Maryland), Kristen Lok (California), Larry Mcarthur (New York), Laura Horton (California), Leeroy Cotton (Illinois), Liwei Fu (Utah), Liz Ducey (Maryland), Maria Josefson (Massachusets), Marina Dimitrova (Colorado), Marion Noble (Texas), Mark Avery (New York), Matt Downing (Illinois), Matthew Pare (North Carolina), Megan Hicks (Pennsylvania), Melanie Feliciano (Virginia), Michele Jett (South Carolina), Michelle Boivin (Louisiana), Michelle Morawski (Virginia), Miguel Castrence (Hawaii), Mike Inman, Mike Liska (Iowa), Mohammed Abdul Kaleem, Mouhamad Diabate, Nadeem Kazmi (Oregon), Nicole Ceranek (Texas), Pamela Ordung (Massachusets), Paxton Neubert (Colorado), Rachel Stevenson (Colorado), Ramachandra Sivakumar (Georgia), Rebecca Chen, Rob Bohon (Utah), Salem Beyene (Virginia), Sandi Mitchell (Colorado), Sarah Finne (Kentucky), Sebastian Dudek (Washington), Shannon Gonzales (California), Shirley Perez (Texas), Shuk Wai So, Soumya Dharmavaram (Pennsylvania), Tarig Ahmed (Texas), Thomas Young (Florida), Tiffany Lok (California), Todd Patterson (Pennsylvania), Tom Carlson (Washington), Traian Dragomir (Virginia), Veronica Tangiri (Virginia), Zack Robison (Wisconsin).