Web Powers Disaster Recovery Swarm
Web Powers Disaster Recovery Swarm
BY Editors | Thursday, October 6 2005
It’s widely acknowledged that when Hurricane Katrina made landfall the morning of Monday, August 29, recovery forces on the ground were not prepared. According to Jeff Jarvis, a blogger prominent enough to appear on cable news behind the label “Blog Daddy,” the Web was not ready either. Jarvis, a veteran exec of both old and new media, opined on his BuzzMachine blog that if the Web community “[had] truly learned the lessons of the tsunami and even 9/11, there was more we could have done to be ready to help”
Putting those words to action, Jarvis and others in the Web tech community are set to converge this evening for what they’re calling Recovery 2.0, a sub-conference of sorts taking place during San Francisco’s larger Web2.0 Conference. Their mission: to initiate discussion about how developers, relief organizations and media can respond more efficiently to disasters in the future.
Doc Searls, another prominent tech pundit and blogger and who signed on to the Recovery 2.0 effort, has declared Katrina the start of a “War on Error,” waged against those who failed to predict and prepare for tragedies in their jurisdiction. “When the blaming stops and the fixing truly begins,” Searls wrote in his blog, “we'll need more than our government organizations to step forward. As citizens…we will need to do what government simply can't do.”
Good Intentions Lead to Inefficiencies
It’s not that the Web hasn’t been used by thousands in Katrina’s aftermath – efforts at Web-assisted recovery have been legion. Craiglist’s New Orleans Lost & Found site, which before the storm hosted two or three missing pet notices a week, alone logged 600 posts on August 30 seeking missing friends and family; it logged 1500 more on August 31.
Nola.com, the “Everything New Orleans” website operating in association with The New Orleans Times-Picayune, became for a time the largest center of activity, generating lists of the missing and hosting the vital reportage of the newspaper, whose presses had been flooded.
Across the Web, people location boards for survivors have mushroomed: the American Red Cross is running its Family Linking site in conjunction with Microsoft; Yahoo and Google each have one; news networks, universities, non-profit organizations, individuals, and ad-hoc evacuee groups all ran their own find-your-loved-ones-here site.
However, the enormity of the Web’s response was exactly its problem. As David Geilhufe, the lone man behind nonprofit and non-governmental software operation, Social Source Software, pointed out on his blog, “Refugees can search 20 websites for lost relatives and still miss an entry on the 21st.” There was seemingly no way to aggregate missing persons data spread out over dozens of sites in a variety of formats into a single, searchable database.
In the midst of developing a post-Katrina community site which was to host its own missing persons message board, Geilhufe realized the last thing evacuees needed was another list to scan. The solution, as Geilhufe noted in the blog, was “a data standard with the right fields so that all these bulletin boards and online databases could interoperate.”
So, Geilhufe and his crew put together a list of some 50 missing persons sites, recruited management software company, Salesforce.com, to build the database’s back-end, and contacted key members of the blogosphere to recruit volunteers. The result was The Katrina People Finder Project, a spontaneous effort resulting in a more comprehensive evacuee search engine at katrinalist.net.
Experts in the Web community began tag-teaming onto the project – folks from classified listings site Craigslist wrote code, affiliates of Harvard University provided data entry expertise and coordination, and the online community site Omidyar.net provided a temporary home for the project.
“Very interestingly – though it is a problem that can be solved by technology – we solved it with 2000 volunteers doing data entry,” Geilhufe told PDF.
The Swarm Makes Due
Working with Jon Lebkowsky, CEO of information management firm, Polycot Consulting, Ethan Zuckerman, a research fellow at the Berkman Center for Internet and Society at Harvard Law School, had about 2000 volunteers ready to start work on the afternoon of the September 3. Volunteers with varied tech skills signed up online and were each given a set of posts to go through, “scraping” pertinent data. Those data were then fed into the standard PeopleFinder Interchange Format, developed for the project virtually overnight by open source tech outfit, CivicSpace’s Kieren Lal, Salesforce.com’s Jonathan Plax, and Berkley student Ka-Ping Yee. Two days later, 10,000 entries had been compiled.
As of October 2, 3,000 volunteers had entered more than 649,000 missing and found persons records into the online database, which had been searched over one million times since launching September 6th. One dynamic Geilhufe credits for the accomplishment is the dynamic of “heterogeneous action by uncoordinated people.” Jarvis calls it the “swarm ethic.”
“One thing the internet does well is bring people together around shared interests, needs, functions…. We swarm around standards and make them standard,” Jarvis wrote on BuzzMachine.com. “It’s efficient. It’s good citizenship….Swarming is the way we capture not just the wisdom but also the work of the crowd.”
“Efficient,” of course, is a relative term. What might be efficient for finding a good mp3 source can be woefully inadequate in an emergency. As Zuckerman told PDF, “There was one team entering data into the database, another team porting data from the Red Cross’ database, there was another team trying to make the data searchable, and those teams didn’t always manage to work together.” As a result, data that had been picked and pruned remained inaccessible to those who needed it. Zuckerman and Geilhufe both believe that a set of new tools – like the PeopleFinder Interchange Format – will be developed in the aftermath of the Project.
Carrying through with the Recovery
Though comparisons between the speed of the project’s response and the speed of the government’s are inevitable, Zuckerman is not quick to judge. Though he believes “a lot people can learn from what we did well,” he said, “In fairness…we weren’t exactly pulling people’s bodies out of toxic waste. We were entering data.” Plus, Zuckerman added, "We didn’t have anyone to answer to – and that’s an enormous difference.”
The role of organizations like the government and the Red Cross, as Geilhufe sees it, is command and control. But he wonders, “how do you allow organizations like the Red Cross or government to be command and control, but still harness the power of this distributed effort?” He thinks data standardization is the answer.
There is evidence that command and control operations have at least taken notice of the swarm’s potential. On September 21, Geilhufe reported in his blog that the data gathered by the project is currently ”being processed by IBM and the San Diego Supercomputer Center to form part of a central database of evacuees for the Red Cross and Microsoft.” This evacuee locater database, operated by the American Red Cross and Microsoft, is accessible online at Katrinasafe.com.
It appears that Katrina evacuees and their loved ones aren’t accessing the People Finder database as much as the better-known “command and control” effort at Katrinasafe.com. According to website traffic measurement company, Alexa, which measures a website’s average Internet reach over a three-month period, the Katrina People Finder Project’s Katrinalist.net site has reached an average of 4.25 percent of Internet users since its launch in early September. In comparison, since its launch around the same time, Katrinasafe.com has reached 11 percent of Internet users.
Perhaps a lack of promotion is to blame for Katrinalist.net’s narrower reach. “By the time the site was up and populated,” Geilhufe told PDF, “most of the main organizers were burnt out and no one stepped up to do the type of outreach campaign required to make every evacuee aware of Katrinalist.net. One of the weaknesses of an all volunteer effort… if no one steps up, it doesn't get done.”
In the aftermath of Hurricane Rita’s wrath, and the threat of more high-level hurricanes to come, members of the swarm attending Recovery 2.0, according to their wiki, hope to facilitate relief coordination and information sharing and to connect people to such efforts.