California is suffering from a terrible drought, but it’s not the first, and it likely won’t be the last. While the Golden State has seen its share of bad droughts, the solutions for the water problems of today and tomorrow will likely be very different than those of the past. One of the biggest differences is in the use of data.
In a blog post, writer Peter Hartlaub remembers the California droughts of 1977 and 1991. In preparing the post, Hartlaub drew on his own childhood memories of putting bricks in toilet tanks and also combed the SFGate photo archives for photos illustrating the many ways in which people tried to conserve water and mitigate the effects of past droughts. One photo shows a young boy taking a “shortened bath,” with a box-like device taking up about half the space in the tub. Another shows a water-guzzling scofflaw watching as workers slap a “water restrictor” on her meter.
It may not hurt Californians to use some good old-fashioned bricks in their toilets or bath shorteners in their tubs, but today there is a much more sophisticated tool to ensure that the most effective measures are being put into place to conserve water.
“Back in the ’70s when we experienced that other big drought, there wasn’t the concept of big data,” said Fresno, Calif., CIO Carolyn Hogg in an interview with TechWire. “Today, we understand big data.”
Using big data, utility companies, government agencies and technology vendors are able to correlate sometimes seemingly disconnected information to visualize the root causes of problems and the pathways to potential solutions.
“Big data is not only about the high volume of data being collected at a rapid pace, but also the variety of data being brought to bear to understand and solve a problem,” says Elke A. Rundensteiner, director of the data science program at Worcester Polytechnic Institute, in Worcester, Mass.
“For instance, knowledge about community resources could range from connecting people that have special skills in crop production or that have available time with those in need,” she adds. “Or, knowledge about resources that may be available in one community but in short supply in another community could become critical in times of crisis.”
The East Bay Municipal Utility District, which serves Alameda and parts of Contra Costa counties in California, is using data to issue usage reports to its customers. During a year-long pilot program, the water company leveraged a reporting system developed by WaterSmart Software to track and disclose the water usage of 10,000 of its 650,000 customers.
“We use customers’ own data compared with what we know about what a typical water-conserving home might use,” says district spokesperson Andrea Pook. “The purpose of the program is to provide that background information to let them know whether they are being efficient or not.”
For example, two-person households that used more than the average 127 gallons of water per day received a statement with a worried-looking emoticon on it. The statement encouraged these customers to take certain actions to reduce their water consumption. The combination of big data (and perhaps a little bit of guilt) seems to have worked: The data-based usage reports helped to reduce water consumption by 5 percent. The utility said it believes the reporting system could eventually help the state meet its goal of reducing water usage by 20 percent per capita.
Pook says that the utility plans to add 25,000 to 50,000 customers to the program this year, and 100,000 customers by 2017. She added that the program was underway before the drought in California. “We’re finding that it’s a good thing to have in our pocket as we face a difficult year in terms of our water supply,” she says.
Of course, California isn’t the only area to be weathering climate- and weather-related problems, and the federal government is looking to big data — and the private-sector — to help solve those problems.
Climate.gov, part of the Climate Data Initiative, gives the public access to the government’s climate data and analysis of that data. Further, President Obama has encouraged the private sector to use the data to come up with solutions that will help the world deal with climate change.
The big picture
Fresno’s Hogg noted in her interview with TechWire that the use of big data can help mitigate issues relating to the current drought, but that those issues have a huge ripple effect that must be considered.
A task force including the USDA, Emergency Management System and City of Fresno is convening to see what lessons were learned from the last drought.
“The USDA has already captured that data to help our region prepare for this big crisis that is coming. It’s going to be twofold: It’s going to limit the number of crops we are going to be able to grow, but it’s also going to impact our workforce, because the same people who work the farms aren’t going to have the volume to work, so there’s going to be a higher percentage of unemployment, which means there’s going to be a greater need to feed that population, said Hogg. “So if we can at least minimize, or use some predictive analytics to minimize some of the results that are going to occur through this drought, then that’s going to place our region in a better spot.”
Big data will be a big help to Californians struggling through the current drought, and offers hope as a way to predict — or maybe even prevent — shortages and other problems in the future.
“Predictive analytics based on big data, such as historic trends of years or even decades of weather patterns, can be leveraged to help us understand and, in the long term, tackle the impact that our actions have on our environment not only today but also for future years,” says WPI’s Rundensteiner. “When given impact of our actions supported by data, this will ultimately influence the actions of individuals, policies and society as a whole.”Tags: Data Center,Technology