`

Goldsmith: 'Big data' to reinvent government

August 3, 2013
big-pic-goldsmith-080513-2col.jpg

While some Americans question the National Security Agency’s habit of amassing citizens’ phone records, former Indianapolis Mayor Steve Goldsmith urges city governments to dive into “big data.”

“There are just huge opportunities here to make a difference,” Goldsmith said in a phone interview.

Goldsmith AP Photo Goldsmith

Goldsmith is project director of Harvard Kennedy School’s Data-Smart City Solutions initiative, a recently launched national repository of case studies.

Of course, cities have collected data on crime and property and citizen complaints for decades. But now some are analyzing numerous sources to predict future activity.

In one example promoted by Data-Smart City Solutions, the city of Chicago is developing computer models to predict rat outbreaks.

While Goldsmith, 66, is still influential in Indianapolis, which he considers home, his primary job is teaching in Harvard’s Kennedy School of Government, giving him a perfect bully pulpit to push big data.

He also was just named managing director at Chicago-based Huron Consulting Group, where he will work on operational excellence in the public sector and higher education. He has a dozen consulting contracts, including one with software giant SAP, according to a disclosure he filed with Harvard.

Indianapolis residents remember Goldsmith, who was mayor from 1992 to 2000, for privatizing city services, but his advocacy of data-driven government goes back to his days as Marion County prosecutor. He said his office drove annual child-support collections from less than $1 million to $30 million by pulling numerous information sources and finding parents who hid under assumed names or took part in the cash economy.

“The learning in that experience was that if you analyze data from lots of different places, you can solve important public problems in a new way,” he said.

Indianapolis collects volumes of data about trash services, animal problems and the like through the Mayor’s Action Center. Sarah Taylor, director of constituent services, said she’s following Data-Smart Cities for ideas.

Goldsmith was here in May for the Association of Government Contact Center Professionals’ conference, Taylor said, and he was by far the most popular speaker. “He always, I think, is pretty inspirational as far as what does the future hold at a local level.”

Big data, big trouble?

While it’s hard to argue with more effective government, researcher Anthony Townsend said there are plenty of reasons for caution about big data.

Cities’ decisions could be misguided by faulty data collection and analysis, neither of which receives much scrutiny by the public, said Townsend, research director at Institute for the Future, a Palo Alto, Calif., think tank.

There was a similar wave of excitement in the 1960s, when computers were introduced to government, he said, and it resulted in a well-documented policy disaster.

Based on computer models, the New York City Fire Department closed a number of stations in poor neighborhoods, leaving them more vulnerable to fires. Journalist Joe Flood’s book, “The Fires,” describes how 600,000 people were displaced by fires in the ensuing decade.

“I’m convinced those kinds of things are going to happen again, probably on an even larger scale,” Townsend said.

Goldsmith is excited not only about the possibilities for big data analysis but also its increasing accessibility. Because of cloud-based computing, sophisticated analytical software is much more affordable, he said.

Both SAP and IBM see local governments as untapped markets for their big data solutions.

SAP last year helped the city of Boston roll out Boston About Results, which includes performance scorecards and new mobile applications for constituents. The initiative stemmed from Boston’s $650,000 contract with SAP to improve data analytics, the Boston Globe reported.

Goldsmith said using big data requires inter-agency cooperation, typically spearheaded by the mayor. While he was deputy mayor of operations in New York City, he started an initiative to zero in on illegally converted apartment buildings most vulnerable to fires.

“No one agency was capable of doing it on its own,” he said. “It took a group of really smart people working in the mayor’s office.”

Cities can’t afford the kind of talent the private sector hires to work on big data, but Goldsmith thinks they can accomplish a lot by attracting young, tech-savvy people who are willing to spend a few years in government to make a difference.

Consultants fill in the gaps. “It still can be pricey, but if you can do your work 20 percent better, you can save hundreds of millions of dollars,” Goldsmith said.

Privacy concerns

Townsend worries that cities will end up handing their “brains” over to the private firms.

“A lot of the business models, they’re consulting arrangements, and they’re consulting arrangements that involve a lot of proprietary technology,” he said.

The fact that software firms promote using the cloud, a network of remote servers, only heightens privacy concerns, Townsend said. Local governments and citizens would never know if the NSA subpoenaed cloud vendors like Google and Amazon, he said.

“They’re going to want toll records. They’re going to want utility bills. They’re going to want all this stuff local governments collect on a regular basis.”

Goldsmith said recent revelations about the NSA’s data collection make promoting big data “more complicated.”

“There is always a possibility of privacy invasion,” he said. “It’s important that government be sensitive to that.”

But Goldsmith doesn’t think it should deter cities from collecting even more data, through mobile phones and sensors in the field.

“It’s necessary if you want to improve services,” he said.

Indianapolis isn’t working on anything that Public Safety Director Troy Riggs considers big data, but he’s paving the way for that possibility. Riggs assembled a task force, led by Department of Code Enforcement Director Rick Powers, to marry crime data with the city’s other data sources to identify “hot spots.”

“If we have a crime issue in an area, doesn’t it make sense that we have a code enforcement issue and an animal control issue as well?” Riggs asked.

The data merger is possible because the Department of Public Safety is installing a computer-aided dispatch system by InterAct, based in Winston-Salem, N.C. The city has a 10-year, $12.7 million contract with InterAct.

Riggs thinks it will take at least two years to build the data system he envisions. It’s a necessary exercise before the city can do predictive analysis, he said. Riggs said he’s not impressed by the software that’s on the market now.

IBM promotes predictive policing in an ad that shows an officer timing his coffee break to head off a robbery. The would-be criminal turns on his heel when he sees the cop leaning against his patrol car, patiently sipping.

“There’s nothing that good,” Riggs said.•

ADVERTISEMENT

Recent Articles by Kathleen McLaughlin

Comments powered by Disqus