5 Lessons from Chicago’s Open Data Evolution

January 2, 2018 10:04 am PST | Data as a Service

What’s it like to be a city’s first chief data officer?

Brett Goldstein’s roots are in the startup industry, where he was one of the early employees of OpenTable, the online reservation service. Goldstein also served for five years as a Chicago Police Officer, founding and commanding the CPD’s Predictive Analytics Group, and in 2011, Mayor Rahm Emanuel appointed Goldstein to be Chicago’s first-ever CDO — and one of the first city-level CDOs in the nation. Goldstein’s mission from Mayor Emanuel: use open data to make Chicago the most transparent and smart city in the United States.

At the 2017 Socrata Connect conference, Goldstein spoke with another of Chicago’s data stars: Charlie Catlett, Senior Computer Scientist, U.S. Department of Energy’s Argonne National Laboratory, Director of the Urban Center for Computation and Data, and the force behind Chicago’s groundbreaking Array of Things (AoT) project, which uses sensors placed throughout the city to track data on everything from pollution to traffic.

Goldstein and Catlett shared insights on how data programs progress on a maturity path, and wisdom from their time devoted to various data-driven projects in Chicago.

 

Chicago’s Data Evolution: From Releasing Data to Situational Awareness

After being appointed Chicago’s CDO, Goldstein’s first step was an overhaul of Chicago’s open data platform — Goldstein describes the experience as starting at “point zero” in the open data maturity journey.  

 

“Putting data out there is not good enough. Having a data platform where you get information from that data is much more important.” —Brett Goldstein, Chicago’s first CDO

 

“What we found in those years was that releasing data was great. Giving transparency was amazing. But what was next? Because putting data out there is not good enough. Having a data platform where you get information from that data is much more important,” says Goldstein.

As Chicago matured in its data evolution, it moved past simply releasing data to using it to create what Goldstein calls “situational awareness.” For example, the Windy Grid Project took all of the city’s data, and used it to answer a basic question: What is happening in a given place?

 

chicago windy city

 

“This situational awareness — or the ability deep dive and realize that all this data is in fact interdisciplinary — was a game changer for Chicago. We went from 2011, having really no data program, to as the next couple of years unfolded, having systems like WindyGrid, driven by data, which were able to drive all of our agencies and produce some of the analytics you see today.”

The first sensors for Catlett’s AoT project were installed well into Chicago’s evolution into data maturity, in the summer of 2016. Catlett describes the project as trying to achieve three data-related activities: measure basic data (like air quality and weather), enable people to do edge computing (writing software that analyzes images and sound at the spot where sensors are located), and provide a platform where people can plug in a new kind of sensor.

 

What Chicago’s Learned During Its Open Data Journey

Catlett and Goldstein 

From “ground zero” of data usage, to using data to power agency’s work, to mining data in innovative ways, what lessons have Goldstein and Catlett discovered?

1. Transparency is valuable: People can feel trepidation about the public presence of videos cameras and microphones. That’s why AoT had two years of public discussions before installing its first device.

AoT opts for a transparent process overall, disclosing its software (which anyone can download) and hardware (all open source), as well disclosing its policy development process and answering questions publicly on its website. AoT’s data is all open source and free to use.

2. Urban ideas need to be pragmatic. It’s important for public policy ideas to be workable under real world conditions, says Goldstein.

 

 “[Projects] have to start with something you’re trying to measure or fix.” —Charlie Catlett, Senior Computer Scientist, U.S. Department of Energy’s Argonne National Laboratory and Director of the Urban Center for Computation and Data

 

Before installing sensors, Catlett points out, the people involved in the AoT project spent two years asking the scientific community what data would be useful to collect and measure. Then, they asked the city — its partner — which of these various ideas from the scientific community matched challenges faced by city residents. “[Projects] have to start with something you’re trying to measure or fix,” says Catlett.

3. Have a message for the press. Catlett sees it as his responsibility to help the press find interesting angles for writing about AoT. “We worked very hard at a set of very clear messages about the value of what we were doing and our goals,” says Catlett.

4. Make the value of open data initiatives clear. “We realized that when people think about interacting with the city, it’s about paying taxes, a permit, a fine, or something like that,” says Catlett. That is to say, associations are not always positive. To show residents the value of installing sensors, therefore, Catlett focused on specific concerns — flooding, for instance — that affect a given neighborhood to show the utility of the sensors. 

 

“We start with not just the generic science problem to be solved, but what’s valuable to the residents of Chicago. When [residents] see we’re offering something we’re interested in, they take a different view.” —Charlie Catlett 

 

“We start with not just the generic science problem to be solved, but what’s valuable to the residents of Chicago. When [residents] see we’re offering something we’re interested in, they take a different view,” says Catlett.

5. Encourage data usage. City staffers aren’t the only users of data. There’s so much data that we don’t know how to use precisely, says Goldstein. That’s where opening up data to scientists, the community, economists, and policymakers can be helpful.

For WindyGrid, that means the next generation is OpenGrid — available on GitHub, where anyone can download it and make it their own. “Our strategy has been to push the data out into a platform that’s useful right off the bat — it goes to the Socrata data portal, it’ll go to Plenario [a platform that gathers up datasets onto a single map] — and then provide the APIs on the other end so that people can repurpose the data,” says Catlett.

 

Watch the Full Presentation

Hear more from Catlett and Goldstein about their experience as pioneers in Chicago’s open data world:


Join us in person at Socrata Connect ‘18 for thought-provoking presentations and conversations with peers. Save your seat!


Previous Article
Data as a Service
Connecticut Data Sparks Winning App for Opioid Crisis

January 5, 2018

Next Article
Data as a Service
10 Great Government Data Wins from 2017

December 27, 2017