data collection

Learning To Be Smart: Using Data and Technology to Improve Services in Kansas City, Missouri, 2009 – 2019

Author
Tyler McBrien
Focus Area(s)
Country of Reform
Abstract

When Troy Schulte took over as interim city manager of Kansas City, Missouri, in 2009, the local economy was struggling and the government faced hard choices about how to use scarce resources. With a slashed budget and a diminished workforce, Schulte had to figure out how to deliver city services without reducing quality. Together with a small team of employees, he began to create a culture of data-driven decision making in municipal offices, to invest selectively in technology, and to give nonprofit organizations and firms an opportunity to develop their own, innovative solutions to city problems by making more information available to them. Schulte found a kindred spirit in Mayor Sly James, who negotiated a public–private partnership with a view to developing what Kansas City’s chief innovation officer called “the smartest 54 blocks in the country” along the city’s new streetcar corridor. As initial efforts came to a close and a new mayor entered office, Schulte and other officials stepped back to assess what they had learned. The new, data-driven culture had yielded positive improvements, whereas the technology-based smart-city initiative had had a more limited impact—at least in the shorter term. The experience generated important lessons about the scale of the benefits that technology could generate in midsize cities and in what kind of time frame.

Tyler McBrien drafted this case study based on interviews conducted in Kansas City, Missouri, in January 2020. Case published March 2020.

 

Making a Smart City a Fairer City: Chicago’s Technologists Address Issues of Privacy, Ethics, and Equity, 2011-2018

Author
Gabriel Kuris
Country of Reform
Abstract

In 2011, voters in Chicago elected Rahm Emanuel, a 51-year-old former Chicago congressman, as their new mayor. Emanuel inherited a city on the upswing after years of decline but still marked by high rates of crime and poverty, racial segregation, and public distrust in government. The Emanuel administration hoped to harness the city’s trove of digital data to improve Chicagoans’ health, safety, and quality of life. During the next several years, Chief Data Officer Brett Goldstein and his successor Tom Schenk led innovative uses of city data, ranging from crisis management to the statistical targeting of restaurant inspections and pest extermination. As their teams took on more-sophisticated projects that predicted lead-poisoning risks and Escherichia coli outbreaks and created a citywide network of ambient sensors, the two faced new concerns about normative issues like privacy, ethics, and equity. By 2018, Chicago had won acclaim as a smarter city, but was it a fairer city? This case study discusses some of the approaches the city developed to address those challenges and manage the societal implications of cutting-edge technologies.

Gabriel Kuris drafted this case study based on interviews he and Steven S. Strauss, Lecturer and John L. Weinberg/Goldman Sachs & Co. Visiting Professor at Princeton University, conducted in Chicago in July 2018. Case published September 2018.

 

From Saving the Census to Google Maps: The US Census Bureau's TIGER System, 1980-2010

Author
Pallavi Nuka
Focus Area(s)
Core Challenge
Country of Reform
Abstract

After the 1980 US national census, 53 state and local governments sued to correct alleged errors in the count, and the US Census Bureau found itself at a crossroads. For years, the bureau had integrated information from paper-based sources to create maps for its census takers, and the procedure was slow and unreliable. An overhaul of the cumbersome system would be a complex and difficult task, and there was a deadline: the next national census would take place in 1990. Robert Marx, head of the bureau’s geography division, decided to take advantage of new advances in computing technology to improve performance. As part of that initiative—known as Topologically Integrated Geographic Encoding and Referencing, or TIGER—the bureau built interagency cooperation to create a master map, developed a software platform, digitized information, and automated data management. Its efforts generated a nationwide geospatial dataset that fed an emerging geographic information industry and supported the creation of online services such as MapQuest, OpenStreetMap, and Google Maps. This case provides insights on overcoming common obstacles that arise in the collection, digitization, and publication of information in accessible formats, which are challenges that affect many open-data reforms.

Pallavi Nuka drafted this case study based on interviews conducted in August, September, and October 2017. Case published February 2018.