After the 1980 US national census, 53 state and local governments sued to correct alleged errors in the count, and the US Census Bureau found itself at a crossroads. For years, the bureau had integrated information from paper-based sources to create maps for its census takers, and the procedure was slow and unreliable. An overhaul of the cumbersome system would be a complex and difficult task, and there was a deadline: the next national census would take place in 1990. Robert Marx, head of the bureau’s geography division, decided to take advantage of new advances in computing technology to improve performance. As part of that initiative—known as Topologically Integrated Geographic Encoding and Referencing, or TIGER—the bureau built interagency cooperation to create a master map, developed a software platform, digitized information, and automated data management. Its efforts generated a nationwide geospatial dataset that fed an emerging geographic information industry and supported the creation of online services such as MapQuest, OpenStreetMap, and Google Maps. This case provides insights on overcoming common obstacles that arise in the collection, digitization, and publication of information in accessible formats, which are challenges that affect many open-data reforms.
Pallavi Nuka drafted this case study based on interviews conducted in August, September, and October 2017. Case published February 2018.