RETHINKING THE ROLE OF THE CABINET: MARYLAND'S CENTER OF GOVERNMENT REFORMS, 2007 - 2012 SYNOPSIS When Maryland governor Martin O'Malley assumed office in January 2007, he took over a government hamstrung by ineffective coordination and imprecise monitoring. The state had 20 executive departments, 80,000 state workers, and a $30-billion budget. Winning cooperation to work toward shared goals and keeping tabs on progress posed daunting challenges. A Democrat who had served as mayor of Baltimore, Maryland's largest city, O'Malley had staked his campaign on promises to reform the education sector, increase public safety, safeguard the environment, make health care more affordable, and improve the state's economy. As he had done while mayor of Baltimore, O'Malley decided to focus his cabinet on several main priorities and set specific goals. Then he and his team (1) invested substantial amounts of time to come to understand the inner workings of the various departments, (2) developed measures of department performance, and (3) met regularly with department heads to evaluate progress. Because the governor was able to keep abreast of what was getting done and what wasn't, his team could focus on department performance in a timely manner and from a position of knowledge. O'Malley called the process StateStat. The implementation of StateStat kept departments focused on the governor's priorities, promoted greater coordination among departments and agencies, and produced measurable progress toward goals set by the center of government. Michael Scharff drafted this case study based on interviews conducted in Annapolis, Maryland, in October and November 2012. Additional interviews were conducted by phone in February and March 2013. Although the governor was unavailable to be interviewed for this case study, aides and officials close to O'Malley described the reform process in detail. Case published July 2013. INTRODUCTION When Martin O'Malley moved into the Maryland governor's office in January 2007, he brought with him an innovative, data-driven performance management tool that he had fine-tuned during his preceding eight years as mayor of the state's largest city. Earlier, as mayor of Baltimore, 36-year-old O'Malley had found that city agencies were struggling to deliver services to residents in a timely and efficient manner. For example, absenteeism at city agencies slowed work and raised costs. For instance, on an average day, one in seven employees at the Department of Public Works was absent.1 And because other workers had to pick up the slack, overtime costs ballooned. In 2000, Mayor O'Malley decided to address the absenteeism problem by launching an adaptation of CompStat, an information system first used by the Police Department of the City of New York in the mid-1990s. O'Malley called his Baltimore version CitiStat. CitiStat enabled the mayor to view, in one place, information about (1) the volume and quality of services city agencies provided, (2) personnel data such as overtime and sick leave, and (3) citizen complaints. The system gave the mayor's office the ability to spot signs of trouble and to help diagnose and solve the underlying problems. For instance, more-careful tracking of absences helped the city's public works department adjust its management practices, and within three months, overtime pay dropped by 25%.2 In its first year, CitiStat saved the city $6 million in overtime pay, according to a report by the Center for American Progress, a progressive public policy and research organization.3 O'Malley soon expanded his data-driven approach to all city programs. CitiStat tracked response times for snow removal and trash collection and gathered information on the number of vacant buildings and sewage overflows. The governor's staff used a geographic mapping system to identify areas with common problems, and it analyzed trends. Every two weeks, agency managers met with the mayor and his senior team to review their results. In 2004, Harvard University's John F. Kennedy School of Government gave CitiStat an award for innovation in government. And by 2006, the last year of O'Malley's two terms as mayor, city officials estimated that CitiStat had saved the city more than $350 million since its creation.4 After two terms as mayor, O'Malley turned his sights on the state government, which he called "poorly managed" and "poorly governed."5 At the time, the state had a $1.7-billion structural deficit; and during his gubernatorial campaign, O'Malley pledged to improve Maryland's efficiency and effectiveness. Further, he said he would set targets to achieve concrete gains in a number of areas, such as building new schools and reducing class sizes and increasing the funding of the state's community colleges. He vowed to protect the environment by creating a subcabinet focused on improving the health of Chesapeake Bay and by rewarding farmers for planting cover crops and building buffer zones both to prevent soil erosion and to keep toxic fertilizers from washing into the bay. He also promised to make health care more affordable, to invest in agricultural-based technologies that would preserve the farming industry, and to strengthen the economy.6 Seeking to duplicate his Baltimore successes, O'Malley announced that if elected, he would adopt CitiStat as part of a broader campaign to increase efficiency and accountability in Maryland's sprawling state government. And he pledged to move quickly, promising to begin holding meetings with department heads during his first month in office. O'Malley won the election easily, with 56% of the vote to 42% for incumbent Republican Robert Ehrlich Jr. He took office in January 2007 and set to work. THE CHALLENGE Based on his seven years as mayor of Baltimore, O'Malley knew that from his position at the center of Maryland's government-with limited staff and limited financial resources and myriad constraints on his own time-he would have difficulty ensuring the efficient and effective delivery of results. State-level departments, like city agencies, were the primary service providers. Whether he could he could make good on his campaign promises, which had ranged from reforming the education sector to growing the state's economy, hinged in large part on how those departments performed. The sheer size of the task was a problem in itself. With a $30-billion budget and a payroll of 80,000, the state government was a vastly larger enterprise than the city of Baltimore, which had a $2.4-billion budget and 15,000 employees in 2007.7 O'Malley's major challenges were to set up a system that would synchronize department priorities with his own and then to ensure the work got done. When he took office, O'Malley concentrated initially on developing a system to gauge departments' performance and enable his office to assess results. He would then begin to work with department heads to come up with measurable goals. Roy Meyers, a professor of political science at the University of Maryland Baltimore County, said O'Malley realized "pretty quickly" that he would have to reconfigure CitiStat to make the system work effectively within the state structure. At the city level, problems like street potholes and uncollected garbage were clearly visible and relatively easy to detect and prioritize. In a far larger, state bureaucracy, covering a much broader geographic area, the identification of problems at all levels required considerably more information that was both reliable and up-to-date. O'Malley found that state-level departments collected only limited data-and inconsistently. Further, because state departments often provided services in concert with county governments and third parties, it was usually hard to isolate the state's role and evaluate its performance. And the annual budget process allocated funding with little consideration of the things programs had achieved-or failed to achieve-in the past. Maryland's government had already tried to better organize and monitor the delivery of services: in the late 1990s, the state introduced Managing for Results (MFR), which required departments to set a handful of high-level targets, measure progress on them once a year, and report the findings as part of the annual budget process. Although the legislature had codified the initiative in 2004, O'Malley and his team considered MFR deficient because the program focused on broad outcomes once a year rather than tracking rates of progress for the many separate projects and programs that shaped outcomes. MFR was too high level, with too few indicators to determine fiscal needs and to identify and address implementation challenges. "It is a little bit inadequate to say that you're going to have four to six measures for a whole department that might be a billion or a $2-billion enterprise," said Matt Gallagher, the governor's chief of staff, who had managed CitiStat when O'Malley was Baltimore mayor. "It is even more inadequate to think that simply having those four measures is going to, in itself, move the numbers and move the performance of the department." The $1.7-billion structural deficit that O'Malley inherited when he took office in 2007 underscored the need for reliable metrics on department performance. Under the existing system, the Department of Legislative Services, which supported the legislature, was using MFR data to review department budgets and missions and gave advice to lawmakers. But the quality and accuracy of that advice were limited by the lack of specific, relevant measures. As a result, the governor could not establish a firm link between resources and outcomes when working on the budget he proposed to the legislature every spring. Because the departments lacked detailed performance measures of their own and reported on progress only once a year, the governor's office also was unable to identify short-term implementation problems and anticipate bottlenecks. As a result, government programs could be well on their way to failure before anyone in the governor's office knew about it. In addition, O'Malley faced the sensitive task of securing the buy-in and cooperation of leaders and rank-and-file workers in the state's 20 executive departments. Although the governor appointed nearly all of the department leaders in his cabinet, these permanent civil servants usually had their own goals or were heavily involved in continuing work on programs that reflected past priorities. Further impeding efforts to align strategic priorities, department divisions typically operated in silos, even when common goals demanded cooperation with other arms of the government. O'Malley had to deal with the problem after he registered Maryland as a participating member of the Regional Green House Gas Initiative in April 2007. The initiative, a multistate program aimed at limiting carbon dioxide emissions from electrical plants, required a coordinated effort by several Maryland departments. Failure to work together could lead to incomplete results, duplication of effort, and inflated costs. A significant part of the coordination problem was the lack of organized forums wherein department personnel could discuss cross-cutting issues and work out shared responsibilities. To deliver on his campaign promises, O'Malley had to fire up the cabinet's performance. That meant focusing departments on priorities, enhancing coordination, and tracking progress. FRAMING A RESPONSE To map out his strategy, O'Malley turned to his policy director, John Ratliff, whom he hired from the National Governors Association, a bipartisan organization of the governors of the 55 states, commonwealths, and territories. Ratliff was familiar with approaches taken by other states' governors who faced challenges similar to Maryland's. He began speaking with O'Malley during the transition period between O'Malley's election and inauguration and then joined O'Malley's office shortly thereafter. During the campaign, O'Malley had pledged to create a version of CitiStat at the state level. He had seen how CitiStat had helped clarify what Baltimore's government had to do-and, more important, how it had to do it-to meet the city's needs. The conviction that a CitiStat-like unit could be helpful at the state level also arose from an example the governor has seen in the United Kingdom during his tenure as Baltimore mayor. O'Malley had visited with then prime minister Tony Blair to share lessons about Baltimore's crime-fighting strategy. While in London, O'Malley learned about the Prime Minister's Delivery Unit, a team of advisers who focused government departments on the implementation of specific projects tied to specific goals. Like CitiStat, Blair's delivery unit made the collection and analysis of data central to the head of state's management strategy. Individual public service delivery agreements stated a single goal shared by relevant departments and listed the specific contributions of each department. The delivery unit monitored progress and intervened as needed to keep projects focused and on schedule. In crafting an approach that would work in Maryland, O'Malley's office first had to determine the metrics required to evaluate the performance of each of his major departments. That meant translating broad campaign promises into specific goals and then identifying the role each department would play in achieving results. By applying specific metrics for each department, the StateStat team could analyze and boil down the information into briefing papers for discussion at StateStat meetings. (However, in the beginning, StateStat differed from Blair's delivery unit in that it did not focus departments on statewide, strategic goals that cut across departments. In 2007, O'Malley had yet to define those goals.) StateStat built on much of the existing Managing for Results framework but emphasized strategic performance measures, timely submission of data, and continuous review and assessment. (MFR continued to exist as a separate process.) Recognizing that being able to see the big picture required visual presentation of complex information, O'Malley emphasized graphics in both briefing papers and presentations. So, his staff outfitted a government office with projectors and computer monitors to support audiovisual presentations at StateStat meetings. The governor's team began with a pilot focused on the three departments that faced the most-pressing problems: juvenile services, public safety and correctional services, and human resources, which oversaw child protective services. They reasoned that a phased rollout also would demonstrate to other departments how the program worked, would ease skepticism, and would provide time to work out priorities and objectives with the remaining 17 executive departments. Later, after full introduction of StateStat, problems persisted. Low levels of interdepartmental coordination remained a challenge. O'Malley's big campaign promises, like restoring the environmental quality of Maryland's cherished Chesapeake Bay, would require multiple departments to work together. In response, the governor's team decided to hold a series of subcabinet meetings-in addition to the StateStat sessions-that would bring departments together to consider shared issues like the environment, public safety, and the economy. Ratliff called the subcabinet meetings "learning sessions," at which the new administration heard about the challenges departments faced. That way, those in the new administration could begin to think about setting concrete targets that would measure progress. GETTING DOWN TO WORK The governor had to win cabinet agreement on priorities, improve department coordination, and track progress closely and accurately. The pilot Immediately after taking office, O'Malley worked with the state legislature to legitimize StateStat under state law. In securing statutory backing for the program, the new governor signaled to the public and to civil servants his intent to strengthen management control of government operations. The new law, which took effect on June 1, 2007, said the governor could require any executive branch department to participate in StateStat. According to the bill, a department selected for participation had to "adopt a comprehensive set of performance and citizen satisfaction measurements; regularly and frequently submit timely and accurate data, regularly and frequently attend accountability meetings to assess its performance, and continuously review its strategies and tactics to meet its goals," among other things.8 When he signed the bill into law, O'Malley made clear his intentions: "Through StateStat, we are going to do everything we can to make government more open and transparent, so that we understand what things are working, what things are not working, and how we can maximize the investment that the hard working people of our State make in the important work of state government," he said. "It is going to be our foundation for restoring accountability and for driving our progress."9 O'Malley assigned Gallagher, his deputy chief of staff and the former director of CitiStat, to take on the additional responsibility of overseeing StateStat during the pilot phase. The StateStat bill provided Gallagher with a budget of $361,444 from the state's General Fund for fiscal year 2008, which began July 1, 2007. The funding covered salaries for analysts, costs of supplies, start-up costs, and ongoing operating expenses. The pilot phase lasted more than a year and developed important procedures, like methods for setting department metrics and tracking performance, thereby enabling the broad rollout to work better. In October 2008, O'Malley's office was ready to expand the initiative across state-level departments and scale up the size of the StateStat team. He hired Beth Blauer, who was then chief of staff at the Department of Juvenile Services, to serve as director of StateStat. Blauer had a degree from New York Law School and had been a probation officer for the state earlier in her career. Blauer's appointment freed Gallagher to focus more of his energy on his duties as deputy chief of staff. Blauer moved to buttress capacity at StateStat, which had three staff analysts when she became director. Throughout her four-year tenure as director of StateStat, the analytic team grew to 10. O'Malley expanded the size of the unit despite a tough budget constraint-in part by reconfiguring the existing staffing plan of his own office, directing one of the three deputy chiefs of staff and the deputy's support team to take StateStat roles. Blauer selected the analysts through an open recruitment process, seeking candidates who had strong analytic skills, one or two years of work experience, graduate degrees in areas like public policy and policy administration, and proficiency with the Excel spreadsheet program that departments used to report data. In making hiring decisions, however, she strongly emphasized two factors that did not usually show up on résumés: "The two most important traits were that they have strong writing skills and intellectual curiosity." Indeed, some analysts without graduate degrees were recruited from other departments because of their demonstrated performance in those areas. Aware of the importance of planting the seeds of data-based management widely across Maryland's government, O'Malley viewed the StateStat unit as a training ground. "One was really meant to work at StateStat for only two years and then be spun into other leadership positions in government," Blauer said. "The governor saw it as his bench-deepening school." For example, in early 2013, former StateStat analysts included the deputy secretary of the Department of Transportation and a director in the Department of the Environment. By 2009, 16 of the state's 20 executive departments were part of the formal StateStat process. Leaders from the 16 departments attended StateStat meetings either once or twice a month depending on how pertinent a department's work was to the governor's broad priorities. Only the Departments of Aging, Budget and Management, Disabilities, and Information Technology did not meet regularly for StateStat, though they still counted as 4 of the 16 that did meet. However, representatives from budget and management and information technology sat on the panel at all department-specific StateStat meetings because budgetary and technology issues were common across all departments. Blauer, the director, assigned each StateStat analyst to work with specific departments usually grouped by theme and matched to the analyst's background. For example, analyst Phillip Stafford, who held a master's degree in environmental engineering, had a portfolio that included the departments of environment and natural resources. Setting goals When O'Malley became governor, he already had some knowledge about how state departments performed or failed to perform, having worked with numerous state-level departments on local issues while serving as Baltimore's mayor. Gallagher said O'Malley's experience provided insights into problems in state government. "We were acutely aware of what we thought were very significant performance deficiencies," he said. "Oftentimes, they impacted us as a local jurisdiction." Gallagher pointed to a backlog in DNA testing at the corrections department as an example of a state bottleneck that had local repercussions. Although the governor had articulated his major goals during the campaign, his office wanted to build a sense of ownership within each department by hosting conversations about priorities. The early discussions helped (1) identify the goals and targets each department pursued under the Managing for Results program and (2) integrate them with the governor's campaign pledges as well as commitments related to the state's participation in federal grant programs. The department's success in delivering its core services, too, figured in the planning. From 2007 to 2008, O'Malley's staff reviewed the discussions at the subcabinet meetings. The governor's office subsequently released 15 strategic policy goals grouped under four pillars: skills (education, workforce training, economic development, and jobs), security (public safety and homeland security), sustainability (Chesapeake Bay, clean water, clean air, and transportation), and health (health care and public health). The goals were ambitious. For instance, under the skills pillar, one goal was to "Recover 100% of the jobs lost to the great recession"; one of the goals under the sustainability pillar was to "Reduce Maryland's greenhouse gas emissions by 25% by 2015." Information reporting and accountability Once the department and the governor's team agreed on what to measure, the department secretary worked with senior managers to assign responsibility for collecting the data. One person in each department had gathered the measures from the various units within the department, compiled the information into a single spreadsheet, and submitted the spreadsheet to the relevant StateStat analyst about four days before the department's StateStat monthly meeting. Each analyst pored over the information, checking the status of actions flagged for follow-up and looking for trends or potential problems worthy of discussion at the upcoming meeting. "Before the meeting . . . I comb through every metric and look at major changes and anything that might cause concern," said Kristen Ahearn, an analyst whose portfolio included the Departments of Housing and Community Development, Education, and Energy as well as the Motor Vehicle Administration (MVA). While reviewing data from the Department of Housing and Community Development, for example, Ahearn noted that the number of loans made in 2012 for a program to help new or expanding small businesses and nonprofit organizations was significantly lower than the year before. She flagged the anomaly for discussion as part of the regular briefing memo sent to the governor and his executive team the night before the meeting. The memo included charts and graphs and suggested questions that served as the basis for discussion. At the meeting, participants worked out the apparent loan discrepancy and resolved other matters on the agenda. StateStat analysts did not share their briefing memos with department staff or department secretaries ahead of the meetings. The StateStat team wanted department staff to gain proficiency in the process of reviewing data regularly and thinking strategically about how the findings would apply to day-to-day operations. "The idea was that if we gave them the memo, they then would study for the test instead of integrating the methodology into their own systems," Blauer said. That policy wasn't popular with everyone involved in the StateStat process, however. Uncertainty about what would be asked at meetings frustrated some of those who faced the questioning. But Blauer said she thought the criticisms were exaggerated: "It was pretty predictable what was going to be covered in the next meetings, because we always said what people should be prepared to talk about next time they came in." At the monthly meetings, StateStat representatives quizzed department officials on a variety of topics. For example, during a session in late 2012 with the head of the Motor Vehicle Administration and his deputies, analyst Ahearn zeroed in on data that showed the administration spent 108% of its budget for overtime pay in the fiscal year that had ended in June, with spending especially high in the Baltimore branch. Furthermore, the data revealed that overtime spending across all MVA branches for each of the first two months of the current fiscal year exceeded the total for any other single month in the budget. Ahern had noted the trends in her briefing memo and suggested that the panel ask the director why the overtime budget had increased on the whole-and specifically at the Baltimore City branch. At the meeting, the director's deputy explained that the overtime had increased largely because the branch office had been relocated closer to a major highway, so the volumes went up, but the number of employees remained the same. The governor's staff also sought answers on why traffic fatalities in the western part of the state had spiked in recent months and pressed for action to address the issue. Such targeted questioning sometimes created tension, especially among department officials who resented being interrogated by analysts who in some cases were half their ages. Blauer was aware of the potential for antagonism. "There is a lot of power in the analyst position, and it has to be handled extremely delicately," she said. Blauer said she was careful to clarify the roles all participants played during the meetings. Analysts were paid to examine data, uncover apparent trends and anomalies, and raise the issues for discussion and resolution. Department secretaries and officials had the answers and explanations. "I reinforced the message that the secretaries are the data experts and own the data," she said. "I emphasized to the secretaries that we were there as collaborators, helping facilitate the process with the tools for them to do their jobs." The targeted questioning sometimesuncovered areas where departments struggled and could benefit from the governor's help. At one StateStat session, for example, the governor's team questioned the distribution rate for mortgage assistance set up for home owners in the wake of the global financial crisis. Representatives from the Department of Housing and Community Development, which oversaw the program, conceded that they had difficulty in raising awareness of the availability of the assistance. To help, O'Malley visited religious organizations to announce the availability of the aid and how to enroll. Following each monthly meeting, the StateStat team posted two items to a dedicated public Web site: the memo that had been circulated in advance of the session and the department's Excel data sheets. The analyst then circulated a memo to the department, listing action items and follow-up questions that had been asked in the meeting. Most of the monthly StateStat meetings involved a single department, with the aim of getting the main decision makers at the same table to discuss a common purpose. Blauer said she understood the need for this approach, based on her own experience. "I came from a department [the Department of Juvenile Services] where there was an incredible labyrinth of red tape to move resources or shift positions around," she said, because multiple layers of internal management slowed decision making. "With StateStat, once we identified a need, we could triage it much faster. . . . When a child at juvenile services had to be sent out of state because no services were available, we knew immediately about that lack of services because it was in the data. Coming to a conclusion like that in the past could have taken six to eight months." Working together StateStat meetings sometimes involved representatives of several departments when a target or goal required cooperation. O'Malley's effort to improve the health of Chesapeake Bay was an example of how the process fostered such coordination. For decades, a toxic combination of runoff from numerous sources-including storm water, wastewater treatment plants, and nutrient-rich soils-had been poisoning Chesapeake Bay, the largest estuary in the United States. The pollution angered Maryland's citizens. "One of the more-visible problems is when you have a hot summer day right after a rain, a big algae bloom occurs, and tons of dead fish float on the surface, especially in Baltimore Harbor," said Stafford, the analyst whose portfolio included the departments working on environmental issues. "It smells, and it's a very visual, in-your-face view of what is wrong with the bay." Since 1987, Maryland had been part of a regional partnership called the Chesapeake Bay Program, which brought together federal and state agencies, local governments, nonprofit organizations, and academic institutions in an effort to restore and protect the bay. In 2009, the partnership began setting two-year targets. Maryland's 2009-11 targets included planting 325,000 acres of cover crops that reduced nutrient runoff from farms by holding soil in place; building 130 structures for storage of animal waste to limit runoff; and restoring 1,000 acres of wetlands. Four state departments- Environment, Natural Resources, Agriculture, and Planning-had to play important roles for the partnership to succeed. In a special program dubbed BayStat, O'Malley brought all four departments together to track progress rather than meet individually with each department as part of the normal StateStat process. "In a cabinet meeting, you've got 30 people in the room and one hour, which is a lot less effective than when you've got five secretaries together and are talking about one subject," Blauer said. O'Malley also invited scientists from the University of Maryland to participate in the meetings. Stafford said the bay-focused meetings were "a way to bring into one room-rather than the separate silos-all of these different departments that should be accountable for these achievements and have them hash out the solutions together." "I know my responsibilities and have a view of my industry. Bob has his particular view, and so do Rich and John," said Earl Hance, secretary of the Department of Agriculture, referring by first name to the other Maryland secretaries whose departments were active in the bay partnership. "By bringing us together once a month, we get a much better perception of each other's issues and responsibilities. You have a better understanding when you make decisions; you think through processes differently than you would individually." The regular BayStat meetings helped the departments develop a plan for reaching their two-year targets. "We took a look at what the burden for each of the sectors should be," Stafford said. "Our agriculture sector needed to reduce levels by this much; our urban sector needed to reduce by this much." Because many targets were group oriented rather than department specific, departments could help each other when unexpected problems arose. For example, when the Department of Natural Resources realized that some of the places it had planned to plant trees and introduce other types of runoff-reducing mechanisms, like stream buffers, were productive agricultural areas, the department enlisted the Department of Agriculture to work with farmers to plant cover crops that would serve the same purpose in preventing nutrient runoff. Data gathered by the departments informed policy decisions. For instance, the secretaries worked with the University of Maryland to determine that farmlands on the eastern side of the bay had the highest soil concentration of phosphorus, a major ingredient in crop and lawn fertilizers. The governor's office worked to pass a bill establishing limits on fertilizer use throughout the state. The office also steered through a bill that paid subsidies to farmers who planted cover crops. OVERCOMING OBSTACLES During the early development of StateStat, in the months before O'Malley held goal-setting conversations with departments, some personnel grumbled about the amount of energy invested in measuring department performance and expressed uncertainty about whether the data actually mattered. Early StateStat conversations focused mainly on the things individual departments could achieve, as opposed to the ways department outputs were tied to larger strategic outcomes. Within departments, staff sometimes felt they had to work on projects that seemed petty or appeared to lack any clear link to a larger mission. In 2008, a year after taking office, O'Malley tried to address that problem by convening a series of conversations with departments. He also began to build a system that would ensure follow-through. In July 2008, building on ideas he had seen at Tony Blair's office, O'Malley set up a separate delivery unit to ensure departments were contributing to the achievement of these broad policy goals. He repurposed the Department of Planning's Office of Smart Growth, which had the broad task of ensuring that Maryland's housing, transportation, and infrastructure projects encouraged environmental sustainability and the well-being of citizens. To run the unit, O'Malley appointed David Costello, who was head of the Office of Smart Growth and had directed the Mayor's Office of Community Investment in Baltimore City when O'Malley was mayor. The Office of Smart Growth's three staffers became delivery unit employees. Based on the 15 strategic policy goals defined through the department conversations, Costello and his colleagues formulated goal-specific delivery plans as well as associated tracking templates to drive and measure progress. Then, once every two weeks, the unit invited the relevant departments to discuss and assess progress on a specific goal. The infrequency of the reviews meant it could be months before the unit circled back to review the same goal again. However, delivery unit staff worked daily with department staff to measure and evaluate progress. In theory, the delivery unit was supposed to help departments track progress toward the administration's goals, especially when multiple departments shared a single goal. But the new system was not perfect. The delivery unit placed a significant new burden on department heads and staff who were already participating in StateStat and now reported twice and attended twice as many meetings. Having anticipated that department staff would resist the idea of the delivery unit for those very reasons, Costello said that early on, he worked to head off the anticipated pushback. "We tried to pitch it and sell it on its merits," said Costello. "'Here's an opportunity for your department to shine-or conceivably acquire more resources and attention for your program,'" he recalled telling department leaders. "I don't think anyone disagreed that the governor's goals were compelling and important ones." But resistance was expectedly strong. "Some departments dragged their feet," said Costello. "As we started to address delivery unit-related issues at Stat meetings, it was clear that many agencies weren't doing what they had been asked to do. They would commit to doing delivery unit goal-related work but then would often drag their feet by not helping draft the delivery plans or identify all relevant agency programs and initiatives. And even after they identified certain programs, they weren't particularly forthcoming regarding the status of their programs. With many agencies, it just became a long, frustrating dance." Costello said that, not surprisingly, department secretaries who were "more loyal and committed to supporting the governor" were inclined to participate willingly in the new system. But O'Malley also had appointed a number of subject-matter experts to his cabinet, and these secretaries were far less enthusiastic. Costello said their recalcitrance sapped some of the program's strength. By early 2010, the duplication of work created by the existences of the delivery unit and StateStat-and the tension that caused-had become increasingly clear, and O'Malley decided to combine the two into a single StateStat process. StateStat staffing increased to reflect the added responsibility of monitoring O'Malley's big policy goals. Most of the delivery unit staff, familiar with the monitoring and reporting process, became StateStat analysts. Blauer retained her post as director of the expanded StateStat operation, and Costello moved on to become deputy secretary at the Department of the Environment. Costello, who called his experience running the delivery unit "frustrating," said that after the merger, StateStat, which had progressively taken on many of the elements of the delivery unit, was further strengthened by its increased responsibility to focus more of its time and attention on the state's most important programs and cross-agency challenges. "I can remember-before the delivery unit merged into StateStat-sitting in a StateStat meeting once where 20 minutes was spent discussing the management of a $50,000 grant, when broader questions regarding multimillion-dollar stimulus grants went unasked. The merger of the delivery unit and StateStat enhanced the administration's efforts to achieve its strategic goals by focusing greater analysis and monitoring on the agency programs that were apt to, or were intended to, contribute the most to achieving its key goals." ASSESSING RESULTS Officials close to the governor's office pointed to several outcomes that StateStat made possible by improving coordination, monitoring implementation, and encouraging accountability. StateStat's simplest attribute-getting people into the same room to discuss issues-produced concrete results. For instance, in 2007, at their first StateStat meeting, officials from the Department of Public Safety and Correctional Services presented data on injuries of inmates and correctional officers at an outdated prison built in 1874. The data also showed that a corrections officer had been killed in an attack at the prison in 2006. The governor's office responded quickly and ordered the facility closed. In addition to incidents involving staff at correctional facilities, StateStat tracked overtime at the department and prodded department leaders to reduce both measures. By 2010, the number of overtime hours had dropped 22%, and serious assaults on staff had decreased by 50%.10 StateStat also tracked the administration's attempts to reduce the 24,000 unanalyzed crime-related DNA samples, which the administration had inherited in 2007, and the loading of nearly 83,000 DNA samples into a federal database. The efforts resulted in 212 arrests, including 16 for murder and 103 for sex offenses.11 Furthermore, in 2012, O'Malley reported that Maryland had met its 2009-11 milestones to restore the health of Chesapeake Bay. The state oversaw the planting of 430,000 acres of cover crops-23% more than the goal. The crops prevented an estimated 2.58 million pounds of nitrogen and 86,000 pounds of phosphorus from getting into the bay. And to reduce runoff, the state planted 895 acres of forest buffers-more than double its goal. Meanwhile, at the Motor Vehicle Administration, average wait times declined to 27 minutes in 2012 from 44 minutes in 2007, even with annual reductions in staff levels to reduce costs. During the same time period, the number of customers who used new, alternative services that eliminated the need to visit the administration's offices increased to 38.7% from 28.9%. StateStat analyst Ahearn pointed to what many observers considered the strongest attribute of the program: a fact-supported forum for discussion that created pressure for results. "Any time I can think of hard results, it's because we pushed the department to do something," she said. She noted the success of efforts to improve awareness among home owners regarding a program that offered mediation with banks and mortgage services. "After months of our asking the department to increase participation in the program, the department acted by obtaining the addresses of all of the home owners in the state eligible for the program." With the addresses in hand, the department began sending out mailings advertising the program to qualified home owners. In 2009, Governing magazine named O'Malley one of nine "public officials of the year" and cited StateStat as an innovative management tool. "Underneath O'Malley's 'stat' talk is a deeper view about how to effect change in the public sector," the magazine wrote. "Whether the issue is education, public safety or economic development, O'Malley says, the problems are complex and the bureaucracies are fragmented. In his view, government won't improve unless leaders dissect the interrelations between different agencies and hold them accountable for meeting measurable goals."12 The administration's online postings of the Excel data sheets from StateStat sessions formed part of the effort to enhance transparency. However, the unwieldy nature of the data limited the public's ability to use the information. "The voluminous data on the StateStat website lacks analysis that explains basic performance trends to the average citizen," wrote University of Maryland professor Meyers in a Baltimore Sun editorial.13 And Len Lazarick, editor and publisher of MarylandReporter.com and former statehouse bureau chief at the conservative-leaning Baltimore Examiner, called the spreadsheets "unanalyzed data dumps" that "don't come close to providing real accountability."14 Not surprisingly, StateStat ruffled some civil servants, who spoke of the need to meet StateStat analysts' proliferating requests for more and better measurements. In 2007, the Department of Housing and Community Development measured about 400 data points, according to Hazel Heeren, manager of business performance at the Department of Housing and Community Development. By 2012, that number had swelled to more than 1,000 across the entire department. Dozens of department staff were involved directly or indirectly in gathering and reporting on the measures and in compiling the information for StateStat analysts, she said. In addition, preparations for the meetings often took valuable staff and executive time. Unsure of the exact nature of the questioning, departments often held premeetings to make sure they would be prepared. Heeren said her department sent as many as a dozen representatives, usually department heads and senior administrators, to the StateStat meetings. "We have to bring people who can answer questions should they arise," she said. Even though Annapolis is Maryland's capital, numerous departments were located in Baltimore, an hour's drive away, which added to the time burden. Furthermore, a session would typically run about an hour and a half, and department staff then spent additional time after the meeting addressing questions raised in the analyst's follow-up memo. The precise nature of the process also led to questions about StateStat's broad aims. "We want to have a clear understanding behind why we are measuring what we are measuring," Heeren said. She added that sometimes the new metrics appeared to detract from the core mission of the agency and create uncertainty among employees who did not fully understand the rationale for such thinking. And Warren Deschenaux, director of the office of policy analysis at the Department of Legislative Services, said StateStat prioritized the tracking of data points without clear understanding of the outcomes the department sought. Deschenaux said StateStat's approach contrasted with the program-oriented sessions like BayStat, which measured progress toward more clearly articulated, overarching goals. It was not always clear that the legislature wanted as much detail as StateStat provided. "Particularly in our environment, legislatures aren't all that interested in the details of administration," said Deschenaux. "They want the big picture, and they want you to tell them what to do." However, said Deschenaux, "if the analysis coming from Legislative Services is good enough to highlight measures relevant to policy disputes, it can have an impact on whether the legislature accepts the allocations to certain departments the governor has requested," said Meyers, the professor. REFLECTIONS Matt Gallagher, the governor's chief of staff, who had managed CitiStat while Martin O'Malley was mayor of Baltimore, asserted that StateStat was effective despite its shortcomings. "Bringing together secretaries and department personnel on a monthly or biweekly basis accelerates your decision making," he said. "It accelerates the feedback loop. It gives you an opportunity to observe where you have management capacity-and deficiencies-at the department level. And it improves the level of connectedness between the governor's office and what actually happens at the department level." Unlike Managing for Results, StateStat enabled the governor's team to both zero in on specific aspects of participating departments' performance month to month and then use that information to analyze broader trends across time. But StateStat analysts had to be able to digest large amounts of information and exercise judgment in deciding which performance data to present at the meetings. And the governor and his senior team, including his chief of staff, had to commit to being present at StateStat sessions in order to foster a culture of accountability. Former Maryland governor Robert Ehrlich Jr., a Republican who lost the 2006 election to O'Malley and ran again unsuccessfully in 2010, said the number and specificity of data points that departments were asked to measure shifted focus away from core goals. "You start measuring everything, you're measuring nothing," he said. The former governor had relied on Managing for Results to track outputs on a broad scale during his tenure and suggested StateStat was unnecessary. "When something did not happen, I picked up the phone and got things moving." Mike Dresser, a Baltimore Sun reporter who covered politics in Annapolis, said Ehrlich's criticisms were to be expected, and he suggested that if it had been a Republican who created StateStat, competing political parties would signal disapproval. Although O'Malley clearly believed that StateStat was a sustainable program because of its valuable role in the business of government, the outlook for sustainability remained unclear in early 2013 because of the uncertainties of politics. Maryland law did not allow O'Malley to run for a third term in 2014. In late 2012, O'Malley's lieutenant governor, fellow Democrat Anthony Brown, announced his candidacy for governor and vowed to retain the StateStat approach if elected; but a change in parties could bring new priorities, policies, and procedures. Journalist Dresser suggested that accountability by StateStat or some results measurement system was important for building the government accountability that people so valued. References 1 Teresita Perez and Reece Rushing. 23 April 2007. "The CitiStat Model: How Data-Driven Government Can Increase Efficiency and Effectiveness." Center for American Progress. 2 Justin Fenton, "O'Malley Installing StateStat: Statistics-Based Management Is Coming to Md. Government," Baltimore Sun, 12 February 2007. 3 Teresita Perez and Reece Rushing. "The CitiStat Model. 4 Fiscal and Policy Note. Maryland General Assembly. 2007 session. Senate Bill 102. Department of Legislative Services. 5 Doug Donovan, "Agency Audit Plan Vowed: O'Malley's Idea Based on CitiStat-Ehrlich Says Mayor Evading His Record Maryland Votes 2006," Baltimore Sun, 10 October 2006. 6 Martin O'Malley and Anthony Brown. "An Action Plan for Maryland Families: Blueprint for Change." October 2006. http://omalley.3cdn.net/17533636ffc83ef40f_spm6bkr9h.pdf. 7 Justin Fenton. "O'Malley Installing StateStat. 8 Fiscal and Policy Note. Maryland General Assembly. 2007 session. Senate Bill 102. 9 Bill Signing Ceremony. Office of Governor Martin O'Malley. 10 April 2007. http://www.governor.maryland.gov/blog/?p=1873. 10 Martin O'Malley and Stephanie Rawlings-Blake, "CitiStat: 10 Years of Measuring Progress: During Its First Decade, CitiStat Has Made Government More Accountable and Effective," Editorial, Baltimore Sun, 30 June 2010. 11 Martin O'Malley and Stephanie Rawlings-Blake, "CitiStat: 10 Years of Measuring Progress. 12 "Driven by Data." 2009 Public Officials of the Year. Governing. Accessed March 2013. http://www.governing.com/poy/Martin-OMalley.html. 13 Roy Meyers, "Too Little or Too Much? State Can Avoid Cuts by Educating the Public about Vital Services," Editorial, Baltimore Sun, 8 September 2009 14 Lazarick Len, "Vaunted StateStat Falls Short of Real Accountability," Editorial, Baltimore Sun, 7 June 2009. Innovations for Successful Societies makes its case studies and other publications available to all at no cost, under the guidelines of the Terms of Use listed below. The ISS Web repository is intended to serve as an idea bank, enabling practitioners and scholars to evaluate the pros and cons of different reform strategies and weigh the effects of context. ISS welcomes readers' feedback, including suggestions of additional topics and questions to be considered, corrections, and how case studies are being used: iss@princeton.edu. Terms of Use In downloading or otherwise employing this information, users indicate that: a. They understand that the materials downloaded from the website are protected under United States Copyright Law (Title 17, United States Code). This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. To view a copy of this license, visit http://creativecommons.org/licenses/by-nc-nd/4.0/. b. They will use the material only for educational, scholarly, and other noncommercial purposes. c. They will not sell, transfer, assign, license, lease, or otherwise convey any portion of this information to any third party. Republication or display on a third party's website requires the express written permission of the Princeton University Innovations for Successful Societies program or the Princeton University Library. d. They understand that the quotes used in the case study reflect the interviewees' personal points of view. Although all efforts have been made to ensure the accuracy of the information collected, Princeton University does not warrant the accuracy, completeness, timeliness, or other characteristics of any material available online. e. They acknowledge that the content and/or format of the archive and the site may be revised, updated or otherwise modified from time to time. f. They accept that access to and use of the archive are at their own risk. They shall not hold Princeton University liable for any loss or damages resulting from the use of information in the archive. Princeton University assumes no liability for any errors or omissions with respect to the functioning of the archive. g. In all publications, presentations or other communications that incorporate or otherwise rely on information from this archive, they will acknowledge that such information was obtained through the Innovations for Successful Societies website. Our status (and that of any identified contributors) as the authors of material must always be acknowledged and a full credit given as follows: Author(s) or Editor(s) if listed, Full title, Year of publication, Innovations for Successful Societies, Princeton University, http://successfulsocieties.princeton.edu/ Michael Scharff Innovations for Successful Societies (c) 2013, Trustees of Princeton University Terms of use and citation format appear at the end of this document and at www.princeton.edu/successfulsocieties ISS invites readers to share feedback and information on how these case studies are being used: iss@princeton.edu (c) 2019, Trustees of Princeton University