Innovations for Successful Societies AN INITIATIVE OF THE WOODROW WILSON SCHOOL OF PUBLIC AND INTERNATIONAL AFFAIRS AND THE BOBST CENTER FOR PEACE AND JUSTICE Series: Centers of Government Interview no.: H1 Interviewee: Cristopher Johnston Interviewer: Michael Scharff Date of Interview: 19 July 2012 Location: Indiana Innovations for Successful Societies, Bobst Center for Peace and Justice Princeton University, 83 Prospect Avenue, Princeton, New Jersey, 08544, USA www.princeton.edu/successfulsocieties SCHARFF: This is Michael Scharff. The date is July 19th, 2012. I am joined by my colleague Rick Messick and together we're speaking with Cris Johnston, who is the Director of the Government Efficiency and Financial Planning Unit in the Office of Management and Budget in Indiana. On your background, how did you first become involved with the administration? JOHNSTON: Well, I picked up the phone at the wrong time I guess. In the fall of 2004-unlike I think Betsy and Adam and maybe some of the others that you spoke with-I had been in state government before and in my private sector life I was in the consulting business and most of my clients were local and state governments. I had not participated in the campaign other than answering questions of the governor's policy team that was developing white papers and that sort of thing. In the fall, after he got elected I did get a couple of calls from people in the transition team and they asked us to serve. Betsy and I actually served on the transition team looking at the Department of Revenue. They were asking about whether I would be interested in certain agency positions and I really said I wasn't interested until this last-so Chuck Schalliol who was the original OMB (Office of Management and Budget) director whom I had never met, called and said, "We have this group that we're not sure what we're going to do with: the government efficiency team." He says, "For lack of a better name that's what it is-this is going to be the small team that has a statewide or enterprise-wide perspective on government performance." I said, "Well, you've piqued my interest there." That was really more of an interest because it was new. It was going to have a statewide perspective and something that not only could be-as an observer, I knew Governor Daniels was going to be a significant change agent and rethinking how government should perform. As a consultant, at least to local governments and some state government agencies prior to that, I was a big advocate that that needed to happen. So I got coaxed out of being a partner in a consulting practice to come back to the public sector. I had worked in the state treasurer's office about twenty years ago. I was the portfolio manager, deputy state treasurer back then. It wasn't until-I had only met the governor once maybe in a social setting, so this OMB director said, I'd like you to-. I kept asking, what do you think this government efficiency team is supposed to do? They weren't sure. Chuck, the person I was talking to, said, "Well, you're going to be able to form it, develop it from the ground up." But it wasn't until he asked me to meet with the governor-again, I had not met him before. I said, "What do you think the charge of this group is?" To this day I still remember it; it was so succinct. He said, "I just want you to ask a lot of questions and when you get the answers you'll know what to do." That's exactly what-I left there and I said, "Well, this could be fun." It made me start thinking, where I wasn't interested at all in coming to the administration, I said, this could be interesting. I made the decision to leave the consulting world and join the administration. I don't know if Adam talked to you much about the construction of OMB, but he created OMB, which is really an umbrella agency for all these financial management agencies that were previously standalone agencies. MESSICK: I was thinking, the way I took it maybe, it was just what had been the budget unit and you, your government-. JOHNSTON: Technically that's right, but the Department of Revenue, the retirement funds, our audit, state audit arm, the State Board of Accounts, the one that regulates or oversees local government budgets and finance, the department of local government finance-all of those department heads report up to Adam as OMB director. This group, the government efficiency team, the five or six people division, was the only new one that was created. All of the others were standalone agencies that never really talked to anybody. But the governor said, "I want an OMB and we've already got these; maybe we just need to coordinate the chain of command communication." While they all have distinct functions, they should all funnel up to an OMB director that reports to the governor. MESSICK: They're fiscal-. JOHNSTON: Absolutely. SCHARFF: There have been some years when the OMB director has also served as the SBA (State Budget Agency)? JOHNSTON: Budget director. SCHARFF: Head of the SBA? The State Budget Agency? JOHNSTON: Exactly. So the first one, Chuck Schalliol who hired me, served in dual rules. Then in the middle time span, there was Ryan Kitchell and then Chris Ruhl, who then broke it up. Ryan was the OMB Director; Chris Ruhl was the Budget Director. So he focused-when they split it up, the Budget Director did just solely manage the SBA, the State Budget Agency. SCHARFF: And reported to the OMB Director. JOHNSTON: Yes, reported up to the OMB Director. Now we're back to sort of the dual service. Adam is serving in both capacities. SCHARFF: Do you have a sense of why that-why those reporting lines had come together and again were sort of separated? JOHNSTON: I don't have-I have just my opinion and that was when Chuck decided to return to the private sector, it was at the beginning of a point of significant change or significant program development that I don't think the governor expected to fall in his lap. We had a big property tax mess back in 2007. That was the time when Chuck was returning to the private sector and I think we quickly realized that even though property taxes are really a local government issue, the citizens were saying, "Well, what is state government going to do about this?" So there were-I'm trying to think what else was going on. There were a couple of other things. I think that maybe we just said that we needed to manage the budget, we needed to have someone just solely focused on command and control over appropriations and spending at the state government level, but we had all these other things that were sort of distracting or were falling in our lap that we needed to pay attention to. So Ryan Kitchell, who was the OMB Director at the time, and I, because of my previous work with local government and my understanding of the property tax system, Ryan and I and others-it was a bigger team effort obviously-focused on a lot of the property tax reform issues of 2007 and 2008. And then Ryan continued to have all these other six or seven direct reports from the retirement funds, the audit arm and all of that, whereas Chris Ruhl had focused solely just on managing the state finances. In hindsight, it was probably fortunate because in two years, the state revenues just fell off a cliff like everything else in the country. So having just those dedicated eyes and ears on managing that was probably fortuitous-fortuitous that that decision was made a couple of years before to help manage through that process when the state had to really buckle down and say, "Well, how are we going to get through this revenue crisis?" MESSICK: Now, before all these disparate agencies-the local government, finance, and the retirement agencies-were they all direct reports to the governor in theory? JOHNSTON: Yes. Indiana had-you had of course the citizens or electorate up here, the governor-. A reasonable organization chart would have seventy boxes all in one line. MESSICK: Oh, I see. JOHNSTON: We still have a lot of agencies that are technically direct reports to the governor, but at least in the OMB world, they all come up through the OMB director who then reports to the governor. MESSICK: So in fact, there is no statutory change that says the local government finance is now a sub-unit of OMB. OMB just got sort of inserted-. JOHNSTON: Right. MESSICK: And a bunch of them just route their memos through-. JOHNSTON: I would have to check the statute to see whether that was changed formally, but in practice that is what actually happened. OMB was created and I think the legislation was very broad to say that it was meant to coordinate the financial management activities of state government. It started off as an executive order. In fact, I think it was the second executive order. It said that we were going to create OMB to have a better coordination of financial management practices. I think a lot of that executive order got drafted in the code but whether they formally report up to the OMB-the retirement funds-I don't know, I would have to go back and check that. MESSICK: So you got the seventy that are all direct and now you put OMB in and at least a bunch go through there. Was there any other sort of entity that picked up, sort of equivalent to OMB, that got inserted between the governor and-? JOHNSTON: There were some other agencies-well, here is where I give a lot of the transition folks a lot of credit to think through these things. There wasn't a Department of Homeland Security before. We created that where they merged the State Emergency Management but then there were other-I'm not sure you'd call them Homeland Security agencies, but smaller ones such as Fire and Building Services. I'm trying to think what else was merged into that. But Homeland Security was also one that created more of an umbrella agency for related functions. MESSICK: So you got some consolidation. Some of those seventy go into the Homeland Security entity. JOHNSTON: But here is where the governor's insight is effective. At the same time as this consolidation, one of the most critical things was separating out agencies. I don't know if people have talked to you about the Department of Child Services. MESSICK: We've read about that. JOHNSTON: That was part of this monolith called Family and Social Services Administration, but Child Protective Services was just a component of this massive agency that had been consolidated over the years. Because it fell under this umbrella, the services within DCS never got any visibility until you had tragic events and then that all came out. So this is where the governor said, "You know what, I'm going to raise the visibility of this function." By doing so, it raises the accountability and I think was an example-everyone does say the Daniels' administration is all about streamlining and consolidation. I use this example frequently, because the administration is about better service and more effective government. MESSICK: For example. JOHNSTON: For example, DCS used to be part of a much larger organization and it is performing much better now as a standalone. They hired more caseworkers. Again, everyone is talking about how state employment went down. Well, here is an example where caseworkers were doubled; they went from 800 to 1600 caseworkers. He said we're not delivering the level of service that should be delivered for a very needy population. So that is where-I think there can be a series of contradictions when people just assume that the Daniels' administration is all about consolidation. It is really about what is the right answer for more effective government. Back to your-are there others? Some others have evolved over time too. It wasn't at the very beginning. There was an entity called the State Student Assistance Commission, which decides what are the funding levels for higher ed support from the state in terms of scholarships. It was a standalone. Then we had a separate commission called the Commission for Proprietary Education that dealt solely with the specialty schools, the private schools, the Devry Institutes and all of that. Then there was a separate commission for higher ed. Well just in the last year all three of those-well those two-have been merged into the Commission for Higher Ed, which takes a more holistic view of maybe not every high school graduate should be going to a four-year academic institution. Maybe there are other-. That was one that took a while but finally people thought, "Well, shouldn't the funding of higher ed scholarships be coordinated through the Commission for Higher Ed that works with the universities about what tuition rates should be?" Before, they were separate entities. You hoped that they talked to each other but like you said, some of the easiest things were just saying, "Hey, let's have a meeting so you can meet these other people." So there have been situations like that. I'm probably boring you to death here. MESSICK: Oh no, we're taking detailed notes. JOHNSTON: We had a State Board of Health, Department of Health, that had separate programs for smoking cessation, anti-smoking programs. But we created a separate Indiana Tobacco Prevention Cessation Board just because of the tobacco master settlement from years ago because people thought we needed a place to bring that money in, to manage that money. But they had anti-smoking programs, the Board of Health had anti-smoking programs. That's what my group really kind of came out and said. When we started looking, when we started doing our program reviews early on in 2005 and 2006, where we contributed was you know what? This is really a priority for this group, why don't we shift this program responsibility from one entity to the other, or maybe we should merge some of these organizations because not that we want to cut back on smoking which was-we had all these people who received grants from the tobacco prevention and cessation group that were worried that we were going to cut their grants. No, we had anti-smoking here, anti-smoking there, we just want you guys to work together and hopefully make a greater impact in this area which really is a health-should be under the Department of Health. SCHARFF: How empowered do you, as a unit, as director of the unit, feel to be able to make decisions and to move on things without bringing it to the governor? JOHNSTON: Well, we try not to use that leverage much. We try to use reason and common sense but sometimes you do have to say this needs to be a legislative program or something that we need to pursue. But early on, just to give you a little history, because we were the new unit-this government efficiency was just a new unit within OMB-we had our own challenges to create our own name, if you will. MESSICK: Who are these guys? JOHNSTON: Exactly, that's what it was. Also, the governor at the same time brought in these high-level executives from private sector firms: Eli Lilly, Cummins and all that. They didn't know what we were all about. We had to make our own name. We had the good fortune though-and actually this was part of a report that we did write with the preamble to the report really coming out of the legislation. We had the governor who gave us the charge: ask a lot of questions, you'll know what to do. Very open-ended but still, how do you go communicate that to seventy agency heads? We had the good fortune that we used out of the first budget bill in 2005, where the general assembly, I don't know what their motivation was, but they wrote this very glossy-all revenues that fund state government come from the people and it is the responsibility of every elected official to carefully guard against misuse of this revenue. So we're going, "Oh, this is great." They basically instructed the governor to direct the Office of Management and Budget to review thoroughly the budget of each executive department, agency and instrumentality and overall functions of the executive department for the same purpose the governor intended, finding efficiencies and cost savings. So we go ah, and they gave us a deadline: this has to be done in eighteen months. So we had the good fortune of the governor basically empowering us to do this but then we had a legislative mandate to go out and really just look under the covers everywhere. What we decided to do-and there had been recent government efficiency commissions where businessmen, blue-ribbon panel type-. SCHARFF: Sure. JOHNSTON: They came in and they would write these reports that are huge and that no one would read. What we decided was instead of going down that path and saying this agency is good, this one is mediocre or A, B, C, grading them, we took a step back and thought, we ought to look at the programs that these agencies administer. Because if that is the toolkit that they have, that somehow they have the power to administer these programs, are these programs effective or not? So we took a step down, even though we stayed within the agency, we said, define what your programs are. So we started looking at, well what do you do within Department of Transportation, Department of Health? We tried to identify programs. SCHARFF: How did you do that? What did the structure look like? JOHNSTON: That was the challenge because you had new agency heads. You had others that said, "Well, I can tell you where my money came from, but I can't really tell you what my programs are." We sort of really kind of made it up as we went along. We relied first on their funding streams. That is how the agency was thinking: "I use this money to do what?" We would go in and we used an eighteen-question template that-I can direct you to the link for this. MESSICK: Great. JOHNSTON: What we did-it was broken into four sections. I don't know if you've seen, or heard people talk about this, called the probe. MESSICK: Yes. JOHNSTON: We asked, "Is the program purpose clear? Does it address a specific and existing problem? Is it not redundant or duplicative of other state, federal, local or private efforts?" So we talked about the design, how well the program was planned, the program management, and can it show any results. We used the same template over and over again. Actually this was a knockoff of what the governor started called "The Part" in the Bush administration. We spent-after we got the charge to do this, we looked at what other states were doing and there were a lot of good things, but this one-we boiled this down from what the federal government had done. I wasn't a big advocate. In fact, I stayed away from it saying, "I don't want to copy what the federal government is doing." But the questions in this were so direct and we forced ourselves to a yes and no-not a lot of gray or maybes. We're going to say mostly-it is going to be yes or no. It made a lot of people upset. We used this tool for over 400 programs. So we would-then this was part of the report too. We just listed them all by agency, what the program was, and what their score was. But the biggest-the most common score was "results not measured." We basically said, "We can't tell whether this program is effective or not because there is no objective way to measure. Is it, do we have better healthcare? Can the kids read at the third grade?" We would get a lot of here's what I do, here's how much money I spend, maybe here's how many people come through this program, but the question wasn't how many people come through a worker retraining program? The question is how many people actually got jobs or how many still had jobs six months after they finished the worker retraining program? So we tried to change the question from here is what I do, here is how much I spend, to why do you do it? Or if we were having trouble getting through the template, what happens if we stop doing this? "Well, I don't know." That just led to other questions. So we did this for fifteen to eighteen months, just the six of us, looking at about 420 programs. SHARFF: Did you travel as a team to each agency or did you split it up? JOHNSTON: Good question. SCHARFF: If another state were to try to mimic specifically these steps. JOHNSTON: Every person in my team had their portfolio of agencies. So someone dealt with just the education world, someone dealt with health and human services. MESSICK: Transportation. JOHNSTON: Transportation and infrastructure. Then general government, sort of the nuts and bolts of administration, technology, and all of that. So everybody went, visited their agencies, looked at their programs. Then we did a peer review. There was an element of subjectivity and that was a lot of criticism that we got and that was fair criticism. What we did was-every week, and as the time got shorter, we would have multiple peer reviews in a week where someone-. I would come in and for the agencies that I looked at I would present my findings. Here's why I rated them this way. Then for the others that covered different disciplines, we would try to say, "You're being too harsh compared to the way they were looked at before" or "You're being too easy on them." So we would do our own internal peer review. What we found at the end of the day was that the tool was effective enough that a poorly performing program could not score well. A program that performed well, if they weren't prepared and didn't bring the evidence to answer the questions, a well-performing program could also score poorly. That was the danger in this: if the agency wasn't prepared or didn't bring the evidence-because part of the template was, well what evidence did they show? Did they have metrics? Do they have a strategic plan? Do they have a benchmark against other states? Those sorts of things. This really helped us grab the attention of the agencies about what this group was all about. We were rather irreverent in the report. We had chapters called "Is Indiana State Government One Big Faith-Based Program?" "Would You Like to Supersize That?" "Would You Do that with Your Money?" "Are Dinosaurs Really Extinct?" "Why Buy Two When One Will Do?" "What's Wrong with Competition if the Taxpayer Wins?" Those were names of our chapters. We tried to make this very short. Actually we had two versions of this. We knew that the other blue-ribbon commissions wrote really big reports and no one read them. So we tried to keep it under fifty pages, including the appendices and charts. We got a lot of blowback from the agency: "How can you do this?" MESSICK: I bet. JOHNSTON: "The governor is going to kill you." I still chuckle. Now, at the end of the administration, I look back and kind of laugh at some of the concerns. When the governor finally read-he let us do our own thing. He knew this was going on and said, "You come back and tell me what you guys think." At the end when he finally read it-because we told him, "We're getting a lot of pushback from your agency heads, your appointees." He said, "Well, I'm not sure we can take all of this on but I generally agree with most everything that is in there." So he basically said, "Why shouldn't we ask these questions?" Continuing to fund programs without showing that they're performing well is a past practice and it is not going to be our future practice. So when we got that, when the agency heads-then the newspapers really wanted to make a stink out of this too. Oh, this report came out-a lot of people thought after two years that we were going to write this glowing report of how everything was fixed. This was pretty condemning in some areas. So people were trying to stir the pot between our group and the other agencies. When the governor came out and basically said that this was the right thing to do, it calmed the waters and really helped establish-gave us our sea legs moving forward, if you will, on being able to continue to ask questions and push for better performance. MESSICK: There must have been some agency heads that by this time were Daniels' appointees in place, 18 months and so forth, who took this report as a hit against them. "What do you mean, you've been there 18 months and you still don't know if that program works or not?" JOHNSTON: I think that was a fair comment to us because there were many agency heads that were doing very transformative things in their agencies. They had legitimate-. Some said, "I'm not done yet." The phrase: "You're trying to change the tires while the car still is rolling-give me some time." I understood that and I think that was a legitimate point, but it created-even enhanced-the sense of urgency that the governor has always been about. He said that we had a limited amount of time. How do we keep that momentum going? SCHARFF: So the report is published, the findings come out. What does your office then do? What is the next step in the process? JOHNSTON: All this time, the basic charge of this group was to develop a performance measurement system. In state government, we hadn't measured much of anything unless the federal government mandated it for additional grants or to report on our grants. The best advice that I got from somebody was that if you wait to design the best performance management system, you'll never get started. MESSICK: Right. JOHNSTON: We had no software. We started doing conditional formatting Excel spreadsheets. We just said we're going to measure something. The very first performance report, the governor said we're going to have periodic performance reports. They were pretty crude looking. We started down this process, we were also-while we were out there were asking, "Why does your agency exist? What should be the key performance metrics? You have a variety of audiences; it could be the taxpayer, it could be the legislature, it could be the constituents you serve in addition to the governor. How are we going to capture that and present it so that people understand?" Those same teams, or the individuals, went out to their portfolio of agencies and started that discussion saying, "What should we be measuring?" What was reinforced, as I mentioned, most of the scores were "results not measured." So we had a lot of work to do. We were doing that because a lot of intuitive, or inferences we were making, were just that. We didn't have the data. What we did do is we wrote-there were probably 200 recommendations in this report. Just to show that we don't-I think I pulled this out here someplace-we hold ourselves to the same standards as everybody else. This is our scorecard of those 200 recommendations. We basically plotted through it and zero was nothing has been done and one was something that has been fully implemented. So we update this every six months just to see where we are. So on average, out of the 200 recommendations, we're still-our goal was to get to 50%. If we could have some movement on all these recommendations and get to 50%, we would say we've made a reasonable impact. We still have obviously a ways to go. MESSICK: This is the current number? JOHNSTON: Right. I think we take some pride in that out of a total of 198 recommendations, 100 of them have been partially or fully implemented. So there has been movement on over half of them. Something has been done. They've invested. They've done further research. There are some recommendations in here that we probably wouldn't make today. Remember, we were doing 400 programs in fifteen months. As we get more information, today we probably wouldn't make some of these recommendations, but still we're holding ourselves to that standard. In fact, I was updating it with the results of the most recent legislative session with the civil service reform and which we highlighted in here. I would say 34 out of the 200 recommendations have been fully implemented. SCHARFF: Is the format that we see here the same format that your office designs to sort of track the metrics of performance at the agencies themselves? JOHNSTON: To go back, we started off with just the conditional formatting of Excel spreadsheets. Then what we did because somebody said, I don't know what-because from the private sector money can be both an input and an output. Here is how much revenue or earnings per share. So what we did-this is pretty crude-we basically said, here is how we're going to develop our metrics system. We started calling to reinforce what we called the "Governor's Dashboard." We said there is going to be very limited-basically, no matter how big the agency is, we're going to limit it to three metrics. Tell us the story in three metrics. That way when the governor says, "What should I be paying attention to?" we had not only a limited number, but it helped to crystalize what their mission is all about and how they tell that to the public. Here are the program measures that really came out of the probe. While we looked at 400 programs, we probably have 1500 measures here now. This is probably down around 200. If you have three per 70 agencies it is in the 150 to 200, probably 1500 here. MESSICK: These 1500 are across all 70 agencies? JOHNSTON: Yes, so we concentrate here and here. This is the day-to-day management. You guys figure out what you need to do to manage your business here. We'll help you with that and we helped agencies develop those measures, but that is really what they are going to use internally. This is the area that we stayed in. MESSICK: These won't all necessarily be outcome measures? JOHNSTON: These will not. These are probably more of how many people are coming through the program, budget-those sorts of things. We really try to make these-these are tougher. We really try to make these outcomes. What is the proxy for clean water or clean air or smooth roads? That's what I tried to do with my folks. With the first team in my group, when they would have-we'd go out and try to develop these types of measures to capture the mission. I said, "Get the agency head to tell you in fewer than five words why they exist: clean and safe roads, strong families, educated kids." Once we start with that, then you can work down to what is the challenge. You're not going to have something that necessarily just blares clean air. What is going to be the right proxy to say that IDEM, the Indiana Department of Environmental Management, is promoting clean air or we have clean air in Indiana? MESSICK: Compliance across all 92 counties to EPA (Environmental Protection Agency) standards. JOHNSTON: There you go. So we started off, again, with very basic Excel spreadsheets. Then we bought a very small package because again, going back to that same guy that said, "If you wait to design the best system and buy the best software and all that, you'll never get started." We used a local vendor to start capturing this and putting it out on our website. Then when we finished this process and we actually went back, one of the things that we tried to do with Adam was to link this with the budget development process. We were focused mostly on performance but then one of the things we tried to do was integrate performance measures with the funding. I used-what am I buying for the five million dollar appropriation here to start that discussion? SCHARFF: That's where you were trying to get in touch with the entire-. JOHNSTON: Yes, that's probably the area we've not progressed very well, that is engaging the General Assembly. I think we've got the executive branch thinking about performance, how do I justify my budget request, how do I communicate to the various stakeholders about our performance. There are some legislators who worry about this, there are others who just say, that's my favorite program, I don't care if you're telling me it is not performing well, I want to fund that program. That's probably the one area that we've not engaged enough. Now we have a system through Oracle that there is a performance measurement dashboard that is out on our transparency portal. That is where all of it resides right now. We still have a ways to go; we're not pleased with it, it is not aesthetically pleasing. It shows that we're measuring and it's information that we can use but I think in order for it to attract visitors and get citizens engaged, we need to change it from saying everyone is going to have a bar chart. It is sort of-there is a mindless glitz element to it but you need that in order to attract people to the site. MESSICK: What is the hardest area you've encountered to come up with outcome measures? JOHNSTON: I think it is any area that deals with the human condition. MESSICK: For example? JOHNSTON: You're not going to move the needle on diabetes. If you want to lower diabetes or have healthier-that may be a ten-year measure. So you have to work with little proxies. What are you doing to get there? MESSICK: So you're actually measuring in some cases not outcomes but milestones? JOHNSTON: Milestones towards an outcome. MESSICK: That might be numbers of people in the diabetes area that have been diagnosed or are on medication-. JOHNSTON: Getting treatment or maintaining their treatment, those sorts of things that also serve as sort of proxies to that measure of have we lowered the incidence of diabetes in Indiana citizens. MESSICK: So that's the biggest issue. State government is so service oriented and of course services are a lot easier to measure than things like how well has the foreign affairs department done. Well, we haven't been invaded for two years. JOHNSTON: Right. MESSICK: Well is that your fault-. JOHNSTON: Right. How do you measure the negative, the things that never occur? We've got cases like that. There are others where you think, boy, how do you measure-go to DCS. When I talk about this with other folks I use two what I think are fairly stark examples because people usually aren't in this world. With the Department of Corrections, it is recidivism. How many offenders come back into the system within three years? MESSICK: You can also measure escapes right. JOHNSTON: That's something that is more visible. But here, if you're living up to the Department of Corrections, not just jails, are you rehabilitating the person so that they can be a productive member of society? MESSICK: Does DOC have responsibility over the parole function? JOHNSTON: Right, yes. MESSICK: So at least they can't claim that that's another department. JOHNSTON: They could have just one measure here around recidivism. Then what influences that is whether their substance abuse programs are working. So if somebody comes back and violates parole, why are they violating parole? Are they back on drugs? Do they not have a job? So then you want to go back and see whether the programs are positively affecting the top of the triangle, my key performance indicator here. Because we've got education programs, job-training programs, substance abuse programs, it may be sexual abuse programs, all this counseling that goes on here, the next step for us-we have all those measures. Now can we connect the two? SCHARFF: It is wonderful to have these measures of how well programs are working or not working and I can see how important that becomes when you're engaged in the budgeting process. But I wonder once you have these measurements, is there then an effort by a group here or individuals here, either within your unit or the larger office itself, to go in and help to improve performance? JOHNSTON: We try to do that. One of the things that we did not institute early on, but we have instituted in the last two or three years, and you probably read about New York City stat and copstat and all that. We don't do-it is not as formal, but every quarter, we will bring in those portfolios of agencies. We usually have a two- or three-agenda item. It may be completely different than measures, but they know that they can be called upon and say, "We notice this measure is trending off, what is going on there?" That is a good reinforcement. Also using the measures and then-. They were sort of defensive at first, but they now come back and say, yes, it is trailing and they supply why. That is the whole issue. I guess the other big challenge that we've had in, I think, I don't want to say we have had completely across 70 agencies, but much of this started off as this is an OMB compliance exercise. Part of it is using this data to help make decisions about program performance or resource allocation. At first this was just, "I've got to get these guys from OMB off my back, just send the metrics into these guys." MESSICK: Right. JOHNSTON: Now, meeting and talking about these things reinforces why this is happening and getting them to think about ah-. The other big pushback was, "There are things I can't control." We said, "That's right." It still happens occasionally: "I can't control that." We understand as long as you can explain it. I remember one thing was State Police and traffic fatalities on state roads. The State Police said they couldn't control the weather. We understand that. If there is a big bus accident and unfortunately a bevy of traffic fatalities because of snow, that's all we want to know. If it spikes up in the March quarter because there is a bunch-well, we had a big snowstorm in northwest Indiana along Lake Michigan and there were terrible accidents or something like that. But it is more about coming back and learning from what you're reporting and to say why this is happening. If it is a successful outcome, let's do more of it; if it is a poor performance, what do we do to correct it? Where we come in is maybe there is something more-they're the experts in what they're trying to deliver. Where we come in is, if it is part of the system that is holding you back let us be your advocate to bridge-. "I've got these rules from the Department of Administration that won't let me buy this." Well, let's talk about it. Convince us that this is really needed to help deliver your service and we'll be your advocate. MESSICK: I can't tell you how many countries I've been involved with with performance measures, mostly in the justice sector, which is very hard to measure. JOHNSTON: Right. MESSICK: One of the real differences-you're the third person we've talked to who makes this clear-is you're using it as the conversation beginner and not as the be all and end all. JOHNSTON: Boy, if you're the penalizer it is not going to work. MESSICK: But when you ask the question that elicits the, "It was because of a snowstorm that led to a bus crash and therefore...." You see that getting that follow-up has been really difficult and you're right, it just becomes you didn't need a ruler on the knuckles. Maintaining the dialogue so that it becomes a tool to probe and not-. JOHNSTON: Right. MESSICK: That's a huge thing that we need to focus on. That's a really huge thing. Otherwise it just becomes this X box-checking exercise. OMB wants this, so here it is. That kind of thing. JOHNSTON: I think you've nailed it. That just takes time. MESSICK: And constant devotion to it. JOHNSTON: I think there are more state governments that are looking at developing groups like ours. That is their main charge: to keep asking those questions. Let's get the data out on the table. So much of what happened on the third floor with our legislature-and we can't control this-but it is based on conjecture or emotion. MESSICK: If you think it is based on conjecture and emotion in a state like Indiana, can you imagine what the legislature of Bangladesh is like? JOHNSTON: Absolutely. MESSICK: The Philippines? JOHNSTON: We're going through this right now in local government reform. That's off topic, but after the property tax reform the governor created this commission to look at how local government is structured. You want to talk about how-. MESSICK: Everybody works for local government in some counties. JOHNSTON: It is all emotion. Every legislator knows every local government official. What we're trying to do-you might think this is funny. We used this after the probe. These are all the excuses we heard about why we don't have to measure. SCHARFF: What was your response? What did your unit do to sort of push back? JOHNSTON: We used this and they thought it was funny because they had used it on us. So this was one of those, like David Letterman's, they would fly across from ten all the way up to one. So we'd just get all of these. They would say, "Yes, I fed you that line." So it was an icebreaker to move from the probe in the program reviews. MESSICK: I have half a dozen chief justices from around the world that have given me answer number five. JOHNSTON: Here is the-I stole this actually from a governor's presentation. This was back in 2007. I think the governing magazine-. What are the key elements for success? He said set the bar high. Progress starts with data. SCHARFF: So what happened in the probe report? There were quite a few programs as you noted that couldn't be measured. MESSICK: Or weren't. SCHARFF: Or weren't being measured. Over time did your team then either try to figure out a way to measure them or begin measuring them? JOHNSTON: One of the chapters here was the action plan: "Where Indiana State Government Should Go From Here." Part of it said we are going to help you, the agencies, to develop measures. After the probe, we went out and did that for the next year or so. We were trying to identify and confirm the programs, what was important and work through this pyramid. We created very crude tools like this to help them think through-here's how you should think about this. We're not going to get involved in your day-to-day business, but we're going to help tell the story of your agency. MESSICK: You know what I should send you-all this came about, I mean the intellectual roots started in New Zealand, Australia and the UK (United Kingdom). After it went to those places, it went to my former agency, the World Bank, where we then walked into countries like Haiti in the mid '90s and tried to get them to operate a program evaluation unit along the lines of what New Zealand and Australia are doing. JOHNSTON: Right. MESSICK: Now what you see in Allen's piece that I have sent you is there has been a reaction to performance measurement. JOHNSTON: Okay. MESSICK: I have to go back and read this. Part of it is as I said is because people aren't-it has become the answer and not the opening question. That might be an interesting thing for you to talk about in your paper. There is now rather significant literature that this is all off base. The one answer is you have to have public-spirited, civic-minded public employees to carry this out. You can't measure morale. You can, but you beat them up with numbers all day long and you get automatons. What you're suggesting, I think, is really interesting. Engaging at the level you're engaging with this ongoing discussion is preventing this kind of by rote. I have friends who worked in the cabinet, the same unit you worked in, for Tony Blair. They gave up in despair because all they did was train lots of clever, very smart civil servants in the UK, on how to gain this system so that they could continue to get their money. JOHNSTON: Right. MESSICK: We continue to do what we're doing and still feed the beast in the prime minister's office. The first thing, by the way, that Gordon Brown did when he took over from Blair was to scrap the unit. JOHNSTON: Right. SCHARFF: Can we go down that path of how you prevent people from learning to-? MESSICK: Yes, because in fact, you see, from part of the-there will be no developing country government that hasn't had a dozen of these, but this is now sort of-. It has become rote exercises. So that everything you're saying is really, really interesting, even from the point of view of course of 50 pages not 400. JOHNSTON: Right. MESSICK: One of my colleagues once prepared a 300-page analysis of the land tenure situation in Yemen in English. There are probably five people in Yemen that read English. They didn't have time and all the people that really needed to know it of course did not read English. So those kinds of things. JOHNSTON: Yes. So part of the effort was-when I walked in on Betsy, breaking down the silos. We obviously focused more on the larger agencies but now we have migrated to everybody. When everybody knew that they were going to have performance measures and that in these quarterly meetings we-. MESSICK: Well the quarterly meetings-. JOHNSTON: The large agencies, they don't want to sit there and listen to the small agencies. But you know what, that's their opportunity to say, "Here's what I'm about." So there is a plus that the smaller to mid-size agencies say, "I'm part of Indiana state government." MESSICK: And here's my contribution. JOHNSTON: Here's my contribution and by golly this big agency is listening to what I have to say. There's Betsy and there's Adam and me at the head table. I guess that's the other-. I don't know if anyone has talked about this or not, I know I'm jumping all over the place. MESSICK: This is good. JOHNSTON: One of the things that I thought was effective in the governor's office in coordinating with OMB, was that we early on, in the first couple of years, determined that if you look at the governor's policy director, Adam's budget analyst that covers that agency, and my person-we call my group the M in OMB, management. Let's say an agency head comes in and says, "I've got a great idea." Before it used to be that you would walk in and visit with-I'm not saying this governor but a governor. All you'd have to do is walk across the hall to the budget agency and say, "The governor's office said make this happen." MESSICK: Five million dollars for X. JOHNSTON: Give me the money. We learned early on that if the agency head comes in, deals with the governor's policy director, Adam's budget analyst that covers that agency, and my person who covers it from a management and performance perspective, then we usually make the right decision if they all get on the same page. Even though we're separate agencies, if you get those three people, the governor's office person, the budget analyst and the OMB person, working with that agency on a particular issue, it is an easier sell and you usually come up with the right decision instead of somebody finding-it is sort of like a child saying, "Mom said 'no' so I'm going to Dad." We learned that that was an effective tool. If somebody was trying to get splintered off it was "Oh, have you talked to Joe who covers your budget?" or "Have you talked to Sally who covers your agency for OMB?" That sort of thing. SCHARFF: We have a section in our case study that we call the "Overcoming Obstacle" section. Essentially what that section does is we look at a reform and how it is being carried out, how it is progressing and what the team is doing and the steps they're taking. But then often times there is a hiccup along the way. It is a chance for recalibration or to try something new or to look at a way to navigate around that roadblock. Sometimes the reform team is successful; sometimes they're not. I wonder, in the last few years here, in your experience, has there been an obvious roadblock moment when you had to say to yourself, "Hmm, this is going to be a real challenge, I need to figure out-this was unexpected, I need to figure out a way to get around it?" JOHNSTON: Well, not to be flippant but it happened a lot. If our charge is to get people out of their comfort zone, it happens daily. I guess at this stage of the administration I'm a little more reflective. If I had to do something over or change how we positioned something, it would be that triangle that I was telling you about: the one with the budget agency, policy director and ourselves. When the revenues were really starting to tank, we should have repositioned ourselves when Adam's group said, "You need to cut 10%; we need to reserve 10% or more," multiple years in a row to get through the fiscal situation. We should have positioned ourselves more to help them find the savings. We got painted in the overall OMB that my small group was part of the budget cutters and just finding the cuts instead of being more proactive. We know you've been called upon to find 5%, 10%, whatever the percentage was. We should have been out there as an advocate to help them achieve-find those savings. Now most of them found it. I was happy to hear when we were doing agency head reviews, several agency heads came in and said, "You know what? If you hadn't started us down this path of measuring, when we had to cut we would have never known where to start. We would have just said it's across the board, 10% for everybody in my agency, all these programs. I at least knew where to start: the least effective programs because you made me measure." In that time of people saying, "I don't know where to go next. I thought I had cut as much as I could. Where do I go next?" we got presented as the budget guys who want to cut more. Instead we should have gone in and said, "Let's think about this, let's help you think through this. Here is where you can make probably the most effective cuts to meet the goals of the budget agency and what the governor is saying in terms of putting our budget together." That is more internal politics, but I wish we had done that more, just getting out there when those memos went out for the next budget cycle. We should have said that they had to make these cuts and gotten out there in a more proactive way. SCHARFF: But you still feel that those cuts were informed cuts? JOHNSTON: Yes. SCHARFF: Informed primarily by-. JOHNSTON: I think it helped those agency heads think about what programs had the least impact and so those were the areas that they should probably focus on first. SCHARFF: Just to be clear, so OMB was dictating to each agency the percentage of the cut and then it was up to the agency to figure out how to reach that? JOHNSTON: There was generally a directive. We think we need to cut here. There is talk about literature-there is a lot of literature that says you should be more informed across all of state government. You should pick-here are the programs. So then you're pitting say transportation against child services programs. I originally thought that was the way to go. But even when you get an across the board 10% cut for all agencies, what we found is that internally they knew where to go. Because of the portfolio of programs that they were managing, they knew which ones were the least effective and the ones where the cuts would be the least disruptive. So they took their 10% charge, but I don't think most of them did 10% across everybody. They figured out where to go within their agency to make the 10% cut. There is a lot of literature out there that says you should just list all of the programs in state government, regardless of function, and make a determination of which are the most effective and which aren't. I see the theory behind that, but then you're really pitting different constituencies against each other and different goals that you're trying to achieve. Again, a good transportation system versus child protective services. That's tough. MESSICK: That's a tough tradeoff. JOHNSTON: I think I've seen what has happened here and in other states that you can still get there within the agency by giving everybody that goal of 10% but then let them chose within their portfolio of programs that they manage in their agencies where to make the most prudent cuts. MESSICK: Now, is the number that came out, 10% for transportation and child services or did somebody say child services 8%, transportation 10%? JOHNSTON: I think, and you can confirm this with Adam, it started off with the same percentage for everybody but once you got into the details, then we-. MESSICK: There was an appeal? JOHNSTON: There were some adjustments to it. MESSICK: Within each agency was there a consistent counterpart? Was there a performance management unit within child services, within DOC? JOHNSTON: I'm not sure there was a unit, but at least somebody was assigned as the performance person. Now did they actually create a Performance Information Officer? I'm not sure that happened. MESSICK: Were they typically part of the budget unit of the agency? JOHNSTON: They were usually more on the budget side, financial management side. MESSICK: When you meet on a quarterly basis, that's the person that shows up? JOHNSTON: No. We usually ask the agency head. He has to sit there too. He brings that person with them. If somebody decides not to show up, we follow up and ask, "Why weren't you there?" Again, they're all in it together. MESSICK: I have this wonderful-I was going to send to you this four-page article. It is about employee performance assessment, but it makes this point that what happens is when the agency head doesn't show, then it just becomes a routine function, no discussion, everything answers itself. Of course it is easier when it is like that. You don't have to do as much work. JOHNSTON: Absolutely. MESSICK: It is a substitute for management, not a tool. JOHNSTON: The other agencies are very observant. As soon as they realize oh this guy gets a free pass-. I had this discussion yesterday with the Department of Financial Institutions. MESSICK: That regulates-? JOHNSTON: State charter banks. They're not funded by the general fund, but they're funded by fees that they charge their regulated entities. We had this discussion where they said, "We're unique; we're different." MESSICK: We generate our own money. JOHNSTON: "We generate our own money, how come we have to observe cutbacks?" I said, "I understand your point but you're part of state government. The governor says, 'We are the State of Indiana.' As soon as it leaks out that DFI got its carve out because-then what do we tell an agency that is half federally-funded, half state-funded? How do we manage all of this?" They still didn't like it, but it was all part of the governor's belief that we are acting as one, we're serving as one. Everyone has their responsibility within their agencies, but this whole thing about going to performance-based pay, procurement reform-the fact that everybody knew they were in it together really is beneficial instead of a bunch of carve outs, being independent contractors and being able to say here's why I'm different. Yes, we hope you're different because you're serving a certain responsibility, but you're still part of the team. MESSICK: I assume your program-related outcome measures would help you guard against-you see if I were the prime minister of Kenya and I told the minister of the Arid Lands Ministry to cut 10%, I can tell you the 10% would be all taken by people that weren't members of my tribe, irrespective of-. JOHNSTON: Right. MESSICK: A political appointee in a state government, if he is from the north, might choose to see that most of the cuts were absorbed in the south. So did the program-related-? JOHNSTON: Are you asking whether we were able to detect that differentiation within a program? MESSICK: Were you able to check whether the 10% that they cut to meet their number was really from the least performing programs or just from the program of the non-Kikuyu, for example? JOHNSTON: Did we go out and verify? No, but when they made their cuts, we had that connection between the budget analysts and our performance people, at least at the beginning, where they said they were going to be making their cuts-. MESSICK: Your budget and performance people knew this-. JOHNSTON: They had a sense of yes, that's probably where you should cut. At the last quarterly meeting, we were again taking this reflective mode. After all this work, we said, "Give us the performance metric that you are most proud of and give us the one that you think still needs improvement." Just one of each. MESSICK: Tell me some of the answers you got. JOHNSTON: We still have people that don't want to bring the bad one. You can't tell me that every single program is performing the way you want it to. MESSICK: What was somebody willing to bring forth as a bad one? JOHNSTON: The governor has ingrained customer service so much-there are still some that said turn around times. They were more process-driven than outcome-driven measures, but it was error rates, customer service, turnaround times, and those sorts of things. MESSICK: What was the best one? JOHNSTON: You know what, the DOC admitted our recidivism isn't dropping. The positive thing that I took away from this, even though it wasn't a great report and you always want to do better, is that they actually connected. because the measure here that talked about the number of offenders in their time cut programs, they're actually doing something, they're learning a trade or they're getting their GED. They had realized that the percentage of idle offenders had been growing. They said maybe that has an impact long term but they actually picked on this. We have more offenders that are idle than ideal. Our target is to be at X percent, or to have a low percentage, and they had exceeded it for the last couple of quarters. They actually picked up on that. Now they're not-it used to be really embarrassing. Now they come out and say, "Look, our measure says this." They're not happy about it, they're not happy about their dirty laundry, but DOC at least mentioned that. We have too many idle offenders in our system. MESSICK: Which means they're not pursing vocational or educational training while they're incarcerated. JOHNSTON: Right. So the ones-obviously the maximum security people aren't going to be participating-but for the ones that are eligible, the percentage of idleness was too high. They came back and said, "We think that's a problem and we need to address it." That's when we asked, "How are you addressing it?" They said they're able to break this measure down by facility to see which are the offending facilities and where the problems are. Then they're going to go back and focus on why at Pendleton, a medium security facility, there is so much idleness? Then they start out loud sort of diagnosing their own problem, which is interesting. So we just sit back and sort of listen or just ask really short but direct questions. Do you have prison industries at that location where they can work? Maybe there is a prison industry at a different facility that you can open up the line of business at this facility. That sort of thing. So again it goes back to continuing to ask the question: why is this result the way it is? MESSICK: What is your indicator-I assume you still have a series of state hospitals for the mentally insane? JOHNSTON: We've been having that debate a long time about community-based service. MESSICK: But what is your indicator for-this appears in a classic book I'm going to send Mitch by James Q. Wilson-. JOHNSTON: I'm reading-I'm going back to his old book Bureaucracy and reading that right now. MESSICK: That's exactly the book I'm sending Mitch. JOHNSTON: He passed away. Someone said, "If you're in the public sector, you've got to read this book Bureaucracy." So I just got it. MESSICK: When you get to chapter six you'll know exactly. In chapter six, he makes the point that there are some agencies where we can see what people do but we can't see the outcome. One of the classics is a mental hospital. We can see-so how do we measure? Moreover, this one is also one where we don't have any-these people, if we had a cure-so the idea is you just hire the best people you can and give them the freedom to do whatever they can. Then that presents, "Well, how do we measure what they do?" JOHNSTON: So you're talking about the really forensic cases, the criminally insane? MESSICK: We certainly have a unit here in the state that deals with that. JOHNSTON: You know what, that's one where we probably don't have a good measure. What we measured before are the ones that aren't at that critical stage, so it is length of stay. Are they getting the services they need and are they able to be mainstreamed back into a stable environment? So we have a length of stay measure for all the others, but in terms of the ones that are here and they're not-I mean they're residents instead of patients-we probably don't. I don't think we have a measure for that. Again, maybe that's the tough one of a negative measure. They didn't harm themselves. They didn't hurt other people. MESSICK: Or it becomes what we talked about earlier, which is the milestone. Well, we got the medicine every day that they required. They didn't harm anyone else. JOHNSTON: Is it six months without an aggravated assault or incident or something like that? Those are the tough ones to measure. MESSICK: I'm deeply involved in trying to help these courts come up with these measures and this is really hard and you're often stuck with what's justice-. JOHNSTON: Well our criminal justice institutes were challenged there because they have these programs. Is it really-is the crime rate going down really the right measure? You can't influence all crime. MESSICK: Right, it is an attribution issue. Wilson actually has a nice piece, an article in 1992, on how to measure police effectiveness. I can send you a link to that. It is all about performance measures in the criminal justice system. What you'll learn is that one of the best run agencies in the US government has historically been the bureau of prisons because it was easier, relative to other agencies, to come up with measures of effectiveness of which one was the number of escapes, one was attacks on other persons and of course the recidivism thing which is very difficult, but at least historically administrators had these fairly clear cut goals that therefore led to a better management cadre. JOHNSTON: Sure. MESSICK: The most interesting thing that I think I've learned and I've read and Betsy made this point, is that Mitch set up that single objective of this enterprise, which is if you're not contributing to increasing the take home income of a Hoosier, then why not? That clarity of objective. SHARFF: Well, we should probably-. Thank you so much for your time, this is really wonderful, thank you. Innovations for Successful Societies Series: Centers of Governance Oral History Program Interview number: H1 ______________________________________________________________________ 21 Use of this transcript is governed by ISS Terms of Use, available at www.princeton.edu/successfulsocieties Use of this transcript is governed by ISS Terms of Use, available at www.princeton.edu/successfulsocieties