Research Study:
People-Centered Oversight

Introduction
People-centered oversight (PCO) is about shifting the focus of oversight from bureaucratic compliance to human impact. PCO is more than a vision or framework. It is taking shape in state legislatures around the country every day. Its practitioners — legislators and staff from both parties, in both full- and part-time legislatures — are already using these principles to make government more responsive, effective, and accountable. We explore the condition of people-centered oversight in the states, both through broad data and through case studies that highlight innovative approaches in data collection and service delivery across government. We also offer recommendations – both for individual legislators and staff and at the institutional level – to advance people-centered oversight in the nation’s legislatures.
In an era of historically low public trust in American government and democracy,[1] government must prove that it can achieve its goals, keep its promises, and noticeably improve the lives of the people whom it serves.
“People have not rejected the goals of our programs,” said Senator Carl Levin in 1979, “but they have begun to question whether our programs have anything to do with our goals.”
Nearly half a century later, those words ring truer than ever: our system of democratic governance struggles to make the case for itself, because often well-intentioned programs fail to achieve the results that were promised when they were enacted.
To regain the trust of the American people, lawmakers and administrators must ask the questions Senator Levin raised throughout his career. Are our programs achieving their goals? Is government not only efficient and honest, but is it also easy to work with?
Our system charges legislators and executive branch officials with the responsibility to monitor the performance of public programs and make changes to those programs based on the information obtained – that is, to conduct legislative or administrative oversight. Today, low levels of trust in government suggest that oversight as currently practiced may not be producing the kinds of information needed to enable government programs to meet the needs of the public.
In fact, the disconnect between the government and the public it serves may be a sign that oversight efforts are often focused on issues that are of little public concern. Oversight hearings, especially the kind that make headlines, are frequently exercises in partisan competition where one side of the aisle or the dais is primarily concerned with embarrassing the other, rather than getting to the heart of what is keeping government from succeeding.
But what if lawmakers conducting oversight and their counterparts in the executive branch viewed oversight as a collective opportunity to better understand how government was serving the people? What if, rather than challenging the integrity of the executive branch officials testifying at hearings, lawmakers challenged themselves and the administration to learn about the lived experience their constituents were having with government – to learn about the shared experience of working with government and how it compares to the standards the public expects in the 21st century.
Shifting the perspective of the inquiry from “what did the bureaucracy do” to “what are the needs and experiences of the people being served” puts the emphasis on the mission of government in a democracy. It can also lower the barriers between legislators who authorize and fund programs and the administrators who are responsible for implementing those programs.
We have given this approach to oversight a name: People-Centered Oversight (PCO). People-centered oversight uses the same sweeping powers of inquiry American legislatures have long enjoyed but shifts the emphasis toward prioritizing the experience and outcomes of policy for the people it is meant to serve. Moving beyond mere questions of policy compliance, people-centered oversight asks: Are policies delivering real results? Are programs improving lives? What is the end-user’s[2] experience with government services? When the answer is “no,” or “not good,” where and why are these policies and systems falling short?
People-centered oversight is about shifting the focus of oversight from bureaucratic compliance to human impact. This approach is centered on two foundational elements:
- Emphasizing outcomes by engaging with those who experience government services firsthand, particularly program beneficiaries and frontline workers responsible for delivering those programs.
- Transforming oversight by strengthening feedback loops between legislators, program implementers, and the public, ensuring that oversight is more of a dialogue rather than a one-way evaluation.
PCO is more than a vision or framework. It is taking shape in state legislatures around the country every day. Its practitioners — legislators and staff from both parties, in both full- and part-time legislatures — are already using these principles to make government more responsive, effective, and accountable.
[1] “Public Trust in Government: 1958-2024” (Washington, DC: Pew Research Center, June 24, 2024), https://www.pewresearch.org/politics/2024/06/24/public-trust-in-government-1958-2024/.
[2] We use the term “end-user” to describe the people on the receiving end of government programs and services in place of more variable terms like “taxpayer,” “inmate,” or “beneficiary.”
The landscape of people-centered oversight in the United States today is one of significant gaps alongside promising innovations. Like legislative oversight generally, the condition of what might be considered PCO varies widely between — and even within — state legislatures, and it is challenging to definitively measure or compare. Institutional components like committee rules and digital tools can support people-centered oversight, but so can legislative culture or the practices of an individual legislator.
To capture completely the full spectrum of people-centered oversight in every state would be a monumental task, and we do not claim to have done so with the data we have summarized here and attached in the State-by-State Summaries. Nor have we tried to definitively say who is “doing” people-centered oversight and who is not. Instead, we have done our best to compare states on several standard criteria that provide a general sense of their ability to monitor program implementation and outcomes, receive and process information from end-users of programs, and generally engage with the public in the oversight process.
These criteria include direct opportunities for public input, the collection of implementation data from agencies and casework, and other general indicators of oversight efforts. In this way, we can see where the basic infrastructure for conducting people-centered oversight is in place and how legislators and staff make use of it. We have also supplemented these findings with a collection of state-by-state summary notes with additional findings for each state.
While some states have laid a solid foundation for people-centered work with digital tools, proactive engagement, and a culture of meaningful oversight, others are limited by basic oversight mechanisms or cultures of indifference around oversight. In our collection and analysis of this state-by-state data, several patterns emerged in the implementation of people-centered oversight across the states:
First, people-centered oversight depends on good oversight. PCO sets a high bar for engaging with the public, strengthening feedback loops between the legislative branch and program implementers, and enhancing existing oversight efforts. However, it cannot function where there is little in the way of existing oversight infrastructure, staffing, or effort. If the underlying oversight mechanisms in a state are weak or underused to begin with, even the most innovative PCO initiatives will struggle to drive meaningful accountability and reform.
Second, most attempts at people-centered oversight through public hearings fall short. Americans generally do not make a habit of testifying before legislative committees. The barriers to doing so (taking time out of a weekday, traveling to a hearing or figuring out remote testimony, and the social pressures of publicly speaking, to name a few) are very high.[1] Even where legislative oversight committees devote agenda space to public comment and testimony (as in Delaware, Colorado, Hawaii, and Wyoming) there is little evidence of consistent testimony by day-to-day users of government programs — much less evidence of use of that testimony by committees to inform their oversight.[2]
If legislators rely on hearings as a primary method of accepting feedback on programs and policies, they will continue to hear exclusively from those who do make a habit of testifying at committee hearings: executive branch staff and members of interest groups. These entities can offer valuable feedback and information, though they do not necessarily incorporate the perspectives of end-users that legislators can find beyond the confines of a hearing.
While committee “field trips” and holding meetings at locations other than a capitol committee room can help, public hearings alone are too limited in scope to form the basis of people-centered fact-finding activity, and their inherent obstacles limit participation by most citizens. For a truly inclusive approach, people-centered oversight must extend beyond the committee room.
Third, we found that web forms are an increasingly popular way to solicit information from the public, despite lackluster results. Perhaps in an effort to lower the barriers to participation associated with hearings or other high-cost methods of engagement, many institutions have turned to tools like web forms. These forms take many shapes, ranging from general tools to contact members or submit written committee testimony to specifically designed mechanisms for auditors to collect information on fraud, waste, and abuse in programs.
These web forms make it easier for the public to engage with legislative institutions in many cases, and they can play an important role in a balanced strategy for people-centered oversight. However, like other ways for the public to engage with legislatures, web forms come with tradeoffs. They are more open to manipulation by spam, bots, and advocacy campaigns. Their reach is also limited, much like participating in public committee hearings, to those Americans with the interest and means to fill out legislative web forms.
We paid particular attention to web forms beyond mere generic contact forms. (“Fill out this form if you have a suggestion that will help [our caucus] investigate and bring accountability to government,” “Share your thoughts in my legislative survey.”) While some of these forms appeared to be in regular use, as with a public input form from the South Carolina House Oversight Committee, many others seemed to exist as merely symbolic gestures toward public engagement on a given issue, and it was unclear how or if the results would be used to inform oversight.
While web forms can serve an important role, they are only a starting point. PCO at its best requires more than providing a mechanism for the collection of feedback from the public. It demands careful, deliberate use of information – gathered by lawmakers or by administrative agencies or third parties – to inform policymakers about the public’s experience with a program. Without robust systems to collect, analyze, and integrate feedback, web forms remain an incomplete solution.
Fourth, other inconsistencies and gaps in digital infrastructure hinder progress in multiple aspects of PCO. Almost all state legislatures have some kind of centralized bill tracking system, often with years of searchable records from previous legislatures. Even in cases when similar resources for tracking, enabling, or reporting on oversight work are available, the use of those systems can be sporadic and can change with political leadership.[3]
This is also true when it comes to tracking casework. As we will discuss later, casework serves as an important basis for people-centered oversight, but the use of computer programs to harness the data it generates is often weak. As part of our research, we spoke to staff in many legislatures to determine whether they had a centralized system for processing and monitoring trends in constituent requests. Fewer than half of states, based on our sample, have adopted these systems.[4] Further, when such a system had been implemented across an entire chamber or legislature, use by individual offices was often inconsistent. In one state, Tennessee, staff we spoke to indicated they previously had a system for tracking constituent requests but had discontinued its use because of low rates of use by staff. In many states without such systems, it can become difficult for legislators and staff to monitor trends in constituent engagement in their own offices, let alone at an institutional level.
Despite these challenges, people-centered oversight is still happening in all corners of government. Even in the absence of institution-wide momentum, individual legislators and staff and other offices are centering implementation and the end-user experience their work. As we will discuss in our case studies, legislators, staff, and researchers are engaging with new ways of gathering and using casework data. Some analytic bureaucracies[5] are conducting fieldwork to gather data and provide answers about program implementation and user experience in new and interesting ways. In bright spots across government, attention to human-centered design and program implementation by legislators, agencies, and the public is leading to meaningful changes in administration.
Aspects of PCO are also happening every day in small but foundational parts of legislative work. When a staffer helps a constituent navigate unemployment insurance, when a legislator holds a town hall in their community, or when a program evaluator conducts fieldwork, they are serving as the eyes and voice of the public and building up systems of people-centered oversight.
Opportunities for PCO are expanding. Even before the COVID pandemic, some states were breaking down the old geographic barriers inherent in legislative work with technology to enable options like remote committee testimony.[6] At the same time, other digital tools are making it easier to conduct outreach work, gather information on legislative casework, and even aggregate casework information collected across caucuses or entire legislatures. Executive agencies are also collecting and sharing more information about service delivery than ever before. With just a handful of exceptions, most states make much of this data available to legislators and the public through online open data tools. The data shown in these tools — typically generated directly through the provision of government services — can provide a far more complete and candid picture of reality than legislators and staff can hope to capture through their own data collection efforts.[7]
Critically, it is not just the volume of collected information that is increasing. Advances like artificial intelligence make it possible to analyze and make more sense of large amounts of data than ever before, even with typically low levels of staffing in legislatures. Because people-centered oversight, like legislative work generally, depends on the ability to collect, manage, and analyze information effectively, these advances represent a significant opportunity for practitioners of PCO.
The following case studies demonstrate how legislators and staff, oversight partners, agencies, and contractors are using people-centered oversight to great effect for both themselves and the people they serve.
[1] Pamela Ban, Ju Yeon Park, and Hye Young You, “How Are Politicians Informed? Witnesses and Information Provision in Congress,” American Political Science Review 117, no. 1 (February 2023): 122–39, https://doi.org/10.1017/S0003055422000405.
[2] The exception is when the salience of a topic rises to a level that invokes intense attention and scrutiny from the public, as in Oregon’s Joint Committee on Addiction and Community Safety Response hearings on drug decriminalization. In these cases, however, it appears to be the salience of the topic encouraging a surge of public participation rather than the institutionalized behavior of the legislature.
[3] The exception to this is in legislative analytic bureaucracies, where records tend to be routinely updated and generally complete. The resulting spotty records also mean it can be challenging for citizens (and researchers) to get a sense of what is happening with oversight in a particular legislature.
[4] For more detailed state-by-state information, see the attached State-by-State Summaries.
[5] We use the terms “analytic bureaucracy” and “analytic agency” to refer to “any state government entity that helps legislators assess agency program performance as well as financial performance.” The broader term “oversight partner” also includes non-governmental groups upon which legislators rely when conducting oversight. For more information, see Lyke Thompson and Marjorie Sarbaugh-Thompson, “Checks and Balances in Action: Legislative Oversight across the States” (Wayne State University, 2019), http://go.levin-center.org/50state.
[6] Eyragon Eidam, “Remote Testimony: Cure for State Democratic Barriers?,” Government Technology, January 22, 2016, https://www.govtech.com/network/Remote-Testimony-Cure-for-State-Democratic-Barriers.html.
[7] It should be noted that the mere presence of an open data or performance portal in a state is not a magic bullet. The accuracy and timeliness of the information collected are important, as is the type of information presented. That is, a spreadsheet of the names of organizations licensed to drill wells is open data but does not lend itself to people-centered oversight in the same way as information about, say, the time it takes for someone to acquire such a license.

Conclusions
Conclusions

Recommendations
Recommendations

Case Studies
Case Studies

Conclusions: Oversight that Engages and Delivers for People
Government should work, and it should work for the American people. It should be efficient, honest, and easy to deal with. It should achieve our goals. It should, in the words of Robert Caro, transform people’s lives for the better. When we talk about government working, it is easy to fall into the habit of doing so in terms of the mechanics that make up government in the twenty-first century: contracting, procurement, compliance, regulation, civil service rules, and so on. All those factors are important to making government work. And, to most Americans with other things to worry about, they do not matter. What matters is implementation and outcomes — the real, tangible effect government can have on their lives.
If American democracy is to make a case for itself in our time, it must prove that it can deliver government that works. This is a tall order but, in an era of historically low trust in American government and in democracy generally, an essential one. If we are to get it right, it will require not just changes to policy and to implementation of that policy, but a reorientation of legislative oversight toward the kinds of people-centered approaches we have outlined here.
The stakes are high, but the good news is that this more responsive, experience-driven model of oversight has already spread its roots across the country. People-centered oversight happens every day. It happens when legislative staff record the details of a phone call on a computer. It happens when a program evaluator carves out a few extra days on a project plan for fieldwork. It happens when legislators tour facilities or when agency analysts compile statistics on service delivery. PCO is already a part of legislative life across the country.
What we need now is not to invent something entirely new, but to recognize the value of these practices and commit to them with greater intention. By building on what already works, institutionalizing what is often informal, and directing our oversight energy at experience and outcomes, we can strengthen the link between government and the people it serves. What we stand to gain is more than just improved government programs, but improved lives and the proof so many are craving that our democratic system is capable of listening, adapting, and delivering for people.
There is evil and injustice that can be caused by political power, but there is also great good. It seems to me sometimes that people have forgotten this. They’ve forgotten, for example, what Franklin Roosevelt did: how he transformed people’s lives. How he gave hope to people. Now people talk in vague terms about government programs and infrastructure, but they’ve forgotten the women of the [Texas] Hill Country and how electricity changed their lives. They’ve forgotten that when Robert Moses got the Triborough Bridge built in New York, that was infrastructure[…] And that one bridge created thousands of jobs: 31,000,000 man hours of work, done in twenty states, went into it. We certainly see how government can work to your detriment today, but people have forgotten what government can do for you. They’ve forgotten the potential of government, the power of government, to transform people’s lives for the better.
-Robert Caro[1]
[1] Robert A. Caro, Working, First Vintage Books edition (New York: Vintage Books, 2020), 183.

Recommendations for Institutions and Individuals
In our work to understand the condition of people-centered oversight in the United States today, we have come across a number of best practices that form a foundation for people-centered oversight. Some of them are more concrete (changing computer systems, for example) while others (like shifting institutional culture) are longer-term, more nebulous changes. We outline these recommendations below, divided into sections for whole institutions and for individual legislators and staff.
For Legislative Institutions
People-centered oversight — like oversight more broadly — is a habit. Sweeping powers of inquiry, plentiful resources to conduct people-centered oversight, and even a history of meaningful oversight work cannot guarantee that people-centered oversight will happen. Effective PCO requires commitment and sustained attention on an institutional level. Activities like training and professional development programs can contribute to that commitment and attention, as can building people-centered practices — like user-centered research or analysis of casework data — into the rhythm of legislative work. What ultimately matters is cultivating a shared belief in the power of people-centered work and translating it into action in the long term.
“Oversight” conducted to take down perceived opponents or produce clips for social media rarely involves the people it purports to serve, and it is more likely to result in a culture of risk aversion and lowered effectiveness of government down the line. The same is true of oversight that seeks only to monitor strict adherence to policies and procedures. Michigan’s old application for public benefits, described in our case study, is the result of this kind of strict oversight. The result may comply with policies, but there is no guarantee it will produce its intended outcomes. By agreeing on shared goals and transforming oversight into a routine, non-punitive dialogue with agencies, legislative institutions can also transform the incentives that oversight creates, resulting in better outcomes.
Adopt user-centered research and design and support analytic bureaucracies and agencies in taking on user-centered work. The best way to understand how policies are achieving their goals and make course corrections is to gather perspectives from the people most directly affected by those policies. Analytic bureaucracies (as in our case study on Minnesota’s Legislative Auditor), agencies, and contractors (like Civilla in its partnership with the Michigan Department of Health and Human Services in our case study) are in the best position to take on this work and feed their findings back to legislatures. Legislative institutions can facilitate the collection of this information and empower agencies to use it in service of positive change not only in their oversight work but also by appropriating resources for user-centered work and being clear about their expectations for user-centered work through legislation and other tools.
Bill tracking is the low-hanging fruit of legislative computing because it requires relatively simple databases and is often managed by concentrated groups of professional staff. It makes sense, then, that it was the first part of legislative operations to go digital. Other aspects of legislative work, however, have been left behind. Oversight is not so easily fed into a database or quantified (though the agency performance and open data portals in some states are a great start), but that does not mean it has no place in the digital world.
Oversight reports from committees — and even basic committee records like meeting minutes — can be quite challenging to find and can even vanish online as a consequence of changes in leadership and partisan control or technology upgrades. Tools for tracking casework and other constituent interactions also offer many benefits to legislative institutions and can be relatively simple to implement. Our data point, however, to very spotty implementation between — and even within — state legislatures. While some of these gaps are explained by technology systems, others are the result of institutional culture that fosters partisan separation of computer systems, staff resistance to new technology, or inattention to the implementation of policies authorized by the legislature. Understand that changes to legislative computer systems also involve work on organizational culture.
Make the public a meaningful part of oversight — and that means more than just public hearings and web forms. Meaningfully including the public in oversight involves meeting people where they are, which means going beyond legislative websites or committee rooms. Conducting hearings in remote locations, touring state facilities, and having conversations with program users and frontline staff can help. Routinely incorporating information gathered from member casework in oversight activities can also be beneficial. Directing analytic bureaucracies or agencies themselves to collect more detailed, people-centered information can also help take the pressure of data collection off legislators and staff and gather better and more complete information than would otherwise be possible.
Think carefully about biases inherent in how your systems and policies define who “the people” are and work to find effective and feasible ways of ensuring that, as often as possible, the data you gather comes from a representative sample of the public. Use objective data wherever possible to ensure your oversight work is rooted in reality and understand that all the people will not be happy with every program or decision.
State governments produce an enormous amount of data, much of which is provided to legislatures — often in reports required by statute — or to the public through tools like open data portals. The choice of datasets included in this reporting can have major impacts on oversight, policymaking, and public understanding of how programs are working.
For those in state government responsible for determining which data to request or report, it is important to consider the extent to which the information it yields will provide real insight into how programs are achieving their goals or produce information to assist in the pursuit of people-centered oversight. Gathering or reporting too much data thoughtlessly can overwhelm institutions with largely useless information and slow down the work of agencies, which must reallocate resources to collect that information. Data on performance and service delivery (e.g. wait times, service levels, program outcomes, customer satisfaction) provide far more insight for PCO purposes than other raw data that agencies typically produce (e.g. names of licensees, financial statements, volume of cases). Performance-related data also makes it easier to set expectations for agencies that more closely align with program goals, and routine reporting allows nimbler, more data-driven oversight action than if data must be gathered through means like an audit or subpoena.
For Legislators and Staff
It is one of the best tools available to the legislative branch to understand how policy implementation is working on the ground and to build capacity and understanding of how the executive branch functions. Think of and talk about casework as an oversight function — not a strictly political one. Treat casework like the valuable tool it is by involving caseworkers in oversight and other legislative work, and by collecting and monitoring data thoughtfully — ideally with a robust computer system. Because more data can yield better insights, aggregate casework data across offices when possible. Watch for trends in casework, both qualitative and quantitative, since the most important issues in program implementation and user experience often reveal themselves in casework first.
Dig into proactive casework. Hold town halls. Offer assistance and work to understand what constituents are thinking and feeling. Use proactive casework as a tool to correct for gaps in who might contact your office for casework services. Use the information that you gather from proactive constituent work to inform your oversight work.
Develop systems to separate signal from noise when engaging with public concerns. Work to understand government programs and how they are operated so that you can better distinguish between when a program has failed to deliver intended results and when it has issued a determination that a constituent or member of the public does not like. Draw a bright line between constituent casework and regular correspondence or advocacy campaigns.
The fundamental challenge of people-centered oversight — and all legislative work, for that matter — is in handling information. AI, even with its flaws, offers a promising way to process large amounts of information and identify trends more quickly and often accurately than would otherwise be possible, especially in a resource-constrained legislative environment. Take time to learn the practical details of how you can use AI to enhance your work without sacrificing data integrity and cybersecurity.
The legislative branch is the people’s branch — an institution designed to keep its members in closer touch with citizens than anyone else in government. This is a great advantage in conducting people-centered oversight, and many of the basic practices that facilitate PCO are just good legislative practices. Form long-term relationships with constituents and learn about their experiences with government and how they change over time. Get to know your district, visit facilities, and ask questions. Remember that all of oversight, all of legislating, and all of government begins and ends with people.

Case Studies
Casework and Oversight
Casework has long been an important part of the American legislative tradition. John Quincy Adams, serving in the House of Representatives after his presidency, worked with his constituents to help with everything from pensions to postal appointments.[1] Today, casework has grown into a significant proportion of legislative work at all levels of government. A recent estimate from POPVOX Foundation suggests that job titles for around 1,500 congressional staff — about fifteen percent of the congressional workforce — include some mention of casework.[2]
Casework at the state level varies widely across the country. Some members of state legislatures handle their own casework as it arrives by email, telephone, text message, and letter. Some have staff assigned to handle casework. Still others operate full-scale district offices with specialist staff and in-person service. No matter the structure, though, the basics of casework are the same: a constituent faces some issue with government (or a government-adjacent matter), and they go to a legislator for help.
Casework is more than just good politics or a relic of a bygone era in American legislatures. It has the potential to be a transformative tool for oversight.
There are two central benefits to casework as an oversight tool. The first is as a builder of institutional capacity for making sense of how well government programs are being implemented. For example, if members and staff by performing casework related to licensing of cosmetologists come to better understand the practicalities of the licensing system and the situations of people who use that system, those members and staff are in a better position to oversee (and legislate on) that system. Just as the constituent benefits from particularized assistance with their cosmetology license; members, staff, and the legislative branch more broadly benefit from an improved understanding of how the relevant policies are implemented and where they may need improvement.
Second, casework can help lawmakers identify changes, trends, and snags in existing programs, providing an early warning system when things go wrong. Constituent casework often happens as the result of some kind of failure on the part of government or a constituent’s dissatisfaction with a public program or service. A process lags, someone does not get the outcome they hoped for or expected, bureaucracies clash, well-intended procedures get in the way, or systems get overloaded. Eventually, the facts of these circumstances would make it to legislators’ desks through audit results or required reporting of one kind or another, but the first signs of trouble often come from constituent contact.
Casework data is a valuable input to legislative oversight, and more is better. Increasingly, legislative institutions nationwide are coming to recognize the value of data gathered through casework, and they act on it by using more powerful tools to collect that data and even aggregate data from multiple offices to draw better, faster conclusions. In most places, casework data aggregation is still in its infancy, and the quality of data collected from many legislative offices makes aggregation as much an art as a science. Still, casework data aggregation promises a bright — and deeply people-centered — future for data-driven legislative oversight.
But there is a problem with casework data as it is today; a familiar, glaring question in American government: because not everyone who needs help or is negatively affected by a program seeks legislative assistance, who are “the people” around whom casework — and casework as oversight — is centered?
To answer that question, we can examine who engages with elected officials generally: the people who generate casework and whose data it reflects. The statistics in this area are a bit thin, but if casework follows the broader trends of people who participate in the American political system, they tend to be whiter, wealthier, and better connected.[3] Their experiences with government, the services they use and the problems they encounter, reflect those realities, skewing casework data away from experiences with, for example, social services that serve less privileged populations.
So, what can be done to correct this bias while maintaining the broader benefits of casework? One tool is proactive casework, outlined in a recent paper by graduate student Megan Rickman Blackwood of UNC-Chapel Hill entitled, Proactive Casework Theory.
Proactive casework
Proactive casework theory, writes Blackwood, “moves casework from a responsive to an anticipatory model of representation. Rather than responding to an uneven stream of complaints that reflect preexisting inequalities in political knowledge, trust, and efficacy, PCT initiates outreach to widen the aperture of service provision…. This structural change in who is the first mover in service provision requests reorients the entire access structure around the office’s willingness and capacity to seek out needs, rather than a constituents’ ability to advocate for themselves.”[4] In other words, when elected officials reach out to their constituents to offer casework services — with a phone call or a knock on the door — the result is that a wider and more representative section of the population can benefit from those services. This benefits not only the constituents receiving the service but also the legislator for whom casework serves as a source of data for oversight. Proactive casework theory, then, has the potential to correct for the bias in the data generated by traditional casework.
And proactive casework theory is more than just a theory; it is a blueprint for action that could be adopted by legislators and staff – a blueprint with clear benefits. When Blackwood ran a pilot program with one office in the Virginia House of Delegates in the fall of 2023, it caused an increase in the number of constituent contacts (one of over five thousand percent, in fact), but it also increased the number of cases representing marginalized groups by 24.5 percentage points — from 53.6% to 78.1%.[5]
Getting proactive casework right takes effort. It requires a commitment to take on more casework, data collection, and, of course, lots and lots of phone calls or door knocks. In Blackwood’s pilot, a group of volunteers made many of the calls to constituents using a script and logging cases for staff to follow up on as they arose.[6] Because proactive casework does not necessarily require year-round attention, it lends itself to sporadic work from existing volunteer networks.
Improving oversight inputs with proactive casework requires no sweeping legislative reforms or major institutional overhauls — just a shift in approach. Many legislators have the infrastructure, from casework operations to volunteer networks, in place already. The other building blocks of a successful program include focusing proactive casework intentionally, building and using robust data systems, and integrating the resulting casework data into legislative oversight efforts.
Focusing proactive casework intentionally is key in correcting for some of the data bias produced by the old, reactive casework model. The people who most frequently seek assistance under the reactive model — and whose inquiries are reflected in most existing casework data — do not represent a legislator’s entire constituency. Proactive casework allows for better targeting in outreach, but if it simply replicates the same old patterns, it risks reinforcing the bias rather than correcting it.
To fully enjoy the benefits of proactive casework, legislators and staff must intentionally structure their outreach around communities that might otherwise be left out of casework. Modern voter data permits in-depth targeting based on a multitude of factors,[7] but there are other, simpler ways of targeting proactive casework, too. Think about underserved geographic areas of the district — or just plot existing case requests on a map and look for gaps. Consider blind spots in voter data that might limit outreach to groups like students, people who have recently moved, non-citizens, and so on. Other community institutions where people already engage with government services like transit hubs, social service offices, and libraries also offer great opportunities for in-person engagement. With or without tools like voter data, it is easy to make proactive casework a mechanism for correcting existing disparities in constituent service and the associated bias in oversight data.
A good proactive casework approach also requires building a robust data system. This is important for handling data outputs later, but it is also a critical part of scaling up a casework operation. In a traditional, reactive casework model in which requests come in at a (mostly) manageable, even pace, it is possible (if inadvisable) to skate by on email threads and sticky notes. Proactive casework, however, generates a surge of incoming cases, many of which require considerable follow-up from staff. Nonetheless, Blackwood’s case study suggests practitioners of proactive oversight do not need to invest enormous sums of time and money in expensive and complicated software.
In one test run, Blackwood used a Google form with built-in intake scripts for volunteers to respond to a multitude of scenarios. Volunteers entered information as they spoke with constituents, and the data moved automatically to a spreadsheet for case tracking and data analysis. The system allowed volunteers — the first point of contact in proactive oversight — to record initial information from constituents in a structured way, making it easier for legislative staff to process and respond to them later. Of course, there are many other proactive outreach and casework tools available to legislators, and anything that permits good data collection, analysis, and case management, will make a fine solution.
Beyond the operational benefits, a well-structured data system also lays the groundwork for integrating proactive casework data into legislative oversight. Rather than just a service function, casework at its best is a tool for understanding how policies are implemented — whether and how government keeps its promises to the people it serves. The offices that do the best job of using casework data for oversight make a habit of reviewing case trends, sharing findings with their whole teams, and thoughtfully including the data in other legislative work. When patterns emerge — widespread issues with unemployment benefits, delays in cosmetology licensing, complaints about a correctional facility — they let those insights serve as early warning and incorporate them into legislative work from oversight planning to policy interventions.
Proactive casework, then, offers a practical way to improve the value of casework for legislative oversight. By shifting from reactive to proactive engagement with constituents, legislators do more than just correct for the bias of traditional constituent service — they harness the power of the tradition of casework to build stronger, more responsive feedback loops and strengthen their role as the “eyes and voice” of the people they serve.
[1] Leonard Dupee White, The Jacksonians: A Study in Administrative History, 1829-1861 (New York: Macmillan, 1954), 143-145.
[2] “Who Does Casework for Congress? An Unscientific Survey,” POPVOX Foundation Blog, May 6, 2024, https://www.popvox.org/case-notes/who-does-casework.
[3] Henry E. Brady, Sidney Verba, and Kay Lehman Schlozman, The Unheavenly Chorus: Unequal Political Voice and the Broken Promise of American Democracy (Princeton: Princeton University Press, 2012),117-146.
[4] Megan Rickman Blackwood. Forthcoming. 2025.
[5] Megan Rickman Blackwood, “A Theory of Proactive Casework at the State Level as a Means of Bridging Access Gaps” (UNC Chapel Hill, 2024),26.
[6] Ibid, 23-4.
[7] “Constituent Data | Contact Lists At Your Fingertips.,” L2 Data, https://www.l2-data.com/constituent-data/.
Legislative oversight is only as strong as the information that informs it. While hearings and constituent casework can provide valuable snapshots of how government is working, they are not a substitute for more formal, systematic tools for gathering information. For that, legislative institutions often turn to analytic agencies – entities like auditors and inspectors general, legislative research agencies, and so on – that help legislators assess conditions in society, examine financial performance, and evaluate the impact of government programs. The field of program evaluation is a broad one, but its general focus on operational performance, service delivery, and program outcomes along with aspects like statutory compliance makes it a natural venue for people-centered oversight. In this case study, we will look to Minnesota, where the Office of the Legislative Auditor’s efforts to incorporate citizens’ lived experience with government make it a model of people-centered oversight in action.
The Minnesota Office of the Legislative Auditor’s (OLA) Performance Evaluation Division exists to “determine the degree to which the activities and programs entered into or funded by the state are accomplishing their goals and objectives” and using resources efficiently.[1] As a part of the legislative branch, it answers to a bipartisan, bicameral commission of legislators that appoints the Legislative Auditor and selects topics for review.
In recent years, OLA staff have used tools like surveys and interviews in their program evaluation work to collect feedback from groups as wide-ranging as cosmetologists, court officials, parents of new drivers, and individuals incarcerated in state prisons. OLA’s consistency in gathering this user- and implementer-centered information, their care in working with vulnerable populations, and their commitment to using the resulting information to produce substantive data and insights to drive oversight make them a valuable national model for the role that legislative analytic agencies can play in people-centered oversight.
While not every program evaluation topic lends itself to extensive interviews and surveys, most OLA program evaluations in recent years have included some component in which OLA staff solicit direct feedback from end-users or implementers of programs. A report on driver testing and licensing in 2021 included not just information garnered from surveys of supervisors at testing sites, but in-person surveys on the experience of scheduling a test conducted with 45 parents while they waited for their child to take a road test.[2] A 2023 evaluation of a COVID-era rental assistance program included feedback gathered from 207 survey responses from landlords on their experiences with program aspects like application processing times and call center support.[3] Even a recent report on the implementation of task force recommendations on aggregate resources (a term encompassing materials like sand and gravel) – not a subject on which surveys might typically be used as an evaluation tool – included survey information gathered from county zoning administrators across the state.[4]
Including perspectives from people who have lived experience with government programs adds a great deal of value to OLA’s program evaluations. This is especially true in cases when those perspectives come from vulnerable populations or other groups whose perspectives might not traditionally be considered in oversight work. Working to include voices from those populations – and doing so in a thoughtful, ethical way – is challenging but, in recent reports examining state correctional facilities and parts of the state’s child protection system, OLA did just that.
As part of a program evaluation examining safety in state correctional facilities, OLA collected survey responses from 246 prisoners and 1,469 prison staff.[5] These surveys yielded valuable insights from both groups on the safety of correctional facilities, but the process of collecting data from them – particularly from prisoners – required careful planning and execution. An appendix to the report discusses some of the challenges involved.[6]
Department staff needed to develop methods for prisoners to access an online survey tool without access to the wider internet. Prison staff had to arrange for prisoners selected by random sample to take the survey and supervise the process without observing survey answers. Because of the unique ethics of conducting such a survey, OLA voluntarily consulted the Department of Corrections’ Institutional Research Board when developing survey protocols. These protocols included emphasizing the voluntary nature of the survey and making audio instructions available for respondents with limited reading ability. All these measures and others described in the survey methodology appendix (which stands on its own as a worthwhile read for anyone interested in the work of people-centered oversight) took considerable effort. They resulted, however, is a richer and more valuable program evaluation for the Legislature; one which carried with it the voices and points of view necessary to understand the full picture of the state’s corrections system – voices and perspectives that might otherwise never have made it beyond the walls of a prison.
Similarly, a 2022 report on Child Protection Removals and Reunifications includes, along with survey data gathered from law enforcement agencies and county child protection agency administrators, information gathered from a series of interviews with young people who were directly impacted by child protection removals. From the report: “We worked through a DHS-coordinated youth advisory council to interview several teenagers and young adults who had been removed from their homes and placed in foster care. We appreciated their willingness to share their stories with us.”[7] As it had done in its correctional facilities project, OLA staff consulted with an agency Institutional Review Board to assess the ethical challenges of conducting research with a vulnerable population.
While these interviews did not produce the same type of statistically rigorous data as some of OLA’s other work, they did provide valuable perspective from people who had experienced the effects of child protection removals firsthand. “Most of the young people we spoke with acknowledged that they had been in abusive or neglectful situations prior to their removal from the home,” reads the report. “However, a common concern in these interviews was that the young person was not aware of what was happening at the time nor did they know the reason for their removal from the home. The young people we spoke with expressed a desire for greater communication at the time of removal.”[8] Once again, this information – key to understanding policy’s impact on the people to whom it matters most – is only available in service of legislative oversight because of its inclusion in OLA’s reports.
OLA’s work in Minnesota is an excellent example of people-centered oversight because it so consistently includes frontline perspectives on program implementation – from both staff and “end-users” of programs. By meeting people where they are (sometimes very literally, as in the case of parents during their teen’s road test) OLA can include more candid and meaningful information on how policy impacts the people government serves. This work is challenging and takes real effort – far more than any web form – but the results speak for themselves.
Around the country, legislative analytic agencies with the ability to conduct program evaluations were born out of the idea that legislatures, to serve as a coequal branch of government, must develop better capacity for independent information gathering and oversight.[9] The information they collect and report is one of the best tools the legislative branch has to answer the central questions of people-centered oversight – questions of whether and how well government delivers on its promises. By collecting information from the front lines of program implementation thoughtfully and consistently, the Minnesota OLA does more than just enhance its reports – it strengthens the legislative branch and its oversight efforts with valuable, people-centered data. The Minnesota approach shows the power of people-centered oversight that extends beyond token efforts and includes deliberate, systematic integration of feedback from the people whose perspectives matter most. It serves as a compelling blueprint for people-centered oversight that is well within the reach of most analytic agencies.
[1] Minnesota Statutes 2024, 3.971, subd. 7.
[2] “Driver Examination Stations” (Minnesota Office of the Legislative Auditor, Program Evaluation Division, 2021), 20-21.
[3] “RentHelpMN” (Minnesota Office of the Legislative Auditor, Program Evaluation Division, 2023), 37.
[4] “Aggregate Resources” (Minnesota Office of the Legislative Auditor, Program Evaluation Division, 2025), 18-24.
[5] “Safety in State Correctional Facilities” (Minnesota Office of the Legislative Auditor, Program Evaluation Division, 2020), 87-90.
[6] Ibid.
[7] “Child Protection Removals and Reunifications” (Minnesota Office of the Legislative Auditor, Program Evaluation Division, 2022), 2.
[8] Ibid, 27.
[9] “NLPES and Its History,” National Council of State Legislators, February 25, 2025, https://www.ncsl.org/legislative-staff/nlpes/nlpes-and-its-history.
We have discussed the benefits of people-centered work in the context of legislative oversight and information collection, but what do government and the people it serves stand to gain when that people-centered mentality filters back to program implementation? What does it look like for programs to deliver better results, to work with efficiency and honesty, and to meet the people they serve where they are? And what can practitioners of oversight learn from these practices? As governments invest in human-centered design for modernizing processes and improving digital services, we are finding out. For a case study, we need look no further than the Levin Center’s home state of Michigan.
The story of how Michigan used human-centered design to overhaul its public benefits application process begins, naturally, with a numbered form: DHS-1171. Clocking in at more than 40 pages, the old version of DHS-1171 was used by a quarter of Michigan residents every year to apply for a variety of public benefits.[1] At Civilla, the Detroit-based nonprofit that undertook the monumental process of redesigning it, the old form has a sort of legendary status. The form, they note, contained 1,000 questions and more words than all of Shakespeare’s Macbeth in a jumbled format.[2] It requested the date and city of conception for most people under the age of 22. It was the longest form of its kind in the United States.[3] Because of its length and complexity, this ostensible tool for accessing public assistance, in reality, served as a barrier to service for millions of Michiganders. It also consumed agency staff time with error corrections and other processing difficulties. “People described,” writes Civilla, “feeling like they were interacting with a fraud prevention system, rather than working with a service provider.”[4]
Today, the equivalent form is twenty pages with large text, bright colors, and questions written in clear language. Civilla reports that 90% of applicants feel confident they can complete the application on their own and 90% of applicants do complete it within 20 minutes. For staff, the new form contributed to a 42% drop in processing time.[5] And the reviews from applicants and agency staff are outstanding. “After filling out the new application I feel like I can breathe again. The old application would have taken me a whole day. This one was more understandable and less stressful — it asks you the questions but with respect,” said one applicant.[6] A caseworker observed that, after the new form came out, “People are coming to me with a different tone. They said it felt like the application was more caring.”[7]
It may just be a form, but Michigan’s new public benefit application is a triumph. It improves the agency’s efficiency and applicants’ experience in applying for benefits. It helps government deliver services better and helps applicants get the assistance they need. And its transformation was only possible because of a deliberate focus on people.
Design of forms — design of entire programs, for that matter — comes with many competing priorities and choices of where to focus attention and effort. The choices the designers of the old DHS-1171 made are clear: it was a form built to adhere as closely as possible to regulations and eliminate risk. Writes Civilla, “The benefits application was originally meant to serve the people of Michigan, but it had lost sight of that intention along the way. Over the course of 30+ years, it had been designed around process and policy, lawsuits and audits — additions that had accumulated, without considering how they might actually impact people applying for benefits every year.”[8] By choosing to focus on people instead, Civilla’s redesign changed all that.
Choosing to focus on people is not easy, especially with thirty years of institutional inertia in the other direction (Civilla’s effort was the sixth to redesign the form in that time).[9] It is also challenging because of people — people in all their messy, diverse unpredictability. To address this, Civilla used a process of human-centered design, built around user research — weeks of interviews with benefit recipients and frontline staff — to “understand the public benefits through the eyes of those who interacted with the system every day.”[10] As the form neared completion, a limited pilot program and continuous improvements based on feedback from the front lines of service delivery helped shape it into an even more powerful and user-friendly tool — and demonstrated how sustained, people-centered work can transform even the most entrenched processes.
The story of DHS-1171 holds plenty of lessons for practitioners of oversight. First, the old form serves as a cautionary tale of oversight incentives gone wrong. When oversight efforts — on the part of legislators, analytic bureaucracies, and executive branch agencies — focus solely on process, compliance, and risk mitigation, those efforts create a set of incentives that can build barriers and create systems that are ultimately cumbersome, impersonal, and unproductive. The old form ticked all the boxes, asked the right questions carefully aligned with statute, and satisfied years of recommendations and requirements from audits and litigation, but it did not do the one job it needed to do – enable people to access services to which they are entitled under law. Oversight can do wonderful things but, when oversight powers are wielded without regard for their effect on real people, oversight can get in its own way.
The field of human-centered design offers valuable direction for anyone looking to improve service delivery, both in terms of implementation and oversight. It is difficult work to shift priorities away from compliance and toward end users of programs. It is difficult work to run interviews and focus groups and program pilots and to make constant changes to programs to improve results. But it is worthwhile work, and it is work from which almost any government program can benefit. And, when government does prioritize people – when it does commit to learning from feedback and improving continuously – the result is what government so desperately needs: programs that serve the public better and more efficiently, programs that improve lives, and programs that build trust.
[1] “Our Work: Project Re:Form,” Civilla, https://civilla.org/work/project-reform.
[2] “Making the Case for Change,” Civilla, https://civilla.org/stories/making-the-case-for-change.
[3] “The Road to Rollout,” Civilla, https://civilla.org/stories/the-road-to-rollout.
[4] “Project Re:Form Case Study,” Civilla, https://civilla.org/work/project-reform-case-study.
[5] “Project Re:form .” Civilla, https://civilla.org/work/project-reform-case-study
[6] Ibid.
[7] Ibid.
[8] “Making the Case for Change.”
[9] Ibid.
[10] Ibid.