5 takeaways on tech, from the authors of 'Power to the Public'
Local government leaders have had little choice but to launch one experiment after another in the face of COVID-19. And whether those experiments were efforts to redeploy staff or to stand up new programs in a matter of days, adopting new technology was often part of the process. Now, as local leaders consider the role tech will play in their cities’ recovery—and how to invest the billions available in the American Rescue Plan—a new book provides a blueprint on how to best approach technology when it comes to furthering policy goals.
"Power to the Public," co-authored by Tara Dawson McGuinness and Hana Schank, two members of the New America think tank and former Obama administration officials, is loaded with tales of government reforms that take a user-centered approach in design, data, and delivery to advance the public interest and promote the public good in the digital age. From a years-long project to humanize and expedite a cumbersome government aid form to removing a logjam that kept kids waiting to be placed in a foster-care home, the authors emphasize that change that improves people’s lives doesn’t start with a product. It starts with people.
During a recent interview, McGuinness, who founded the New Practice Lab, a research and design lab focused on family economic security, and Schank, the director of strategy for the Public Interest Technology program at New America, offered five lessons policymakers should take away from their blueprint.
Try it before you change it.
The best way to understand how a process is or isn’t working is to try it yourself. That was how the nonprofit Civilla sold the state of Michigan on fixing an overly complicated emergency financial-assistance application that contained more than 1,200 questions. Rather than give a presentation to top administration officials on the form’s ineffectiveness, Civilla handed them the form, had them sit outside the conference room in a noisy hallway and gave them 15 minutes to fill it out. “That was the transformational moment,” McGuinness said. “That [project] wouldn’t have happened if senior executives hadn't walked through the shoes of someone filling out that application.”
This approach focuses the process on the user experience, which then becomes the starting point for problem solving. Civilla interviewed the applicants and the frontline state workers who processed the form to craft the framework that ultimately resulted in a form that was 80 percent shorter and processed in half the amount of time.
Key Takeaway: McGuinness said policymakers should ask themselves these questions as they examine the user experience: Is your policy, program, or tax credit working for the people using it? How do you know it’s working? How often do you check? Who's eligible and how can you tell that it is accessible to them? What would we drop or evolve to make this better? What are the blockers?
Apps don’t change the world. People do.
Technology alone won’t solve a problem, and new tools won’t fix a broken policy or a convoluted process. Instead, technology only helps if you have a clear understanding of the problem you are trying to solve. And getting to the root of a problem starts with asking people what’s going wrong for them. Technology is helpful when it’s brought in where appropriate—and it doesn’t have to be groundbreaking. For example, Schank noted a prior project to end veteran homelessness entailed the labor-intensive work of building relationships with and keeping track of housing-insecure persons in one community. “We were asked by someone about the technology we used to do all that,” she said. “The answer was Google Docs.”
Key Takeaway: “People in government are barraged with vendors [who say they have the next best thing],” Schank said. “What your mindset should be is one that has an enormous dose of skepticism. And you need to make them prove to you why it works.”
Data can reveal inequalities, but it can also hide them.
Using real data to understand how you are serving people is a very powerful tool, but it is also a reflection of the people using it—with positive and negative implications. For example, a team in New York City in recent years used 311 data to map rat complaints across the city as a way to track infestations and target abatement efforts. But one of the areas on the map known by a team member to have rats didn’t show up as infested. It turns out it wasn’t the data that was the issue—it was the way it was collected. People in that person’s neighborhood, which was lower-income and majority-minority, didn’t know about 311 and weren’t reporting the problem. The team only discovered this by chance because someone from there was part of the mapping effort. Being intentional and inclusive with data collection efforts can stop these discrepancies before they occur.
Key Takeaway: When looking at your data, Schank said, always ask “who or what is not there that should be? What are the barriers?” McGuinness added: “And remember that a lot of the systems we work with today were designed a long time ago. If you work within the bounds of how something exists, you can sometimes forget it was created in a different time with a different set of leaders with different functions than we use today.”
Big policy changes can start with small projects.
While many leaders today want to tackle systemic inequities, doing so is complicated and overwhelming. Starting small—and picking one process to reevaluate with an equity lens—means you can start getting practice right away with dissecting the ways systemic bias is baked into many of our systems. Small-test scenarios can not only feel more feasible but also have the benefit of providing immediate, measurable results. For example, Schank and McGuinness said they’re now involved with New York City’s desire to create equitable programs and the two have proposed using a food aid program to pilot ideas. “Usually, it’s stand it up, get it done,” Schank said, referring to policy projects. “But here, we’re not saying everything in New York has to be done this way—we’re starting small and trying a thing to see how it works.”
Key Takeaway: “You don’t have to change the system all at once. Small interventions can make all the difference,” Schank said. Ask where people are getting bogged down in a process. Run a small test of how fixing the logjam might work so you can learn, improve, and retest before broadening usage.
Meet people where they are.
The authors say the success of the American Rescue Plan will depend on policymakers to make sure the resources get to the people who need them. “The ideas are big and bold,” McGuinness said. “But if 20 percent of people can’t access the rental assistance or their tax credit, then it won’t be a success.” She and Schank hope that policymakers can learn from what they see as a shortcoming of the CARES Act passed in March 2020 in response to the then-growing pandemic: While the $2.2 trillion did provide much needed relief to the economy and millions of families, it fell short when it came to how those benefits reached those families. If the people who need the help can’t access it, Schank said, “the program may as well not exist.”
Key Takeaway: “Whether you’re working on rental assistance, unemployment system modernizations, or getting child subsidies out the door,” McGuinness said, “success means understanding who you’re serving, checking in with them by asking, and having data instrumentation that allows you to see if you are really delivering to all people.”