When private tech meets public service

Navigating the tension between innovation and accountability


Technology is full of contradictions. It can equalize and marginalize, increase efficiency while adding complexity, illuminate or obscure. Sometimes all at once. 

For government leaders, becoming fluent in navigating these contradictions is a key to unlocking the potential technology promises—while protecting the democratic values it must uphold.

A robotic hand holds the scales of justice while a human hand points at it, representing the balance and tension between artificial intelligence, fairness, and human responsibility.

An illustration of a robotic hand holding the scales of justice opposite a human hand pointing forward. The image symbolizes the tension and balance between technology, ethics, and human accountability in shaping fair public systems.


Contradictions in government and private sector approaches

As the government has invested heavily in technology to modernize government systems over the last two decades, we have learned that adapting private sector approaches in democratic contexts presents myriad challenges. Today, the influence of private sector technology firms and practices has only increased, with the proliferation of their platforms deployed across government systems, and increasingly their CEOs in administrative roles.

While delivering on private sector priorities like optimization, data extraction, and rapid scalability, they often conflict with democratic values like inclusion, deliberation, and accountability, creating new tensions in how and whether the government serves all citizens. Here are a few examples of contradictions of technology attempting to balance private tech approaches and democratic principles.

Scale vs citizen-centeredness

In an attempt to achieve private-sector scale in democratic processes, many civic leaders have turned away from in-person community input towards digital engagement tools. With the ability to process thousands of comments in real time, these digital platforms promise increased participation. But participation from whom? Digital platforms commonly raise concerns about language and digital access barriers to engagement.

One example was Seattle’s “Your Voice, Your Choice” participatory budgeting platform, which allowed residents to vote online for neighborhood projects using city funds. While the platform achieved significant strides in participatory methods, its online voting raised concerns around favoring tech-savvy and English-speaking residents.

While representative participation is a democratic value, scale is not a synonym for representation. Digital tools only result in broader and representative participation if they are equitably accessible and transparent.

Open data vs data sovereignty

Open data portals have become a key strategy for how governments modernize services, foster transparency, and enable civic innovation. At the same time, once the data is released, governments and communities have limited control over how data is interpreted, repurposed, or monetized.

Chicago’s police data portal illustrates this tension. Publishing crime incident data has helped residents to understand neighborhood safety and enabled watchdog groups to hold police accountable. Yet, independent evaluations have flagged governance gaps and privacy risks: the City has faced criticism for inconsistent compliance with its policies, and concerns persist that even anonymized, block-level crime data can expose sensitive information or reinforce neighborhood stigmas.

Opening up civic data up for access or accountability serves democratic goals. The contradiction surfaces when outcomes conflict with citizens' privacy rights and the City’s ability to protect or enforce the rights of its citizens.

Efficiency vs accountability

Outdated government systems translate into real problems for citizens. Automated decision-making systems are appealing to civic leaders for their ability to process complex applications and eligibility decisions at high speed. But efficiency can come at the cost of transparency and fairness.

One example is Idaho’s automated Medicaid eligibility systems, which significantly improved beneficiaries’ outcomes through expedited benefits decisions. Still, the algorithmic inner workings restricted appeals and oversight, resulting in a class action lawsuit that determined the algorithm "unconstitutionally arbitrary.”

Government benefits must be distributed equitably and adherent to policy to be democratic, and the opacity of algorithmic systems make it difficult to track compliance and hold programs accountable.

Building systems that serve both innovation and democracy

Technology outcomes are a mirror: they take the form of the incentives, requirements and values of where technological decisions are made. The decision isn't about choosing between private sector innovation and democratic accountability, it's designing systems that seek to understand what the right balance looks like in each context. The practical approaches below offer a way to navigate these contradictions:

  • Build for reliability over features. Though citizens may say they want experiences to work like Netflix or Amazon, studies show that government services focused on reliability and customer experience outperform those prioritizing flashy features.

  • Maintain transdisciplinary approaches and capabilities. Tech solutionism assumes every problem has a technological solution, when some democratic challenges require political, social, or cultural considerations and outcomes that can’t be automated or optimized. 

  • Track democratic outcomes. In digital environments, metrics tracked are often informed by private efficiency practices. Yet, democratic outcomes, like privacy, transparency, representation, and accountability, are rarely tracked. Articulate democratic values and outcomes alongside technical requirements, develop metrics that reflect them, and pilot systems with the people most likely to be excluded before full deployment. 

  • Iterative testing: Even well-executed services can have vulnerabilities that emerge and need to be addressed. Operationalize approaches like agile or six week sprints that allow for cycles of learning and iterating.

  • Maintain human touchpoints alongside digital channels: Automated systems work best when combined with accessible human support for complex cases, appeals, and users who can't navigate digital interfaces. 

  • Build technological literacy in those overseeing projects and procurement. Leadership and staff must have a contextual understanding of technologies and their dynamics so that they can effectively oversee vendors and make informed procurement decisions.

Civic leaders can ask not just, "Does this work?" but "Does this strengthen our democratic compact with citizens?" From this perspective, contradictions can be seen as context-aware design challenges, not technical problems, grounded in real leadership decisions. When approached with intention and care, government innovation has the potential to incorporate the best practices from the private sector and strengthen democratic values and outcomes.


At Public Servants, we help governments design systems that honor democratic values while embracing innovation. Learn about our services or contact us for support bridging the gap.


Lauren Sinreich

Lauren Sinreich is a strategic design and innovation consultant with two decades of cross-sector experience at the intersection of technology, systems thinking, and human-centered design. She has worked across federal agencies including the Veterans Health Administration, civic innovation including projects with The Mayor's Office in San Francisco, and Fortune 100s like ServiceNow and Bank of America.

Next
Next

Designing policy that doesn’t break delivery