Insights

Data-Driven Public Private Partnerships: 3 Areas of Risk for Public Organisations to Understand

Hannah Wood, Joanna van der Merwe and Melissa Amoros-Lark
Author: Hannah Wood, Joanna van der Merwe and Melissa Amoros-Lark

Public Private Partnerships (PPPs) aren’t good or bad—but when they deal with data what they can be is risky. Yet, the innovations of the private sector offer efficient solutions to many problems facing public organisations 

Last year we asked what would it take for a private company like Palantir to become an acceptable ally for the public sector? As we dove further into this topic, we found that in many cases PPPs are the only option. They’re increasing both in quantity and in depth, and the public sector is becoming more and more dependent on private services. 

Another trend in these partnerships is the centrality of data. Transfers of large datasets containing personal information are becoming more and more common. In response to the convergence of dependency and data-centrality to PPPs, we’ve identified three areas of risk in data-driven public private partnerships (DDPPPs) for decision- and policy- makers to be aware of and to help them assess the potential dangers of handing over data:

  • Public sector dependency on private data services 
  • Entrenchment and opacity of privately supplied data services 
  • Disparities in risk-taking 

Our advice to public sector policymakers: use these areas as building blocks of a framework that responsibly arranges DDPPPs to reduce the risks involved.  

What are data-driven public private partnerships?

Traditional PPPs, according to The Organisation for Economic Co-operation and Development (OECD), can generally be defined as “long term contractual arrangements between the government and a private partner whereby the latter delivers and funds public services using a capital asset, sharing the associated risks.” 

DDPPPs are distinct form of PPPs, however, and the Centre has defined them as the following:   

Data-Driven Public-Private Partnerships (DDPPPs) are agreements between a public organisation and a private actor where the latter provides a product and/or service to the former addressing a public sector challenge. The solution relies heavily on the exchange of significant amounts of data often including personal, demographic and/or sensitive data. A key element to this definition is the active engagement of both parties, throughout the course of the partnership, in developing the solution. This leads to a co-developed bespoke solution to the public sector’s challenge. 

Partnerships that rely heavily on data are more frequent than ever, and often include services that the public sector may not have the capacity to handle. This year, the UK’s National Health Service (NHS) partnered with Amazon, Microsoft, Faculty and Palantir to assist with COVID-19 contact tracing applications. Palantir has worked with government partners such as the U.S. Immigration and Customs Enforcement and the Netherlands National Police Services Agency. Google Sidewalk Labs has partnered with a Toronto city to develop the first fully smart city. While in the humanitarian sector, MastercardAlibaba and Palantir have provided logistical and financial services to the World Food Programme’s (WFP) goal to fight global hunger.  

Risk Area 1: Public sector dependency on private data services 

In 8 years, people are predicted to interact with a data-generating device 4,800 times a day. Making relevant and meaningful use of this enormous amount of data is a challenge facing the public sector. Large tech firms such as Amazon, Google and Microsoft provide some of the most sophisticated data services, and they aren’t going anywhere. While competition may vary, it’s likely that the public sector will be forced to rely on such private partners.  

This risk of dependency runs not only at the partner level but also at the service level too. Public sector organisations enjoying bespoke solutions provided by companies may find themselves unable to replace these unique services, placing further dependence on their private partners. This is problematic because public organisations may have embedded privately supplied data services into their internal ways of working leading to dependencies. 

With this increased dependence on private sector services comes a reduced ability to mitigate inherent technical, reputational and operational risks due to a lack of maneuverability. After awarding a $600 million contract to Amazon Web Services in 2014, the CIA has recognised it’s potential dependence problem. Last year the intelligence agency announced that in 2021 it would procure around $10 billion of private data services from multiple providers to diversify their supplier portfolio. 

Risk Area 2: Entrenchment and opacity of privately supplied data services 

A second area of risk emerges from the obscured nature of algorithms used in many DDPPPs. The way in which these algorithms are built and how their IP is protected reduces the level of control of the public agencies who procure these services. 

Algorithms are largely created by inputting large amounts of similar types of data into a system that categorises them, and then makes future predictions based on those previously inputted. These algorithms often make up much of the basis of the analytical tools that private vendors offer to public organisations in these DDPPPs. 

Even to their engineers, algorithms are often described as a black box—where the relationship between input and output is unexplainable. Not only are they unexplainable, but the accuracy of their predictions is also opaque. Researchers and ethicists are debating the extent to which these decision-making algorithms can be trusted, and outline the need for better literacy both on the side of the user and creator 

Using algorithms that are potentially flawed or misinterpreted poses significant challenges when making policy-altering and life-altering decisions. In 2016, ProPublica conducted a risk assessment of the COMPAS tools used in the justice system and found that black defendants were 45% more likely than white defendants to be determined higher risk for committing a subsequent crime. For instance, in Wisconsin a man sentenced to 6 years in prison in part due to a COMPAS prediction that he had “a high risk of violence, high risk of recidivism, high pretrial risk” was “free to question the assessment and explain its possible flaws.” While the legal system allowed for a proper analysis of the tools, neither the justice system nor the lawyers were prepared or able to sufficiently explain either the benefits or harms of the algorithm in court.   

Another factor that obscures the decision-making processes of algorithms is the need for businesses to protect their intellectual property. A COMPAS executive pointedly announced that “the key to our product is the algorithms, and they’re proprietary…it’s certainly a core piece of our business.”  

It’s common practice for consulting firms such as KPMG, Deloitte and McKinsey to make analytical decisions for public sector firms. But the risk in DDPPPs is that the data tools which are introduced into the public sector become entrenched and institutionalised within public sector decision-making. No longer just supplemental processes, these tools may prove to be beyond an organisation’s control without capability to duly scrutinise them. Once the tool has been institutionalised, removing it from the standard way of work becomes increasingly difficult.  

Risk Area 3: Disparities in risk-taking 

Becoming dependent on a data service isn’t necessarily a bad thing. However, when this is coupled with entrenched, obscured decision-making tools without much choice for substitution it is the public sector who take on higher risk in the partnerships.  

This imbalance can generate many risks and lead to many forms of violations of rights. In 2016, the UK National Health Service shared non-anonymised data of 1.6 million patients with Alphabet’s DeepMind without notifying the data subjects. In this exchange the NHS risked sharing personal information of many people with a private enterprise.  Despite the clear risks, the partnership continued.  

In 2019, the NHS partnered with Amazon Alexa so that general practitioners and pharmacists could offer verbal advice through the voice-activated device. The anonymised data helped Amazon create, advertise and sell its own products as a new feature. The health service is becoming more and more reliant on private services and is likely to engage in more high-risk DDPPPs as technology develops.  

This year in the UK, after A-Level exams were cancelled due to coronavirus measures, an algorithm downgraded 40% of the expected results of UK students. These results determine access to education and impact employment prospects. Notably, many of the results rose for students from schools with a better track record, and lowered for students from schools with poor track records. There is now an individualised review process for students who received lower-than-predicted scores, but it is unclear the extent to which this review process will rectify the scale of the problem caused by the algorithm. This has led to extra-allocation of resources and major reputational damage for Ofqual, the public qualifications body.    

Public sector organisations often lack sufficient knowledge on how to push back or grasp the potential impacts and therefore end up bearing more risk. This risk exposes the organisation’s service users to greater privacy and rights infringements.  

Conclusion  

Ideally the public sector would be able to choose to partner with acceptable allies, and incentivise the private sector to reduce the risks inherent in DDPPPs. But for now, the dynamic between the two sectors is one marked by a stark power imbalance. 

To gain some ground, public organisations should pursue frameworks for procurement of private data services which respond to these risk areas of DDPPPs and emphasise the security of the data subjects. Such frameworks would address dependence at both supplier and service levels. They would seek to ensure that private services cannot be so entrenched that they cannot be discontinued or have no substitute. Moreover, they would cement transparency into the partnership and provide a foundation for scrutiny and accountability of these services.  

Another starting point is to look to the shared challenges that face both the private and public sectors. By finding common ground and tackling pain points of the two sectors, frameworks for good behaviour could incentivise private actors into acting more like acceptable allies.  

Until such frameworks emerge, the public sector is left with the tradeoff between innovating on solutions and the inherent risks in DDPPPs.