Start your day with intelligence. Get The OODA Daily Pulse.

Home > Analysis > Renowned Encryption Experts Sound the Alarm on Client-Side Scanning (CSS)

While Facebook may be mired in Congressional controversy in the U.S. and legal battles worldwide, public safety and law enforcement officials continue to explore a viable technological solution to their need to lawfully gain access to information on smartphones. The solution currently under consideration, Client-Side Scanning or CSS, grows out of an August 2021 proposal from Apple, Inc., which claims a limited scope: searching smartphones only for illegal images of abuse; the prevention of acquisition of non-targeted materials through advanced cryptography; and the ability for users to opt-out of the iCloud backup of their Camera Roll to avoid client-side scanning capabilities.

A group of technologists, who for the last 25 years have come together to co-author seminal articles warning about threats to encryption, have released yet another warning, this time concerning CSS. The paper, entitled Bugs in our Pockets: The Risks of Client-Side Scanning, comes complete with recommendations for lawmakers and policymakers.

What is CSS?

The authors first explain how CSS works:

“Some in industry and government now advocate a new technology to access targeted data: client-side scanning (CSS). Instead of weakening encryption or providing law enforcement with backdoor keys to decrypt communications, CSS would enable an on-device analysis of data in the clear. If targeted information were detected, its existence and, potentially, its source, would be revealed to the agencies; otherwise, little or no information would leave the client device. Its proponents claim that CSS is a solution to the encryption versus public safety debate: it offers privacy in the sense of unimpeded end-to-end encryption and the ability to successfully investigate serious crime.”

They then state the core issue surrounding the implementation of CSS proposals as an alternative to allowing law enforcement access to encrypted user data or decryption capabilities on smartphones:

“At a casual glance, CSS systems may seem an opportunity to provide a compromise approach to surveillance. Data can be encrypted end-to-end in transit and at rest (e.g., in encrypted backup systems), rather than being available in cleartext on services on which governments can serve warrants. Involving the user’s device in the CSS process may allow for some sort of transparency; perhaps some cryptography can help verify properties of the scan prior to its execution or limit the purpose or pervasiveness of scanning. CSS may also allow for some rudimentary user control, as users may be able to decide what content can be scanned or remove the scanning altogether.

The introduction of CSS would be much more privacy-invasive than previous proposals to weaken encryption. Rather than reading the content of encrypted communications, CSS gives law enforcement the ability to remotely search not just communications, but information stored on user devices.”

What are the Implications of the Deployment of CSS?

In the end, however, the authors do not mince words in their warning about the technological implications of the recent CSS proposals:

“CSS makes law-abiding citizens more vulnerable with their personal devices searchable on an industrial scale. Plainly put, it is a dangerous technology.  CSS by its nature creates serious security and privacy risks for all society while the assistance it can provide for law enforcement is at best problematic. There are multiple ways in which client-side scanning can fail, can be evaded, and can be abused.

Its proponents want CSS to be installed on all devices, rather than installed covertly on the devices of suspects, or by court order on those of ex-offenders. But universal deployment threatens the security of law-abiding citizens as well as lawbreakers. Technically, CSS allows end-to-end encryption, but this is moot if the message has already been scanned for targeted content.

In reality, CSS is bulk intercept, albeit automated and distributed. As CSS gives government agencies access to private content, it must be treated like wiretapping. In jurisdictions where bulk intercept is prohibited, bulk CSS must be prohibited as well. Although CSS is represented as protecting the security of communications, the technology can be repurposed as a general mass surveillance tool.

CSS and Previous Threats to Encryption

Professor Steven Bellovin, the lead co-author of the paper, is the Percy K. and Vida L.W. Hudson Professor of Computer Science at Columbia University and an affiliate faculty at Columbia Law School. In 2012-13, Prof. Bellovin was the Chief Technologist for the United States Federal Trade Commission. In a brief phone interview, Prof. Bellovin shared the history of collaboration between the co-authors of this paper:

“Most of the authors have been working together for 25 years addressing any threats to encryption. Our initial work was a white paper in response to weaknesses in the “key escrow” system of the government’s Clipper Chip initiative back in the late 1990s. In 2015, the group on this paper published another paper, ‘Keys Under the Doormat’, in response to [then] FBI Director Comey and the Prime Minister of the UK at the time, David Cameron, reopening the debate on what should be considered ‘exceptional access’ to encrypted data for law enforcement purposes.

The bulk of the authors have worked together on these issues for many years. On this “Bugs in our Pockets” paper concerning CSS, we made sure to invite international experts to co-author with us, as well as offer a bridge on the topic to some younger scholars in the field to make sure we had a broad representation.

Our objective with this paper is to influence policymakers and legislators around the world. Australia already has laws on the books. The European Union is discussing CSS. My policy perspectives are as valid as any other person on the planet. But I caught my first hackers 50 years ago – and every technologist contributing to this paper is coming from the perspective of why CSS is a bad idea from a technology perspective based on years of experience and expertise.”

The usual collaborators with Prof. Bellovin, who have once again contributed to this paper, include:

Hal Abelson: Class of 1922 Professor of Computer Science and Engineering in the Department of Electrical Engineering and Computer Science at Massachusetts Institute of Technology.

Ross Anderson: Professor of Security Engineering at the University of Cambridge and at the University of Edinburgh

Josh Benaloh: Senior Principal Cryptographer at Microsoft Research and an Affiliate Professor in the Paul G. Allen School of Computer Science and Engineering at the University of Washington.

Matt Blaze: Professor of Law; Robert L. McDevitt, K.S.G., K.C.H.S., and Catherine H. McDevitt L.C.H.S. Chair, Department of Computer Science at Georgetown University.

Whitfield Diffie: Chief Security Officer of Sun Microsystems, Retired.

Susan Landau: Bridge Professor of Cyber Security and Policy at The Fletcher School and at the School of Engineering, Department of Computer Science, at Tufts University.

Peter G. Neumann: Principal Scientist in the Computer Science Laboratory at SRI International.

Ronald L. Rivest: Institute Professor at Massachusetts Institute of Technology.

Jeffrey I. Schiller: Enterprise Architect at Massachusetts Institute of Technology.

Bruce Schneier: Fellow and Lecturer at Harvard Kennedy School, a fellow at the Berkman Klein Center for Internet & Society at Harvard University, and Chief of Security Architecture at Inrupt, Inc.

New co-authors, brought on board to offer additional generational and international perspectives on CSS, include:

Jon Callas: Director of Technology Projects at the Electronic Frontier Foundation.

Carmela Troncoso: Assistant Professor at Ecole Polytechnique Federale de Lausanne (EPFL).

Vanessa Teague: CEO of Thinking Cybersecurity and an Associate Professor (Adj.) at the Research School of Computer Science at the Australian National University.

Conclusions and Recommendations

The Bugs in our Pockets paper offers the following conclusions and recommendations on CSS as an approach to solving the frustrations of law enforcement and public safety officials:

  • Even if deployed initially to scan for child sex-abuse material, content that is clearly illegal, there would be enormous pressure to expand its scope. We would then be hard-pressed to find any way to resist its expansion or to control abuse of the system.
  • The ability of citizens to freely use digital devices, to create and store content, and to communicate with others depends strongly on our ability to feel safe in doing so. The introduction of scanning on our personal devices – devices that keep information from to-do notes to texts and photos from loved ones – tears at the heart of privacy of individual citizens. Such bulk surveillance can result in a significant chilling effect on freedom of speech and, indeed, on democracy itself.
  • CSS has been promoted as a magical technological for the conflict between the privacy of people’s data and communications and the desire by intelligence and law enforcement agencies for more comprehensive investigative tools. A thorough analysis shows that the promise of CSS solutions is an illusion.
  • Technically, moving content scanning from the cloud to the client empowers a range of adversaries. It is likely to reduce the efficacy of scanning while increasing the likelihood of a variety of attacks.
  • Economics cannot be ignored. One way that democratic societies protect their citizens against the ever-present danger of government intrusion is by making search expensive. In the US, there are several mechanisms that do this, including the onerous process of applying for a wiretap warrant (which for criminal cases must be essentially a “last resort” investigative tool) and imposition of requirements such as “minimization” (law enforcement not listening or taping if the communication does not pertain to criminal activity). These raise the cost of wiretapping.
  • By contrast, a general CSS system makes all material cheaply accessible to government agents. It eliminates the requirement of physical access to the devices. It can be configured to scan any file on every device. And it has become part of some agencies’ vision (such as the UK Government Communications Headquarters’ [GCHQ] pitch document AI for national security).
  • It is unclear whether CSS systems can be deployed in a secure manner such that invasions of privacy can be considered proportional. More importantly, it is unlikely that any technical measure can resolve this dilemma while also working at scale. If any vendor claims that they have a workable product, it must be subjected to rigorous public review and testing before a government even considers mandating its use.
  • This brings us to the decision point. The proposal to preemptively scan all user devices for targeted content is far more insidious than earlier proposals for key escrow and exceptional access. Instead of having targeted capabilities such as to wiretap communications with a warrant and to perform forensics on seized devices, the [GCHQ’s] direction of travel is the bulk scanning of everyone’s private data, all the time, without warrant or suspicion. That crosses a red line. Is it prudent to deploy extremely powerful surveillance technology that could easily be extended to undermine basic freedoms?
  • Were CSS to be widely deployed, the only protection would lie in the law. That is a very dangerous place to be. We must bear in mind the 2006 EU Directive on Data Retention, later struck down by the European Court of Justice, and the interpretations of the USA PATRIOT Act that permitted bulk collection of domestic call detail records. In a world where our personal information lies in bits carried on powerful communication and storage devices in our pockets, both technology and laws must be designed to protect our privacy and security, not intrude upon it. Robust protection requires technology and law to complement each other. Client-side scanning would gravely undermine this, making us all less safe and less secure.

Related Reading:

A direct link to the paper:  Bugs in our Pockets: The Risks of Client-Side Scanning.

For the previous position papers by the co-authors regarding proposed technology solutions and the threat they pose to encryption, see The Risks of Key Recovery, Key Escrow, and Trusted Third-Party Encryption and Keys Under Doormats: Mandating Insecurity by Requiring Government Access to All Data and Communications| Journal of Cybersecurity.

Black Swans and Gray Rhinos

Now more than ever, organizations need to apply rigorous thought to business risks and opportunities. In doing so it is useful to understand the concepts embodied in the terms Black Swan and Gray Rhino. See: Potential Future Opportunities, Risks and Mitigation Strategies in the Age of Continuous Crisis

Cybersecurity Sensemaking: Strategic intelligence to inform your decisionmaking

The OODA leadership and analysts have decades of experience in understanding and mitigating cybersecurity threats and apply this real world practitioner knowledge in our research and reporting. This page on the site is a repository of the best of our actionable research as well as a news stream of our daily reporting on cybersecurity threats and mitigation measures. See: Cybersecurity Sensemaking

Corporate Sensemaking: Establishing an Intelligent Enterprise

OODA’s leadership and analysts have decades of direct experience helping organizations improve their ability to make sense of their current environment and assess the best courses of action for success going forward. This includes helping establish competitive intelligence and corporate intelligence capabilities. Our special series on the Intelligent Enterprise highlights research and reports that can accelerate any organization along their journey to optimized intelligence. See: Corporate Sensemaking

Artificial Intelligence Sensemaking: Take advantage of this mega trend for competitive advantage

This page serves as a dynamic resource for OODA Network members looking for Artificial Intelligence information to drive their decision-making process. This includes a special guide for executives seeking to make the most of AI in their enterprise. See: Artificial Intelligence Sensemaking

COVID-19 Sensemaking: What is next for business and governments

From the very beginning of the pandemic we have focused on research on what may come next and what to do about it today. This section of the site captures the best of our reporting plus daily daily intelligence as well as pointers to reputable information from other sites. See: OODA COVID-19 Sensemaking Page.

Space Sensemaking: What does your business need to know now

A dynamic resource for OODA Network members looking for insights into the current and future developments in Space, including a special executive’s guide to space. See: Space Sensemaking

Quantum Computing Sensemaking

OODA is one of the few independent research sources with experience in due diligence on quantum computing and quantum security companies and capabilities. Our practitioner’s lens on insights ensures our research is grounded in reality. See: Quantum Computing Sensemaking.

The OODAcast Video and Podcast Series

In 2020, we launched the OODAcast video and podcast series designed to provide you with insightful analysis and intelligence to inform your decision making process. We do this through a series of expert interviews and topical videos highlighting global technologies such as cybersecurity, AI, quantum computing along with discussions on global risk and opportunity issues. See: The OODAcast

Daniel Pereira

About the Author

Daniel Pereira

Daniel Pereira is research director at OODA. He is a foresight strategist, creative technologist, and an information communication technology (ICT) and digital media researcher with 20+ years of experience directing public/private partnerships and strategic innovation initiatives.