Start your day with intelligence. Get The OODA Daily Pulse.
P. W. Singer is an incredibly influential author with a proven ability to grasp the complexities of megatrends at the intersection of technology, international relations, and modern conflict. His books have shaped the strategies of both global corporations and nations. His latest book, written with Emerson T. Brooking, dives deep into how technologies available to all of us are now part of modern warfare. In LikeWar: The Weaponization of Social Media, Singer and Brooking provide insights into how new dynamics around social media have forever changed conflict, bringing billions of users into into the battle through the applications installed on their smartphones.
From the book overview:
P. W. Singer and Emerson Brooking tackle the mind‑bending questions that arise when war goes online and the online world goes to war. They explore how ISIS copies the Instagram tactics of Taylor Swift, a former World of Warcraft addict foils war crimes thousands of miles away, internet trolls shape elections, and China uses a smartphone app to police the thoughts of 1.4 billion citizens. What can be kept secret in a world of networks? Does social media expose the truth or bury it? And what role do ordinary people now play in international conflicts?
OODAloop’s Matt Devost and Bob Gourley had an opportunity to interview Singer for additional context on key issues examined in the book. Our questions and his responses are below:
Matt Devost: We’ve already seem influence operations move into the realm of social media like Twitter and Facebook. Is that the primary battleground of the future or is there an emerging technology or platform we should be thinking about?
P.W. Singer: Whether it is a marketing for a new widget or a military information operation, the most successful operations move across the platforms, and indeed leverage activity in one for the other. Like so much else, the Russian operations targeting the US elections in 2016 and again now in 2018 illustrate this diversity approach well. They were active in Twitter and Facebook, but also planting imagery into Instagram and Reddit, etc. They used each as both a test group for what worked, but also to push the success wider. This also points to another key change. While there was a massive amount of turnover in the early years of social media (in the book we also talk about 6 Degrees, Friendster, and MySpace –remember those!), it has reached a kind of adolescence with less turnover. Part of this is because we’ve become so enmeshed in the networks (Facebook has all your vacation photos ), but more so because of their sheer power (e.g. Facebook buying up new firms like Instagram before they become rivals, so that even if youth move away from it, they’ll still control the next platform).
Bob Gourley: What in your view can businesses do to reduce risk in this modern age of information war?
P.W. Singer: The lesson of the book for business is the same as it is for governments: Start planning for the new ways that people fight online, or be the next victim. Just as they have had to do with cyberwar and cybersecurity (hacks of the network) over the last decade, business needs to adjust for the new side of what we call “LikeWar” (hacks of the people on the networks via likes and lies). It is interesting that many cybersecurity companies have set up new divisions for this new area. For broader industry, it involves catching up their understanding of the new threats building beyond their networks, as well as wargaming out their likely responses. Companies like Toyota or Domino’s pizza have all had to deal with this issue over the last few years. Indeed, Nike just became the target of a massive push by the same Russian bots that were targeting our election (the Russians were trying to layer their op onto the Colin Kaepernick controversy as a means to create further division). Companies might even it even deploy the very same tools back, using its own army of bots and sockpuppets to steer the debate (Amazon, for example, has set up an effort that is a near mirror of units created by the Israeli and Russian militaries to influence online debate)
Bob Gourley: What did your research lead you to conclude about the potential collateral damage of cyber attacks? Are individuals and small businesses at risk? Who can they turn to for protection in this new age?
P.W. Singer: When you think about the potential risks to a company (targeting its overall brand or reputation or planting information to effect a share price), the impact of this side could be the same or even greater than a traditional network breach. The problem is akin to cybersecurity in that often it is not that clear who to turn to. Government is not well-equipped to respond, while the tech platforms are struggling with their own roles and responsibilities. That was part of the reason of the book, to help lay out the problem, establish the new rules of the game, and then propose a series of actions that can be taken by government, by business, and by each of us as individual users.
Matt Devost: We are already seeing where some less technologically sophisticated cultures are being influenced through bad fakes (e.g. Photoshopped images) as they aren’t finely tuned to question images. With emerging deep fake technology, even the most sophisticated technologists will not be able to detect which videos are real or fake. How do we establish standards or mechanism for introducing trust into the social graph? Do you envision that this is a role for AI?
P.W. Singer: Like everything else there will be no silver bullet solution to this, as it’s a war of two sides going back and forth. That makes Likewar almost Clausewitzian. The difference in the future is that the two sides dueling will likely be AI. One side creating information and personas incredibly difficult for a human to know whether is real or not and the other trying to police the network for them. But, I should note, the companies are conflicted on this, as the technology of chatbots and the like may be useful as weapons in tricking people, but they are also good for digital marketing, create help desk savings etc,
That means there may be a role for policy here as well to decide for them in ways that keep the public interest in mind. Senator Warner, for instance, has floated the idea of requiring bots to be labeled as such, so that humans know when they are interacting with a fake persona online. It is both fascinating policy that could aid in a lot of impending problems…and its effectively the Blade Runner rule!
Bob Gourley: What are your views regarding the challenges of unfair AI (AI that is biased or unethical or leads to conclusions that is unfair to consumers)? Have you seen indications of solutions that companies can put in place to mitigate problems with the AI they are fielding?
Algorithmic bias is a real problem. We see it happening both from design teams that don’t understand the implications of what they are building and from beta tests go awry, where the online crowd manipulated the results (the worst example being when a chatbot was steered towards spewing Nazi propaganda)
In the book, we push the idea that, given the growing stakes, the companies should more rigorously “wargame” their technology before it is pushed out into the world. It was one thing to beta test food rating apps, to just push it out there and see what happens. It is another when you are talking about the nervous system of the modern world. That is, akin to how good firms now test for cybersecurity vulnerabilities, the same needs to be done for the LIkewar side. Explore how bad guys might use a product and even good guys might accidentally misuse it. Doing so beforehand would limit a lot of the worst surprises, save companies major brand headaches, and maybe even save lives.
OODALoop Comment: This book deserves a place on the bookshelf of every corporate strategist and government leader.