Start your day with intelligence. Get The OODA Daily Pulse.

Commissioners Krebs, Hurd et. al. Deliver Commission on Information Disorder Final Report

We posted an analysis back in early November about former CISA Director Krebs’ and former Congressman William Hurd’s participation as Commissioners on the Information Disorder Report.  At that time, the Commission had released an interim report (along with a webcast) discussing their findings to date and their roadmap for completion of the report.  Congressman Hurd is a member of the OODA Network.  The final report is now available.

The Commission on Information Disorder final report opens with a letter from the co-chairs (Katie Couric, Chris Krebs, Rashad Robinson) clearly sounding an alarm:

“Information disorder is a crisis that exacerbates all other crises. When bad information becomes as prevalent, persuasive, and persistent as good information, it creates a chain reaction of harm.”

In our recent coverage of the CSET’s AI and the Future of Disinformation Campaigns, Part 1: The RICHDATA Framework, what the Aspen Institute Commission has chosen to characterize as “information disorder” is usually characterized as “mis- and disinformation.”  The CSET authors offered this perspective on the issue, which complements the characterization offered by the Commission co-chairs:

“Artificial intelligence (AI), specifically machine learning (ML), is poised to amplify disinformation campaigns—influence operations that involve covert efforts to intentionally spread false or misleading information.”

We promised an expanded analysis of the CSET framework as soon as Part 2 of the series is made available, which “examines how AI/ML technologies may shape future disinformation campaigns and offers recommendations for how to mitigate them.”   We also mentioned that we would take a look at the recently released  Commission on Information Disorder Final Report, with an eye towards why they chose to differentiate their misinformation research efforts as “information disorder”?  What sets this research apart?

Working Definitions

The Commission provides these working definitions in a sidebar of the report.  They go a long way in framing the issue and thought the OODA readership would want to have them as a quick reference as you frame your organization’s research efforts:

The terms disinformation and misinformation are defined in a variety of ways.  The Commission employed the following definitions for the purposes of this report.

Information disorder, coined by First Draft Co-Founder Claire Wardle, denotes the broad societal challenges associated with misinformation, disinformation, and malformation.

Disinformation is false or misleading information, intentionally created or strategically amplified to mislead for a purpose (e.g., political, financial, or social gain).

Misinformation is false or misleading information that is not necessarily intentional.

“There is an incentive system in place that manufactures information disorder…”

A Call for Leadership and a Framework for Action

Like the recent ‘call to action’ over at DHS CISA, we are seeing a pattern in a few advisory boards and research projects which make a commitment to populating their ranks with a mix of researchers, academics, and practitioners.  The practitioners seem to be winning the day, as the deliverables on these projects make clear that they are not business as usual i.e. delivering a weight white paper with no real “there” there). They want their recommendations to be immediately actionable (as they feel time is of the essence) They are also creating accessible frameworks that are a baton pass, of sorts, made publicly available for those who need to tackle the issue at hand.

So too with the Commission on Information Disorder.  They start with leadership and share the research questions they were asking themselves during their six months-long exploration of misinformation and how to solve it:

” Proactive leadership, rising from within every sector and institution in our society, is our only way out of this crisis. And yet it is sorely missing. The committed and powerful leadership we need is not yet the leadership we have. Accordingly, the biggest question we faced as co-chairs of the Aspen Institute’s Commission on Information Disorder was simply this: How can we help increase the breadth, depth, honesty, and efficacy of leadership for tackling information disorder?

“The shared belief of the Commission co-chairs is that one critical catalyst for bringing about the leadership we need is the establishment of a framework for action—a path toward change. It must be paved with well-researched and real-world solutions, which people affected by mis-and disinformation can demand their leaders walk down. And it must be clear enough to help responsible leaders stay on track toward something real.”

One thing that definitely sets this report apart is something we have not seen in any of our other analyses of disinformation research projects.  The Commission addresses “the biggest lie of all”, which has become fatalist conventional wisdom, and “which this crisis thrives on…that the crisis itself is uncontainable. One of the corollaries of that mythology is that, in order to fight bad information, all we need is more (and better distributed) good information. In reality, merely elevating truthful content is not nearly enough to change our current course. There is an incentive system in place that manufactures information disorder, and we will not address the problem if we do not take on that system, nor will we improve if we fail to address the larger societal issues that continue to divide us.”

“Saying that the disinformation is the problem— rather than a way in which the underlying problem shows itself—misses the point entirely.” – Mike Masnick

Scope and Approach

The Commission very consciously did not want to boil the misinformation ocean, acknowledging early in the final report that “both mis- and disinformation caused harm in many areas,  [the commissions] recommendations place special emphasis on a set of narrower categories of misinformation harms from empirically grounded domains (e.g., threats to public health, elections) which can be evaluated for information quality and accuracy by professional bodies with established standards and domain expertise.”

Put this commitment to categories from domains that have the ability to measure quantifiable negative outcomes from misinformation campaigns as a powerful insight that sets this report apart from other research efforts.  As ar result, The Commission chose to focus its attention on three priorities: (1)

Increasing transparency and understanding: Enhancing access to and inquiry into social media platforms’ practices,
and a deeper examination of the information environment and its interdependencies.
Building trust: Exploration of the challenges the country faces in building and rebuilding trust in the institutions people count on to support informed public discourse and debate, and the role that access to reliable facts and content plays in those conversations.
Reducing harms: Interventions that reduce the worst harms of mis- and disinformation, such as threats to public health, democratic participation, and targeting of communities through hate speech and extremism.”

The immediate commitment to harm reduction – actionable now – is worth noting as another unique deliverable from this commission relative to other projects we have researched.

“An “everyone’s responsible” stance does not absolve tech companies—from social media platforms to search engines to digital messaging services—of the sins of its various products and services.”

Key insights and context

The bulk of the final report grew out of the Commission’s conversations with over 25 subject matter experts (SMEs), “creating more than 600 minutes of expert understanding on information disorder.”  Besides Commissioner Hurd and Commission Co-chair Krebs, OODA Loop team members have a deep intellectual familiarity and/or personal histories with some of the SMEs who contributed to this report, having crossed paths with them professionally on projects which were the intellectual precursors to this study of the misinformation crisis, including Commissioner Deb Roy, Danah Boyd, Jim Steyer, Ethan Zuckerman, and Jeff Koseff.  They have all done vital work in the past and their participation in these research efforts makes it that much more impressive.  They are amongst the heaviest of hitters in their respective disciplines.  In a brilliant open-source move, The Commission provides videos of each of these “Disinfo Discussions” (to which links are available at the end of this post).

Building on their limited scope and focused approach, the report offers the following insights and context which inform their final recommendations:

  • Disinformation is a symptom; the disease is complex structural inequities.
  • The absence of clear leadership is slowing responses.
  • Trade-offs between speech and misinformation are not easy.
  • Disinfo doesn’t just deceive; it provides permission: supply meets the demand.
  • The platforms’ lack of transparency is hampering solutions.
  • Online incentives drive ad revenue, not better public discourse.
  • Broken norms allow bad actors to flourish.
  • Local media has withered, while cable and digital are unaccountable

Commission Final Report Recommendations

“One of the most challenging aspects of addressing information disorder is confronting the reality that “disinformation” and information campaigns by bad actors don’t magically create bigotry, misogyny, racism, or intolerance—instead, such efforts are often about giving readers and consumers permission to believe things they were already predisposed to believe.”

Recommendations to Increase Transparency

Public interest Research: This recommendation contains two distinct proposals: one focused on public data and the other focused on private data. Congress should implement protections for researchers and journalists who violate platform terms of service by responsibly conducting research on public data of civic interest. Separately, it should also require platforms to disclose certain categories of private data to qualified academic researchers, so long as that research respects user privacy, does not endanger platform integrity, and remains in the public interest.

High-reach content disclosure:  Congress should require all social media platforms to regularly publish the content, source account, and reach data for posts that they organically deliver to large audiences.

Content moderation platform disclosure:  Congress should require all social media platforms to disclose information about their content moderation policies and practices, and to produce a time-limited archive of moderated content, in a standardized format that will be available to authorized researchers.

Ad transparency:  Congress should mandate that social media companies regularly disclose, in a standardized format, key information about every digital ad and paid post that runs on their platforms. Paid posts, including political advertising, can be a powerful vector for misinformation—often insulated from scrutiny and correction thanks to techniques targeting small communities.

“We are in a crisis of trust and truth.”

Recommendations to Build Trust

Truth and transformation:  Commissioners endorse efforts that focus on exposing how historical and current imbalances of power, access, and equity are manufactured and propagated further
with mis- and disinformation—and on promoting community-led solutions to forging social bonds that can resist it.

Healthy digital discourse:  Develop and scale communication tools, networks, and platforms that are designed to bridge divides, build empathy, and strengthen trust among communities.

Workforce diversity:  Increase investment and transparency to further diversity at social media platform companies and news media as a means to mitigate misinformation arising from uninformed and disconnected centers of power.

Local media investment:  Promote substantial, long-term investment in local journalism that informs and empowers citizens, especially in underserved and marginalized communities that are
most likely to be harmed by, or are most vulnerable to, mis- or disinformation.

Accountability norms:  Call on community, corporate, professional, and political leaders to promote new norms that create personal and professional consequences within their communities
and networks for individuals who willfully violate the public trust and use their privilege to harm the public.

Election information security:  Improve U.S. election security and restore voter confidence with improved education, transparency, and resiliency. This will include proactive outreach communications, updated educational content, and greater transparency and resiliency around elections, election infrastructure, and audits as a means to counter false narratives.

“Currently, the U.S. lacks any strategic approach and clear leadership in either the public or the private sector to address information disorder. The federal government has been ill-equipped and outpaced by new technologies and the information ecosystems that take shape around them.”

Recommendations to Reduce Harms

Comprehensive federal approach: The Administration should establish a comprehensive strategic approach to countering disinformation and the spread of misinformation, including a centralized national response strategy, defining roles and responsibilities across the Executive Branch, and identifying gaps in authorities and capabilities.

Public Restoration Fund: Establish an independent organization, with a mandate to develop systemic misinformation countermeasures through education, research, and investment in local institutions.

Civic empowerment:  Major online platforms should provide investment and innovation in online education and platform product features to increase users’ awareness and resilience to online misinformation.

Superspreader accountability:  Online platforms should hold superspreaders of mis- and disinformation to account with clear, transparent, and consistently applied policies that enable quicker, more decisive actions and penalties, commensurate with their impacts—regardless of location, political views, or role in society.

Amendments to Section 230 of the Communications Decency Act of 1996:  This recommendation contains two separate proposals to amend Section 230. First, withdraw platform immunity for content that is promoted through paid advertising and post promotion. Second, remove immunity as it relates to the implementation of product features, recommendation engines, and design.

In our final analysis, of the many formative efforts to research and provide solutions to the misinformation crisis, this report is the seminal document to date for how best to frame this issue.

Further Resources

Former CISA Director Krebs and Former Congressman William Hurd both Commissioners on Information Disorder Report

The aforementioned Disinfo Discussions at The Aspen Institute of which the report is compromised:

Fundamentals of Mis- and Disinformation – Dan Boyd

Decline in Trust – Decline of Trust – Ethan Zuckerman

Section 230 and the First Amendment – Mary Anne Franks and Jeff Kosseff

Youth and Media Literacy – Jim Steyer

Congressman Will Hurd is an OODA Network Member.  See:

Will Hurd on Skills For Success In The Modern Age

OODAcast with Congressman Will Hurd on AI, 5G, Cybersecurity Risk and Geopolitical Risk

A CIA Officer and Delta Force Operator Share Perspectives on 9/11

Related Reading:

Black Swans and Gray Rhinos

Now more than ever, organizations need to apply rigorous thought to business risks and opportunities. In doing so it is useful to understand the concepts embodied in the terms Black Swan and Gray Rhino. See: Potential Future Opportunities, Risks and Mitigation Strategies in the Age of Continuous Crisis

Cybersecurity Sensemaking: Strategic intelligence to inform your decisionmaking

The OODA leadership and analysts have decades of experience in understanding and mitigating cybersecurity threats and apply this real world practitioner knowledge in our research and reporting. This page on the site is a repository of the best of our actionable research as well as a news stream of our daily reporting on cybersecurity threats and mitigation measures. See: Cybersecurity Sensemaking

Corporate Sensemaking: Establishing an Intelligent Enterprise

OODA’s leadership and analysts have decades of direct experience helping organizations improve their ability to make sense of their current environment and assess the best courses of action for success going forward. This includes helping establish competitive intelligence and corporate intelligence capabilities. Our special series on the Intelligent Enterprise highlights research and reports that can accelerate any organization along their journey to optimized intelligence. See: Corporate Sensemaking

Artificial Intelligence Sensemaking: Take advantage of this mega trend for competitive advantage

This page serves as a dynamic resource for OODA Network members looking for Artificial Intelligence information to drive their decision-making process. This includes a special guide for executives seeking to make the most of AI in their enterprise. See: Artificial Intelligence Sensemaking

COVID-19 Sensemaking: What is next for business and governments

From the very beginning of the pandemic we have focused on research on what may come next and what to do about it today. This section of the site captures the best of our reporting plus daily daily intelligence as well as pointers to reputable information from other sites. See: OODA COVID-19 Sensemaking Page.

Space Sensemaking: What does your business need to know now

A dynamic resource for OODA Network members looking for insights into the current and future developments in Space, including a special executive’s guide to space. See: Space Sensemaking

Quantum Computing Sensemaking

OODA is one of the few independent research sources with experience in due diligence on quantum computing and quantum security companies and capabilities. Our practitioner’s lens on insights ensures our research is grounded in reality. See: Quantum Computing Sensemaking.

The OODAcast Video and Podcast Series

In 2020, we launched the OODAcast video and podcast series designed to provide you with insightful analysis and intelligence to inform your decision making process. We do this through a series of expert interviews and topical videos highlighting global technologies such as cybersecurity, AI, quantum computing along with discussions on global risk and opportunity issues. See: The OODAcast

Daniel Pereira

About the Author

Daniel Pereira

Daniel Pereira is research director at OODA. He is a foresight strategist, creative technologist, and an information communication technology (ICT) and digital media researcher with 20+ years of experience directing public/private partnerships and strategic innovation initiatives.