Tech Should Advance Standards to Assess Information Quality 

By David Bray

David Bray, PhD is both a Distinguished Fellow and co-chair of the Alfred Lee Loomis Innovation Council at the non-partisan Henry L. Stimson Center. He is also a Distinguished Fellow with the Business Executives for National Security, Fellow with the National Academy of Public Administration, as well as a CEO and transformation leader for different “under the radar” tech and data ventures. David has received both the Joint Civilian Service Commendation Award as well as the National Intelligence Exceptional Achievement Medal, and served as the Executive Director for two National Commissions involving advances in technology, data, national security, and civil societies.

By Vint Cerf

Vint Cerf, PhD is considered one of the founders of the Internet, along with Robert Kahn, with both later receiving A.M. Turing Award, the highest honor in computer science, for their "pioneering work on internetworking, including the design and implementation of the Internet’s basic communications protocols, TCP/IP, and for inspired leadership in networking." Vint contributes both to global policy development as well as the continued standardization and spread of the Internet, and has served on many government panels related to cybersecurity and the national information infrastructure.

SPONSORED CONTENTOver the past five decades, digital advances have enabled the democratization of information content and associated technologies in unprecedented ways. With this comes an increasing challenge in discerning credible and authentic digital information from that which is not. Organizations in the technology sector as well as standards bodies have a responsibility to advance tools and standards to help address this growing issue.

The democratization of digital information production and distribution has led to an unprecedented volume of online content. However, the ability for individuals to judge the quality and veracity of content for themselves has lagged. This is an alarming deficit that needs addressing.

Historically, assessing the credibility of information and testimony has been a complex and resource-intensive process, even for national security as well as in judicial settings. Discerning “the truth, the whole truth, and nothing but the truth” requires triangulation and truth adjudication, even in cases of purported eye witness accounts. The rapid pace and scale of online information exacerbates these challenges dramatically. New methods that are accessible to the public are urgently needed, building upon advancement of information quality standards as a necessity in the era of AI, as posited by colleagues the Hon. Ellen McCarthy and Doowan Lee.


This is sponsored content.  Consider publishing your national security-related, thought leadership content in The Cipher Brief, with a monthly audience reach of more than 500K national security influencers from the public and private sectors.  Drop us a note at [email protected].


For reasons of scale, relying solely on human moderation of digital content creates too great a risk of limiting free speech and enabling abuse. Often what communities think represent truths are in fact beliefs, subject to perspectives, experiences, and change. Thus, any attempt at content moderation – either by humans or by machines – inherently is complex. For free societies which value individual thought and freedoms, content moderation may even backfire dramatically in hybrid conflicts, as actors seeking to polarize communities might turn blame, dissent, and disgust towards the moderators themselves as a “wedge” issue. Instead, civil societies need information quality standards and associated tools that aim to equip individuals to evaluate content quality for themselves.

Possible information quality standards could include metrics like those in development by the Trust in Media Initiative, to include the degree of match between headlines and article content, the extent an article resembles a press release, the level of emotional language used, the number of named sources referenced, and the degree unedited source material is provided. Information quality standards should focus on objective, quantifiable measures as much as possible. Moreover, if a site is aggregating or reusing content from another source, including images or video clips, then the provenance of content should be displayed as well.

Long-term trends on these information quality metrics for any online content producer are also valuable. For example, repeated mismatches between headlines and article content from a source may indicate broader credibility issues. Information quality standards could enable greater transparency of historical indicators for any website or content producer over time.

However, no one metric defines credibility. The overall aggregate of different metrics as well as the collective trends over time are what should matter. The metrics must allow individuals themselves to triangulate signals that matter to them to discern authenticity.

By working to advance visible information quality standards, organizations in the technology sector would allow the development of publicly available tools to present these metrics clearly to readers. Such standards and tools would be part of a suite of indicators that empower readers to personally evaluate credibility and authenticity.

Moreover, the advent of Generative AI (GenAI) models enables realistic but false content like deepfake images, audio recordings, and video content. This reality now makes the need for information quality standards even more pressing. Metrics assessing sourcing and links to unedited versions of such multimedia content are especially important here.


It’s not just for the President anymore. Are you getting your daily national security briefing?Subscriber+Members have exclusive access to theOpen Source Collection Daily Brief, keeping you up to date on global events impacting national security. It pays to be a Subscriber+Member.


The hope is by making less subjective and less politically charged information quality standards visible, producers of information content will strive to improve their “scores” as part of those information quality metrics. This is why the choices of what intended behaviors and outcomes such standards should embody matter immensely. Ideally, less subjective information quality standards should embody those elements that civil societies collectively value in helping assess whether information provided is more authentic and trustworthy.

Yet there are some risks with a standards-based approach. Some actors may adapt “gaming tactics” to improve their scores without truly improving the quality of information provided. Analysis and triangulation of information quality standards and sourcing requires effort that not all individuals will choose to exert. Also, information credibility and veracity adjudication of online information will remain subjective based on one’s worldview. However, the status quo – namely the absence of any indicators for readers to assess for themselves – is untenable.

Even after acknowledging the potential risks, we believe the technology sector should advance standards to assess information quality because something needs to be done to address the challenge in discerning credible and authentic digital content. This is especially important given what GenAI is capable of producing to include both helping informative synthesis of different information sources as well as unhelpful, misleading, and inauthentic synthesis of poor quality information sources too. The empowerment of people represents an essential activity, since de-politicization of information quality assessments represents a crucial shared good for civil societies that value individual freedoms. More objective information quality scores will equip people with the opportunities to analyze information critically for themselves.

In free societies, people have a right to their own perspectives. Navigating the digital “fog of the unknown” online requires individuals to personally triangulate multiple data points on information quality. Information quality standards and tools to support these standards should do so, without limiting freedom of expression or of thought.

In summary, adoption of information quality standards represents a compelling way that the technology sector can address the increasing challenges in discerning credible and authentic information, ideally helping to cross the chasm between what people see online and what people ultimately evaluate to be greater in information quality than other information sources also online. Making people-centered progress on these challenging issues requires transparency, standards, and empowering individuals with methods and tools for critical analysis by themselves. Our shared future depends on it.

This is sponsored content.  Consider publishing your national security-related, sponsored content in The Cipher Brief, with a monthly audience reach of more than 500K national security influencers from the public and private sectors.  Drop us a note at [email protected].

Categorized as:Tech/CyberTagged with:

Related Articles

Search

Close