A Manifesto for Science Communication as Collective Intelligence
May 2022
CITE AS: Holford, D., Fasce, A., Tapper, K., Demko, M., Lewandowsky, S., Hahn, U., Al-Rawi, A., Alladin, A., Boender, T. S., Bruns, H., Fischer, F., Gilde, C., Hanel, P. H. P., Herzog, S. M., Kause, A., Lehmann, S., Nurse, M. S., Orr, C., Pescetelli, N., Petrescu, M., Sah, S., Schmid, P., Sirota, M., & Wulf, M. (2022). A manifesto for science communication as collective intelligence. https://doi.org/10.17605/OSF.IO/TZUFW
LICENSE: Creative Commons - Attribution-NoDerivatives 4.0 International (CC BY-ND 4.0)
A commentary article based on our full-length paper is published in Science Communication:
CITE AS: Holford, D. L., Fasce, A., Tapper, K., Demko, M., Lewandowsky, S., Hahn, U., Abels, C. M., Al-Rawi, A. K., Alladin, S., Boender, T. S., Bruns, H., Fischer, H., Gilde, C., Hanel, P. H. P., Herzog, S., Kause, A., Lehmann, S., Nurse, M. S., Orr, C. A., , Pescetelli, N., Petrescu, M., Sah, S., Schmid, P., Sirota, M., & Wulf, M. (2023). Science Communication as a Collective Intelligence Endeavour: A Manifesto and Examples for Implementation. Science Communication.
https://doi.org/10.1177/10755470231162634
“Ulrike Hahn - Redesigning Science Communication: Science Communication as Collective Intelligence”
Ulrike Hahn gave a brief talk at the ACM Collective Intelligence Conference 2022 (CI 2022) in which she talks about the SciBeh initiative and their efforts in redesigning science communication as collective intelligence.
- Read the manifesto below
- Download the manifesto as a PDF file
- Sign the manifesto
- Join the conversation
Table of contents = executive summary
Why citizens need reliable knowledge
Many of the most pressing challenges societies face today—from climate change to global pandemics—require large-scale, collective decisions informed by the best available evidence. It is only when public beliefs are built on reliable knowledge, rather than poorly informed opinions, that we can successfully address these challenges. However, there are barriers to effective science communication, especially in rapidly evolving crisis situations or when evidence conflicts with political or commercial interests.
Barriers: social media
Social media notoriously prioritises emotion above evidence-based information and it is especially vulnerable to very active, extreme voices, which can skew users’ perceptions of the opinion landscape. The rejection of authoritative sources can also create an “epistemic vacuum,” leading people down the rabbit hole of conspiratorial sources and low-credibility content as they seek alternate sources and explanations.
Barriers: misinformation
Organised efforts to misinform or confuse the public, or to propagate conspiracy theories, endanger informed public discourse. For example, disinformation lobbying groups can disrupt science communication such that collectively supported opinions become treated as equal to collectively supported evidence. As a result, they restrict citizens from implementing scientifically sound solutions. Against organised disinformation campaigns, individual scientists are poorly matched, as they are vulnerable to direct attack from those opposed to specific types of scientific data.
Barriers: communicating as individuals
There are further disadvantages to scientists communicating as individuals. First, some scientists may have views that are at odds with the scientific majority, but these lone voices can still alter public perceptions of consensus, portray an erroneous balance, or paint an inaccurate picture of the controversies surrounding a particular issue. Second, incentives for individual science communication tend to be focussed on achieving short-term goals, such as generating media coverage, rather than on developing trust or achieving specific behaviour- or policy-related outcomes. This lack of strategic planning can put scientists at a disadvantage compared to others who are more adept at communication strategy. Finally, the pressure to publicise one’s findings can in some cases lead to scientific findings being presented to the public as facts even where the research may have important limitations such as methodological weaknesses and small effect sizes, or where it may not yet have undergone peer review or been replicated.
Why collective intelligence can help
We will be better able to counteract misinformation and contribute to an informed public if we can leverage the pooled expertise of a large number of scientists. A collective intelligence approach to science communication would use expert consensus to weigh information. In public debate, this may allow for more decisive conclusions to be drawn since greater expert consensus typically reflects the quality of the evidence supporting a particular position and the degree of certainty around a hypothesis. Communicating scientific consensus can thus help boost public acceptance of scientific findings and support for action, even across partisan lines. Furthermore, scientists, like all individuals, need to avoid cognitive biases when seeking out evidence, including the tendency for confirmation bias. This is particularly important when evidence-seeking in an environment where the rapid pace of scientific production outpaces the ability to critically evaluate it. The judgements and decisions of a diverse collective will likely be less prone to bias and perceived to be less biased - a quality that is often considered important by policymakers and which should therefore facilitate communication with them.
What should science communication as collective intelligence look like?
It should communicate the strength of the evidence
Science should not be viewed as an all-or-nothing process in which hypotheses are either entirely disproved or entirely supported. Instead, the prominence given to different positions, both in terms of attention and importance, must be proportionate to the strength of the evidence supporting them. Collective intelligence in science communication should function as a source of reputable information that establishes the strength of evidence and variability of study findings, through evidence syntheses such as meta-analyses, systematic reviews, narrative reviews, and expert surveys.
It should be honest about uncertainty and error
Uncertainty and deliberation are integral to the scientific process and should be communicated openly, as should methodological constraints and contentious theories. Although politically-motivated actors may weaponize uncertainty to undermine scientific evidence, communicators can overcome those attempts by being clear about sources and extent of uncertainty. To aid this, evidence syntheses should integrate and communicate uncertainty in and across scientific studies as well as explain the causes and types of uncertainty. In crisis situations, characterised by high uncertainty, experts will disagree, and in these cases it may be inappropriate to communicate a single, aggregate scientific conclusion.
It should be diverse
To ensure accurate evidence synthesis, a scientific collective must be epistemically diverse and must guard against bias. Scientific research is a complex iterative process that requires methodological variety. As such, consensus formation should not be hasty but should instead genuinely reflect the breadth of views brought by different disciplinary perspectives and life experiences. Such diversity has the additional advantage of enabling more groups to identify with the scientific collective, rather than perceiving scientists as an out-group.
It should be open to alternative perspectives
A collectively intelligent science communication system must allow for evidence-based dissent. This should take a constructive form: when scientists engage with stakeholders, all parties should first seek to understand the other party’s norms, values, power structures, and experiences. All parties should also be clear about the purpose of the interaction, and they should be willing to persuade, and be persuaded by, the strength of the other’s arguments. New incentives should promote listening to, engaging with, platforming, and amplifying the views of others.
It should be transparent
Transparency allows people to see how decisions were made based on the available information and to understand how applying different personal values would have affected the decision. Scientific transparency can be improved in several ways – through pre-registration and openly sharing data and analysis codes, by communicating the degree of scientific consensus, and highlighting how single studies relate to the overall body of scientific evidence. Finally, when like-minded scientists collectively publish a declaration about a specific proposition, the process by which this statement was reached should be thoroughly scrutinised to minimise bias.
It should build trust
Building trust in science should be a priority for future emergencies. To this end, participatory research (where scientists develop research in collaboration with stakeholders) could help promote trust between the scientific community and the public.
It should be motivated by the common good
Science communication must be viewed as more than an opportunity to promote one’s own research or preferred scientific viewpoint. During an emergency, the scientific community should re-prioritise and coordinate their communication efforts towards activities most likely to protect communities from harm.
It should be easy to understand
Scientific communication, should foster a shared understanding of scientific methodology. Simpler language can help achieve this, as well as reach larger audiences and promote multidisciplinary knowledge sharing and knowledge retention.
Meeting the challenges of science communication as collective intelligence
To address these challenges, science communication needs to embrace innovation. We suggest scientists need to focus on the following:
- Scientists as a collective need to define what constitutes expert consensus, as opposed to just group opinion.
- More research is needed to determine how audiences perceive and understand sources of scientific uncertainty, so that scientists can communicate this effectively to society.
- Researchers and social media businesses should continue to develop powerful artificial intelligence tools for sifting through large datasets, identifying misleading content, and flagging it for users.
- Systems that allow for comments on published research could enable experts in the field to draw on scientific consensus to provide ongoing re-evaluation of peer-reviewed publications. Machine learning algorithms that monitor new publications could be used to keep evidence syntheses up to date.
- Online platforms could be used to help facilitate rapid knowledge exchange between scientists as well as discussion of evidence, and consensus formation.
- A “machine of scientific accumulation” might be constructed to depict the global state of scientific evidence over time. As further data and evidence are generated in support of or against certain policies, the global state variable could indicate drift diffusion processes demonstrating the amount of evidence and confidence among the scientific community. This could assist scientists in making specific policy recommendations.
- Scientists could develop more strategic communication programmes, similar to those used in public relations.
- To minimise risks of polarisation, entrenchment, and degeneration of discourse, new methods are needed to enable scientists to prioritise ideas, evidence, and arguments.
Recent examples of collective intelligence in the science communication domain
As a starting point, in Table 1, we provide some examples of where collective intelligence has been harnessed in science communication efforts.
Table 1. Recent examples of collective intelligence in the science communication domain.
Collective intelligence platform | Description | Notes and references |
---|---|---|
The Debunking Handbook 2020 | This written document and guide is a prime example of a SciComm collective intelligence effort. Since the scientific knowledge about debunking has noticeably evolved over the last ten years, this collective project updated the first edition of the Handbook. It is also an effort to unify the knowledge and convey scientific consensus about the academic field of debunking to its audience. The Handbook was designed using a preregistered approach, as such preregistering the methodology and intended completion schedule. The collective production of this document was informed by research on the process of consensus formation in a medical context (e.g., Rosenfeld, Nnacheta, & Corrigan, 2015). In addition, the authors utilized precedents from psychology to enrich this SciComm document. As a matter of fact, the handbook creation process, as far as the author selection, preregistered process of document production, scope definition, methodology design, and result presentation are concerned, involves steps that leverage a collective intelligence process. | Lewandowsky, S., Cook, J., & Ecker, U.K.H. (2020). Under the Hood of The Debunking Handbook 2020: A consensus-based handbook of recommendations for correcting or preventing misinformation. [Working Paper, document creation process] Rosenfeld, R. M., Nnacheta, L. C., & Corrigan, M. D. (2015). Clinical consensus statement development manual. Otolaryngology—Head and Neck Surgery, 153, S1-S14. |
Indie-SAGE | The public policy channel for this independent scientific advisory panel employs multiple outlets to communicate, such as reports, weekly briefings, and events to inform the public and aid policy makers. Indie_SAGE, also known as The Independent Scientific Advisory Group for Emergencies, is a group of scientists not connected to the government which works collectively to provide impartial scientific advice to the UK government and policy makers about the current Corona pandemic. This advice is aimed at reducing deaths and the strain on the UK health system as well as helping to manage the pandemic response. The group wants to educate the public about this crisis. Indie_SAGE sees transparency as one important key factor to be successful in the fight against the pandemic and support the UK leadership and decision-making. | Landler, M., & Castle, S. (23 April 2020). “The secretive group guiding the UK on Coronavirus.” The New York Times, “Revealed: Cummings is on secret scientific advisory group for Covid-19. The Guardian. 24 April 2020. |
Red Team C19 NL | The Red Team C19 NL is a diverse collective of experienced professionals not connected to the government, working collectively to contribute to the prevention and control of COVID-19 by sharing expertise and relevant experience. The starting point is the interruption of virus transmission within all groups of society. The Red Team offered constructive criticism to the “blue team”, i.e. the Dutch government, the national public health institute (RIVM) and their Outbreak Management Team. The Red Team aimed to broaden the scope by sharing expertise and experience in the field of emergency response, frontline health workers, social sciences, public health, microbiology, anthropology, data science, governance, and beyond. | Example of the first letter with advice to the Prime Minister and Ministry of Health: Offer of organized constructive criticism in Dutch: Aanbod van georganiseerde constructieve tegenspraak, from the members of the group ‘RedTeam’, 2 August 2020. |
openIDEO | This digital platform is an electronic management system that mimics a collaborative incubator for creating solutions to current problems. openIDEO is a creation and innovation platform. This space focuses on environmental, ecological, and climate issues and challenges and on how to solve them. This maker-space (see also Stanford d.School) enables its users to, collaboratively, work on solutions for contemporary problems. Through this process, openIDEO also aims to encourage community involvement and societal change. Different parties can get involved in these friendly but somewhat competitive projects, such as experts, sponsors of particular projects, and outside consultants. | Suran, S., Pattanaik, V., & Draheim. D. (2020). Frameworks for Collective Intelligence: A Systematic Literature Review. ACM Computing Survey, 53, 1 (14). |
Wikipedia | Wikipedia is an online free-content encyclopedia supported by the Wikipedia Foundation, which communicates a wide range of topics related to scientific knowledge. Wikipedia’s content is produced collaboratively by anonymous volunteers and is freely editable except in limited cases to prevent disruption or vandalism. The fundamental principles of the project are summarized in its five pillars: encyclopedic knowledge, neutrality, openness, civility, and adaptability. Because everyone can help improve it, it has become more comprehensive than any other encyclopedia and its contributors work on improving their quality, removing and repairing misinformation and other errors. Over time, articles tend to become more comprehensive and balanced. Wikipedia has grown into the world’s largest reference website, with more than fifty-eight million articles in more than 300 languages, 127,165 active contributors, and 1.7 billion visitors monthly. | Livingstone, R. (2016). Models for Understanding Collective Intelligence on Wikipedia. Social Science Computer Review, 34(4), 497-508. |
The COVID-19 Vaccine Communication Handbook & Wiki | The COVID-19 Vaccine Communication Wiki and Handbook project is run by the SciBeh research group. This wiki constitutes a comprehensive online guide to offer practical tips based on the very latest evidence to talk reliably about vaccination and its associated myths and fears. The interdisciplinary team includes experts in vaccine psychology, education and virology who volunteer to produce a living document, which will continue to evolve through as the vaccine rollout gains pace. The key contents of the wiki have been edited as a handbook and translated into 12 languages, including summaries for policy-makers. | SciBeh. (2021). How this Handbook came about |
Authors
Development of the Manifesto
Documentation of the five development steps
This manifesto was itself a collective intelligence, where we attempted to harness the knowledge of a diverse group of researchers in writing and refining the text you see above.
This is a summary of steps involved in how the manifesto was developed.
-
The SciBeh workshop on Science Communication as Collective Intelligence featured a series of panel sessions and group discussions over the challenges facing science communication and how collective intelligence could help. See notes from the workshop here.
-
We invited attendees to sign up as contributing authors or co-ordinating lead authors to craft the manifesto. Co-ordinating lead authors set out the structure for the manifesto manuscript after reviewing the workshop notes, meeting to discuss them, and conducting a voting exercise on all the points raised from the workshop, and the relevant sections for including them. This resulted in the division of the writing process among 6 groups, each overseen by one CLA.
-
Contributing authors had 2 months to produce their allocated sections. This was combined in a long-form document.
-
Once the long draft was completed, the CLA team condensed the paper into a list of brief statements and used a voting exercise to collectively determine what key “propositions” extracted from our lengthier draft are critical to the short manifesto.
-
CLAs went over the voting report (which you can see below) and extracted propositions that received at least 60% support from authors. This was re-organised into a more coherent order, turned into prose, and edited by all the CLAs. This resulted in the final short-form manifesto that is presented on this page.
Report of the consensus outcomes
Please visit the pol.is report to see the consensus outcomes from the manifesto authors who voted on the inclusion of specific propositions from the collaboratively-written manuscript in the short-form manifesto presented on this page.