Science communication and policymaking

Prepared by Bruce Lewenstein, facilitated by Rick Borchelt

This session addressed one big question: How reactive can science communication research/practice be in polarized political situations? We began with three subsidiary questions:

  • Do your practices formally target policymakers? If so, what is different from how you target other stakeholders?
  • Does your research formally address communication to, with, or among policymakers? If so, who funds that research?
  • What is the desired outcome of your practices, or what questions are you trying to address in your research?

In the discussion, we failed to address these questions. Instead, we talked more generally about what we know about science communication policy (which is different than “science communication with policymakers”).

We agreed that science itself has changed in recent years. The knowledge production process has changed in pace and the push for impact (whether that push is driven by commerce or by the need for greater social appropriation of science). These change raise issues for both science communication practice and research. What should science communication practice talk about (the content of science, the commercial pressures, the social needs, etc.)? What questions should science communication researchers address (public attitudes, public knowledge, construction of publics, etc.)? Those in the middle of practice and research – evaluators – need to be careful about what their relationship is with the changes in science

For science communication itself, we identified a variety of issues:

  • Audience reception of information is affected by politics, religion, ideology, authoritarianism, etc. [NOTE: this one of the few times we discussed audience.]
  • Covering the policy dimensions of science can better inform public discussion.
  • Climate change may be a special case (because of the politicization), but we didn’t fully discuss – for example, might health communication also need its own analysis, because of other socially-based conflicts (abortion, vaccines).
  • Overall, we thought that more science critique is needed (along the lines of art or music critics – people who love the field but cast a careful eye across it)

All these issues raise a fundamental tension of democracy – between expertise and participation. How can communities be brought into discussions that require expertise. Citizen science and local knowledge examples show the challenge.

Because of the challenge of “anti-science,” science communication practice needs to provide more context, differentiating between frame-based issues (such as climate change prevention vs adaptation) and deep belief issues (rejection of scientific reasoning, though often not of technological products). We need more discussion of cases like that of Thabo Mbeki’s rejection of established knowledge about HIV/AIDS – was his rejection based on unwillingness to read science, a need to establish his profile as a leader, a distrust of scientific institutions that had a clear history (in South Africa) of bias? This kind of case shows why we need to use a more nuanced discussion, pulling apart meanings of science – as a body of knowledge, as an approach to the world, as a set of social institutions.

These challenges show why science communication research needs to be relevant to practitioners, to help them think about how to deal with different flavors of anti-science. Some of the topics that one might address:

  • Why did the hepatitis C vaccine stay under the radar when the hepatitis B vaccine did not
  • Cross national studies of particular controversies, looking at public perceptions and at strategies of communication
  • Understanding how Community action and political engagement interact with public communication of science and technology
  • The links between PCST practice and development communication – what can be learned from the long history of research on Development communication
  • The role of cultural differences (for example, different beliefs about personal hygiene, or the way in which political culture defines some topics as in or out of science (the example is the US, where public health researchers are explicitly prevented from gathering data on gun violence)

But we also agreed that we need to be careful: Usually science communication research is driven by controversies, and that’s not the only context. What other perspectives do we need?

As a final topic, we addressed policy ABOUT science communication. This is both a research and a practice issue. For example, both the 1985 Bodmer Report and the 2000 House of Lords report on public engagement led to government commitments to research funding and to policies about expected science communication practice. Science communication practitioners and researchers need to be at the table when scientists and agencies are creating communication strategies. Government funding is also important for professionalization of the fields. We’ve seen that in PCST and other science communication meetings that are so often funded by government agencies in part to address national policies – we’ve seen that in South Africa, China, India, Israel, Mexico, and Brazil (at least! probably more places too).