Scientists are often confused and surprised when their work is met with distrust from members of the public. Though many instances of distrust lack warrant, failures in the trustworthiness of scientific communities can justify such distrust. In the first part of this paper, I examine the conditions for scientists’ trustworthiness as expert knowers, emphasizing scientists’ dual role as both generators and communicators of knowledge and revealing some of the complications and challenges of being a trustworthy communicator. Importantly, trustworthy communicators must not only be (morally) sincere and (epistemically) competent, as testimony theorists have argued, but they also must understand the needs and expectations of their audience. Importantly, scientists are not simply communicating facts to the public, but rather are making judgements about the state of research and its relevance as these pertain to the concerns of their public audience.
In the second part of this paper, I turn my attention to the distrust of scientific communities sometimes evident in socially marginalized groups. Theorists such as Naomi Scheman (2001) have provided interesting arguments to the effect that a legacy of poor interactions with scientific communities can provide good reasons for members of marginalized groups to distrust scientific communities and their claims. Examples include historical relations between geneticists and indigenous communities, medical researchers and African-American communities, environmental scientists and Inuit communities, and sex-difference researchers and women. I focus on two aspects of such a position that require further development. First, the mechanisms of marginalization play out differently for various social groups, and hence the reasons such groups have for distrust will vary accordingly. One does not simply get from social marginalization to a singular set of reasons for scientific distrust, and it is important to analyze how certain features of social marginalization play out in providing various reasons for distrust. Second, the target of reasonable distrust needs to be more carefully specified than simply “science” or “scientific communities”. There may be strong reasons for distrusting specific communities of science or particular research areas without a blanket scientific distrust being justified. Distrust does at times “travel”, expanding to a broader range of scientific communities than its original target, and it can do so in a reasonable way. Scientific communities need to attend to such travelling distrust (regardless of whether or not it is reasonable) if they want to be successful communicators. But it is also important to recognize when such travels of distrust are reasonable. Only a strong scientism that overemphasizes the similarities across different scientific fields and their histories could support a broad-based reasonable distrust of scientific communities, and such scientism is unwarranted. I conclude with some suggestions concerning how scientific communities might improve their trustworthiness across a variety of social groups.
According to Nature, the sentence against seven members of the Italian National Commission for the Forecast and Prevention of Major Risks was one of the most striking events of 2012. The seven commissioners, all university professors in seismology, volcanology and engineering, had been convicted to six years of prison for manslaughter. The charge is that of having misled the public about the tremors that had been striking the town of L’Aquila in the first months of 2009, tremors that eventually culminated in a deadly magnitude-6.3 earthquake on 6 April. The fact that on 10 November 2014 an appeal court cleared six of the seven accused does not reduce the importance of the case, especially as the trial will continue in the Court of Cassation, Italy’s highest court of appeal.
The case has longly occupied the Italian public debate but has received little attention from scholars in the academia. Particularly, the few scholars who analysed the case assumed highly polemical tones and a prescriptive register (directed either at the sentence or at the accused). Only very rarely scholars have investigated the case from the perspective of the philosophy and sociology of science, thus leaving its implications largely unexplored.
The paper will focus on a specific aspect of the case, namely the expert witnesses that were summoned by the two sides (the public prosecutors and the lawyers of the defendants) during the first degree of the trial. These witnesses had to discuss one of the decisive points of the quarrel, the existence of a causal link between the declarations of the seven during the meeting of the commission on 31 March 2009 and a decrease in the citizens’ risk perception. It is indeed argued that the latter played an important role in inducing the population of L’Aquila to stay in their houses - contrary to their habits - after the two tremors that preceded the earthquake on the night of 6 April.
The prosecution summoned a cultural anthropologist, lecturer at the university of L’Aquila, whereas the defense summoned some researchers in the fields of neuroscience and media studies. The various expert witnesses proposed wildly different theoretical frameworks and empirical evidence in order to refute or support the hypothesis of the causal link mentioned above. From a detailed analysis of their depositions, two diametrically opposed conceptions of science and of scientific responsibility emerge. Moreover, they have conceived their role and engagement in heterogeneous ways (who as neutral and professional specialists, who as intellectuals with a moral mission).
The paper will furthermore proceed illustrating the norms, motivations and constraints that were at stake in this situation of expertise, and will formulate some hypotheses on the difference between knowledge produced in the academia and knowledge produced during judicial expertises. It will discuss these questions in the light of classical and recent works in the field. In order to develop the argument, the paper will make use of a considerable amount of documents: the nine hundreds pages motivations to the first sentence, the written advices consigned by the expert witnesses, interviews with the expert witnesses and media releases.
Many writers claim that effective and ethical scientific communication should be “sincere”: that is to say, scientists should, insofar as possible, report what they believe. For example, in a recent paper, Keohane, Lane and Oppenheimer (2014) describe honesty (which incorporates sincerity) as “intrinsic to science: the sine qua non for this form of human activity”; Nordmann (2011) has argued that sincerity is essential if science is to achieve its Enlightenment ideals. Such suggestions relate to a broader sense that “sincerity” is, in Bernard Williams' phrase, one of the key virtues of truthfulness (Williams, 2002).
This paper argues that sincerity cannot be a norm for ethical scientific communication, on the grounds that it is an unobtainable ideal, of questionable ethical significance.
Three arguments are presented for this position. The “argument from collaboration” states that given the collaborations necessary to produce scientific papers, reports and so on, it is hopeless to expect these paradigm acts of scientific communication accurately to reflect the views of each contributor. This point is developed by appeal both to recent work on collective deliberation and empirical case studies of authors' attitudes to “their” papers.
The “argument from standards” states that, given the practical difficulties of ensuring trust in scientific communities, there are often good reasons for scientists to use standardised tools and techniques to generate and report their results. Using these tools and techniques may sometimes lead them to report claims which they do not, themselves, believe. However, I argue that in these cases, the importance of enabling communication over-rides the value of sincerity.
The final “argument from non-ideal conditions” states that even if there is a sense in which scientists can and should be “sincere”, the force of this demand rests on an assumption that they are communicating in “epistemically ideal” circumstances, where other, non-experts are engaged in an honest attempt to distinguish true from merely putative experts. However, in many real-life cases, scientists are communicating in “epistemically hostile” environments, where they know that others might maliciously twist their words to distort their meaning. In these contexts, I argue, the ethical arguments in favour of sincerity have little weight.
That sincerity is not a norm of ethical scientific communication may seem rather surprising. In the conclusion, then, I note a tricky problem for my arguments: even if there are arguments against sincerity as an ethical or epistemic virtue in science communication, the widespread belief that sincerity is a virtue may give scientists prudential reasons to be sincere (insofar as possible). I suggest that these risks of being “caught out”, although important, do not suffice to justify sincerity as a basic norm for scientific communication.