3rd Copenhagen Conference in Epistemology: Epistemic Trust and Inclusiveness – University of Copenhagen

Forward this page to a friend Resize Print kalender-ikon Bookmark and Share

Social Epistemology Research Group > Activities > Past Activities (2008-2014) > 3rd Copenhagen Confere...

3rd Copenhagen Conference in Epistemology: Epistemic Trust and Inclusiveness

Conference Program
August 15-17, 2011

August 15
09:50 - 10:20: Registration, coffee, and tea
10:20 - 11:05: MIKKEL GERKEN: Testimony and the Ethics of Expertise
11:10 - 11:55: ARNON KEREN: Deference and Trust in a Community of Experts
12:00 - 13:00: Lunch
13:00 - 13:45: NIKOLAJ J. L. L. PEDERSEN:  Trust, Absence of Evidence, and Epistemic Rationality
13:50 - 14:35: KRISTOFFER AHLSTROM-VIJ: Required Trust
14:35 - 15:00: Coffee and tea
15:00 - 16:00: KIRK MICHAELIAN: The Limits on Vigilance
16:05 - 17:05: DUNCAN PRITCHARD: Epistemic Paternalism and Epistemic Value

August 16
09:00 - 09:30: Coffee and tea
09:30 - 10:15: CHRIS THOMPSON: A general Model of a Group Search Procedure, Applied to Epistemic Democracy
10:20 - 11:05: PRECIOUS IGHOROJE: Procedural cum Epistemic Democracy and the Intellectual Virtue of Inclusiveness
11:10 - 11:55: ERIK J. OLSSON: What Is the Optimal Size of a Deliberating Jury?
12:00 - 13:00: Lunch
13:00 - 14:00: PAUL FAULKNER: The Dogmatism of Trust
14:00 - 14:10: Coffee and tea
14:10 - 15:10: RONALD DE SOUSA: Disestablishing Epistemic Peerage
15:15 - 16:15: ALVIN GOLDMAN: Social Epistemology and Theories of Liberal Democracy
19:00 - late: Conference Dinner at Public House

August 17
09:00 - 09:30: Coffee and tea
09:30 - 10:15: THOMAS ATCHISON: Trusting Glenn Beck, Distrusting Climate Science: Reflections on the Epistemic Difficulties of Citizens in Modern Democracies
10:20 - 11:05: SHANE GAVIN RYAN: Trust and the Epistemic Environment
11:10 - 11:55: KATHERINE HAWLEY: Injustice in Trust
12:00 - 13:00: Lunch
13:00 - 14:00: RACHAEL BRIGGS: How to be Open-minded (but not so Open-minded Your Brain Falls Out)
14:00 - 14:10: Coffee and tea
14:15 - 15:15: KLEMENS KAPPEL: Believing on Trust: An Outline of a Theory of Epistemic Trust

(Alphabetically by author)

Kristoffer Ahlstrom-Vij (University of Copenhagen)

There has been a lot of discussion in recent epistemology about the conditions under which we are permitted to trust others. The present talk seeks to illuminate a closely related yet largely overlooked aspect of trust, by giving an account of the conditions under which we are required to trust certain sources on epistemic grounds. The account proposed holds that a subject is required to trust at least one of the sources in a set O if and only if (a) the alternative to trusting any of the sources in O involves a significant epistemic downside for the subject, (b) the sources in O are sufficiently reliably powerful sources on the relevant matters, and (c) the subject’s evidence does not, on balance, indicate that they are not.

Thomas Atchison (Metropolitan State University)

Citizens need to know something, on even the most minimal view of their role in a democratic society.  Some would hold, ambitiously, that they ought to know enough to participate meaningfully in self-government.  Others would say that it is enough if they can learn that the elites they have been passively allowing to rule (by voting or not voting) are no longer serving them well enough and it is time to 'throw the bums out'.  Even this last, minimal, view, though, requires citizens to know when their interests are no longer being served (and perhaps a bit about why).  And this may not be so easy to know.
In this paper I briefly (very briefly, since I think the point is widely understood) present findings from public opinion research, political psychology and media studies, suggesting that time-constrained citizens in contemporary democracies face great difficulties in arriving at even a minimally accurate picture of the world and of the relevant policy options. Unfortunately, these difficulties do not seem to diminish with more education or with more effort and attention.  (For example, a recent Pew Center poll reports that college educated Republicans in the US are more likely to reject the findings of climate science than less educated party members, while college-educated Democrats move in the opposite direction. In a 2003 PIPA study, Kull, et al., report that for most US news sources, people who paid more attention to the news were more likely to have misconceptions related to the 2003 US invasion of Iraq.) This suggests that the problem is not, as it is sometimes said to be, mainly due to apathy or to a (possibly rational) decision to ignore political affairs. It is a problem even for those who are making some significant efforts to inform themselves.
It is tempting to suppose that these dismal facts are the result of epistemic vices, perhaps culpable failures to reason well or to inquire honestly.  But, building on an argument made by Thomas Kelly in his 2008 J.Phil paper "Disgreement, Dogmatism, and Belief Polarization," I argue that reasonable choices (about who to trust and about how to assess new evidence) can lead to poor epistemic results.  Moreover, nfavorable epistemic conditions (a highly polluted information environment wherein one is naturally led to trust unreliable sources) can defeat the sorts of strategies generally recommended to individuals under the label 'critical thinking,' at least when combined with plausible time and resource constraints and a realistic picture of human cognition.  Tentatively, I conclude that there are no solutions to these problems at the individual level short of requiring what we might call heroic or supererogatory epistemic effort. 
Finally, I consider what sort of institutional or systemic conditions would be necessary to provide citizens with a manageable epistemic task.  This is partly a matter of a properly functioning media system (and proper norms of journalistic practice -- norms that would lead to a trustworthy press, which would help to identify trustworthy experts), but it may also be a matter of brute epistemic luck, since we will inevitably trust before we are able to figure out who is worthy of our trust.

Rachael Briggs (NYU/University of Sydney)

What should you do when you discover that you disagree with a trusted peer about the answer to some question? Recent work on peer disagreement between into two main poles. At one pole—the conciliatory pole—are views that say you should always change your opinion so that it is closer to your peer’s. At the other pole—the steadfast pole—are views that say you should always stick to your guns, and keep your opinion exactly as it was before you learned of the disagreement. I propose a way of modeling conciliatory views in a partial belief framework. I consider an argument to the effect that strong conciliatory views are incoherent, and discuss how one might weaken a conciliatory view to make it coherent.

Paul Faulkner (Sheffield)

We value trust and trustworthy behaviour. We value it because it is necessary for the collective good of cooperation. And one such good is the pooling of knowledge. Testimonial knowledge is in the very first instance knowledge got on trust. And the knowledge possessed by modern liberal democracies requires an extensive division of epistemic labour that is underpinned by trust. However, an attitude of trust also runs counter to other things that we value. It runs counter to notions of individual autonomy and to notions of epistemic justification. In forming beliefs on trust it is a speaker's intention that one act (uptake their testimony) that leads to an audience acting (testimonial uptake). In this respect, trust has more in common with akratic action than autonomous action. And one plausible account of justification is that it is a matter of being sensitive to and articulating the evidence. In forming beliefs on trust we do exactly the opposite. In forming beliefs on trust, an audience gives a speaker the benefit of the doubt and forgoes, and is insensitive to, matters of evidence. In this respect, trust might be illuminatingly compared with a paradigmatically unjustified attitude, namely that of dogmatism. Conversely, an emphasis on epistemic autonomy can then be presented as the emphasis on the good of individual justification. So we have a conflicting set of values. On the one side, we value trust because it is necessary for the realization of the good of collective knowledge. But on the other side, trust conflicts with our valuation of individual autonomy both practical and epistemic. This paper aims to establish these contrasts, and then propose that this conflicting set of values is necessary for correct epistemic function at the social level. It makes this point by reference to Kuhn's notion of scientific progress. According to Kuhn, what is essential to scientific progress is that there be periods of normal science and periods of revolutionary science. And just as trust is necessary for normal science and incompatible with revolution in science, so epistemic autonomy is incompatible with normal science but necessary for revolutionary progress. For this reason, the social generation of knowledge, that is the practice of collaborative science, requires a complex set of social values that make a virtue out of both trust and autonomy even when these virtues pull the individual in different directions.

Mikkel Gerken (University of Copenhagen)

In his “Towards and Ethics of Expertise” John Hardwig urged that it was “… high time we got to work on the ethics of expertise. Indeed, it is past time” (Hardwig 1994, p. 100). In this paper I get to work on a complicated and important phenomenon concerning trust in expert testimony. We may approach it by considering the following cases.
1.a: Shoshanna the meteorologist: Shoshanna, a meteorologist working in atmospheric composition research, is being interviewed on the topic of anthropogenic global warming for a local news broadcast. During the interview, she is asked about the impact of global warming on aquatic life. Shoshanna recalls from an airline magazine article that warmer waters cause coral bleaching and figures that coral reefs form the base of the food-chain in the aquatic habitat. So, she answers that impact is likely to be that there will be no fish to eat in 50 years.
1.b: Susanna the vision scientist: Susanna, a cognitive psychologist working in vision science on color vision, who is called upon to testify in court on the reliability of a certain eyewitness testimony. However, prior to her appearance in court she has learned that the defendant has had a very troubled upbringing. Susanna firmly believes that impact of the social environment in the formative years should influence the sentence. Consequently, she asserts, during her cross-examination, that anyone with an upbringing like the defendant would be psychologically wired to commit a similar crime. 
1.c: Sanjana the philosopher: Sanjana, a professional philosopher of mind, is giving popular lecture on mental causation at public library. During the Q&A she is asked whether unrestricted freedom of speech should be retained if leads to hurtful talk. Sanjana takes free speech to be morally valuable. Also, she recalls having heard that Mill argued that free speech is promotes the distribution of true belief in a society and she gathers that it is epistemically valuable as well. So, she answers that unrestricted free speech should be retained at all costs.
I argue that expert’s assertions are bad and attempt to characterize the badness of them. Here is a preview: The assertions are both epistemically bad and morally bad. They are morally bad, at least in part, because they are epistemically bad. More specifically, the experts in their various domains violate an obligation to qualify their assertions they speak outside their domain of expertise. They do so in a context in which the audience is likely, or even rational, to trust them to speak within their domain of expertise. I label such contexts “domain-strain contexts” and the assertions by an expert in such contexts “domain-strain testimony.”
In Section 1, I argue that the expert’s assertions are epistemically bad. In Section 2, I argue that if the expert’s assertions are epistemically bad, they are morally bad. In Section 3, I consider a number of complicating conditions. In Section 4, I briefly consider philosophical expertise. In Section 5, I conclude.

Alvin Goldman (Rutgers University)

Epistemology and political philosophy are generally assigned to different sectors of philosophy.  But several movements are abroad that speak to a considerable degree of convergence.  The so-called "epistemic" approach to democracy invites room for mutual engagement.  This paper begins by reviewing traditional approaches to liberal and democratic thought that provide little in the way of common ground.  It then moves into a territory (deliberative democracy) where there appears to be a convergence, but this (I argue) is an illusion.  Finally, the paper focuses on species of epistemic approaches to democracy that truly engage important issues in epistemology -- at least social epistemology.

Katherine Hawley (University of St. Andrews)

In this paper I build on Miranda Fricker’s (2007) account of testimonial injustice, to develop an account of injustice in trusting more generally; along the way, I raise some (non-fatal) criticisms of Fricker’s account. 
For Fricker, the central case of testimonial injustice is identity-prejudicial credibility deficit.  This occurs when an audience gives too little weight to a speaker’s words, because of negative stereotypes about a social group the speaker belongs to, perhaps her race, gender or class.  It is clear that such injustices can damage speakers in important practical ways – undermining their career prospects, or their criminal defences, for example.  In addition, however, Fricker argues that such injustices harm speakers in distinctively epistemic ways.
Testimony – and testimonial injustice – involve trust and distrust.  When we receive testimony, we trust or distrust the speaker to speak honestly and knowledgeably.  But this is a special case of trusting and distrusting more generally.  We trust or distrust certain people to look after our children, to govern wisely, to invest our money, or simply to show up on time for an appointment.  Starting from Fricker’s discussion of injustice in epistemic trust, I explore the opportunities for justice and injustice in trusting in more practical matters.  In addition, I consider the nature of the wrong we do people when we unjustly fail to trust them: is there a practical analogue of the epistemic harm which is perpetrated in testimonial injustice, at least according to Fricker? 
This exploration of justice and injustice in trusting is interesting in its own right.  But it also raises some challenges for Fricker’s treatment of testimonial injustice.  I pay particular attention to the different forms of injustice involved in (a) ignoring or disbelieving what someone says, and (b) not asking someone’s opinion in the first place. I argue that, although people can be wronged epistemically – wronged in their capacity as knowers – this is not the distinctive or most central feature of testimonial injustice.  Rather, both testimonial injustice and injustice in trusting more generally wrong people by disrespecting them as commitment-meeters, or promise-keepers. 

Fricker, Miranda (2007): Epistemic Injustice, Oxford: Oxford University Press.

Precious Ighoroje (University of Ibadan)

Based on the Condorcet’s Jury Theorem, both procedural and epistemic democrats agree that the more the voters, the greater the probability that the outcome of a decision will be correct in a two-option case. The Condorcet’s Jury Theorem is used to prove that a group is a better truth tracker than an individual. While procedural democracy lays emphasis on the fact that a set of procedures must be followed for us to reach the correct decision, epistemic democrats insist that the aim of democracy is to ‘track truth’ at the end of every decision. And the only way we can track truth is to include ‘everyone’ in the decision making process. Epistocracy is rejected in favor of epistemic inclusiveness.
In making decisions, both procedural and epistemic democracies aim at attaining political equality to a large extent and this is where the virtue of inclusiveness comes in. Epistemic inclusiveness basically means that members of a human social system must be permitted to have equal access to, and participate in, the learning, innovation, or knowledge processing activities of the collective group. Meaning that when it comes to making decision in every social group, all should be included.
This paper attempts to further our understanding on the epistemic virtues of inclusiveness and trust. I note the general normative character of virtue epistemology as a prescriptive science. While rejecting the views of Ernest Sosa who saw epistemic virtues as innate abilities and dispositions of the knowing subject to attain truth, I propose the idea that epistemic virtues are acquired traits which are not innate. Rather, they require cultivation and deliberate practice. As acquired habits, they aid or improve our cognitive abilities and the probability of attaining true outcomes in decision making.
Allusion is made to Aristotle’s theory of Virtue Habituation for further clarification on how virtues can be cultivated. This idea of epistemic virtues is largely based on James Montmarquet’s view that intellectual virtues lay emphasis on epistemic conscientiousness which is the motivation to arrive at the truth. 
It is based on these assumptions that epistemic inclusiveness qualifies as an intellectual virtue.  As an intellectual virtue, it does not come naturally to us; it requires deliberate efforts to include all necessary parties in making a social decision. This is because, it is a natural human tendency to be selfish and want to ignore others in decision making. Another reason is that in a democratic political setting, it is largely difficult and expensive to access all to know their opinions. 
Having considered these basic issues, I conclude that participatory democracy best epitomizes the epistemic virtue of inclusiveness and that epistemic trust necessarily follows inclusiveness. This is based on the obvious idea that people are more willing to adhere to a rule/decision which they took part in forming. Epistemic inclusiveness therefore, necessarily breeds epistemic trust. 

Klemens Kappel (University of Copenhagen)

The concern of the paper is believing on trust, that is, believing a proposition in part because one trusts some person (or institution) reporting it. Generally, we cannot assume that trust is a distinct type of relation or attitude. The word 'trust' in ordinary parlance does not denote a single unified psychological attitude or relation. Rather, it seems likely that there is a rather heterogeneous family of related phenomena that may be variously picked out when we talk about trust. In view of this, my strategy will be the following. Though my aim is not exegetical, I will focus on the kind of trust that figures in John Hardwig's wellknown papers (Hardwig, 1985, 1991). Hardwig argued that the sort of pervasive division of cognitive labour that we see in the sciences requires quite extensive believing on trust (more on this below). My focus is the notion of trust that plays the role depicted by the kind of cases that concerns Hardwig. I will use the label 'epistemic trust' to refer to those types of trusting relations. The overall aim of the paper is to outline a theory of epistemic trust, and show how epistemic trust might be respectable notion given process reliabilism about knowledge and justification, or similar general externalist assumptions (Goldberg, 2010).

Arnon Keren (University of Haifa)

Deference to an expert or an epistemic authority is a particularly strong form of epistemic trust. It does not only involve conforming one's judgment to the authoritative judgment of the expert, but also treating the expert's opinion that p as issuing a preemptive reason for believing that p: that is, as issuing a higher order reason against basing one's opinion regarding p on one's own consideration of all the available evidence. 
Both epistemic and pragmatic reasons can justify deference to an expert under certain conditions. However, it is often not easy to determine whether under current conditions one ought to defer to a (purported) expert. The difficulty is exhibited by the questions facing a layperson who confronts the opinions of a single scientific expert, or the consensus view of a group of scientific experts: Is the opinion of the purported expert properly based on the evidence available to her? Is the expert indeed in a better position to judge? The difficulty is even greater when laypersons confront disagreement among scientific experts. Here the question which is raised is not only whether to defer, but also to whom. Even if laypersons may sometimes be able to determine the answers to these questions, often it would be difficult, if not impossible, for laypersons to determine whether and to whom they ought to defer. At least, it would be extremely difficult for them to determine this, without carrying extensive research of their own.
The paper will study the implication of this difficulty for the norms governing a community of experts. Assuming that it is a function of a community of experts to serve as authorities to which layperson can defer, the paper will ask what type of norms should govern experts' belief-forming practices. In particular, what kind of belief-forming practices would allow laypersons to better determine whether they ought to defer to members of the community? The paper will suggest two such norms to which a community of experts should adhere. The first is a norm disallowing deference within the expert community, that is, disallowing deference by one member of the expert community to another member of the community. The second is a norm limiting the epistemic inclusiveness of the community. It will be argued that whether or not such norms contribute to the accuracy or reliability of a scientific expert's opinion, they can contribute to the ability of laypersons to determine whether they ought to defer to her opinion. 

Kirk Michaelian (Institut Jean-Nicod/Bilkent University)

Sperber and collaborators (Sperber et al. 2010) have recently argued that humans are ''epistemically vigilant'' with respect to information communicated by other agents -- that, given the evolutionary stability of communication, and given that it is often in the communicator's interest to deceive the recipient, recipients must have a capacity to filter out dishonest communicated information. This descriptive account dovetails with the normative account defended by Fricker in the epistemology of testimony (Fricker 1995), according to which, while an agent is epistemically entitled to trust herself, she is not so entitled simply to trust the testimony of other agents but is required to monitor testifiers for dishonesty. Drawing on empirical and evolutionary work on communication and memory, I argue that this ''vigilantist'' line gets things backwards: while agents neither need nor have an effective capacity for vigilance with respect to communicated information, they both need and have such a capacity with respect to internally generated information.

Erik J. Olsson (Lund University)

The size of deliberating juries in court varies with different countries. In the English speaking world, there are normally 12 jurors, except in Scotland where the jurors are 15 in number. The greater the number of jurors, the higher is the cost of administrating court proceedings. Accordingly, the question has been raised whether smaller juries would be just as good as larger ones. There seem to be little consensus, however, in what the correct answer to that question should be. Thus, while some American courts have ruled in favor of the admissibility of smaller juries, a recent evaluation of the Scottish tradition found the number 15 to be “uniquely right”. Are there, then, any rigorous arguments favoring a specific jury size? Obviously, we want jury deliberation to be as reliable a process as possible. This suggests the use of Condorcet’s famous jury theorem stating that the majority is under certain conditions more reliable than any single member, and that the reliability of the majority increases with the size of the jury. The latter conclusion would indicate that a jury should be as large as possible, time and money permitting. Unfortunately, the application of the theorem to deliberating bodies is severely problematic because we find, among its conditions, that of independence of opinion, which tends to be violated if the jurors deliberate before they reach their final views. This paper proposes a different model which, while being as exact as Condorcet’s model, differs from it in allowing for deliberation and for jurors to influence each other’s views, so long as they contribute to the deliberation process in an independent way. It is argued that the latter condition is a realistic assumption. The model makes it possible to study the reliability of a jury setup by computing the resulting veritistic value in the sense of Goldman (1999). How this is done in practice is illustrated by computing optimal jury sizes for various realistic background conditions within the Bayesian simulation framework Laputa.

Nikolaj J. L. L. Pedersen (Yonsei University)

According to some views on the structure of warrant, it is not possible to acquire an evidential warrant to accept certain very basic propositions (e.g. anti-sceptical hypotheses). Due to epistemic circularity, any attempt to acquire such a warrant will be subject to a principled failure of warrant transmission. Owing to the principled absence of evidence, it has been suggested that the proper attitude—the kind of attitude that one ought to have—is not one of belief, but one of trust. I investigate whether trust can be epistemically rational, despite the principled absence of evidence. I adopt a consequentialist framework and suggest that trust can be regarded as epistemically rational in the sense of maximizing expected epistemic value.

Duncan Pritchard (University of Edinburgh)

My concern is with the merits (or otherwise) of a form of paternalism which is specifically epistemic, one that mirrors familiar forms of paternalism in the ethical and political spheres. To this end, I critically explore Alvin Goldman’s seminal discussion of this topic, which runs along veritistic lines. While broadly sympathetic to the general position that Goldman develops in this regard, I argue that in order to properly evaluate the potential of epistemic paternalism we need to broaden our focus away from veritism and consider the non-instrumental value of epistemic standings over and above true belief.

Shane Gavin Ryan (University of Edinburgh)

This paper examines trust relations between non-experts and experts. The goal of my paper is to articulate the epistemic benefits of trust relations; to describe the pattern that non-expert to expert trust relations take; to explore how one can foster appropriate trust relations. 
I begin by setting out why trust is of epistemic importance. I argue that successful trustings that manifest virtue allow for an increase in the dissemination and circulation of knowledge. On the other hand too little trust and too much trust risks foregoing knowledge in the case of the former or gaining false beliefs and calling into question whether the true beliefs held on the basis of testimony really count as knowledge in the case of the latter. 
I discuss the relations between experts and non-experts as being characterised by an asymmetry. The expert is in a position of epistemic advantage vis-à-vis the non-expert. The non-expert is generally dependent on the expert; he can’t come to know certain matters without the testimony of the expert. Ordinarily the non-expert will not have trust relations particular to specific experts; rather whether they are trusted or not will depend on the level of trust is for them as a group. However, such an approach raises ethical concerns on the basis of an individual being wronged; this is in so far as she is not trusted because of her group membership although she may be entirely trustworthy. 
The general level of trust afforded to the expert testifier can in turn be expected to be influenced by the level of trust that is present in the epistemic environment. An epistemic environment in which there is little confidence that testifiers generally aren’t or wouldn’t be influenced in their testimony by epistemically irrelevant considerations that are nearby possibilities in that environment will not be conducive to a trusting environment. We can expect such possibilities to be strongly influenced by such conditions as the politics, material situation and values of the society. (For example, prima facie, we would expect there to be less trust in a totalitarian society recognised by its inhabitants to be such than in a democratic society.) (Uslaner 1999). 
Focusing on factors most pertinent to our own epistemic environment, I argue that measures can be taken to foster an epistemic environment in which it is appropriate for the non-expert to trust to a greater degree than currently is the case, and so accrue the epistemic benefits that come from successful trusting that manifests virtue. The measures that can be taken to foster such an environment include more actively incentivising testimony that is worthy of trust, and disincentivising – through means of both informal and formal sanctions – testimony that constitutes a breach of the public’s trust. In discussing the latter I consider whether current arrangements that formally sanction untrustworthy expert testimony are sufficient to appropriately assure public trust in the epistemic environment. 

Ronald de Sousa (University of Toronto)

Who are “epistemic peers”? There is a growing literature about what to do when two such peers disagree. In this talk, I will not attempt directly to contribute to that debate. Rather, I cast a skeptical eye on the very idea of identifying epistemic peers, and raise some puzzles about attempts to extend that status to people whose different backgrounds determine different interpretive frameworks and assumptions . A variety of mechanisms govern confidence levels for any given proposition. (And the term “confidence level” is importantly ambiguous between degree of belief and constancy of that degree.) The transmission of belief is commonly said to rest on “trust”; but that term is also ambiguous.  As I shall use the terms, “affective trust” implies emotional commitment; “routine epistemic trust” does not. Among the mechanisms of belief change, deductive argument generates good intersubjective adjustment, but plays a role in only few practical situations. Recognizably inductive or abductive arguments are more important and less conducive to intersubjective harmony, but are still relatively rare. Among other more important sources of belief change are mechanisms of cultural transmission, which are also multifarious. Some beliefs pass from one believer to another much like subliminal primings, without arousing explicit awareness of their adoption. These involve only routine epistemic trust, and probably rest on innate mechanisms  of imitation known to be uncommonly powerful in our species. Others, such as religious or tribal practices and belief, are rooted in affective trust, especially that conferred on parents or teachers by children. Some of these come with a meta-level label designating them as desirable or credible because they are traditional. The label itself can seem questionable in virtue of other epistemological and methodological considerations, including skeptical questioning of common assumptions. Indeed, there are good reasons to think that tradition, when invoked on its own terms in opposition to other considerations, while highly likely to persuade, is also almost guaranteed to provide bad reasons for belief or practice.  Once philosophical considerations about framework assumptions become a topic for discussion, they can give rise to their own counter-claims, illustrated by the hand-wringing complaint by some Westerners that their own values of sexual equality, openness or democracy merely reflect ethnocentric bias. How then might we find yet higher-level principles that will adjudicate between framework assumptions?  The complicated imbrication of natural psychological processes, cultural mechanisms, and normative standards involved in such assessments make it unlikely that we should ever arrive at reliable criteria for recognizing genuine “epistemic peers”.

Chris Thompson (London School of Economics)

The Condorcet Jury Theorem (CJT), and its various extensions, provides one epistemic justification for inclusiveness in political settings.  In the standard CJT framework a group of agents will use majority rule to determine which of two alternatives on an agenda is correct.  The CJT states that the probability the correct alternative will receive a majority of the votes increases as group size increases and in the limit approaches certainty. 
The CJT results hold provided that agents are competent (they are more likely to vote for the correct alternative than not) and are independent. For the competence and independence assumptions of the CJT to hold agents need to receive private pieces of information regarding the true state of the world.  The reason that majority voting allows a group of agents to track the truth is that it pools these privately held pieces of information.  This is the point at which most epistemic defences of democracy that rely on the CJT stop – it is just a happy accident when the competence and independence assumptions of the CJT hold, when agents have private pieces of information regarding the true state of the world.
I extend the explanation of how groups of political agents are able to track the truth by providing an account of how agents are able to extract pieces of truth-conducive information from the environment in the first place.  I provide a general model for a search procedure involving groups of agents.  A single agent searching for an object of interest may only have a small probability of finding it.  But if we employ a group to search for the object the probability that at least one of the group members will find the object can be significantly higher.  I present a theorem that states under certain conditions the probability that a group will identify a particular object is strictly increasing in group size and in the limit reaches certainty.  The theorem is confirmed by simulation results. 
In a CJT framework, the objects of search are the pieces of information regarding the true state of the world.  Before agents cast their votes they engage in a search for information.  If agents find pieces of truth-conducive information the competence assumption of the CJT will hold.  If there is diversity in the search procedures of agents then agents will find different pieces of truth-conducive information and the independence assumption of the CJT will hold. 
Search procedures provide a second epistemic justification inclusiveness in political settings: the greater the number of diverse agents included in the search for information, the more information the group will find.