Implications of the Spinozan Belief Model for Science and Society
April 19-20, 2018
St. Paul, MN
As our society has moved into the age of instant social communication, it has become increasingly apparent that the growing abundance of misinformation and propaganda is altering many individuals’ behavior from the political domain to the financial. Concern for this “fake news” in our new “post-truth” world is widespread. Many questions need to be explored: How dangerous to society is this? How does misinformation influence individuals? Can we counter its effect? Is censorship required? If it is required, then who should censor? Would this be in violation of the United States’ guarantee of free speech? In this unique, interdisciplinary symposium composed of noted philosophers, cognitive scientists, neuroscientists, and political scientists we will begin to grapple with some of these questions. Here we will examine the evidence for the 17th century philosopher Baruch Spinoza’s conception of the processes of belief and doubt. His view argues that the process of understanding a statement and believing that statement are one and the same. The process of comprehension is the process of believing. Only later after mandatory belief can an individual assess and disbelieve with a separate mental process. Belief is first, easy, and inexorable with comprehension, whereas doubt is retroactive, difficult, and disruptable. If the Spinozan belief model is correct, individuals would be biased toward belief simply due to the imbalance of the processes. As more and more false information enters our new and easily shareable marketplaces of ideas online, the Spinozan perspective of instantly and automatically altering the behavioral tendencies of the receivers suddenly has much more weight. When millions comprehend a false idea, we can be assured that many won’t have time, cognitive resources, correct information, or neural integrity to perform the mental work to doubt it. The overall aim of this symposium are to address two central questions: 1) Is the Spinozan belief model the process humans use to believe and doubt information? 2) And if so, what are the societal implications of instant social communication on Spinozan minds?
Co-sponsored by Hamline University’s Department of Psychology, Department of Philosophy, Department of Political Science, Neuroscience Program, and the Wesley and Lorene Artz Cognitive Neuroscience Research Center
April 19th (Thursday)
Session 1 Location: Klas Center — Kay Fredericks Room
2:35-2:45pm — Opening Remarks (Dr. Erik Asp)
2:45-2:55 — Dr. Daniel Gilbert, Department of Psychology, Harvard University
Session 1: Spinoza’s philosophy and belief model
3:00-3:35 — Dr. Edwin Curley, Department of Philosophy, University of Michigan
Title: Spinoza on belief
Abstract: In spite of Spinoza’s best efforts to present his metaphysical views as clearly as possible, by expounding them in the geometric fashion, beginning with axioms and definitions, and proceeding largely by formal proofs, the interpretation of Spinoza’s metaphysics remains highly controversial among Spinoza scholars. I’ll begin with an outline of my interpretation, which emphasizes his belief that everything which happens in nature is to be explained by scientific laws, and is necessary given those laws and the antecedent conditions. I’ll then pass to his theory of mind-body union, which holds that mind and body are one and the same thing, conceived in different ways, and analyse Spinoza’s theory of belief, explaining its origins in a critique of Descartes’ alternative, voluntaristic theory. I’ll conclude with a discussion of the implications of that theory for the ethics of belief and the problem of religious toleration.
3:40-4:10 — Dr. Jake Quilty-Dunn, Department of Philosophy, Oxford University
Title: The lower border of Spinozan belief: Object perception as an origin of propositional structure
Abstract: Proponents of the Spinozan model of belief acquisition emphasize a continuity between perception and belief. In both cases, incoming information is accepted without hesitation. Gilbert (1991) and Mandelbaum (2014) speculate that this continuity has a sensible evolutionary origin. Given the reliable accuracy of perception, it was economical for cognition to evolve simply to accept the deliverances of the senses and integrate perceptual information into the rational planning of action without intermediating analysis. But many have argued that perceptual representations lack the propositional structure required for belief. Even Gilbert distinguishes “the propositional system of representation that underlies cognition” from “the imaginal system of representation that underlies perception” (1991, 116). This divergence in representational format raises a serious problem for the Spinozan evolutionary hypothesis that cognition evolved to accept the deliverances of the senses: if the outputs of perceptual systems are not propositional, then cognition cannot simply accept them as is. There must instead be some sort of translation mechanism—but in that case there is significant computational mediation from perception to cognition, which undermines the Spinozan evolutionary hypothesis. This talk dissolves this problem by arguing that perceptual systems deliver propositional structures that can be cognitively exploited without mediation. Evidence from perceptual psychology strongly suggests that perceptual object representations (or “object files”) have a non-iconic, propositional format. These perceptual structures can be immediately believed without any intervening translation. The Spinozan model of belief acquisition can thus be integrated with perceptual psychology to yield a fuller picture of the architecture of the mind.
4:15-4:45 — Dr. Eric Mandelbaum, Department of Philosophy, CUNY
Title: Troubles with bayesianism: Ballistic believing and psychological defense mechanisms
Abstract: A Bayesian mind is, at its core, a rational mind. Bayesianism is thus well suited to predicting and explaining mental processes that best exemplify our ability to be rational. However, evidence from belief acquisition and change appears to show that we don’t acquire and update information in a Bayesian way. Instead, the psychological mechanisms by which we buttress our sense of self play a critical–and decidedly non-Bayesian–role in belief acquisition and updating.
4:50-5:15 — Session 1 Panel Discussion
Special Public Lecture Location: Sundin Music Hall
7:00-8:00pm — Dr. Daniel Gilbert, Department of Psychology, Harvard University
Title: Happiness: What your mother didn’t tell you
Abstract: Most of us think we know what would make us happy and that our only problem is getting it. But research in psychology, economics, and neuroscience shows that people are not very good at predicting what will make them happy, how happy it will make them, and how long that happiness will last. Is the problem that imagination has limits, or is the problem that society misleads us about the true sources of human happiness? The answer to both questions is yes. Professor Gilbert will explain why, when it comes to finding happiness, we can’t always trust our minds and our mothers.
8:00-8:30 — Questions for Dr. Daniel Gilbert
April 20, 2018 (Friday)
Session 2 and 3 Location: Anderson Center 111-112
Session 2: Challenges to Spinoza
8:00-8:25am — Dr. Justin Steinberg, Department of Philosophy, CUNY
Abstract: Spinoza not only thinks that ideas are intrinsically belief-like, he also thinks that we apprehend the world affectively, and that our affective representations constitute evaluative judgments. We are, by default, judgmental beings. In this talk, I explore how this account of evaluative judgments, taken together with other psychological dispositions—like the tendency to imitate others’ affects, reduce cognitive dissonance, and resist modifications to our belief-systems—contributes to civil strife. After delineating Spinoza’s account of the psycho-social roots of superstition, tribalism, and intolerance, I briefly consider some possible institutional and individualist remedies to this situation, concluding with a reflection on what it could possibly mean on a Spinozistic scheme to become less judgmental, and why (and when) it might be valuable.
8:30-8:55 — Dr. Ruth Mayo, Department of Psychology, Hebrew University in Jerusalem
Abstract: The Spinozan model advocates that comprehension equals acceptance and that rejection is a secondary process, demanding motivation, ability, and cognitive resources. If this were always the case, we would all think nothing of clicking on links offering us a monetary prize or hopping into cars of strangers who offer us a quick ride home. But we don’t act on those suggestions. In my talk I will propose that the primary rejection response to a specific untrustworthy source presents a basic underlying mode of thought that operates when we are in a skeptical mindset due to contextual cues or due to personality traits. I will report studies demonstrating that while in a trust mindset the primary process is of acceptance and a congruent flow of activation, in a skeptical mindset the primary process is of rejection, falsification, and incongruent flow of activation. I will first elucidate the nature of the skeptical mindset and show that skepticism inherently entails the activation of alternatives to the original accessible concept, thereby undermining the preeminence of the prime. I will then demonstrate that skepticism blocks the congruent accessibility effects using the “Donald ” task, the embodiment paradigm, and an applied context of web advertising as ads embedded in a distrust-provoking article promoted the naming of alternative (competing) brands. The offered conclusion is that the human mind is sensitive and flexible enough to block any influence from the environment if it seems unreliable, suggesting that the Spinozan model is context dependent.
9:00-9:25 — Dr. Uri Hasson, Center for Mind/Brain Sciences, University of Trento
Title: Consideration of truth and falsity affects perceived veracity in a manner consistent with inference generation: Behavioural data, philosophical implications, and a neurobiological hypothesis
Abstract: People are often tasked with assessing claims that are presented as either false, fake, or true. Such meta-linguistic attributions are a routine part of language, but we do not know how they impact the perceived veracity of the statements in question, if at all. We asked participants to evaluate the likelihood of attributive statements made about individuals whose faces were presented on the screen, for instance, how likely is it that this person is a liberal; how likely is it that this person drinks tea for breakfast. This established a baseline for two studies where truth and falsity was suggested and in which participants evaluated: i) how likely it was that such statements were true [or false] of the person (Study 1) or ii) indicated whether they agreed or disagreed with a guess of truth/falsity made by a computer (Study 2). We find that suggestions of truth increased endorsement rates as compared to baseline, but only when the statements were highly informative when true or weakly informative when false. The findings indicate that when evaluating suggestions of truth or falsity, individuals construct inferential models of what would follow if the relevant statement were true, and of what would follow if it were false. And, the impact of the suggestion is mediated by the accessibility of these inferred contents. A neurobiological hypothesis is presented to explain these effects. Finally, we discuss whether our results can be taken to bear on truth-theoretical debates in philosophy, and in particular on minimalist or deflationary accounts of truth according to which the truth predicate is redundant unless when used for formulating very specific generalizations.
9:30-9:55 — Dr. Maj-Britt Isberner, Institute for Psychology, University of Kassel
Title: Knowledge is power: Evidence against default believing of information that violates well-known facts
Abstract: According to the Spinozan model, comprehension and assessment/validation are strictly separate stages of information processing, and by default, comprehension entails the initial acceptance of the comprehended information. Theories and findings from the domain of language comprehension, however, call into question the conceivability of a comprehension process that is uninfluenced by the real-world truth or plausibility of information. In my talk, I will discuss these theories and findings in relation to the Spinozan model, including research from our own lab which – using reaction times and eyetracking as indicators – speak for a fast and obligatory rejection of unambiguously false information. I conclude that while belief may be the default processing result for the comprehension of (at least predominantly) new information, the Spinozan model does not seem to apply to the comprehension of violations of well-known facts.
10:00-10:25 — Session 2 Panel Discussion
Session 3: Spinozan Implications for Science
10:30-10:55 — Dr. Lisa Fazio, Department of Psychology and Human Development, Vanderbilt University
Abstract: Politicians, advertisers and propagandists often repeat false or misleading claims. Research on the illusory truth effect suggests that this repetition is likely to be effective; repeated statements are given higher truth ratings than novel statements. Prior research has assumed that this effect occurs only when individuals do not possess relevant knowledge. In contrast, we find that prior knowledge does not protect against the illusory truth effect. False statements that were read twice were given higher truth ratings than novel statements, even when the statements contradicted prior knowledge (e.g., “A date is a dried plum”). Multinomial modeling demonstrated that participants sometimes rely on fluency even when knowledge is available. However, illusory truth is not inevitable. When participants were prompted to explain how they knew that a statement was true or false before giving a truth rating, repeated statements were judged similarly to novel statements. Thus, prior knowledge can protect against the illusory truth effect, but only if it is used.
11:00-11:25 — Dr. Vikram Jaswal, Department of Psychology, University of Virginia
Title: Biased to Believe Testimony?
Abstract: Young children have a well-deserved reputation for credulity: They believe much of what they are told even when it conflicts with first-hand experience. This credulity could stem from a general, undifferentiated trust in other people, or it could reflect a specific bias to trust what other people say. I’ll describe two studies showing that 3-year-olds have a specific, highly robust bias to trust testimony and that responding skeptically requires inhibiting this bias. These results are consistent with the Spinozan account of belief: At least by three years of age, young children accept what they are told by default. Learning how and when to “unaccept” information represents a major challenge throughout development.
11:30-11:55 — Dr. Erik Asp, Department of Psychology, Hamline University
Abstract: In the early 1990s Dr. Daniel Gilbert contrasted two psychological models of belief and doubt: The Cartesian model (belief is subsequent and separate from comprehension) and the Spinozan model (belief and comprehension are the same process). Typically, cognitive resource depletion has been used as the standard method to adjudicate between the two models. Here, neuroscientific data will be brought to bear on the issue. The diverse constellation of deficits and symptomology following prefrontal cortex damage will be examined, and several empirical studies conducted in lesion patients will be described. Broadly, our results suggest that damage to the prefrontal cortex tends to increase credulity generally. Of the lesion patients studied, no one showed a dissociation between comprehension and belief. These neuropsychological data argue for a Spinozan perspective: belief is inextricably linked to comprehension, and the prefrontal cortex mediates retroactive doubt. Finally, a rudimentary neural circuitry model of belief and doubt will be offered using recent discoveries in fear conditioning processes as a guide.
12:00-12:25 — Dr. Tali Sharot, Department of Experimental Psychology, University College London
Title: Human valuation of knowledge and ignorance
Abstract: The pursuit of knowledge is a basic feature of human nature. Yet, in domains ranging from health to finance people sometimes choose to remain ignorant. I will present data showing that emotion is central to the process by which the human brain evaluates the opportunity to gain information, explaining why knowledge may not always be preferred. The brain reward circuitry selectively treats the opportunity to gain knowledge about favorable, but not unfavorable, outcomes as a reward to be approached. This coding predicts biased information-seeking: participants choose knowledge about future desirable outcomes more than about undesirable ones, vice versa for ignorance, and are willing to pay for both. This work demonstrates the critical role of affect in how the human brain values knowledge.
12:30-1:00 — Session 3 Panel Discussion
Session 4 Location: Klas Center — Kay Fredericks Room
Session 4: Spinozan Implications for Society
2:45-3:10 — Dr. Tim Levine, Department of Communication Studies, University of Alabama-Birmingham
Abstract: This talks describes Truth-Default Theory (TDT). TDT provides a new theoretical perspective on human deception and deception detection. TDT is contrasted with various cue theories of deception. TDT centers on robust findings of truth-bias in deception detection experiments and holds that the tendency to believe others is not really a bias at all but is instead adaptive. A belief default is necessary for efficient communication and social coordination. The belief default, however, makes humans vulnerable to deception. The structure and logic of TDT is outlined. Key empirical findings are summarized.
3:15-3:40 — Dr. Briony Swire, Network Science Institute, Northeastern University
Title: Remembering fact from fiction: Familiarity and the continued influence effect of misinformation
Abstract: People frequently continue to use inaccurate information in their reasoning even after a credible retraction has been presented. This is known as the continued influence effect of misinformation. If comprehending a correction requires temporary acceptance of its truth, then corrections could inadvertently make the “myth” more familiar. This is problematic because familiar information is more likely to be accepted as valid. From a dual-process perspective, familiarity-based acceptance of myths is most likely to occur in the absence of strategic memory processes. We thus examined factors known to affect whether strategic memory processes can be utilized; age, detail and time. Participants rated their belief in various statements of unclear veracity, and facts were subsequently affirmed and myths were retracted. Participants then re-rated their belief either immediately or after a delay. We compared groups of young and older participants, and we manipulated the amount of detail presented in the affirmative/corrective explanations, as well as the retention interval between encoding and a retrieval attempt. We found that (a) older adults over the age of 65 were worse at sustaining their postcorrection belief that myths were inaccurate, (b) a greater level of explanatory detail promoted more sustained belief change, and (c) fact affirmations promoted more sustained belief change in comparison to myth retractions over the course of one week.
Abstract: If we situate conspiracy theory belief under the Spinozan Model, we assume the act of comprehending the conspiracy theory is the act of believing it, and the act of disbelieving it is the second, more effortful, step (that may or may not be taken, given an individual’s motivation and/or ability). In the contemporary political and media environment, we consider the psychological and political explanations of conspiracy belief in the context of recent research which has provided evidence in support of the “conspiracy theories are for political losers” hypothesis. Specifically, perceiving oneself as being on the losing side of politics induces a loss of control and feelings of uncertainty and anxiety. It is this induction of loser status that reduces the motivation to unbelieve a conspiracy theory, especially if belief in that conspiracy theory reduces uncertainty and anxiety. In other words, we argue that political losers will stop with the first step of accepting because doing so fulfills a psychological need to regain control and grapple with uncertainty. We present data from a nationally representative pre-post 2016 election panel and a convenience sample collected in March 2018 to test our theory and discuss the implications of our work for theories about the causes of conspiracy theory endorsement.
4:15-4:40 — Dr. Josh Compton, Institute for Writing and Rhetoric, Dartmouth College
Abstract: Inoculation theory offers a way to think about the requisite conditions to reject a claim under the Spinozan belief model—motivation and resources. Threat (a perception of belief vulnerability raised by an inoculation treatment message, implicitly through the presence of counterattitudinal content and/or explicitly through the presence of a forewarning) triggers the motivation, and refutational preemption (the raising and refuting of counterattitudinal challenges to an existing position that are usually present in an inoculation message) provides the resources. In this talk, I’ll consider whether inoculation messaging might offer a useful strategy to confer resistance to influence in the contexts of politics, health, commerce, education, and others.
4:45-5:10 — Session 4 Panel Discussion
5:15-5:30 — Closing Remarks