In 1939 two representatives of the Institute for Propaganda Analysis (IPA), Alfred and Elizabeth Lee, stated that propaganda “may distort our views and threaten to undermine our civilization” (12) Anxiety about persuasive communications motivated IPA members to reach out to the U.S. public, with the hope of cultivating media literacy among the citizenry. Lee and Lee's ambition to neutralize negative media effects by educating the public, as well as ancillary desires to frustrate adversarial broadcasts with strategic counter‐propaganda, are objectives that have since been echoed by numerous twentieth and twenty‐first century communications analysts. Nevertheless, identifying the potential sway of media is one thing, while successfully marshaling a response to such persuasive communications is another. This same concern is one shared by many supporters of evolutionary theory, faced with the potential impact of religiously motivated anti‐evolutionism in stimulating public distrust of established science. Taking into account the persuasive characteristics of Darwin‐skeptic mass communications identified in previous studies, uncertainties remain regarding how to ideally promote evolutionary theory to religious audiences. It is to this quandary that the present study is dedicated, with the intent of translating the most successful methods of science promotion employed in other contexts to the so‐called Evolution Wars. In what follows, various strategies used to address vaccine hesitancies and shape public policy for improving immunization uptakes will be conceptualized for pro‐evolutionist contexts. Recommendations for enhancing the endorsement of evolution will then be systematized into general guiding principles, as well as direct and auxiliary intervention procedures.

Having previously analyzed religiously motivated anti‐evolutionist persuasive techniques (Aechtner , , ), a supervening question raised in one form or another has been: What might this research tell us about how to better advocate for consensus science in the face of religious Darwin‐skeptic influences? In response to this enquiry I have, at most, conveyed that those aspiring to counter Darwin skepticism must acknowledge that religiously motivated anti‐evolutionist media tend to harness a broader, more ubiquitous suite of persuasion techniques than do communications supportive of evolutionary theory (Aechtner , 203). Consequently, pro‐evolutionists should consider how persuasive heuristic cues can be used for the promotion of biological sciences. In an effort to progress beyond this rudimentary finding, the present study aims to improve evolution advocacy to religious audiences by reframing some of the most successful methods of science endorsement for Evolution Wars contexts. To do this, it will build upon my own analyses by incorporating a growing corpus of pro‐science intervention research associated with endorsing scientific notions to doubting audiences.

Critically, numerous research groups have utilized empiric means to test and tackle counter‐science attitudes relating to vaccine hesitancies and climate change denial. In particular, vaccination hesitancies have received substantial attention because of the prospective influences of anti‐vaccination pundits, and the immediate need to formulate better public health policies for addressing immunization refusal. Such studies provide valuable insights concerning optimal methods for brokering constructive science communications with skeptical religious publics and, as with my own work, they have also considered the oblique ways that audiences can be swayed by various persuasion attributes. To be sure, anti‐evolutionism cannot simply be likened with resistance to anthropomorphic climate change or opposition to vaccinations. Even so, despite differing science denial contexts and cohorts, research addressing various science‐skepticisms can reveal important details about decision‐making trends, as well as what has assisted other science promotion efforts.

COMPARABLE TACTICS: ANTI‐VACCINATION AND ANTI‐EVOLUTIONIST PERSUASION ATTRIBUTES

Anti‐vaccinationism and Darwin skepticism differ in their goals and retinue, yet counter‐vaccine media often exhibit similar persuasion characteristics to those identified in religiously motivated anti‐evolutionist communications. Such anti‐vaccine media also display a rhetorical edge not generally found in pro‐vaccination materials (Ma and Stahl , 304). This includes repeated fear appeals emphasizing purported risks associated with the toxic ingredients of vaccines, as well as professed hazards to overstimulating infant immune systems (Zimmerman et al. ; Kata ). In concert with its emotive language, anti‐vaccination media feature a suite of allegations that coincide with persuasive cues identified in Darwin‐skeptic media. These include the rhetorical devices described as Asking Questions as well as the Contrast Principle and Negativity Effect. Asking Questions incorporates intention queries, in which people are asked to make predictions regarding future behaviors, as well as rhetorical inquiries (Aechtner , 192). The Contrast Principle and Negativity Effect, on the other hand, involves using comparisons to accentuate a point, with negative information and defamatory contrast proving to be more powerful in shaping people's attitudes (Aechtner , 89).

In relation to these two persuasive elements, audiences are told that pro‐vaccination policies encroach upon civil liberties, and vaccine‐skeptics ask about the fairness of such rules. Counter‐immunization media also frequently question whether there is truly enough empirical data supporting vaccine efficacy, and rhetorically cross‐examine the motives of vaccination advocates (Wolfe ). In a similar fashion to Darwin‐skeptic media, anti‐vaccination negative contrast involves biting attacks against supporters of vaccines, depicted as blind adherents who irrationally reject an alleged mountain of evidence favoring vaccine refusal. Just as anti‐evolutionists argue that scientists have provided insufficient proof for evolution's validity, anti‐vaccination media also contends that vaccine safety has not been scientifically substantiated (Arthur ). Moreover, akin to anti‐evolutionist claims that their opposition is despotically enforcing Darwinian orthodoxy, counter‐vaccine spokespeople assert that governmental vaccination policies are totalitarian, while continually questioning the moral character of their adversaries (Davies, Chapman, and Leask ).

Notably, these verbal assaults are replete with conspiracist ideation, featuring a vile Big Pharma–supporting cadre hiding the useless and dangerous nature of vaccines (Heller ). Such messages coincide with Scarcity Principle prompts that can be identified in Darwin‐skeptic communications. The Scarcity Principle stipulates that people's perceptions of the value and subjective desirability of items increase when they appear to be in limited supply. Relatedly, when items or ideas become banned or censored, appetites for these commodities and notions are often intensified due to their perceived scarcity (Aechtner , 192). With respect to the persuasive influence of scarcity, vaccine‐denialists and Darwin‐skeptics both routinely announce that they are being censored by powerful science‐advocating juntas (Zimmerman et al. ). Anti‐vaccination media further resemble Darwin‐skeptic broadcasts by continually appealing to Source Cues, which involve accentuating the expertise of communicators (Rodriguez ). Some broadcasts even liken anti‐vaccinationists to Galileo, in much the same way that anti‐evolutionists tether Darwin‐skepticism to the credibility of the Scientific Revolution's luminaries (Arthur ). Added to these cases are numerous celebrity endorsements besprinkling counter‐vaccination broadcasts, further reinforcing messages with the influential magnetism obtained via famous personalities (Arthur ). Anti‐vaccination materials also emphasize the shared values and ideals adhered to by members of the counter‐vaccine movement and various publics. In the case of anti‐vaccinationists, these tend to be values associated with professed notions of equality and fairness, as well as the virtues of academic freedom and personal autonomy in opposition to discrimination and authoritarianism (Leask and Chapman ). Such discourse mirrors the articulation of similar sociocultural values expressed by Darwin‐skeptic media makers, who also state that anti‐evolutionism coheres best with the worldview credenda of audience members.

Counter‐immunization attempts at enhancing credibility are also built around claims that there exist a multitude of rebel doctors secretly endorsing anti‐vaccination efforts, and millions of vaccine‐hesitant individuals around the world. These pronouncements correspond with Social Consensus cues identified across Darwin‐skeptic media, as they utilize an audience's tendency to favor commodities and ideas perceived to be the most dominant or popular (Davies, Chapman, and Leask ). Counter‐vaccination media further cultivate this suasion tactic through copious anecdotal testimonials of vaccine injury, along with underdog biographies of virtuous anti‐vaccinationists braving persecution against the alleged Big Pharma superstructure. Such stories capitalize on the persuasiveness of underdog effects in Social Consensus cues, which involves accentuating the struggle of righteous dissenters who are contesting against a highhanded majority (Arthur ). These narratives are also frequently supplemented with specialist language and the use of statistical data, which can be classified as instances of Statistics and Technical Jargon. It is frequently the case that complex, field‐specific language and numerical data can act as markers of expertise, leading audiences to defer to the credibility of communicators (Aechtner , 193). Although differing in subject matter and delivery, such persuasive elements approximate the analogous use of jargon and statistics located throughout Darwin‐skeptic communications.

In broad strokes, the analogous persuasion motifs observed across anti‐vaccination and Darwin‐skeptic media are summarized in Table . These attributes are deployed throughout an extensive range of communication outlets by representatives of both parties, to degrees that often rhetorically outstrip their pro‐vaccine and evolution‐supporting rivals. Along with these likenesses, researchers have documented how persuasive anti‐vaccination messages can have enduring effects that trade upon people's information‐seeking proclivities (Kata ; Betsch and Sachse ; Narayan and Preljevic ). In like manner, I have established how anti‐evolutionist media are infused with sizeable quantities of persuasive elements, which suggest a persuasive potency likely to influence targeted audiences (Aechtner , 201–02). With these resemblances in mind, we are presented with a picture of two different forms of counter‐science media exhibiting comparable persuasion attributes, and the potential to trigger the contestation of established science. What then, we can ask, have researchers of vaccine hesitancies concluded about how best to respond to anti‐vaccination influences bearing many of the same persuasive attributes as Darwin‐skeptic mass media? To start, by all accounts it appears that trying to frustrate such counter‐science initiatives and their affiliated attitudes through policies of intensified fact dissemination is unlikely to succeed.

Persuasive commonalities of Darwin‐skeptic and counter‐vaccine media
Persuasion variables: Darwin‐skeptic media Anti‐vaccinationist media
Asking Questions
  • Questioning an opponent's understanding of scripture.

  • Questioning empirical data confirming evolution.

  • Querying the moral principles of pro‐evolutionists.

  • Questioning of the fairness of pro‐vaccine policies.

  • Asking whether empirical data support vaccine efficacy.

  • Queries regarding vaccine supporter motives and morals.

Contrast Principle and Negativity Effect
  • Contrasting Darwin skepticism's empirical support with evolution's supposed deficits.

  • Comparing anti‐evolutionist honesty, open‐mindedness, and commitment to science with pro‐evolutionist duplicity and blind faith.

  • Contrasting the religious corroboration for Darwin skepticism with the spiritual insufficiencies of accepting evolutionary theory.

  • Contrasting data supporting vaccine efficacy and safety with the claimed empirical and anecdotal support for vaccine hesitancies.

  • Comparing anti‐vaccination honesty with vaccine lobby corruption.

  • Contrasting a devotion to critical thinking and personal autonomy with pro‐vaccine authoritarianism and close‐mindedness.

Scarcity Principle
  • Conspiracist notions of evolutionists suppressing incriminating scientific findings, and censoring creationist/ID supporting empirical data.

  • Discriminatory hiring practices against Darwin skeptics.

  • Conspiracy theories involving Big Pharma and government cover‐ups of vaccine dangers and ineffectiveness.

  • Doctors and researchers co‐conspiring to censor counter‐immunization data.

Source Cues
  • Appeals to sacred authority, including divine sanction and scriptural support.

  • Appeals to academic qualifications, scientific proficiencies.

  • Linking anti‐evolutionism with founders of science, religious pacesetters.

  • Mentioning anti‐vaccine scientific expertise and purported scientific papers.

  • Superior insider knowledge, as well as the deeper understanding of personal wellbeing and family members' health.

  • Celebrity endorsements.

Social Consensus
  • Testimonials from supporters.

  • Descriptions of anti‐evolutionism's popular support.

  • Mention of underdog anti‐evolutionist scientists.

  • Testimonials of vaccine injury.

  • Claims of rebel anti‐vaccine scientists and doctors around the globe.

Statistics and Technical Jargon
  • Biological Sciences, Earth and Atmospheric Sciences, Mathematics, Physics, Astronomy, and Theology technical jargon.

  • Scientific and survey statistics.

  • Medical sciences terminology.

  • Statistics validating the dangers of vaccinations, or figures invalidating the safety and efficacy of vaccines.

Fear Appeals
  • Invoking fears regarding social evils resulting from evolutionary theory.

  • Fears concerning a decline in Christianity precipitated by evolution.

  • Narratives of toxic vaccine ingredients and vaccine injury.

  • Frightful accounts of medical/corporate malfeasance.

Highlighting Shared Values
  • Emphasizing shared religious and moral values.

  • Underscoring mutual sociopolitical values associated with national narratives.

  • Asserting a commitment to the values of equality and fairness.

  • Mentioning values linked with individualism.

THE CHALLENGE OF MODIFYING ATTITUDES AND ACTIONS

There has been broad assent that resistance to scientific premises such as evolutionary theory or vaccines does not simply occur because of a deficiency in public rationality. This despite the Information Deficit Model of science communication, which has assumed that people are skeptical of scientific premises because they lack knowledge or scientific literacy (Hart and Nisbet ). Deficit model suppositions have traditionally suggested that the solution to erroneous beliefs about science can be found in supplying publics with more information, making rather inexplicable science understandable, and giving people the capacity to comprehend scientific fundamentals. It has been thought that this will result in a reciprocative acceptance of scientific ideas. However, increasing people's factual knowledge and testable cognizance about vaccines, for instance, often does not translate into elevated levels of vaccination confidence or improved immunization behaviors (McClure, Cataldi, and O'Leary ). By contrast, trying to correct misinformation with accepted scientific facts can often be counterproductive, leading to reductions in people's intentions to vaccinate (Nyhan and Reifler ; Kahan et al. , 78–79). In actuality, an individual's understanding of the data and theory underlying consensus science is frequently not a reliable indicator of the reception of those same scientific premises (Kahan et al. ). As a result, when it comes to publicly contested science such as anthropogenic climate change, biological evolution, or vaccinations, trying to change minds through the intensification of fact communiqués is unlikely to succeed. This is because people are liable to make decisions not only on the basis of understanding data and fact claims, but also in relation to persuasive sociocultural influences, allied with their identity associations and values (Lewandowsky and Oberauer , 218).

Although increasing scientific knowledge is an important goal, facts and science comprehension levels are clearly not the only ingredients that matter when it comes to the decision‐making dynamics around vaccines or other publicly contested science topics. Alternatively, it has been found that vaccination beliefs and behaviors are also driven by an assortment of psychological, sociopolitical, and cultural considerations. For example, community processes involving social networks and norms can greatly impact vaccine choices, because people tend to tailor their own behaviors to match up with the actions of their peers, and the members of groups with whom they self‐identify. Individuals are also inclined to interpret and accept vaccination facts through the cognitive lens of shared cultural values rather than only via scientific perspicacity (Brunson ; Browne ). Such findings correspond with the central premises of the cultural cognition thesis, and in many ways are unsurprising. Be that as it may, if increasing data dissemination and science comprehension does not always impact vaccination decisions, what approaches to altering vaccine perceptions and behaviors are efficacious for improving immunization acceptance? Are there any pragmatic lessons from research into increasing vaccination uptakes that may be applied to, or adapted for, other counter‐science contexts such as communicating to religious audiences in the Evolution Wars?

As things stand, attempts at modifying vaccine beliefs and behaviors provide the immediate lesson that altering people's perspectives and actions around contested science seems exceptionally difficult. Most attempts at changing the underlying attitudes guiding behavior around consensus science, such as vaccines, have proven ineffectual, limited in outcomes, or even disadvantageous, and are frequently designed around sparse amounts of empirical verification (Brewer, Chapman et al. , 186–87). In reality, there have been few successful, evidence‐based communication strategies that have positively influenced vaccine beliefs and mitigated opposition to vaccinations. As the authors (Dubé, Gagnon, and MacDonald , 4200) of one meta‐analysis concluded, there is “no strong evidence on which to recommend any specific intervention to address vaccine hesitancy/refusal.” Even so, despite these dispiriting findings there have been a handful of techniques that have demonstrated more promising results overall.

Of the health policies trialed for improving vaccination uptakes, those strategies that have met with the greatest success and empirical substantiation involve behavioral interventions and delivering improved access to vaccines. These interventions include providing monetary and nonmonetary incentives for vaccinating, imposing penalties for vaccine refusal, providing on‐site vaccination drives at people's places of work, as well as reducing logistical and economic barriers inhibiting people from getting vaccinated. Moreover, automatically setting vaccination appointments as the default option for individuals at their local clinic improves vaccine coverage. This latter approach is grounded within the notion of choice architecture, or nudging, which incorporates noncompulsory influences that do not restrict choices, but instead subtly alter the ways in which choices are presented. Hence, in making vaccination appointments the default condition for patients, the choice to opt out of automatically scheduled meetings is still available. However, the opt‐out choice construction often impels individuals to take part more readily than if the decision environment was structured with an opt‐in design.

Nonetheless, while these programs may be effective in rallying vaccination uptake, porting them over to Evolution Wars contexts is not necessarily straightforward. People cannot be enrolled into default meetings to receive doses of evolutionary theory in the same way that general practitioners might automatically schedule medical appointments. That being said, increasing access, choice construction, and behavior‐modification interventions have demonstrated some of the most operative results in vaccination scenarios. Therefore, questions ought to be asked about whether increasing access, providing incentives, or even experimenting with nudging can be used for the advocacy of evolutionary theory in religious communities expressing Darwin skepticism. Could religious leaders sympathetic to consensus science implement choice architecture methodologies to improve the acceptance or exposure to positive, noncreationist notions about evolution? Perhaps when planning in‐house religious education options for congregants on evolution–religion consonance, or the nuances of science–religion interactions, these could be organized as opt‐out rather than opt‐in parts of training curricula. Are there incentives that clerics can use, such as hosting a free meal or providing free learning materials, which may draw congregants to a meeting on evolutionary theory and faith? In terms of access, can religious leaders do more to increase the availability of pro‐evolutionist materials in church libraries, on local and denominational websites, or through other information distribution networks used by group members?

In addition to these considerations, what is notable regarding religious leadership, and their ability to nudge or supply access, is that recommendations from local authorities likely function as a primary lever in decision making around contested science. Along with analyses of behavioral interventions, increasing access and choice design, numerous studies have identified that healthcare provider recommendations are frequently a key factor in increasing vaccination rates. In fact, of all determinants at play in vaccination choices, one of the strongest predictors of an individual's assent to vaccines is whether their general practitioner delivered positive vaccine advocacy and advice in one‐on‐one appointment settings (Edwards and Hackell ; Myers ). Crucially, this is an observation with upshots readily translatable to the Evolution Wars.

Although doctor–patient relationships are not wholly reflected in religious leader–congregant interactions, a local cleric's positive recommendations on evolution may correspondingly serve as a vital influence on a religious adherent's receipt of scientific ideas. That is to say, people unsure about the science of vaccines frequently report that, aside from online searches, their first trusted expert source of information is their local healthcare practitioner. It seems safe to opine that for questions of a religious nature, adherents would likewise turn to the authoritative guidance of their own theological caregivers, taking cues from religious leaders’ perceived expertise on such matters as evolutionary theory. This, of course, may vary by religion and perceptions of religious expertise from tradition to tradition. But it could be ventured that, broadly speaking, just as doctor recommendations are strongly correlated with vaccine receipt, so too a religious leader's duologues on evolution would be highly predictive of the views held by congregants with whom the leader has discussed the topic. A corollary is that if you are a religious mentor supportive of biological evolution's veracity, or you know of such leaders in your midst, face‐to‐face endorsements from religious protagonists could serve as one of the most effectual ingredients in garnering science acceptance (Aechtner and Buchanan ). Consonantly, if research of what stimuli actually improve vaccination receipt is taken as a prompt for other science‐questioning scenarios, those aiming to make genuine headway in promoting evolutionary theory to religious publics need to muster support from local religious leaders.

There are, of course, several difficulties with this suggestion, the first being that, whereas most doctors understand the science of vaccinations and endorse vaccine schedules, there may be many religious leaders who do not approve of evolutionary theory in the same manner, and who may not have a robust comprehension of the science validating it. Second, religious leaders may be reluctant to campaign for evolutionary theory because of widespread resistance to the biological sciences existing in certain religious communities. In relation to this point, there is the third complication that even if a cleric is supportive of evolutionary theory, she may face opposition from church boards and lay leaders who can yield significant control over what is and is not taught within the confines of religious institutions. Fourth, direct one‐to‐one interventions by leaders can be logistically challenging. As a matter of fact, this same problem has been enunciated by healthcare practitioners, who have indicated that recurrently providing vaccine recommendations and answering questions for hesitant patients is not only time‐consuming, but can lead to lower levels of job satisfaction (Kempe et al. ). In the long run, person‐to‐person intervention with individuals questioning consensus science is serviceably difficult and often vocationally unsatisfying. Accordingly, clergy have limited time, finite relational capital and resources, and must balance competing concerns for which defending evolutionary theory may be far less of a priority than other community matters. Added to this is the fifth issue that many people in positions of authority, who could most constructively engage in direct, on‐on‐one interventions, are not always doing so in the best ways possible. As investigators have indicated regarding immunization contexts, for example, clinicians engaging with vaccine‐hesitant patients lack enhanced training in how to communicate using methods most beneficial for delivering vaccine acceptance (Dempsey and Zimet ). In fact, researchers have concluded that many science communicators require improved rhetoric, and upgraded media sensitivities, as they seek to challenge various science‐skepticisms (Cook, Bedford, and Mandia ; Masaryk and Hatoková ). This includes learning how to employ tools of persuasion rather than merely discussing scientific facts. Taking this into account, the communication strategies and persuasion techniques benchmarked for improving vaccination uptakes also need to be assessed.

SUPPLEMENTARY INFLUENCE STRATEGIES

Most studies that have analyzed the use of persuasion methods to shape vaccine opinions report mixed results across a variety of strategies, with a scattering of techniques proving more effective than others. What is notable about the tactics that have garnered relatively better results is how they map onto the persuasive cues identified previously in my own work (Aechtner , 190–94). For instance, with reference to the aforementioned results of one‐on‐one advice from doctors, it would follow that this form of intervention is at least partly successful because of the expertise clinicians exhibit in medical practice settings, which coincides with Source Cues influences. Regarding such parallels, investigations of vaccine choices have also identified the force of bandwagoning, which indicates that many individuals elect to get vaccinated because they believe that the majority of people are doing so (Hershey et al. ). In several ways these findings dovetail with persuasive Social Consensus cues; as researchers of vaccine uptakes have noted (Buttenheim and Asch , 2675), there exists “considerable evidence that letting people know what other people do is one of the most effective ways of increasing that behavior.” Correspondingly, it has been found that individuals tend be more amenable to immunizations when getting vaccinated is described as a prosocial norm, complied with by a preponderance of citizens (McClure, Cataldi, and O'Leary ). The social norm element can also include appealing to ethical mores in a strategy described elsewhere as moral framing, which distinguishes vaccination as the moral standard adhered to by an overwhelming majority of the population (Amin et al. ). Astride recommendations for vaccine stakeholders to mention social and moral norms are acknowledgments that people often socialize in geographically clustered networks with other likeminded individuals, and that health messaging is improved by affirming the shared sociopolitical values maintained within these groups (Beard et al. ).

Social clustering effects have been linked to homophily; the tendency for individuals to be socially attracted to, and to also associate with people bearing similar characteristics or cultural attitudes (Brewer, Chapman et al. , 168). On account of such homophilic clustering, vaccine hesitancy tends to be a highly networked phenomenon, in which social connections can be markedly predictive of trends in vaccine reception. For this reason, it has been suggested that positive health promotions should employ social marketing that connects with people's networked values (Leask, Willaby, and Kaufman ). In other words, to effectively reach vaccine‐hesitant individuals enjoins influencing the social networks in which these people abide, while being cognizant of the commonly held worldviews and value concerns maintained within those networks (Betsch and Sachse ; Lewandowsky et al. ). With regard to this, it has been recognized that people endorse positions on science that support rather than threaten their cultural values, and are more likely to trust experts who expressly share their own worldviews (Kahan et al. ; Kahan ). Experts should, therefore, affirm cultural values instead of attacking them, while emphasizing culturally validating explanations of science (Kahan, Jenkins‐Smith, and Braman ). Also, when a general practitioner is sensitive to and affirmative of a patient's value associations, vaccine promotion is aided through cultural cognitive salience. Additionally, to best avoid values‐related polarization it has been suggested that science communicators use “pluralistic advocacy” (Kahan , 297), which involves featuring representatives from numerous worldview positions. When science advocacy is delivered by experts from a range of sociopolitical backgrounds, culturally cognitive polarization tends to diminish because the publicly contested science is no longer associated with any one messenger's group affinities.

There are several implications of homophilic clustering, social networks, and the import of cultural cognition for the Evolution Wars. To start, science advocacy and advocates bearing connections to cultural meanings will be responded to according to those cultural attributes rather than merely the scientific facts being communicated. Consequently, pro‐evolutionist communications must affirm the networked values maintained within specific socioreligious grids. Also, pluralistic advocacy from experts across a diverse range of cultural worldviews should be employed when possible. Hence, in appreciating cultural cognition it would be rather inane to assume that religious anti‐evolutionists might be persuaded to accept evolutionary theory if the biological premise is represented as being fundamentally atheist in nature, or when the advocates themselves are vehemently opposed to religion and deeply held religious values. In such circumstances, the acceptance of evolution's empirical facts would hinge upon the cultural values thought to be linked with atheism, such that acceptance and rejection polarization would likely splinter between atheists and nonatheists accordingly. If we are not earnestly affirming and connecting with an audience's core values, and/or delivering pluralistic advocacy, there will invariably be culturally antagonistic responses, the rejection of a messenger's expertise, and communication failure.

THE SECOND WEAPON: PERSUASION AND MASS MEDIA

Coupled with the affirmation of values and the importance of local authorities, various persuasion techniques can further improve pro‐science messaging. For example, it has been found that how healthcare practitioners communicate about vaccinations in face‐to‐face meetings can further influence immunization behaviors. Case in point: vaccine receipt occurs more readily if doctors frame the option of getting vaccinated in presumptive announcements, specifying that vaccination is the norm and the assumed default action for all patients. Instead of listing options, and the choice of not getting vaccinated, clinicians simply make it known that a patient is due for a vaccine, and that the individual will be receiving the shot today or will be scheduled in for an appointment, as is routine for everyone attending the practice (Brewer, Hall et al. ). In effect, this communications approach utilizes both choice architecture and the persuasive cue Social Consensus, because the presumptive announcement can reference a participative majority who are getting vaccinated.

Yet another technique demonstrating positive results includes prompting patients to consider their future intentions to vaccinate, and encouraging clients to formulate date‐specific implementation plans to fulfill these targets (Milkman et al. ). This approach corresponds with the Asking Questions persuasion strategies hinted at above, in which self‐predictive questions about future behaviors increases the likelihood that people will see through their pre‐planned actions. Additionally, studies indicate that to get patients in the door for vaccination appointments, primary care providers should send client's vaccination reminders via postcards, update letters, as well as SMS texts or phone messages (Pich ). To an extent, such reminders are conceptual cognates of the persuasive tactics associated with Message Repetition, given that their task is to iteratively increase exposure to vaccine requirement notices beyond clinical settings, which improves patient recall of immunization protocols (Aechtner , 194).

With regard to how reminders, presumptive announcements, and future intention questions can enhance pro‐vaccine messaging, it may well be the case that assimilating similar techniques into evolution‐supporting ventures would be advantageous. Accordingly, can local religious leaders combine presumptive statements with Social Consensus appeals, while also delivering community reminder announcements when inviting congregants to hear messages about evolution–religion concordance? Could clerics trial future intention questions to get community members thinking about whether they might consider noncombative science and religion interactions in future, or ask congregants to forecast whether they would read materials supporting evolutionary theory by a forthcoming date? Additionally, there are still other communication strategies analyzed in vaccine research contexts that may be adapted for different pro‐science enterprises. Particular attention has been given to improving online media messaging, and parsing the best rhetorical methods for endorsing immunization via mass communications. These concerns have arisen in acknowledgment of the part that persuasive communications and online influences seem to be playing in public perceptions of vaccines; with the admission that pro‐vaccination actors could be leveraging both old and new media apparatuses more readily to reduce vaccine hesitancies (Rosselli, Martini, and Bragazzi ).

Although direct one‐on‐one medical provider interventions are efficacious, there are also benefits to implementing large‐scale, online media communications campaigns. As John M. Barry (, 324) has concluded, when it comes to combating future viral outbreaks such as an influenza pandemic, the most vital weapon will first be vaccination, while the “second most important will be communication.” In a modern context, where individuals are frequently using online sources to do their own research into making medical decisions, the need for a compelling online pro‐vaccine second weapon is as important for public health policy as it ever was (Ninkov and Vaughan ). This is especially the case because of the practical limitations of face‐to‐face meetings. One aspect of this includes increasing media coverage while also expanding the range of communications being used for immunization support. A second facet is the need to communicate in better ways within science‐supporting promotions.

It has been suggested that the rhetorical tactics employed by anti‐vaccinationists make their communications a deal more persuasive than pro‐vaccine messages (Shelby and Ernst ). Researchers also note the general failure of science communicators to persuasively defy such counter‐science rhetors in ways that garner popular support from nonexpert publics (Leask ; Masaryk and Hatoková ). With regard to this, several researchers have specified that more effective science advocacy can be achieved by taking into account the Elaboration Likelihood Model (ELM) of persuasion (Kata ; Seyranian ; Okuhara et al. ). The ELM is a theory which postulates that there are two major avenues of persuasion resulting from any exposure to communications: the central and peripheral routes (Petty and Cacioppo ). The first route involves attitude change ensuing from an individual's diligent scrutiny of a persuasive communication. The peripheral route, on the other hand, occurs when there is a lack of ability and/or motivation to thoroughly investigate and process a persuasive message's contentions. This low‐elaboration course involves comparatively less cognitive exertion, which features a dependence upon various “mental shortcuts” or “cues” (Anastasio, Rose, and Chapman , 154) to help formulate ensuing opinions and behaviors. Integrating the ELM entails identifying the peripheral cues, cognitive biases, interpretive schemas, and heuristics that can obliquely shape science‐related decision making, to strategically apply such components for science advocacy (Seethaler ; Morgan et al. ). It is on this specific point that my own analyses serve as a frame of reference for Evolution Wars media.

Having previously identified how pro‐evolutionist broadcasts tend to offer a smaller assortment of persuasive heuristics, at lower rates of recurrence than are exhibited in religiously motivated Darwin‐skeptic media, it can be reiterated that those supporting evolutionary theory should reflect on how to better integrate peripheral cues for science promotion. More to the point, science commentators need to weigh up methods of outclassing Darwin‐skeptic mass media by going one better in adopting the comparable varieties of persuasion cues readily employed by anti‐evolutionists. Work is required to implement the ethical but shrewd use of Source Cues, statistics and technical jargon, contrast framing, Social Consensus, the Scarcity Principle, and even Message Repetition, along with other persuasion cues and influence techniques not yet readily employed in science‐defending Evolution Wars media. This may include using counter‐heuristics and reworking practices already being utilized by science‐skeptics themselves. Such practices include enlisting more well‐known and respected individuals as spokespeople, while also using linguistic cues of public consensus, and being much more concerted in appealing to expertise heuristics (Morgan et al. ). This is because, for members of the general public, trying to figure out “what science (and which scientists) to trust without relying too heavily on simplistic heuristics is not so easy,” and peripheral cues such as “institutional affiliations, degrees earned, and consistency with what other scientists are saying do, in fact, matter” (Priest , 116).

All in all, pro‐evolutionists ought to become more persuasively savvy, embracing rather than eschewing the conscious use of message cues and identity affirmation, since reliance on social values and message heuristics are part and parcel of human cognition. This is the case for both scientists and nonscientists alike, because relying upon cues, such as those signaling expertise, is the “only reasonable way that reasonable people (including scientists) can make sense of science on a day‐to‐day basis” (Priest , 122). Given the prospective and universal influence of heuristics, it would be judicious for evolution advocates to tactically deploy persuasion cues in communications. Moreover, if studies into the triggers of vaccine hesitancies are to be taken as being at least moderately instructional for other counter‐science contexts, it would be prudent to accentuate persuasive cues imparting expertise and trustworthiness. This is because in many areas of the world there seems to be a growing distrust of individuals and institutions in scientific and political authority, while counter‐science pundits such as vocal anti‐vaccinationists and religiously motivated anti‐evolutionists are quick to relay their own professed scientific knowhow (Aechtner , 194–96; Ward ). Employing persuasive cues can also be particularly relevant for Internet environments, where many individuals turn for guidance when experiencing uncertainties about science. This is because people frequently venture to reduce cognitive loads while accessing information online by resorting to the use of heuristics (Sundar ; Metzger and Flanagin ). In practical terms, not integrating persuasive cues into science promotion efforts overlooks a common element of everyday message processing, and a potential determinant of science message reception in the Evolution Wars (Sundar ; Chen ).

In tandem with the measured application of heuristics, researchers have also suggested that tackling science skepticism requires the use of brief and easier to read counter‐communications. As things stand, media opposed to consensus science such as vaccines are often succinct and “cognitively more attractive” (Lewandowsky et al. , 123), because it is less cerebrally taxing for audiences to interpret. Researchers have thus advised designing science communications with laypeople's conceptual understandings in mind, while not reducing technical jargon to the point of hampering perceptions of expertise, since some specialized language heuristically denotes intellectual competencies (Toma and D'Angelo ). Comprehensibility may be further aided by the careful use of repetition, which can make clearly stated phrases more memorable and easier to recall following media receipt. These same recommendations pertain to Evolution Wars contexts and reaching religious publics. Pro‐evolutionists should endeavor to strike a balance between simplifying language to improve readability and cognitive fluency, though still including technical jargon denoting expertise, all the while judiciously using repetition.

Together with simplicity and readability, science‐skeptical media also tend to be punctuated by anecdotal narratives. This is especially the case for anti‐vaccination messages, which are steeped in personal testimonies about immunization dangers (Heller ). Although such anti‐vaccine narratives are factually dubious, stories have proven to be influential communication tools that may trigger peripheral route processing, and often appear to be more persuasive than systematic, well‐reasoned arguments (Cunningham and Boom ). Narratives also seem to transcend education levels, catalyze salient emotional reactions, and garner empathy as people can personally identify with a storyline's characters, contexts, and communicated values (Cuesta, Martínez, and Cuesta ). At the same time, narrative framing, which involves positioning information within culturally sympathetic narratives, has been proposed as a means for reducing culturally cognitive reactance to scientific ideas (Kahan, Jenkins‐Smith, and Braman ). Accordingly, it has been suggested that vaccine stakeholders adopt storytelling‐based communication techniques, such as sharing narratives about people suffering from vaccine‐preventable diseases or personal anecdotes about choosing to get one's own family members vaccinated (McClure, Cataldi, and O'Leary ). What is more, it has been recommended that other persuasive cues, such as statistics, should be included within these pro‐immunization narratives (Okuhara et al. ). Bearing these observations in mind for Evolution Wars purposes, evolution advocates might also consider using narrative‐based communication strategies. Becoming better raconteurs, rather than just architects of evidence‐based treatises, is vital because religiously motivated Darwin skeptics have also long been recounting their own anecdotes about evolution and the purported harm the scientific theory is having on individuals, religious communities, and society in general.

Wherever possible, researchers have also advised tailoring pro‐vaccine communications for specific audiences (Lustria et al. ). Health advocates are instructed to customize health messages in view of vaccine psychographics–psychological and behavioral characteristics associated with people's attitudes and opinions, life experiences, interests, social values and personality traits—which may influence vaccination decisions (John and Cheney ). The recommended goal of such tailoring is to make pro‐science messages congruent with a targeted group's worldviews, including shaping media to address particular socioreligious values. Needless to say, employing message tailoring could also be an advantage in the Evolution Wars, because in short this strategy involves applying the tried and true marketing adage know your audience.

Apart from gaining a deeper understanding of one's audience, it has been postulated that health spokespeople should also avoid trying to convert ardent vaccine deniers. Fervent anti‐vaccinationists are the least amenable to argumentation, and generally represent only a very small fraction of a nation's total population. By contrast, attention should be given to undecided individuals who represent much larger segments of the public. These “fence‐sitters” (Leask , 444) can express a range of vaccine hesitancies, yet they have not altogether taken a side on the matter. Focusing on this larger, more ambivalent subsection of the population is strategic in terms of audience size and conceivable outcomes. Furthermore, there are benefits to not challenging resolute, anti‐vaccine campaigners directly; as Julie Leask (, 3) has noted, a “highly adversarial strategy could give oxygen to anti‐vaccination activists, who may believe that persecution legitimizes their efforts within a martyrdom frame.” Additionally, targeting strident anti‐vaccinationists can draw unnecessary attention to counter‐immunization pundits, provoking “highly polarized discussions in social and traditional media, and perpetuating a false sense that vaccination is a highly contested topic” (Leask , 3). Of note regarding confrontational approaches to science advocacy, vaccination behavior change research has found that personal attacks, as well judgmental and defensive language, are to be avoided. Ridiculing people's sources of counter‐vaccine information is also unhelpful, whereas being respectful rather than dismissive of patients’ concerns, and dialogically affirming people's autonomy in making immunization decisions, can positively open up conversations (Gupta ; Betsch and Sachse , 3724). In the same vein, trying to shame or frighten vaccine‐hesitant patients can be counterproductive, and researchers have advised being cautious about using emotionally evocative messages to unnerve audiences (Greenberg, Dubé, and Driedger ).

Still another concern is that publicly targeting vaccine skeptics carries with it the communicative risk of repeating anti‐vaccinationist misinformation. This potential hazard arises from what has been described as the Familiarity Backfire Effect, which is a phenomena involving a challenge to misinformation paradoxically increasing people's belief in the falsehood itself (Lewandowsky et al. ). If science advocates contest erroneous, but familiar information, and mention the myth during the refutation, over time people may forget the pro‐science facts while retaining the now repeated, and more familiar misinformation due to automatic memory processes. Although evidence supporting the existence of these familiarity effects is mixed, it seems to be the case that the cognitive influences of misinformation are more intense when false information is repeated prior to offering a correction to the fiction (Swire, Ecker, and Lewandowsky ). With respect to this, it also appears that the impact of stating misinformation is lessened when people are explicitly warned before being exposed to challenged fallacies that they are about to hear incorrect claims (Ecker, Lewandowsky, and Tang ; Lewandowsky et al. ).

Relative to Evolution Wars contexts, it is also likely that publicly combatting prominent anti‐evolutionists will fail to cause loyal religiously motivated Darwin skeptics to modify their beliefs, while gratuitously heightening publicity for creationism. Science promoters should instead tailor science advocacy for the fence‐sitters, who may express more tentative objections and represent a much larger audience share. It may also be helpful to reduce any unnecessary mentions of religiously motivated anti‐evolutionist claims, and avoid inadvertently reinforcing Darwin‐skeptic misinformation through its repetition. Plus, before misinformation is raised, explicit warnings should initially be given to prompt audiences that the anti‐evolutionist allegations being referred to are erroneous. Pro‐evolutionists ought also to communicate respect for those doubting evolution, their hesitancies and decision‐making autonomy, while avoiding personal attacks and adversarial approaches.

A CALL TO PERSUASIVE ARMS: FRAMING A BETTER RESPONSE

In pursuing better methods for endorsing evolutionary theory to religious audiences, there is much to be gained from extant science intervention research. When the findings of studies endeavoring to address vaccine hesitancies are conceptualized for Evolution Wars contexts, and integrated with my own examinations of Darwin‐skeptic media, they can be codified for prescriptive purposes. To this end, it is helpful to systematize the resulting science promotion strategies into three broad categories, described here as General Guiding Principles, Proximate Interventions, and Auxiliary Interventions. The first category includes the following nine broad recommendations through which to build better pro‐evolutionist campaigns:

  • Discard the information deficit model.

  • Account for cultural cognition.

  • Enlist local religious leaders.

  • Improve message readability.

  • Tailor communications.

  • Target fence‐sitters.

  • Affirm people's autonomy of choice.

  • Respect the audience.

  • Avoid the familiarity backfire effect.

With regard to getting religious leaders on side, such key individuals doubtlessly possess deep insights concerning the sociopolitical factors shaping cultural cognition. Also, with their insider positioning, clergy can most readily put into action the following Proximate Interventions:

  • Behavior modification and choice construction.

  • Improve access.

  • One‐on‐one mediation.

These more direct techniques reflect the types of in‐person and behaviorally targeted policy strategies that have been trialed in counter‐immunization contexts, including nudging, face‐to‐face consultations, and increasing accessibility.

Along with such direct strategies, there remains a need to enhance the ways in which evolution is being communicated, not only in one‐to‐one settings but also in persuasive media outreach. Although mass communications are almost certainly less operative than proximate science advocacy methods, the practical limitations of such mediation techniques underscore the necessity of also using media communications strategies in the Evolution Wars. This is especially the case as researchers have suggested that there are important media effects at play when it comes to the propagation of science skepticism. Needless to say, the Evolution Wars are already marked by Darwin‐skeptic boosterism, featuring widespread messages riddled with rhetorical tactics and persuasive heuristics that call out for convincing pro‐evolutionist responses. Recommendations for such counter‐persuasion ripostes are delineated in Table , which coincide with many of the persuasive cues described throughout my previous analyses of anti‐evolutionist media (Aechtner , , ). Importantly, these same techniques can be used to optimize both one‐on‐one interventions as well as pro‐evolution mass communications efforts.

Auxiliary interventions
Source Cues
  • Make direct and indirect references to scientific and/or religious expertise.

  • Employ celebrity endorsements and famous spokespeople.

Social Consensus
  • Indicate that a significant majority of a population supports evolutionary theory.

  • Refer to the acceptance of evolution as a social norm.

  • Communicate matching statements from several different sources within the same message to reflect a social consensus.

Presumptive Announcements/Intention Questions
  • Make presumptive statements about the acceptance of evolutionary theory, referencing a presumed majority of religious believers or respected religious adherents who also accept the science.

  • Ask intention questions regarding whether people would consider accepting evolution in future, or contemplate their likelihood of engaging with pro‐evolutionist materials and training.

Message Repetition/Reminders
  • Provide reminder announcements about evolution–religion instruction sessions.

  • Carefully utilize message repetition about key pro‐evolution points.

Statistics and Technical Jargon
  • While making messages easier to read, incorporate statistics and technical jargon as markers of expertise.

The Contrast Principle and Negativity Effect
  • Rather than only communicating facts, contrast the data supporting evolutionary theory with anti‐evolutionist claims.

  • Appeal to the importance of fairness, and the equitable comparison of Darwin‐skeptic notions with evolutionary science.

  • Cautiously employ the negativity effect, recognizing that ideas expressed as being against something tend to be more resilient than those stated for a position.

The Scarcity Principle
  • Identify when the freedom of information is being restricted, or when evolution‐supportive facts are being censored.

Employ Narratives
  • Relate stories about historical and modern consonance between religion and evolutionary theory.

  • Provide one's own account of coming to terms with evolutionary science.

Highlight Shared Values
  • Emphasize shared religious and moral values.

  • Underscore mutual sociopolitical values.

In many respects, the list of Auxiliary Interventions in Table 2 reflects several of the persuasive devices that are already being used, to a greater degree, by a myriad of science‐skeptic pundits around the globe. Since counter‐science commentators are currently interfacing with vast global audiences via such potentially instrumental persuasive devices, it seems incumbent upon science communicators to refine the use of the same sorts of strategies in the interest of improving religious publics’ reception of evolutionary biology. It is on this note that we can return to the question regarding what my own research tells us about how to better advocate for consensus science in view of religiously motivated anti‐evolutionist influences. When my observations are added to the results of previous science advocacy studies, I contend that we should be left not only with the guidelines and intervention advice detailed here, but also a call to persuasive arms. There is an added impulsion to put into effect better pro‐evolution communication activities. Evolution advocates should take into account how best to blend pro‐evolutionist designs with the inventory of general guiding principles for science interventions, in multipronged approaches utilizing both proximate and auxiliary techniques. The need exists for evolution campaigners to avoid becoming sclerotic in how they are undertaking science promotion. Instead, science supporters ought to appropriate the guileless yet tactical use of persuasion, with improved intervention practices to reach religious audiences, for the sake of combatting counterfactual science skeptical influences. When it comes to how people make decisions about publicly contested science, it may be the case that the way science communications are delivered is as important as their factual bases.

ACKNOWLEDGMENT

The author gratefully acknowledges support from the Westpac Scholars Trust. He would also like to thank Rev. Andrew Demoline and Dr. Carrie Kollias for providing valuable feedback on the first draft of this article.

References

Aechtner, Thomas. 2010. “Online in the Evolution Wars: An Analysis of Young Earth Creationism Cyber‐Propaganda.” Australian Religious Studies Review  23 (3): 277–300.

Aechtner, Thomas. 2014. “Darwin‐Skeptic Mass Media: Examining Persuasion in the Evolution Wars.” Journal of Media and Religion  13 (4): 187–207.

Aechtner, Thomas. 2016. “Challenging the Darwin‐Skeptics: Examining Proevolutionist Media Persuasion.” Journal of Media and Religion  15 (2): 78–99.

Aechtner, Thomas, and MalcolmBuchanan. 2018. “Science and Religion Perspectives at St. John's University of Tanzania (SJUT).” Journal of Contemporary Religion  33 (2): 337–45.

Amin, Avnika B., Robert A.Bednarczyk, Cara E.Ray, Kala J.Melchiori, JesseGraham, Jeffrey R.Huntsinger, and Saad B.Omer. 2017. “Association of Moral Values with Vaccine Hesitancy.” Nature Human Behaviour  1 (12): 873–80.

Anastasio, Phyllis A., Karen C.Rose, and JudithChapman. 1999. “Can the Media Create Public Opinion? A Social‐Identity Approach.” Current Directions in Psychological Science  8 (5): 152–55.

Arthur, Donald C.2016. “Negative Portrayal of Vaccines by Commercial Websites: Tortious Misrepresentation.” UMass Law Review  11 (2): 122.

Barry, John M.2009. “Pandemics: Avoiding the Mistakes of 1918.” Nature  459 (7245): 324–25.

Beard, Frank H., Brynley P.Hull, JulieLeask, AditiDey, and Peter B.McIntyre. 2016. “Trends and Patterns in Vaccination Objection, Australia, 2002–2013.” Medical Journal of Australia  204 (7): 275–81.

Betsch, Cornelia, and KatharinaSachse. 2012. “Dr. Jekyll or Mr. Hyde? (How) the Internet Influences Vaccination Decisions: Recent Evidence and Tentative Guidelines for Online Vaccine Communication.” Vaccine  30 (25): 3723–26.

Brewer, Noel T., Gretchen B.Chapman, Alexander J.Rothman, JulieLeask, and AllisonKempe. 2017. “Increasing Vaccination: Putting Psychological Science into Action.” Psychological Science in the Public Interest  18 (3): 149–207.

Brewer, Noel T., Megan E.Hall, Teri L.Malo, Melissa B.Gilkey, BethQuinn, and ChristineLathren. 2017. “Announcements Versus Conversations to Improve HPV Vaccination Coverage: A Randomized Trial.” Pediatrics  139 (1): 1–9.

Browne, Matthew. 2018. “Epistemic Divides and Ontological Confusions: The Psychology of Vaccine Scepticism.” Human Vaccines and Immunotherapeutics  14 (10): 2540–42.

Brunson, Emily K.2013. “How Parents Make Decisions about Their Children's Vaccinations.” Vaccine  31 (46): 5466–70.

Buttenheim, Alison M., and David A.Asch. 2013. “Making Vaccine Refusal Less of a Free Ride.” Human Vaccines and Immunotherapeutics  9 (12): 2674–75.

Chen, Nien‐Tsu Nancy. 2015. “Predicting Vaccination Intention and Benefit and Risk Perceptions: The Incorporation of Affect, Trust, and Television Influence in a Dual‐Mode Model.” Risk Analysis  35 (7): 1268–80.

Cook, John, DanielBedford, and ScottMandia. 2014. “Raising Climate Literacy through Addressing Misinformation: Case Studies in Agnotology‐Based Learning.” Journal of Geoscience Education  62 (3): 296–306.

Cuesta, Ubaldo, LuzMartínez, and VictoriaCuesta. 2017. “Effectiveness of Narrative Persuasion on Facebook: Change of Attitude and Intention towards HPV.” European Journal of Social Science Education and Research  4 (6): 100–09.

Cunningham, Rachel M., and Julie A.Boom. 2013. “Telling Stories of Vaccine‐Preventable Diseases: Why It Works.” South Dakota Medicine  (Special Issue): 21–26.

Davies, P., S.Chapman, and J.Leask. 2002. “Antivaccination Activists on the World Wide Web.” Archives of Disease in Childhood  87 (1): 22–25.

Dempsey, Amanda F., and Gregory D.Zimet. 2015. “Interventions to Improve Adolescent Vaccination: What May Work and What Still Needs to Be Tested.” American Journal of Preventive Medicine  49 (6): 445–54.

Dubé, Eve, DominiqueGagnon, and Noni E.MacDonald. 2015. “Strategies Intended to Address Vaccine Hesitancy: Review of Published Reviews.” Vaccine  33 (34): 4191–203.

Ecker, Ullrich, StephanLewandowsky, and DavidTang. 2010. “Explicit Warnings Reduce but Do Not Eliminate the Continued Influence of Misinformation.” Memory and Cognition  38 (8): 1087–100.

Edwards, Kathryn M., and Jesse M.Hackell. 2016. “Countering Vaccine Hesitancy.” Pediatrics  138 (3): 1–16.

Greenberg, Joshua, EveDubé, and MichelleDriedger. 2017. “Vaccine Hesitancy: In Search of the Risk Communication Comfort Zone.” PLoS Currents  9. Available at https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5346025/

Gupta, Vidya Bhushan. 2010. “Communicating with Parents of Children with Autism about Vaccines and Complementary and Alternative Approaches.” Journal of Developmental and Behavioral Pediatrics  31 (4): 343–45.

Hart, P. Sol, and Erik C.Nisbet. 2012. “Boomerang Effects in Science Communication: How Motivated Reasoning and Identity Cues Amplify Opinion Polarization about Climate Mitigation Policies.” Communication Research  39 (6): 701–23.

Heller, Jacob. 2016. “Trust in Institutions, Science and Self—The Case of Vaccines.” Narrative Inquiry in Bioethics  6 (3): 199–203.

Hershey, John C., David A.Asch, ThiThumasathit, JacquelineMeszaros, and Victor V.Waters. 1994. “The Roles of Altruism, Free Riding, and Bandwagoning in Vaccination Decisions.” Organizational Behavior and Human Decision Processes  59 (2): 177–87.

John, Robert, and Marshall K.Cheney. 2008. “Resistance to Influenza Vaccination: Psychographics, Audience Segments, and Potential Promotions to Increase Vaccination.” Social Marketing Quarterly  14 (2): 67–90.

Kahan, Dan. 2010. “Fixing the Communications Failure.” Nature  463 (7279): 296–97.

Kahan, Dan. 2012. “Cultural Cognition as a Conception of the Cultural Theory of Risk  .” In Handbook of Risk Theory: Epistemology, Decision Theory, Ethics, and Social Implications of Risk, edited by SabineRoeser, RafaelaHillerbrand, PerSandin, and MartinPeterson, 725–59. Dordrecht, The Netherlands: Springer.

Kahan, Dan, DonaldBraman, GeoffreyCohen, JohnGastil, and PaulSlovic. 2010. “Who Fears the HPV Vaccine, Who Doesn't, and Why: An Experimental Study of the Mechanisms of Cultural Cognition.” Law and Human Behavior  34 (6): 501–16.

Kahan, Dan, HankJenkins‐Smith, and DonaldBraman. 2011. “Cultural Cognition of Scientific Consensus.” Journal of Risk Research  14 (2): 147–74.

Kahan, Dan, M. HankJenkins‐Smith, TorTarantola, L. CarolSilva, and DonaldBraman. 2015. “Geoengineering and Climate Change Polarization.” The Annals of the American Academy of Political and Social Science  658 (1): 192–222.

Kahan, Dan, EllenPeters, EricaDawson, and PaulSlovic. 2017. “Motivated Numeracy and Enlightened Self‐Government.” Behavioural Public Policy  1 (1): 54–86.

Kata, Anna. 2010. “A Postmodern Pandora's Box: Anti‐Vaccination Misinformation on the Internet.” Vaccine  28 (7): 1709–16.

Kempe, Allison, Matthew F.Daley, Mary M.McCauley, Lori A.Crane, Christina A.Suh, Allison M.Kennedy, Michelle M.Basket et al. 2011. “Prevalence of Parental Concerns about Childhood Vaccines: The Experience of Primary Care Physicians.” American Journal of Preventive Medicine  40 (5): 548–55.

Leask, Julie. 2011. “Target the Fence‐Sitters.” Nature  473 (7348): 443–45.

Leask, Julie. 2015. “Should We Do Battle with Antivaccination Activists?” Public Health Research and Practice  25 (2): e2521515.

Leask, Julie‐Anne, and SimonChapman. 1998. “‘An Attempt to Swindle Nature’: Press Anti‐Immunisation Reportage 1993–1997.” Australian and New Zealand Journal of Public Health  22 (1): 17–26.

Leask, Julie, Harold W.Willaby, and JessicaKaufman. 2014. “The Big Picture in Addressing Vaccine Hesitancy.” Human Vaccines and Immunotherapeutics  10 (9): 2600–02.

Lee, Alfred M., and Elizabeth B.Lee. 1939. The Fine Art of Propaganda: A Study of Father Coughlin's Speeches. New York, NY: Harcourt Brace.

Lewandowsky, Stephan, Ullrich K. H.Ecker, Colleen M.Seifert, NorbertSchwarz, and JohnCook. 2012. “Misinformation and Its Correction.” Psychological Science in the Public Interest  13 (3): 106–31.

Lewandowsky, Stephan, and KlausOberauer. 2016. “Motivated Rejection of Science.” Current Directions in Psychological Science  25 (4): 217–22.

Lustria, Mia Liza A., Seth M.Noar, JuliannCortese, Stephanie K. VanStee, Robert L.Glueckauf, and JungaLee. 2013. “A Meta‐Analysis of Web‐Delivered Tailored Health Behavior Change Interventions.” Journal of Health Communication  18 (9): 1039–69.

Ma, Jinxuan, and LynneStahl. 2017. “A Multimodal Critical Discourse Analysis of Anti‐Vaccination Information on Facebook.” Library and Information Science Research  39 (4): 303–10.

Masaryk, Radomír, and MáriaHatoková. 2017. “Qualitative Inquiry into Reasons Why Vaccination Messages Fail.” Journal of Health Psychology  22 (14): 1880–88.

McClure, Catherine C., Jessica R.Cataldi, and Sean T.O'Leary. 2017. “Vaccine Hesitancy: Where We Are and Where We Are Going.” Clinical Therapeutics  39 (8): 1550–62.

Metzger, Miriam J., and Andrew J.Flanagin. 2013. “Credibility and Trust of Information in Online Environments: The Use of Cognitive Heuristics.” Journal of Pragmatics  59: 210–20.

Milkman, Katherine L., JohnBeshears, JamesChoi, DavidLaibson, and Brigitte C.Madrian. 2011. “Using Implementation Intentions Prompts to Enhance Influenza Vaccination Rates.” Proceedings of the National Academy of Sciences  108 (26): 10415–20.

Morgan, Melanie B., WilliamCollins, Glenn G.Sparks, and Jessica R.Welch. 2018. “Identifying Relevant Anti‐Science Perceptions to Improve Science‐Based Communication: The Negative Perceptions of Science Scale.” Social Sciences  7 (4): 64–82.

Myers, Kristen L.2016. “Predictors of Maternal Vaccination in the United States: An Integrative Review of the Literature.” Vaccine  34 (34): 3942–49.

Narayan, B., and M.Preljevic. 2017. “An Information Behaviour Approach to Conspiracy Theories: Listening In on Voices from within the Vaccination Debate.” Information Research  22 (1). Available at http://informationr.net/ir/22-1/colis/colis1616.html

Ninkov, Anton, and LiwenVaughan. 2017. “A Webometric Analysis of the Online Vaccination Debate.” Journal of the Association for Information Science and Technology  68 (5): 1285–94.

Nyhan, Brendan, and JasonReifler. 2015. “Does Correcting Myths about the Flu Vaccine Work? An Experimental Evaluation of the Effects of Corrective Information.” Vaccine  33 (3): 459–64.

Okuhara, Tsuyoshi, HironoIshikawa, MasahumiOkada, MioKato, and TakahiroKiuchi. 2018a. “Contents of Japanese Pro‐ and Anti‐HPV Vaccination Websites: A Text Mining Analysis.” Patient Education and Counseling  101 (3): 406–13.

Okuhara, Tsuyoshi, HironoIshikawa, MasahumiOkada, MioKato, and TakahiroKiuchi. 2018b. “Persuasiveness of Statistics and Patients' and Mothers' Narratives in Human Papillomavirus Vaccine Recommendation Messages: A Randomized Controlled Study in Japan.” Frontiers in Public Health  6: 1–9.

Petty, Richard E., and John T.Cacioppo. 1984. “The Effects of Involvement on Responses to Argument Quantity and Quality: Central and Perpheral Routes to Persuasion.” Journal of Personality and Social Psychology  46: 69–81.

Pich, Jacqueline. 2018. “Patient Reminder and Recall Interventions to Improve Immunization Rates: A Cochrane Review Summary.” International Journal of Nursing Studies  91: 144–45.

Priest, Susanna. 2016. “Critical Science Literacy: Making Sense of Science  .” In Communicating Climate Change: The Path Forward, edited by SusannaPriest, 115–35. London, UK: Palgrave Macmillan.

Rodriguez, Nathan J.2016. “Vaccine‐Hesitant Justifications: ‘Too Many, Too Soon’, Narrative Persuasion, and the Conflation of Expertise.” Global Qualitative Nursing Research  3. https://doi.org/10.1177/2333393616663304

Rosselli, R., M.Martini, and N. L.Bragazzi. 2016. “The Old and the New: Vaccine Hesitancy in the Era of the Web 2.0. Challenges and Opportunities.” Journal of Preventive Medicine and Hygiene  57 (1): 47–50.

Seethaler, Sherry L.2016. “Shades of Grey in Vaccination Decision Making: Tradeoffs, Heuristics, and Implications.” Science Communication  38 (2): 261–71.

Seyranian, Viviane. 2017. “Public Interest Communications: A Social Psychological Perspective.” Journal of Public Interest Communications  1 (1): 57–77.

Shelby, Ashley, and KarenErnst. 2013. “Story and Science: How Providers and Parents can Utilize Storytelling to Combat Anti‐Vaccine Misinformation.” Human Vaccines and Immunotherapeutics  9 (8): 1795–801.

Sundar, S. Shyam. 2008. “The MAIN Model: A Heuristic Approach to Understanding Technology Effects on Credibility  .” In Digital Media, Youth, and Credibility, edited by Miriam J.Metzger and Andrew J.Flanagin, 73–100. Cambridge, MA: MIT Press.

Swire, Briony, Ullrich K. H.Ecker, and StephanLewandowsky. 2017. “The Role of Familiarity in Correcting Inaccurate Information.” Journal of Experimental Psychology: Learning, Memory, and Cognition  43 (12): 1948–61.

Toma, Catalina L., and Jonathan D.D'Angelo. 2015. “Tell‐Tale Words: Linguistic Cues Used to Infer the Expertise of Online Medical Advice.” Journal of Language and Social Psychology  34 (1): 25–45.

Ward, Paul. 2018. “To Trust or Not to Trust (in Doctors)? That Is the Question.” Archives of Disease in Childhood  103 (8): 718–19.

Wolfe, Robert M.2002. “Vaccine Safety Activists on the Internet.” Expert Review of Vaccines  1 (3): 249–52.

Zimmerman, Richard, RobertWolfe, DwightFox, JakeFox, MaryNowalk, JudithTroy, and LisaSharp. 2005. “Vaccine Criticism on the World Wide Web.” Journal of Medical Internet Research  7 (2): e17.