IntroductionRisk perception is inevitably biased. For instance, the relationship between benefits and risks is likely to be inversed in the human mind, even though they are usually positively correlated in real-life environment. People tend to believe that beneficial activities and objects involve only low risks, and vice versa (Sunstein, 2002). Typical examples of this tendency are rather controversial topics, such as GMO, stem cell research, tuition payment, possession of handguns, euthanasia, abortions, vaccination or helping refugees. At the same time, such domains often represent major public health and policy issues since refusing child vaccination or engaging in xenophobic reactions are potentially dangerous tendencies.Thus, analyzing how people assess risks and benefits in such public-interest domains has highly important practical implications. Indeed, behavioral (e.g., Brewer et al., 2007; Renner & Reuter, 2012) and neuroscientific evidence (e.g., Chua et al., 2011; Falk, Berkman, Mann, Harrison, & Lieberman, 2010) on how worried or threatened people feel, predicts real-life behavior with severe consequences for both individuals and society.People often evaluate risks and benefits in complex domains, even though they lack the necessary competence, experience, information and time. They engage in heuristic reasoning, using mental shortcuts based on simple rules such as similarity of the cases or ease of example recall from memory (Tversky & Kahneman, 1974). Employing these principles, they are able to make accurate judgments and decisions, but only in appropriate environments. However, heuristic reasoning can also lead to illusions and errors, especially when it crowds out more systematic thinking. Two of the many well-documented heuristic mechanisms of human reasoning are confirmation bias and credibility heuristic.Confirmation bias is a tendency to favor, seek, interpret, remember and recall information in a way that corresponds to prior expectations, beliefs or hypotheses of the reasoner (Nickerson, 1998). This is a substantial cognitive difficulty, since open-minded critical thinkers should be capable of decoupling their existing views and attitudes from systematic evaluation of arguments and evidence (Stanovich, West, & Toplak, 2013). Distorted information search, interpretation and recall due to confirmation bias have been suggested to explain several robust phenomena of human cognition, e.g. illusory correlation or belief persistence. Highly relevant consequences of this cognitive deviation range from biased evidenceevaluation of jurors to conservativism among scientists (Nickerson, 1998). Interestingly, susceptibility to confirmation bias is unrelated to intelligence (Stanovich et al., 2013).The concept of credibility has been defined in numerous ways: trustworthiness and attractiveness of a source of influence, its prestige, or the history of its previous accuracy (Nesler, Aguinis, Quigley, & Tedeschi, 1993). An important factor closely related to perceptions of trustworthiness, accuracy and validity, and judgments on credibility, is expertise (Hilligoss & Rieh, 2008). People rely on credibility of the source, especially when they lack prior attitudes toward the issues and knowledge about the phenomena (Kumkale, Albarracin, & Seignourel, 2010). Thus, a rule of thumb suggests: the more credible the source of an argument, the stronger the argument.According to recent studies, the use of particular heuristic principles in processing information on risks and benefits is also closely related to people's cultural worldviews.The Cultural Cognition Thesis of Risk PerceptionAccording to the cultural theory of risk (Douglas & Wildavsky, 1983), individuals form beliefs about risks and benefits in a way that reflects and reinforces their commitments to an idealized form of social order. According to their "group" and "grid" typology, people can be placed on a two-dimensional scale of individualism/communitarianism and hierarchism/egalitarianism, as depicted in Figure 1. …