Next Article in Journal
The Struggle to Entertain Yourself: Consequences of the Internal Stimulation Factor of Boredom Proneness during Pandemic Lockdown
Next Article in Special Issue
The Mediation Effects of Social Media Usage and Sharing Fake News about Companies
Previous Article in Journal
Employees’ Perception of HRM Practices and Organizational Citizenship Behaviour: The Mediating Role of the Work–Family Interface
Previous Article in Special Issue
Americans’ Perspectives on Online Media Warning Labels
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

How Personal Values Count in Misleading News Sharing with Moral Content

by
Francesca D’Errico
1,*,
Giuseppe Corbelli
2,
Concetta Papapicco
1 and
Marinella Paciello
2
1
Department of Education, Psychology and Communication, University of Bari ‘Aldo Moro’, 70121 Bari, Italy
2
Faculty of Psychology, Uninettuno University, 00186 Rome, Italy
*
Author to whom correspondence should be addressed.
Submission received: 4 July 2022 / Revised: 16 August 2022 / Accepted: 18 August 2022 / Published: 24 August 2022
(This article belongs to the Special Issue The Psychology of Fake News)

Abstract

:
The present study investigates the personal factors underlying online sharing of moral misleading news by observing the interaction between personal values, communication bias, credibility evaluations, and moral emotions. Specifically, we hypothesized that self-transcendence and conservation values may differently influence the sharing of misleading news depending on which moral domain is activated and that these are more likely to be shared when moral emotions and perceived credibility increase. In a sample of 132 participants (65% female), we tested SEMs on misleading news regarding violations in five different moral domains. The results suggest that self-transcendence values hinder online sharing of misleading news, while conservation values promote it; moreover, news written with a less blatantly biased linguistic frame are consistently rated as more credible. Lastly, more credible and emotionally activating news is more likely to be shared online.

1. Introduction

With the advent of social media, misinformation has rapidly spread representing a concrete threat to democratic processes in current digital societies. Indeed, misinformation can “emphasize divisions and erode the principles of shared trust that should unite societies” [1] (p. 81). Misinformation can be defined as false, misleading, or inaccurate content shared to produce a particular judgment in message recipients, irrespective of the veracity (or bias) of what is shared [2]. It is differentiated from the term “disinformation”, which is deliberately deceptive [3]. Despite the fact that false, misleading, and distorted news is well recognized as one of the major concerns in contemporary societies [4], it remains unclear how different personal factors can co-occur and interact to foster this deleterious phenomenon. In the present study, we focus on misinformation concerning some particular topics, such as moral violations in different domains [5,6]. Specifically, we adopt an interactionist approach to investigate the interplay among individual psychological dimensions (i.e., personal values, affective and cognitive processes) and online communicative factors (i.e., moral contents and source biases), leading to the online sharing of what we have appointed “moral misleading news” (Figure 1). We believe that the integration of the communicative and individual perspectives could be useful to better understand why, how, and by whom different kinds of “moral misleading news” could be shared.
Specifically, with regard to individual dimensions, we focus on “distal” pre-existing values [7] and “proximal” cognitive and affective processes [8,9] activated by moral misleading news. The literature has already suggested that the spread of false news could be due to goal-oriented motivations, such as partisan orientation [10]. However, the majority of studies have mainly focused on political orientations as possible motivational determinants of: (a) distortion [11], (b) spreading of disinformation [12], and (c) increasing online polarized and uncivil discussions [13]. In line with this kind of literature, we aim to understand the role played by those specific goals that orient individual cognitive processing over and above political orientations, namely personal values [14]. Personal values are strongly linked to cognitive and emotional processes activated in a specific situation. Thus, distal values can consistently affect the proximal individual responses that, in turn, influence misleading news sharing. In particular, following the theoretical considerations of Jost and colleagues [15] regarding shared reality and conformity, moral values can be considered good predictors of misleading news sharing. As suggested by the literature on misinformation [16], we consider particularly relevant as possible mediators between moral values and online sharing behavior those cognitive and affective processes related to credibility and moral emotions.
With regard to online communicative factors, in the present study, we consider separately the formal aspect of misleading news and its content. To this end, in order to consider the effects of formal information manipulation, we adopt a distinction between the way in which specific content is framed through a high or low level of bias generally found in misleading news [17,18]. Conversely, to evaluate the effects of content on online sharing, we examine moral communications by relying on moral foundation theory [5,6] for two main reasons. First, in line with the literature, we observe that personal values and moral domains are strongly correlated [19]. Thus, it is plausible that some content is more activating for some individuals than it is for others. Secondly, we test with a more comprehensive theoretical framework the activating function of some sensitive news concerning moral and ethical issues. Moral topics might then trigger emotional responses and confirmatory reasoning in a consistent manner; again, we hypothesized that affective and cognitive processes might act as mediators in the relationship between content and online sharing of moral misleading news.
To verify this, based on this interactionist approach, we carried out a quasi-experimental study. The aim is to understand the effect that two different communicative frames (low vs. high level of bias) jointly with personal values for each of the five moral domains (Care, Fairness, Loyalty, Authority, and Purity) have. The effect should be on the activation of moral beliefs and emotions, which, in turn, mediate the influence of both individual and communicative factors on the sharing of online misinformation. Hence, the research questions from which the research proceeded are:
  • RQ1: What is the effect of news framing on sharing behavior?
  • RQ2: What is the role of credibility and moral emotions in sharing moral news?
  • RQ3: How do personal values affect news sharing across the moral domains?

1.1. The Individual Dimensions of Misinformation

The psychological approach for the misinformation processes’ understanding has mainly indicated how people fall victim to fallacious and misleading news when they rely on their “lazy” cognitive processes [20] or also their motivational factors, such as the desire for entertainment or fear of missing out [21,22]. In particular, according to the motivated reasoning account [23], people use their reasoning skills to protect and support their identity, their ideologies, and beliefs rather than to obtain authentic information [24]. Therefore, intentions and behaviors are strongly favored by their attitudes toward certain ideas or ideologies, prompting people to over-believe content consistent with their position. In contrast, they become skeptical when they process content that is inconsistent with them as a kind of “identity-protective cognition” [24]. In this light, to our knowledge, even if largely theorized [15], few works have experimentally tested the role played by the motivational factor exerted by personal values.
A comprehensive definition of values is “trans-situational goals that vary in importance and serve as guiding principles in the life of a person or a group” [25]. As motivational determinants, they can be useful to understand how individuals differ in their perception, interpretation, and behavioral decision in the face of the same situation/stimulus. Consequently, values can also be individual dimensions that can contribute to the credibility and sharing of fake news, following the “motivated reasoning theory” [26] that indicates how we can give more credibility to something that we feel is close to our principles. Values can be the basis for individual evaluations. Objects are positively evaluated if they are in line with personal values; in contrast, they are negatively evaluated if they hinder or threaten their attainment [7]. Moreover, since values can affect people’s beliefs that people hold about how the world functions [7,27], they can affect the grade of credibility of some events and objects that are differently consistent with one’s world view.
Among the theorists that have studied personal values, one of the most prominent values researchers is Shalom Schwartz [25], who has proposed a taxonomy based on the fundamental social and biological needs of human beings. Personal values can be mapped into a circumplex model defined by two dimensions: one dimension captures the conflict between values that emphasize the independence of thought and action, and proneness for change (i.e., openness to change); and values that emphasize preservation of group and social order, and resistance to change (i.e., conservation values). The other dimension captures the conflict between values that emphasize concern for the welfare and interests of others (i.e., self-transcendence) and values that emphasize the pursuit of one’s interests and relative personal success and dominance over others (i.e., self-enhancement). Clearly, many values are moral values but not all of them have an intrinsic moral nature. Moral values are related to welfare and fairness concerns (e.g., benevolence, patriotism, and traditions), and are related to two basic universal human needs: (1) the need of protecting one’s own group from internal and external threats and (2) the need of fostering reciprocal exchange [28]. More in-depth, according to Schwartz’s theory [25], the “Conservation” values (i.e., conformity, tradition, and security) have the function of creating the internal cohesion necessary to face possible threats by “binding” the group together; whereas “Self-transcendence” values (i.e., benevolence and universalism) have the function of promoting prosocial bonds between individuals—even with others outside the group. Framing in Haidt domains [29,30], the “Conservation” values concern Ingroup, Authority, and Purity (e.g., national security, obedience, conformism), the “Self-transcendence” includes the main values related to the Harm and Fairness foundations (e.g., social justice and right to protection). Concerning public opinion, several studies have attested that values orient attitudes toward policy views and electoral choices [31,32,33]. For example, conservation values predict support for the right political orientation, and self-transcendence for the left political orientation [28].
Motivational factors can be considered as strictly associated with emotional and cognitive factors in fake and misleading news credulity and sharing [34]. In particular, the most influential position is that emotions (regardless of the specific type) play a predictive role of increased confidence in misleading news, lowering the ability to discern between true and false news [35]. Therefore, the use of emotions increases trust in hoaxes; the more individuals rely on emotional activation, the more they perceive false or misleading information as accurate. This can explain why false news is definitely conceived to activate individuals’ emotional responses by leading them to be more credible [35]. In this perspective, the failures in the identification processes of credible news would derive from a combination between the lack of analytical thinking and strong emotionality.
In addition, Pennycook and colleagues [36] highlighted how affective processes can play a pejorative role in credibility, whether heightened emotionality is associated with increased belief in fake news. Moreover, Pennycook and colleagues [36] state that emotion, both positive and negative, is predictive of fake news credulity and that it increases the possibility of discerning real from fake news. Specifically, the authors study the role of “momentary” [36] (p. 4) emotions, those emotions that are activated during the reading of news, and credibility. The results suggest that momentary negative emotions diminish people’s ability to discern between real and fake news.
The results relating to how the emotional aspects lead to greater credulity on the part of social media users, however, are based on news mainly politically connoted, since the assumption that moves them is precisely that, referring to an activation close to in-group favoritism [37] or the desire for conformity [15]. However, to fully understand the contribution of personal values, it can be useful to deepen the role played by two crucial factors: the individuals’ value orientation and the moral domain of the news. Concerning personal values, among the moral ones postulated in Schwartz’s model [25], that are self-transcendence and conservation, conformity seems to be more crucial for the credibility processing of information [15].
Moreover, since the well-known literature on misinformation [17,38] focused mostly on misleading news content framed with polarized political belonging (e.g., conservative vs. liberal frame), it seems necessary to them to broaden the focus by concentrating the attention on more inclusive moral domains, which, according to Feldman’s and colleagues [19] studies, are closely related. Their findings suggested, in fact, that self-enhancement values would be associated with generally lower importance for all moral foundations, self-transcendence would mainly be associated with harm–care and fairness–reciprocity moral concerns, conservation would be associated with authority–respect, while openness to change would be associated with high moral relativism, seeing morality as more flexible based on the context.

1.2. Communicative Dimensions of Misinformation

When we consider the communicative side of the literature on misinformation, we mainly encounter two aspects of the issue: the content and linguistic form. As to the first, it is well known that content related to the political field typically induces polarization and, in terms of credibility, a certain type of congruence to the readers’ political orientation [38]. Nevertheless, in the field of moral psychology and moral communication, five moral domains have been defined [6,29] that can settle issues concerning moral concerns, and that can activate automatic, relatively effortless, and rapid moral judgments on questions concerning people’s evaluation of well-being, justice, group, religious, or institutional belonging. Specifically, Haidt and his collaborators [5,29,39] identified five moral domains, which are: harm/care, related to human sensitivity to others’ needs and distress, with the function of activating human urges to care for and protect vulnerable individuals; fairness/reciprocity, related to human attentiveness toward equity and reciprocity behaviors that promote collaborative and altruistic social interactions; ingroup/loyalty, related to the human tendency to build cohesive coalitions that compete to obtain restricted resources in order to obtain collective benefits; authority/respect, related to the tendency to generate social hierarchies within which authority is respected and seen as legitimate for maintaining social order; and, lastly, purity/sanctity, related to social practices preventing those contaminants that threaten the survival of the group. The five domains cluster under two higher-order groupings: (1) the “individualizing” foundations (care and fairness) focus on the rights and welfare of the individual; and the “binding” domains (loyalty, authority, and sanctity) focus on the cohesion of the group [29,40].
More recently, social media studies have highlighted that moral contents associated with emotions can become viral. In particular, Brady and colleagues [9] proposed the MAD model (motivation, attention, and design) that indicated how: (i) people are motivated to share moral contents on the basis of their social identity (motivation); (ii) moral contents are better able to capture people’s attention and to involve them than other types of content (attention); and (iii) social media design and affordances (anonymity or social feedback) facilitate and amplify moral content (design). These three factors led to a greater probability to share online moral content, allowing the so-called “moral contagion” [41].
Specifically, the MAD model pointed out that the final decision to share news with moral content would depend on the interaction between psychological factors and environmental factors in social networks. Specifically, the factors considered concern group-based motivations, how moralized stimuli engage our attention, and how the design of social media amplifies these elements.
In this vein, according to the MAD model, it is possible to assume that moral contents activate moral emotions because they are associated with group identity. In other words, in the presence of moral content, emotions, defined as moral (e.g., outrage), tend to strengthen the membership of their group. The structural basis of this social identity [37], which organizes membership in the group, is represented by personal values. Thus, the motivation that led people to process and share online content can be strictly linked to values, especially when misleading news describes moral violations.
As regards the role played by the formal and stylistic aspect of misleading news, Rashkin, Choi, Jang, Volkova, and Choi [42] found that misleading news more often includes exaggerations (e.g., superlatives), biases, subjectivity, and hedged language than verified news [42]. Frequent biases in misleading news are sensationalism, the presence of high emotionality [17], the discredit of the responsible [18], or also the so-called “factual bias” [43], which consists of the description of one side instead of giving a complex and articulated framework of the news [44].
Recent studies highlighted that news sensationalism led people to trust and fall into misinformation compared to neutral style, especially when they feel a state of uncertainty, and also when they used alternative media to inform [45]; others instead highlighted how the effect of biased news can decrease when people are induced to evaluate news critically [46].
Hence, a suspect source might decrease the perceived credibility of misleading information and this is also confirmed by the research on the “nudge” effect [47] that indicated that, when social media users were asked to assess “the accuracy of headlines”, this activates a kind of “accuracy mind set” [44] that makes people less credulous and more critical toward the biased headlines. Hence, we focused on subjectivity and sensationalism and investigated whether these patterns affect individuals’ credibility evaluations of online misinformation articles.
Sensationalism and the perception of credibility, however, do not fully explain the users’ sharing behavior of misleading news. More recent studies [20] on sharing behavior, in fact, show that people were apparently willing to share content that they could have identified as being inaccurate. In this study, in fact, participants who were asked about the accuracy of a series of titles rated the real titles as much more accurate than the fake titles but, when asked if they would share the titles, veracity had little impact on shared intentions in the context of policy titles [20]. As a result, sharing intentions for fake titles were much higher than evaluations of their truth (for example, more than 91%). As a result, the sharing of low-quality news content on Facebook is associated with the ideological extreme; in fact, in these cases, political motivation prevails [48] and ideological concordance is a much stronger predictor of sharing than it is of faith [20]. In this regard, research has shown that people who share fake news on social networks are, for example, those with a populist ideological orientation, with a strongly conservative ideological orientation, and with an age of over 65 [49].

2. Method

2.1. Participants

Volunteers who agreed to take part in the quasi-experiment were recruited through the researchers’ two referring faculties (Psychology and Communication Studies), and received and completed one of the two questionnaires required by the procedure for a total of 132 participants; of these, two responses were excluded from the analysis due to being incomplete (98.5% completion rate).
The final sample consisted of 45 males, 83 females, and 2 non-binary/other (younger people, 17–30: 47.7%, M = 21.5, SD = 3.1; young adults, 31–50: 23.1%, M = 32.0, SD = 4.1; adults, 50+: 29.2%, M = 53.3, SD = 6.7). After an appropriate explanation of the methods, timing, and procedures of the study, all students personally provided consent if they were of age, whereas consent was given by parents if the participant was underage at the time of the observation.

2.2. Procedure

Participants were asked to accurately read the proposed stimulus, i.e., a screenshot of a news item found online, and immediately afterward to answer self-report questions regarding their cognitive assessments of the veracity and their emotional state in relation to the news item they had just read, as well as regarding their intention to share it on social media. This procedure was repeated five times (one for each news story based on the five different contents), while each subject was randomly assigned to one of two groups associated with a different questionnaire. The first group (N = 57) encountered five news screenshots written in a dry style and with a low bias, while the second group (N = 73) was engaged with the same news story contents, but presented following an emotionally activating communicative framing (i.e., a high linguistic bias). In each of the two cases, completion of the full battery took 30 to 50 min. The experimental design was mixed, with a within-subjects independent variable (i.e., the five moral contents) and the type of bias (subtle or blatant) between subjects. The presentation of the specific stimuli, items, and data collection took place through two different questionnaires, the first containing the five news screenshots written following an emotionally activating communicative framing, while the second one had the same pieces of news presented with a dry and impartial style.
Before administering the measures and during the data collection, the final purpose of the study was outlined and participants were informed that their participation in the study was voluntary, reminding them that they were completely free to refuse to participate or withdraw from the study at any time without consequence. None of the participants refused to participate or withdrew from the study, which was conducted between February and June 2021. All the procedures followed the Helsinki ethical principles and ethical codes of AIP (Italian Psychology Association) and the study was approved by the ethical committee of the university of one of the co-authors. Informed consent was obtained for all participants.

2.3. Measures

2.3.1. News Stimuli

For this study, 10 artificial screenshots of posts on Instagram from a hypothetical online news page were created ad hoc on the basis of moral violation scenarios validated by Clifford et al. [50]. These screenshots were reproduced as faithfully as possible in order to replicate the way of information exchange through screenshots, which is widely used among young people [20,51,52]. The news items have been artfully designed to activate the content domain for each of the five moral foundations outlined by Haidt et al. [29], namely: Care, Fairness, Loyalty, Authority, and Purity. It is because of the large individual differences in the salience of these basic human motivational modules that we decided to include in the quasi-experiment all five misleading news stories. Each of the five moral domains has been declined through two different screenshots, changing in pairs the linguistic modality of presentation of the news. In the second condition, the communicative framing adopted is highly biased, emotionally activating, and judgmental, similar to what happens in the widespread online misinformation outlets. By contrast, in the first condition, the communicative framing style is less biased and judgmental. Moral misleading news biases were coded 1 (subtle bias) or 2 (blatant bias). The image presented in the Instagram post is the same in the two alternatives, differing according to the five moral domains in such a way as to simulate a real screenshot obtained from a comparable news item found online.
As an example, the screenshot related to the moral domain of Care for the first linguistic modality and the second modality are presented in Appendix A (Figure A1 and Figure A2).

2.3.2. Intention to Share

The subject’s intention to share the news item presented with the screenshot was investigated through a direct single-item, closed-ended question, “Would you share this news item on social media?” The five response options from which the respondent could choose ranged from “Absolutely not” to “Absolutely yes”. This question was prompted immediately after the presentation of each of the five screenshots related to the five moral scenarios.

2.3.3. Cognitive Evaluation of the Veracity of the News Item

This ad hoc scale included 12 items aimed at assessing the cognitive evaluation concerning the piece of news contained in the screenshot. For each item, participants rated their agreement or disagreement with the presented evaluation on a 5-point scale (“very little” = 1; “very much” = 5). For instance, two sample items of the scale are: “In your opinion, how credible is the news you read?” and “In your opinion, how truthful is the news you read?” Among the items to which the subjects responded, those referring solely to the subscale of credibility of the news were selected for the present study, i.e., the perceived credibility, truthfulness, and reliability ratings of the news. Cronbach’s alphas for the credibility-related items were 0.86, 0.93, 0.93, 0.94, and 0.95 for the five moral domains of Care, Fairness, Loyalty, Authority, and Purity, respectively, showing an average excellent degree of reliability.

2.3.4. Evaluation of Moral Emotions

This scale measured the emotional response (“What emotions did you feel while reading the news?”) following the presentation of the news screenshot stimulus, investigating a wide range of both positive and negative, activating and deactivating emotions. “Anger” and “enthusiasm” are sample items, to which participants responded via a 5-point Likert scale (“very little” = 1, “very much” = 5).
Many of these items had zero variability, receiving a score of 1 from all respondents—thus indicating the complete absence of the corresponding emotional reaction elicited by the scenario presented. Almost all of the emotions most frequently reported by the subjects were indeed moral emotions, defined by Haidt et al. [29] (p. 110) as those “linked to the interests or welfare either of society as a whole or at least of persons other than the judge or agent”. Consistent with the moral nature of the scenarios proposed, the emotions selected for the present study were, therefore, those that have been shown to provide the motivational drive to do what is considered right and avoid doing what is considered wrong [53]: disappointment, disgust, pity, anger, shame, fear, compassion, anger, discouragement, concern, anxiety, embarrassment, sadness, and contempt. Cronbach’s reliability coefficients for the moral emotion scale for the five moral domains were 0.91 (Care), 0.91 (Fairness), 0.92 (Loyalty), 0.95 (Authority), and 0.93 (Purity), indicating for each case an excellent degree of reliability.

2.3.5. Value Dimensions of Self-Transcendence and Conservation

To assess the value salience of the respondents, the Italian translation of the original 40-item Portrait Values Questionnaire was used [54]. Being highly relevant to the scope of this research, we decided to take into account the two higher-order dimensions of Conservation (incorporating Tradition, Security, and Conformity) and Self-Transcendence (which includes Benevolence and Universalism) from the original 10 empirically derived values along the two orthogonal axes of the circumplex model [55]. Cronbach’s alphas for the two subsets of values pertaining to Conservation and Self-Transcendence were 0.78 and 0.86, providing, respectively, an acceptable and a good degree of reliability.

3. Results

Table 1 summarizes the descriptive statistics for the relevant variables, while Table 2 shows their correlation matrix. Regardless of the moral domain considered, the credibility of a news item was positively correlated with its sharing intention (Care: r = 0.467, p < 0.01; Fairness: r = 0.330, p < 0.01; Loyalty: r = 0.200, p < 0.05; Authority: r = 0.354, p < 0.01; Purity: r = 0.490, p < 0.01), and also emotional activation showed a significant positive correlation with sharing intention across all domains (Care: r = 0.461, p < 0.01; Fairness: r = 0.428, p < 0.01; Loyalty: r = 0.562, p < 0.01; Authority: r = 0.363, p < 0.01; Purity: r = 0.471, p < 0.01). Regarding the impact of individual distal dimensions, the interactions turned out to be more complex according to the domains considered, and the increase in news bias was correlated with the decrease in credibility and emotional activation significantly in all domains except for the moral foundation of Loyalty, where the correlation with moral emotions was in the opposite direction (r = −0.175, p < 0.05), and in the case of Authority, where no significance was found in the relationship between moral emotional activation and news bias. For each of the five moral domains, path analyses were performed to examine the relationships among cognitive-affective processes triggered by fake news leading to sharing behaviors. Furthermore, in each model, conservation and self-transcendence were defined as latent dimensions using their reference values, which are tradition, security, and conformity in the case of conservation, and benevolence and universalism in the case of self-transcendence. In addition, due to heterogeneous distribution of the sample, gender and generations were included in the model as control variables.
All five models showed an adequate fit to the data and different relationships among the variables under study.
In depth, with regard to the moral domain of care (χ2 (29, N = 130) = 35.504, p < 0.001, CFI = 0.94, TLI = 0.93, RMSEA = 0.042 (90% CI = 0.039−0.045) p = 1.00, SRMR = 0.048 ≥ 0.18; CFI = 0.98, RMSEA = 0.042 (90% CI = 0.000−0.083) p = 0.58, SRMR = 0.046), the results suggest that misleading news presented in a drier communicative style promotes both credulity and moral emotional reactions, which, in turn, influence the likelihood of sharing it. In addition, self-transcendence values promote moral emotional reactions (Figure 2).
Concerning the fairness domain (χ2 (29, N = 130) = 53.50, p < 0.001, CFI = 0.94, TLI = 0.93, RMSEA = 0.042 (90% CI = 0.039−0.045) p = 1.00, SRMR = 0.048> 0.003; CFI = 0.93, RMSEA = 0.08 (90% CI = 0.045−0.114) p = 0.07, SRMR = 0.052), it is possible to appreciate the same cognitive-affective paths, but values directly influence the sharing behavior. Indeed, conservation values promote sharing behavior, whereas self-transcendence values hinder it (Figure 3).
In the case of loyalty (χ2 (29, N = 130) = 33.034, p < 0.001, CFI = 0.94, TLI = 0.93, RMSEA = 0.042 (90% CI = 0.039−0.045) p = 1.00, SRMR = 0.048 > 0.276; CFI = 0.99, RMSEA = 0.033 (90% CI = 0.000−0.077) p = 0.69, SRMR = 0.04), it is interesting to note that news pieces written following an emotionally activating communicative framing foster only affective processes (i.e., moral emotions) that, in turn, positively affect sharing behavior. Moreover, as in the fairness domain, conservation values promote sharing behavior, whereas self-transcendence values prevent it (Figure 4).
With regard to the authority domain (χ2 (29, N = 130) = 31.07, p < 0.001, CFI = 0.94, TLI = 0.93, RMSEA = 0.042 (90% CI = 0.039−0.045) p = 1.00, SRMR = 0.048 > 0.314; CFI = 0.99, RMSEA = 0.029 (90% CI = 0.000−0.075) p = 0.72, SRMR = 0.04), misleading news presented with a dry style foster only credulity that, in turn, affects sharing behavior. Sharing behavior is also positively related to moral emotions and negatively related to self-transcendence values. In addition, in this case, conservation is positively related to both affective and cognitive processes (Figure 5).
With regard to the purity domain (χ2 (29, N = 130) = 31.239, p < 0.001, CFI = 0.94, TLI = 0.93, RMSEA = 0.042 (90% CI = 0.039−0.045) p = 1.00, SRMR = 0.048> 0.354; CFI = 0.99, RMSEA = 0.024 (90% CI = 0.000−0.073) p = 0.76, SRMR = 0.04), news screenshots presented with a subtly biased style foster both credulity and moral emotional reactions that, in turn, affect the probability to share them. As in the moral domain of authority, conservation is positively related to both affective and cognitive processes (Figure 6).
With regard to covariates, in the case of care and loyalty domains, credulity is significantly and negatively associated with generation (the younger are more prone to believe fake news than the older). Moreover, for the care domain, credulity is also associated positively with gender (females are more prone to believe), whereas, for the purity domain, gender is negatively associated with moral emotions (the males emotionally react more than the females).

4. Discussion

The results of the present study on the intention of sharing moral misleading news show that specific moral communications activate differentiated cognitive and emotional processes. In general, the results show that the more people believe in the news veracity, the more they get emotionally activated and, consequently, share the news. This is especially true in the case of less biased stimuli. In this case, a low biased stimulus elicits a positive cognitive assessment of truthfulness and subsequent activation of moral emotions; this shows how emotions and beliefs are strictly interconnected.
In other words, differently from some previous works [20] that observed how emotions can be activated mainly in relation to the news falseness, in our case, the emotional activation is parallel to the cognitive evaluation of news veracity across almost all of the five moral domains. As a result, news sharing is influenced by this cognitive-emotional process that intervenes between the presentation of moral misleading news and the subsequent sharing behavior. Therefore, from our results, it emerges that the news is not shared passively, but it is shared only if it is considered credible and emotionally engaging. This can be due also to the fact that, after reading the news, participants were induced to evaluate its veracity; this procedural request can activate a kind of “accuracy mindset” [44] that generally makes people less credulous and more critical toward the biased news, as also demonstrated by the so-called “nudge research” [20]. In addition, our participants were selected on the basis of their linguistic and communicative background, being university students of psychology and communication science.
Considering four domains, i.e., care, fairness, authority, and purity, misleading news presented with a less biased framing fosters both credulity and moral emotional reactions, which, in turn, affect the probability of sharing them online. In contrast, only in the loyalty domain, emotional activation bypasses the subject’s ability to correctly discriminate the veracity of the specific news. This result is in line with studies on the importance of political motivation in the credibility of misleading news [20,23], where the moral domain of loyalty tends to emotionally activate the readers—especially when the stimulus has a strong bias coherently with a highly polarized setting. In fact, in line with the research on political dualism, from our results, it emerges that tribal logic wins over the evaluative processing, since this is the only domain in which participants tend to share news framed with a high level of bias.
With respect to moral domains, however, the literature [19] also highlights how they are related to the individual dimension of personal values, by indicating how self-transcendence would mainly be associated with harm/care and fairness/reciprocity moral concerns, while conservation would be associated with authority/respect. Overall, present results suggest that conservation and self-transcendence values can play an opposite influence in the sharing process. High levels of self-transcendence can be considered a protective factor in the sharing of misleading news, except in the case of the care domain. Self-transcendence, on the other hand, prevents the sharing of misleading news concerning the violation of formalized norms (regulations), group conventions, and respect for authority. Differently, in the case of evident aggression/damage to a victim, people with high levels of self-transcendence activate moral emotions and the likelihood of misleading news sharing. This could be related to the importance of safeguarding “the wellbeing” of the alleged perpetrators of such violations before a clear and acknowledged guilt. Therefore, people with high self-transcendence are cautious in attacking/blaming, except in the case of victims of aggression, as represented in the domain of care.
A higher level of conservation values fosters the sharing of almost all moral violations, except those pertaining to the moral domain of care, where there are no significant relationships. Specifically, results suggest that, in the case of stimuli related to the violation of formal and informal rules that regulate behavior within their own society, people with high conservation tend to share them regardless of the cognitive evaluation of truthfulness.
In the case of the news stimuli about damaging behavior towards “earthly” authorities (e.g., the police) and “sacred” authorities (e.g., religious figures), people with high conservation are more likely to consider the low-bias misleading news more credible and become emotionally activated. Possibly, this is because, for people with high conservation, it is necessary to ensure the social order at a horizontal level: when it comes to maintaining the “hierarchical” order, they are more emotionally sensitive and cognitively permeable, so the sharing intention is directly related [15]. Concerning the limitations of the study, as this is pilot research, future studies will benefit from a larger and more diverse participant sample, including a higher variability of the participants’ age (e.g., teenagers/young people). Cultural factors will be taken into account in future works as well.
Concerning the relation between values and online sharing, we found, in some cases, a direct relation (in particular in fairness and loyalty), independently from cognitive and emotional processes; this can be due to less interest in those scenarios, but, in future studies, physiological measurements and reaction times will also be calculated when reading the news. Furthermore, the results highlight how people with good linguistic and communicative knowledge can better recognize the presence of high biases associated with moral misleading news, giving them less credibility and emotional activation. This limitation could be addressed by replicating the present study in different cultures and differentiated levels of linguistic or communicative knowledge, also, to obtain wider evidence showing the generalizability of the present findings. Skills related to internet browsing, use of social networks, as well as the frequency of use of digital communication tools can potentially have an impact on the credibility of the news; therefore, in future research using a methodology similar to the one presented in this study, it would certainly be desirable to investigate the digital skill levels of the sample. Moreover, the posited models should be replicated on larger and gender-balanced samples considering also the potential age-related differences.
Another limitation inherent in the level of control required by the quasi-experimental design is that of having presented a static screenshot of the news, stopping the information extracted from a precise social network (i.e., Instagram) at a specific moment of interaction. The undeniable importance of the unique affordances specific to the various social networks, as well as the presence of content recommendation methods and different automatic advertising models [56], prompts us to certainly want to overcome this limitation by considering in future studies different contexts based on multiple social networks in increasingly realistic and interactive ways.
With respect to the possible social and practical implications, these results suggest how important it can be to promote educational programs, especially considering youths and young adults, aimed at reinforcing linguistic competence, and also recognizing one’s own personal values and the associated potential vulnerability, in particular, considering the content of the misleading news. In fact, in the case of moral political communications, misleading news about violations of law, in-group interests, and national authorities could be more easily shared by subjects with high levels of conservation values.
The conservation values, such as security, tradition, and conformity, would promote motivated reasoning that confirms the credibility of this kind of news. Moreover, conservation values would make misleading news related to these domains more emotionally activating. This result is in line with other findings attesting that conservatives are more likely to believe false news [57]. Regarding this research field, we add and argue the importance of taking into account not only the political orientation but also the moral contents of biased news [58,59]. The interaction between communication contents and individual values can exacerbate negative reactions by hindering resistance to misinformation; people could further reinforce their own conservative moral principles by believing in overwhelming moral misleading news.

Author Contributions

Conceptualization, F.D. and M.P.; methodology, F.D. and M.P.; software, G.C.; data curation, G.C. and C.P.; writing—original draft preparation, all the authors; visualization, G.C. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by Compagnia San Paolo (“Challenge for Europe”) under the Grant European project “STERHEOTYPES—Studying European Racial Hoaxes and sterEOTYPES” (CUP: B99C20000640007).

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Ethics Committee of the University of Bari (Ethics reference code: ET-22-01, approved 28 January 2022).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data that support the findings of this study are not openly available due to human data and are available from the corresponding author upon request in a controlled access repository.

Acknowledgments

Special thanks to Faragó Laura for her precious suggestions.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

Appendix A. Stimuli Images

Figure A1. Example of stimulus for the domain of Care: blatantly biased condition.
Figure A1. Example of stimulus for the domain of Care: blatantly biased condition.
Behavsci 12 00302 g0a1
English translation: “It never rains but it pours: disabled man attacked on the wild streets of the capital. It happened last night on the streets of Rome to a 16-year-old boy, sitting on the steps of the Church of Santa Maria Maggiore. After getting up to reach the subway, the boy walked slowly towards the station having one of his two legs amputated, holding his crutches. His uncertain walk attracted the attention of two young men evidently intoxicated, who started to mock him in a blatant way triggering the alarmed reaction of the boy who started to cry for help to the unconcerned bystanders. The two young people were out of control, probably at the end of an evening of drinking and taking illegal substances, since they continued to bother the passers-by. The disabled boy shortly afterward managed to change his route, thus moving away from the two aggressors’’.
Figure A2. Example of stimulus for the domain of Care: subtly biased condition.
Figure A2. Example of stimulus for the domain of Care: subtly biased condition.
Behavsci 12 00302 g0a2
English translation: “When disability becomes a burden: young people insult a disabled person. It happened last light on the streets of Rome to a 16 years old boy, sitting on the steps of the Church of Santa Maria Maggiore. After getting up to reach the subway station, the boy walked slowly towards the station having one of his two legs amputated, holding his crutches. His uncertain walk attracted the attention of two young men who started to mock him, causing the boy to call for help. None of the passers-by took the disabled boy’s side, and he decided to change his direction and move away”.

References

  1. Colomina, C.; Sanchez Margalef, H.; Youngs, R. The Impact of Disinformation on Democratic Processes and Human Rights in the World; PE 653.635; Directorate General for External Policies of the Union: Brussels, Belgium, 2021; Available online: https://www.europarl.europa.eu/RegData/etudes/STUD/2021/653635/EXPO_STU(2021)653635_EN.pdf (accessed on 1 September 2021).
  2. Oyserman, D.; Dawson, A. Your fake news, our facts: Identity-based motivation shapes what we believe, share, and accept. In The Psychology of Fake News; Routledge: Amsterdam, The Netherlands, 2020; pp. 173–195. [Google Scholar]
  3. Ireton, C.; Posetti, J. Journalism, Fake News & Disinformation: Handbook for Journalism Education and Training; Unesco Publishing: Paris, France, 2018. [Google Scholar]
  4. Bakir, V.; McStay, A. Fake News and The Economy of Emotions. Digit. J. 2017, 6, 154–175. [Google Scholar] [CrossRef]
  5. Graham, J.; Haidt, J.; Koleva, S.; Motyl, M.; Iyer, R.; Wojcik, S.P.; Ditto, P.H. Moral Foundations Theory. Adv. Exp. Soc. Psychol. 2013, 47, 55–130. [Google Scholar] [CrossRef]
  6. Wheeler, M.A.; Laham, S.M. What We Talk about When We Talk about Morality. Personal. Soc. Psychol. Bull. 2016, 42, 1206–1216. [Google Scholar] [CrossRef] [PubMed]
  7. Schwartz, S.H. An overview of the Schwartz theory of basic values. Online Read. Psychol. Cult. 2012, 2, 1116. [Google Scholar] [CrossRef]
  8. Barlett, C.P.; Anderson, C.A. Examining Media Effects. In The International Encyclopedia of Media Studies; Blackwell: Hoboken, NJ, USA, 2012. [Google Scholar] [CrossRef]
  9. Brady, W.J.; Crockett, M.J.; Van Bavel, J.J. The MAD model of moral contagion: The role of motivation, attention, and design in the spread of moralized content online. Perspect. Psychol. Sci. 2020, 15, 978–1010. [Google Scholar] [CrossRef] [PubMed]
  10. Osmundsen, M.; Bor, A.; Vahlstrup, P.B.; Bechmann, A.; Petersen, M.B. Partisan polarization is the primary psychological motivation behind “fake news” sharing on Twitter. Am. Political Sci. Rev. 2020, 115, 999–1015. [Google Scholar] [CrossRef]
  11. Frischlich, L.; Hellmann, J.H.; Brinkschulte, F.; Becker, M.; Back, M.D. Right-wing authoritarianism, conspiracy mentality, and susceptibility to distorted alternative news. Soc. Influ. 2021, 16, 24–64. [Google Scholar] [CrossRef]
  12. Colliander, J. “This is fake news”: Investigating the role of conformity to other users’ views when commenting on and spreading disinformation in social media. Comput. Hum. Behav. 2019, 97, 202–215. [Google Scholar] [CrossRef]
  13. Anderson, A.A.; Yeo, S.K.; Brossard, D.; Scheufele, D.A.; Xenos, M.A. Toxic Talk: How Online Incivility Can Undermine Perceptions of Media. Int. J. Public Opin. Res. 2016, 30, 156–168. [Google Scholar] [CrossRef] [Green Version]
  14. Schwartz, S.H. Basic Personal Values and Political Orientations. In Improving Public Opinion Surveys; Princeton University Press: Princeton, NJ, USA, 2011. [Google Scholar] [CrossRef]
  15. Jost, J.T.; van der Linden, S.; Panagopoulos, C.; Hardin, C.D. Ideological asymmetries in conformity, desire for shared reality, and the spread of misinformation. Curr. Opin. Psychol. 2018, 23, 77–83. [Google Scholar] [CrossRef]
  16. Cheng, Y.; Chen, Z.F. Encountering misinformation online: Antecedents of trust and distrust and their impact on the intensity of Facebook use. Online Inf. Review. 2021, 45, 372–388. [Google Scholar] [CrossRef]
  17. Vicario, M.D.; Quattrociocchi, W.; Scala, A.; Zollo, F. Polarization and Fake News. ACM Trans. Web 2019, 13, 1–22. [Google Scholar] [CrossRef]
  18. Lischka, J.A. A badge of honor?: How the New York Times discredits President Trump’s fake news accusations. J. Stud. 2017, 20, 287–304. [Google Scholar] [CrossRef]
  19. Feldman, G. Personal values and moral foundations: Examining relations and joint prediction of moral variables. Soc. Psychol. Personal. Sci. 2021, 12, 676–686. [Google Scholar] [CrossRef]
  20. Pennycook, G.; Rand, D.G. Nudging social media sharing towards accuracy. PsyArXiv 2021, 700, 152–164. [Google Scholar] [CrossRef]
  21. Talwar, S.; Dhir, A.; Kaur, P.; Zafar, N.; Alrasheedy, M. Why do people share fake news? Associations between the dark side of social media use and fake news sharing behavior. J. Retail. Consum. Serv. 2019, 51, 72–82. [Google Scholar] [CrossRef]
  22. Tandon, A.; Dhir, A.; Islam, N.; Talwar, S.; Mäntymäki, M. Psychological and behavioral outcomes of social media-induced fear of missing out at the workplace. J. Bus. Res. 2021, 136, 186–197. [Google Scholar] [CrossRef]
  23. Kahan, D.M.; Peters, E.; Dawson, E.C.; Slovic, P. Motivated numeracy and enlightened self-government. Behav. Public Policy 2017, 1, 54–86. [Google Scholar] [CrossRef]
  24. Kahan, D.M. Ideology, motivated reasoning, and cognitive reflection. Judgm. Decis. Mak. 2013, 8, 407–424. [Google Scholar]
  25. Schwartz, S.H. Universals in the content and structure of values: Theoretical advances and empirical tests in 20 countries. Adv. Exp. Soc. Psychol. 1992, 25, 1–65. [Google Scholar] [CrossRef]
  26. Kunda, Z. The case for motivated reasoning. Psychol. Bull. 1990, 108, 480. [Google Scholar] [CrossRef] [PubMed]
  27. Leung, K.; Bond, M.H. Social Axioms: A Model for Social Beliefs in Multicultural Perspective. Adv. Exp. Soc. Psychol. 2004, 36, 119–197. [Google Scholar] [CrossRef]
  28. Rathbun, B.C.; Kertzer, J.D.; Reifler, J.; Goren, P.; Scotto, T.J. Taking foreign policy personally: Personal values and foreign policy attitudes. Int. Stud. Q. 2016, 60, 124–137. [Google Scholar] [CrossRef]
  29. Haidt, J.; Graham, J.; Joseph, C. Above and below left–right: Ideological narratives and moral foundations. Psychol. Inq. 2009, 20, 110–119. [Google Scholar] [CrossRef]
  30. Graham, J.; Nosek, B.A.; Haidt, J.; Iyer, R.; Koleva, S.; Ditto, P.H. Mapping the moral domain. J. Personal. Soc. Psychol. 2011, 101, 366. [Google Scholar] [CrossRef]
  31. Caprara, G.V.; Barbaranelli, C.; Pastorelli, C.; Iafrate, C.; Beretta, M.; Steca, P.; Bandura, A. La misura del disimpegno morale nel contesto delle trasgressioni dell’agire quotidiano. G. Ital. Psicol. 2006, 33, 83–106. [Google Scholar] [CrossRef]
  32. Piurko, Y.; Schwartz, S.H.; Davidov, E. Basic personal values and the meaning of left-right political orientations in 20 countries. Political Psychol. 2011, 32, 537–561. [Google Scholar] [CrossRef]
  33. Schwartz, S.H.; Caprara, G.V.; Vecchione, M. Basic Personal Values, Core Political Values, and Voting: A Longitudinal Analysis. Political Psychol. 2010, 31, 421–452. [Google Scholar] [CrossRef]
  34. Zhang, X.; Ghorbani, A.A. An overview of online fake news: Characterization, detection, and discussion. Inf. Process. Manag. 2020, 57, 102025. [Google Scholar] [CrossRef]
  35. Martel, C.; Pennycook, G.; Rand, D. Reliance on emotion promotes belief in fake news. PsyArXiv 2019, 5. [Google Scholar] [CrossRef]
  36. Martel, C.; Pennycook, G.; Rand, D.G. Reliance on emotion promotes belief in fake news. Cogn. Res. Princ. Implic. 2020, 5, 47. [Google Scholar] [CrossRef] [PubMed]
  37. Tajfel, H. (Ed.) Social Identity and Intergroup Relations; Cambridge University Press: Cambridge, UK, 2010; Volume 7. [Google Scholar]
  38. Pennycook, G.; Binnendyk, J.; Newton, C.; Rand, D.G. A practical guide to doing behavioral research on fake news and misinformation. Collabra Psychol. 2021, 7, 25293. [Google Scholar] [CrossRef]
  39. Haidt, J. Elevation and the positive psychology of morality. In Flourishing: Positive Psychology and the Life Well-Lived; Keyes, C.L., Haidt, J., Eds.; American Psychological Association: Washington, DC, USA, 2003. [Google Scholar]
  40. Suhler, C.L.; Churchland, P. Can innate, modular “foundations” explain morality? Challenges for Haidt’s moral foundations theory. J. Cogn. Neurosci. 2011, 23, 2103–2116. [Google Scholar] [CrossRef] [PubMed]
  41. Eskine, K.J.; Novreske, A.; Richards, M. Moral contagion effects in everyday interpersonal encounters. J. Exp. Soc. Psychol. 2013, 49, 947–950. [Google Scholar] [CrossRef]
  42. Rashkin, H.; Choi, E.; Jang, J.Y.; Volkova, S.; Choi, Y. Truth of varying shades: Analyzing language in fake news and political fact-checking. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, Copenhagen, Denmark, 7–11 September 2017; pp. 2931–2937. [Google Scholar]
  43. Bullock, J.G.; Gerber, A.S.; Hill, S.J.; Huber, G.A. Partisan Bias in Factual Beliefs about Politics; Working Paper No. w19080; National Bureau of Economic Research: Cambridge, MA, USA, 2013. [Google Scholar] [CrossRef]
  44. Litovsky, Y. (Mis) perception of bias in print media: How depth of content evaluation affects the perception of hostile bias in an objective news report. PLoS ONE 2021, 16, e0251355. [Google Scholar] [CrossRef]
  45. Staender, A.; Humprecht, E.; Esser, F.; Morosoli, S.; Van Aelst, P. Is sensationalist disinformation more effective? Three facilitating factors at the national, individual, and situational level. Digit. J. 2021, 1–21. [Google Scholar] [CrossRef]
  46. Schaewitz, L.; Kluck, J.P.; Klösters, L.; Krämer, N.C. When is disinformation (in) credible? Experimental findings on message characteristics and individual differences. Mass Commun. Soc. 2020, 23, 484–509. [Google Scholar] [CrossRef]
  47. Pennycook, G.; Epstein, Z.; Mosleh, M.; Arechar, A.A.; Eckles, D.; Rand, D.G. Shifting attention to accuracy can reduce misinformation online. Nature 2021, 592, 590–595. [Google Scholar] [CrossRef]
  48. Hopp, T. Fake news self-efficacy, fake news identification, and content sharing on Facebook. J. Inf. Technol. Politics 2021, 19, 229–252. [Google Scholar] [CrossRef]
  49. Lewandowsky, S.; Van Der Linden, S. Countering misinformation and fake news through inoculation and prebunking. Eur. Rev. Soc. Psychol. 2021, 32, 348–384. [Google Scholar] [CrossRef]
  50. Clifford, S.; Iyengar, V.; Cabeza, R.; Sinnott-Armstrong, W. Moral foundations vignettes: A standardized stimulus database of scenarios based on moral foundations theory. Behav. Res. Methods 2015, 47, 1178–1198. [Google Scholar] [CrossRef] [PubMed]
  51. Lottridge, D.; Bentley, F.R. Let’s hate together: How people share news in messaging, social, and public networks. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, QC, Canada, 21–27 April 2018; pp. 1–13. [Google Scholar] [CrossRef]
  52. Wang, L.; Fussell, S.R. More Than a Click: Exploring College Students’ Decision-Making Processes in Online News Sharing. Proc. ACM Hum.-Comput. Interact. 2020, 4, 1–20. [Google Scholar] [CrossRef]
  53. Kroll, J.; Egan, E. Psychiatry, moral worry, and moral emotions. J. Psychiatr. Pract. 2004, 6, 352–360. [Google Scholar] [CrossRef]
  54. Capanna, C.; Vecchione, M.; Schwartz, S.H. La misura dei valori. Un contributo alla validazione del Portrait Values Questionnaire su un campione italiano [The measurement of values. A contribution to the validation of the Portrait Values Questionnaire on an Italian sample]. Bollet. Psicol. Appl. 2005, 246, 29–41. [Google Scholar]
  55. Schwartz, S.H. Are There Universal Aspects in the Structure and Contents of Human Values? J. Soc. Issues 1994, 50, 19–45. [Google Scholar] [CrossRef]
  56. Marwick, A.E. Why do people share fake news? A sociotechnical model of media effects. Georget. Law Technol. Rev. 2018, 2, 474–512. [Google Scholar]
  57. Apuke, O.D.; Omar, B. Modelling the antecedent factors that affect online fake news sharing on COVID-19: The moderating role of fake news knowledge. Health Educ. Res. 2020, 35, 490–503. [Google Scholar] [CrossRef] [PubMed]
  58. D’Errico, F.; Papapicco, C.; Taulè, M. Racial Hoaxes, Stereotype and Prejudice. A psycho-linguistic analysis of Italian Fake news. J. Lang. Discrim. 2022, in press. [Google Scholar]
  59. Papapicco, C.; Lamanna, I.; D’Errico, F. Adolescents’ Vulnerability to Fake News and to Racial Hoaxes: A Qualitative Analysis on Italian Sample. Multimodal Technol. Interact. 2022, 6, 20. [Google Scholar] [CrossRef]
Figure 1. Theoretical framework.
Figure 1. Theoretical framework.
Behavsci 12 00302 g001
Figure 2. Care.
Figure 2. Care.
Behavsci 12 00302 g002
Figure 3. Fairness.
Figure 3. Fairness.
Behavsci 12 00302 g003
Figure 4. Loyalty.
Figure 4. Loyalty.
Behavsci 12 00302 g004
Figure 5. Authority.
Figure 5. Authority.
Behavsci 12 00302 g005
Figure 6. Purity.
Figure 6. Purity.
Behavsci 12 00302 g006
Table 1. Descriptive statistics.
Table 1. Descriptive statistics.
MeanSDSkK
Credibility3.8820.836−1.2722.376
CareMoral emotions3.2520.905−0.376−0.482
Online sharing3.2081.413−0.376−1.170
Credibility3.1890.999−0.410−0.348
FairnessMoral emotions1.7160.7571.4151.816
Online sharing2.1551.2340.815−0.352
Credibility2.6101.2100.133−1.051
LoyaltyMoral emotions1.3470.6052.3876.401
Online sharing1.4920.9262.1984.831
Credibility3.1131.080−0.320−0.709
AuthorityMoral emotions1.8220.9151.3751.290
Online sharing2.1001.2750.836−0.482
Credibility2.3521.1740.395−1.046
PurityMoral emotions1.7620.8441.3091.100
Online sharing2.0231.2790.905−0.531
Table 2. Correlation matrix.
Table 2. Correlation matrix.
Moral DomainProximal DimensionsFramingPersonal Values
CRMEBIASEDCONSSELF-TR
CareCR 0.255 **0.230 **0.158
ME0.473 ** 0.230 **0.302 **0.376 **
OS0.467 **0.461 **0.200 *0.1270.049
FairnessCR 0.341 **0.254 **0.141
ME0.359 ** 0.187 *0.241 **0.190 *
OS0.330 **0.428 **0.1540.247 **−0.058
LoyaltyCR 0.174 *0.1250.041
ME0.162 −0.175 *0.236 **0.192 *
OS0.200 *0.562 **−0.0180.206 *−0.054
AuthorityCR 0.426 **0.340 **0.237 **
ME0.491 ** 0.0810.347 **0.232 **
OS0.354 **0.363 **0.175 *0.225 *0.003
PurityCR 0.277 **0.275 **0.107
ME0.667 ** 0.214 *0.325 **0.227 **
OS0.490 **0.471 **0.0940.162−0.022
*. Correlation is significant at the 0.05 level (2-tailed); **. Correlation is significant at the 0.01 level (2-tailed).
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

D’Errico, F.; Corbelli, G.; Papapicco, C.; Paciello, M. How Personal Values Count in Misleading News Sharing with Moral Content. Behav. Sci. 2022, 12, 302. https://0-doi-org.brum.beds.ac.uk/10.3390/bs12090302

AMA Style

D’Errico F, Corbelli G, Papapicco C, Paciello M. How Personal Values Count in Misleading News Sharing with Moral Content. Behavioral Sciences. 2022; 12(9):302. https://0-doi-org.brum.beds.ac.uk/10.3390/bs12090302

Chicago/Turabian Style

D’Errico, Francesca, Giuseppe Corbelli, Concetta Papapicco, and Marinella Paciello. 2022. "How Personal Values Count in Misleading News Sharing with Moral Content" Behavioral Sciences 12, no. 9: 302. https://0-doi-org.brum.beds.ac.uk/10.3390/bs12090302

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop