Next Article in Journal
Holiday Club Programmes in Northern Ireland: The Voices of Children and Young People
Previous Article in Journal
Does Work Disability Contribute to Trajectories of Work Participation before and after Vocational Labour Market Training for Job Seekers?
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Determinants of the Perceived Credibility of Rebuttals Concerning Health Misinformation

School of Economics and Management, Beijing University of Posts and Telecommunications, Beijing 100876, China
*
Author to whom correspondence should be addressed.
Int. J. Environ. Res. Public Health 2021, 18(3), 1345; https://0-doi-org.brum.beds.ac.uk/10.3390/ijerph18031345
Submission received: 5 December 2020 / Revised: 28 January 2021 / Accepted: 29 January 2021 / Published: 2 February 2021

Abstract

:
Users provide and share information with a broad audience on different forms of social media; however, information accuracy is questionable. Currently, the health information field is severely affected by misinformation. Thus, addressing health misinformation is integral for enhancing public health. This research can help relevant practitioners (i.e., government officials, medical and health service personnel, and educators) find the most effective correctional interventions for governing health misinformation. We constructed a theoretical model for credibility-oriented determinants refuting misinformation based on the elaboration likelihood model. We aggregated 415 pieces of valid data through a questionnaire survey. A partial least squares structural equation model evaluated this research model. The results indicated that both perceived information quality and perceived source credibility can enhance perceived information credibility. Under some circumstances, the influence of information quality on information credibility may be more important than that of the information source. However, the cognitive conflict and knowledge self-confidence of information receivers weaken the influence of information quality on information credibility. In contrast, cognitive conflict can strengthen the influence of source credibility on information credibility. Further, perceived information quality can be affected by information usefulness, understandability, and relevance, while perceived source reliability can be affected by source expertise and authority.

1. Introduction

The existence and transmission of health misinformation have led to severe consequences to personal health, social media platform operation, and social stability [1,2]. For instance, when the Zika virus broke out, misinformation related to this virus attracted widespread attention on Facebook and was more popular than correct, reliable information [3]. The expenditure for launching unnecessary informational promotion activities to correct such misinformation grew significantly [4]. Furthermore, in the wake of the COVID-19 outbreak, more than 600 people died in Iran after they drank high levels of alcohol in the mistaken belief that it would protect them against the virus [5,6]. Hence, the public’s belief in misinformation leads to more dangerous consequences than ignorance [4].
In recent years, the Internet has become the main source of information, and with a surge in demand for health information, most people choose to browse the Internet to obtain it. According to a report published by the Pew Research Center, 72% of adults in the US have searched for at least one type of health information on the Internet [7]. Because of its convenience, the Internet meets the public’s demands for having face-to-face consultations with professional medical and nursing personnel [8]. Nonetheless, it is also responsible for the prevalence of health misinformation [9], because traditional quality-control mechanisms, such as professional editors, are excluded from the information-generation process. Particularly, the development of Web 2.0 has changed Internet users from passive information consumers to users who actively generate content on websites such as Weibo, Zhihu, and YouTube [4]. Misinformation is consequently widely transmitted and starts trending on social media platforms [10]. Notably, health information is one of the information sources that attracts the most attention but is most severely affected by misinformation [11].
UNESCO, in its working documents and reports, defines misinformation as unintentional misinformation disseminated with confidence of its authenticity, usually with no apparent intention of profit behind it [12,13]. The consensus of the scientific community provides a relatively clear distinction between true information and misinformation. Health misinformation, which is the focus of this study, is contrary to the cognitive consensus of the scientific community on a certain phenomenon [14]. There are three modes of handling misinformation. The first mode is preemptive prevention. In this mode, true information is transmitted to the public before it is subject to misinformation. Therefore, relevant studies focus on the acceptance of health information [15]. The second mode utilizes the withdrawal or deletion of misinformation. Some studies have prioritized the identification of misinformation and discontinution of its dissemination in time [16]. Although this mode is conducive to mitigating the impact of misinformation, it does not eliminate it [17]. The third, and often the most effective, mode when individuals pursue accuracy motivational goals in processing scientific information includes explanation and correction. This mode is the focus of this study. Correction provides information about beliefs that individuals may hold, stemming from previous contact and communication [18].
The public’s belief in misinformation leads to more dangerous consequences than ignorance [4]. Thus, to successfully right users’ cognitive errors, corrective health information must convince them. This study differs from past studies in terms of information adoption, because correcting misinformation means the original beliefs held by information recipients must be changed. Misinformation rebuttals are messages telling individuals to ignore or disbelieve previous information [18]. In other words, persuasive knowledge is required for information recipients under high-level cognitive conflicts. Additionally, professional knowledge is needed for such refusal of health misinformation to occur, because it is more difficult for the public to cognitively process health information than other types of information. To our knowledge, no prior study evaluated the impact of information recipients’ cognitive conflicts and knowledge self-confidence while refuting health misinformation.
This study aims to explore an effective way to correct health misinformation and eliminate its adverse effects. Although some information on social media can be misleading, or even deceptive, the more people feel they can trust health information, the more they are willing to accept it [19,20]. We constructed a theoretical model for credibility-oriented determinants refuting misinformation based on the elaboration likelihood model. The partial least squares (PLS) method was used to verify the proposed model. The findings of this study complement previous research on social media and online health. Alternatively, this study can help relevant practitioners (i.e., government officials, platform managers, medical and health service personnel, and educators) find the most effective correctional interventions for governing health misinformation.

2. Research Model and Hypotheses

The rebuttal of misinformation falls in the category of persuasion information, which aims to correct the public’s misunderstandings and provide knowledge about the truth of matters. The elaboration likelihood model (ELM), proposed by psychologists Petty and Cacioppoti, is one of the most authoritative theories in the field of knowledge persuasion [21]. It has been widely used to describe how people process information and form their attitudes toward behaviors. According to the ELM, persuasion can be achieved through one or both of the following routes: the central and peripheral routes. The central route of persuasion focuses on information factors. In this route, the information recipient inputs a large number of cognitive resources for the elaborate processing of information to produce the perception of contacted information [22]. In comparison, the peripheral route of persuasion focuses on irrelevant factors, such as the source and presentation of information. In this route, the recipient processes information on the low level [23]. Both routes signify that one’s attitudes take shape or vary according to intrinsic information processing capabilities [24]. Previous empirical studies have verified the application of the ELM in various fields. For example, the ELM and technology acceptance model (TAM) can be combined to study how knowledge-based employees evaluate information and accept advice [24]. Chung et al. studied the adoption of tourist information on social media through the ELM and explored the moderating roles of social presence [25]. Tseng and Wang investigated the information adoption process on tourist websites regarding cognitive risks through the integrated model of ELM and perceived usefulness [26].
According to the three main factors of the effectiveness of information transmission, the influencing factors of the credibility of refusals can be divided into the information source, the information itself, and the information receiver. Based on the ELM, this study constructs a structural equation model for relevant determiners. Specifically, information quality is used as the central route, while source credibility is regarded as the peripheral route. As for the recipient, we explore the moderating role of the public’s cognitive conflicts and knowledge self-confidence in central and peripheral routes.

2.1. Perceived Information Quality

Perceived information quality is defined as the values and proof persuasion of information [24,27,28]. According to the application of the ELM on information adoption, information quality affects one’s attitudes through the central route. Information quality affects the degrees of perceived usefulness [24,27] and trust [21] in information, as well as users’ attitudes and willingness [29]. High-quality information has a significant impact on the persuasion effect [30], and it plays a role in changing people’s attitudes, even when they are concerned about privacy [22]. Conversely, low-quality, irrational, and non-persuasive information has no significant impact on recipients’ attitudes [31]. Excessive advertisements and misleading health information on social media make it more difficult for users to identify whether the information is true or false. Health information with high perceived quality would have an increasingly vital role in determining people’s trust in health information [15]. Typically, the information quality of Internet health information is generally affected by relevance, understandability (i.e., clarity and readability), adequacy (i.e., sufficiency, completeness, and necessity), and usefulness [29,32]. The following hypotheses are put forward based on existing studies:
Hypothesis 1 (H1):
Perceived information quality would have a positive relationship with the perceived credibility of rebuttals concerning health misinformation.
Hypothesis 1a (H1a):
Information relevance would have a positive relationship with perceived information quality.
Hypothesis 1b (H1b):
Information understandability would have a positive relationship with perceived information quality.
Hypothesis 1c (H1c):
Information adequacy would have a positive relationship with perceived information quality.
Hypothesis 1d (H1d):
Information usefulness would have a positive relationship with perceived information quality.

2.2. Perceived Source Credibility

Source credibility refers to the degree of credibility of the information sender as perceived by the information recipient [23]. Thus, it represents an attitude toward the information source and is irrelevant in terms of the information itself [33]. According to questionnaire survey and experimental study, the impact of perceived source credibility on people’s attitudes and information accepting behaviors is widely accepted. This notion affects the adoption of tourist information from user-generated content on social media [25] and the evaluation of online health information [34]. Source credibility plays a vital role in improving users’ experiences and enhancing their behavioral intentions in the virtual community [35]. Individuals are more inclined to believe information from a highly reliable source rather than a source with low reliability [36]. The misinformation rebuttals on the Internet do not exist independently but are overshadowed by a large number of true or false information flows [37]. Understandably, source credibility allows people to handle information through the peripheral route rather than rely on complicated cognitive processing [21]. Many virtual communities infer the credibility of a knowledge source through the user influence system based on historical contributions and published records [38]. The user’s authority is valid for judging the source credibility of microblog information [39]. The recipient’s decision-making process will be more affected by the provider if the knowledge provider has a high level of professional knowledge [40]. The perceived source reliability of health information mainly depends on the expertise (i.e., competence, skill, and knowledge) and authority (i.e., reputation, status, and influences) of information publishers as perceived by the public. The following hypotheses are put forward based on existing literature:
Hypothesis 2 (H2):
Perceived source credibility would have a positive relationship with the perceived credibility of health misinformation rebuttals.
Hypothesis 2a (H2a):
Source expertise would have a positive relationship with perceived source credibility.
Hypothesis 2b (H2b):
Source authority would have a positive relationship with perceived source credibility.

2.3. Moderating Effect of Cognitive Conflict

Regarding the persuasion field, many studies have shown that people accept information more easily when it is consistent with what they consider to be correct. The consensus not only enhances users’ trust in provided information but also effectively influences the recipient’s opinions, attitudes, and beliefs [41]. After receiving new information, people immediately evaluate whether it is compatible with the logic of other facts and cognitive beliefs. If the information conflicts with the original perception, people are more likely to resist changing their original beliefs [42,43]. Therefore, conflicts with original perception may make it less likely to successfully correct misinformation.
If the information is inconsistent with one’s beliefs, it may trigger negative emotions. Further, examining information inconsistent with one’s beliefs is not as smooth as evaluating information consistent with those beliefs. Typically, conveniently deciphered information is more familiar and more easily accepted. Conversely, inconvenience triggers negative feelings and urges people to examine the information more carefully [44,45]. This process requires more effort, motivation, and cognitive resources [4]. Consequentially, such people may seek help from transmitters’ evaluations of reliability. When the information received by consumers counters their preconceived perceptions, stronger correlations between emotional trust and behavioral intentions are formed [46]. The following hypotheses are proposed based on existing studies:
Hypothesis 3a (H3a):
Cognitive conflict would moderate the relationship between perceived information quality and perceived information credibility.
Hypothesis 3b (H3b):
Cognitive conflict would moderate the relationship between perceived source credibility and perceived information credibility.

2.4. Moderating Effect of Knowledge Self-Confidence

The impact of the perceived information quality on one’s attitudes varies according to situations and is affected by personal abilities in specific circumstances [47]. Knowledge self-confidence refers to a self-assessment of the degree to which individuals think they understand relevant scientific knowledge [48,49]. Perceived information quality is a subjective evaluation of information content and depends on the individual’s previous experience and professional knowledge [21]. Both one’s knowledge and skills can be employed to handle information. In some cases, the content of information is read, processed, and considered, and in other cases, the content may be neglected entirely. Such differences may result from recipients’ different interpretations of knowledge content [24,50]. In the peripheral route, impacts are mainly created through simple decision-making standards and clues such as reputation, charisma, or appeal [22]. Individuals may use such clues because they do not want to invest necessary cognitive resources or do not make an effort owing to limited capacities. When judging the authenticity of information through source credibility, users need not have complicated cognitive processing for strongly professional health information. Non-expert users are more inclined to rely on what are known as marginal clues (i.e., source credibility) [51,52]. Evaluating the credibility of health misinformation rebuttals requires more professional knowledge. Source credibility may be the most pivotal factor for non-experts to evaluate information [24]. Hence, the following hypotheses are proposed based on existing studies:
Hypothesis 4a (H4a):
Knowledge self-confidence moderates the relationship between perceived information quality and perceived information credibility.
Hypothesis 4b (H4b):
Knowledge self-confidence moderates the relationship between perceived source credibility and perceived information credibility.
Based on the analysis above, this study presents the research model illustrated in Figure 1.

3. Methods

This study cited the example of rebuttals of health misinformation on the Sina microblogging platform. The PLS structural equation model was used to verify the hypothesis model. The model structure is characterized as reflective first order.
The questionnaire comprised three parts. The first part collected respondents’ personal information and studied respondents’ original understanding of one type of health information, namely, judging whether there are cognitive mistakes. We asked participants, “Do you think bone soup can supplement calcium?” The respondents answering with “Yes” (cognitive errors) were screened into the second and third parts.
In the second part, a situational questionnaire was used to describe specific response situations to the participants through pictures and texts. This allowed participants to imagine themselves in the situation and their responses to be subsequently measured. This method can reduce the influence of biases caused by factors, such as memory and comprehension. Figure 2 shows the health misinformation rebuttals to the respondents. We asked respondents to answer questions based on their actual perceptions of the experimental situation. For example, the respondents’ perceptions of source credibility were mainly their perceptions of the credibility of Sina Weibo and DX Doctor, and this differed between respondents.
The third part included the measurement items for each variable. The present scale was mainly derived from the mature measurement scale in the existing literature. The initial scale for the study situation in this study was designed based on the characteristics of health misinformation rebuttals and was improved through pre-investigation procedures (see Table 1). A score of one to seven was given ranging from total negative to total positive.
For data collection, we utilized So Jump, an online questionnaire survey platform that randomly sends the questionnaire to users, and eventually collected 415 valid questionnaires from 22 November to 3 December 2019. To ensure the quality of the questionnaire, we paid the respondents. There were 166 men (40%) and 249 women (60%). Regarding the age distribution, most respondents (i.e., 384) were aged between 18 and 40 years, accounting for 92.53% of the total sample size. Most of the respondents had bachelor’s degrees and belonged to various industries. The demographics for the research sample are presented in Table 2.

4. Data Analysis and Results

4.1. Non-Response Bias

Non-response bias refers to the fact that a respondent’s failure to answer the questionnaire due to various reasons may lead to bias in the research results. Armstrong and Overton (1977) argued that late responders are more likely to be similar to non-responders than early responders [55]. This study compared whether there were significant differences in occupation and education between early respondents (207 respondents who completed the questionnaire first) and later respondents (208 respondents who completed the questionnaire later). The independent sample t-test results showed no significant difference between the early and later stage respondents in occupation and education (p > 0.05) [56]. This indicates that the non-response bias in this study was not obvious and could be ignored.

4.2. Common Method Bias

When all data are from the same questionnaire, there may be common method bias, which affects the effectiveness of the study. This study examined the potential existence of common method bias through several procures. First, Harman’s single factor test was used. The results showed that the first (largest) factor accounted for 35.883% of the variance, no single factor explained more than 40% of the variance, and all factors explained 73.358% of the variance [57]. Then, the marker variable method was used to add a variable theoretically unrelated to other latent variables to the model [58,59]. The test showed that the label variable has no significant influence on the variables in the original model. Therefore, common method bias was not a key issue in this study.

4.3. Assessment of Reliability and Validity

We tested the indicator reliability, convergent validity, and discriminant validity to ensure the measurement results were reliable and valid. Reliability was verified through Cronbach’s alpha (CA) and composite reliability (CR). As shown in Table 3, CA and CR were both larger than 0.7, denoting high reliability of the data [60].
Convergent validity was evaluated through item loadings, CR, and average variance extracted (AVE). As illustrated in Table 3 and Table 4, the values of item loadings and CR were larger than 0.7, and the AVE values were larger than 0.5, meaning the data had satisfactory convergent validity [61].
Discriminant validity was appraised by comparing the square root of the AVE of each construct to the inter-construct correlations, and by comparing the item loadings to the cross-loadings. Moreover, the heterotrait-monotrait ratio (HTMT) is usually no more than 0.85, and when the concepts of perspectives are similar, the HTMT threshold can be extended to 0.90 [62]. Table 5 shows that the square root of the AVE of each construct was greater than the inter-construct correlations. Similarly, Table 4 depicts that all item loadings were higher on their factor than on any other factor. Table 6 shows that most HTMT was no more than 0.85, and all HTMT was no more than 0.90. These results confirmed the discriminant validity.

4.4. Assessment of the Structural Model

The results of the hypotheses based on the t-values, confidence intervals, and value of f-squared are presented in Table 7. The path coefficients and explained variance of the structural model are revealed in Figure 3. The Q 2 of the perceived credibility of rebuttals, perceived information quality, and perceived source credibility were 0.469, 0.334, and 0.442, respectively. The R 2 of the perceived credibility of rebuttals, perceived information quality, and perceived source credibility were 0.614, 0.499, and 0.566, respectively. The structural model showed high prediction accuracy. The perceived information quality ( β = 0.444 , p < 0.001 ) on the central route and perceived source credibility ( β = 0.306 , p < 0.001 ) on the peripheral route significantly affected the perceived credibility of the rebuttals of health misinformation. Hypotheses 1 and 2 were thus supported.
Further, information relevance ( β = 0.354 , p < 0.001 ), understandability ( β = 0.150 , p < 0.01 ), and usefulness ( β = 0.330 , p < 0.01 ) all significantly affected perceived information quality; however, the impact of information adequacy remained insignificant ( β = 0.032 , p = 0.617 ). Thus, Hypotheses 1a, 1b, and 1d were also supported, while hypothesis 1c was rejected. Additionally, information source expertise ( β = 0.495 , p < 0.001 ) and authority ( β = 0.308 , p < 0.001 ) had a significant impact on perceived source credibility; therefore, Hypotheses 2a and 2b were supported. Cognitive conflict negatively moderated the impact of perceived information quality on perceived information credibility ( β = 0.089 , p < 0.05 ), and positively moderated the impact of perceived source credibility on perceived information credibility ( β = 0.135 , p < 0.01 ). Hypothesis 3a and 3b were consequently supported. Moreover, knowledge self-confidence played a negative moderating role in the relationship between perceived information quality and perceived information credibility ( β = 0.140 , p < 0.01 ). However, its moderating role in the relationship between source credibility and perceived information credibility was insignificant ( β = 0.029 , p = 0.581 ), suggesting that Hypothesis 4a was valid, while Hypothesis 4b was invalid.

5. Discussion

The perceived credibility of health misinformation rebuttals can be improved from two aspects: information content itself and information source. In addition to authoritative experts having credibility recognized more by the public, they have a stronger ability to produce high-quality debunking information. Under some circumstances, the influence of the information itself on information credibility may be more important than the information source. Moreover, under circumstances wherein the harmful effects of health misinformation are relatively weak (not enough to attract the attention of authorities and many experts), the science involved in correcting health information is relatively simple, or it is difficult to verify that information through practice (e.g., measurement of calcium content in bone soup). Cho et al. (2011) found that information from different sources may have the same influence, for example, whether the recipient is told that the information comes from research funded by “ExxonMobil” or “people like you” [63]. These findings suggest that the factors of source reliability may be overlooked sometimes. Additionally, the crux of the information is often easier to remember than the source, and compelling stories from unreliable sources may be recalled or accepted long after the sources are forgotten [4]. However, it is difficult to accurately judge whether the health information is credible or not based only on the information content itself [64]. Focusing on the information source can reduce the sharing of false information [65]. Nonetheless, the combined effect of information quality and source reliability can enhance information credibility to a greater extent. Caulfield recommended that accurate information of value to the public must be shared and called on scientists to participate in science-related communication on social media [66].
Information quality can be improved by enhancing information relevance (i.e., meeting the audience’s demands and attracting the public’s interest), understandability (i.e., ensuring rational logic and high readability), and usefulness (i.e., helping the public solve problems with practically and feasibly) [29,32]. Public demand for information adequacy is not high. Information overload, similar to information scarcity, can be a hindrance; therefore, a full and comprehensive exposition of information is not necessary in some cases. People should take action to disseminate high-quality information globally that is accurate, easy to digest, engaging, and easy to share on mobile devices. Information must be customized to the recipient, since different people can perceive the same things in dramatically different ways. Encouraging individuals with large numbers of followers to share corrective or high-quality information and encouraging scientists to communicate more with the public on social media platforms may be effective strategies for combating false health information online.
The overwhelming information deluge on social media has affected the public’s understanding, and it is particularly significant to make use of expert resources to ensure that high-quality misinformation rebuttals stand out. The credibility of information sources often enhances the persuasiveness of communication. Ideally, the public should maintain a high level of trust for reliable sources, while trust in unreliable sources should be reduced. In practice, however, it often is difficult for the public to determine a source’s reliability. When people are confused about who can provide accurate information, it would be helpful to provide users with clearer indicators of the reliability of a source. Therefore, social media could rate information sources, such as through expert ratings (expert ratings of articles) and user rating of articles or information sources. Messages are popular mainly because influencers share them with their audiences [67]. Communicators with high credibility should pay attention to the authenticity of health information before they disseminate it to their followers, as individuals and businesses with a large social media audience have a greater responsibility to verify the accuracy of any health information they share. They should refer to points from real medical reports and authoritative experts, only forward information from trusted health knowledge providers, and make full use of the authority effect and platform effect to improve information credibility.
Although the cognitive conflict of the information receiver has a significant moderating role in the two paths of information quality and source credibility, the moderating effects are completely different. When cognitive conflict is high, the influence of information quality on perceived information credibility is hampered, while the effect of source credibility on perceived information credibility is enhanced. People evaluate the logical compatibility of the information they receive with other facts and beliefs. Once information is accepted, it is highly impervious to change. From the perspective of cognitive consistency, this resistance stems from subsequent inconsistencies in the information that result from the refusal to admit that previous information is false. Thus, conflict with existing knowledge reduces the likelihood that it will be successfully corrected. If there is insufficient evidence that the original perception is wrong, the easiest way to resolve this conflict may be to revert to the original idea and ignore the corrective input.
In general, people tend to hold on to what they already know. Changing perceptions entails additional motivation and cognitive resources, and if the topic is not of interest to the public, it is very difficult to change predominant misconceptions [4]. Therefore, for the receivers of information with high cognitive conflict, only high-quality persuasive arguments can convince them to change their original attitudes and beliefs, such as solid evidence and logic of eloquence. At this point, it is easier to persuade a person or medium that is highly trusted by the receiver of the message to disprove the misinformation. Evidently, inconsistent information can trigger negative emotions and increase the difficulty for the receiver to process the information. Nonetheless, trust in an information sender can supplement positive emotions without any effort to process the information, and it can be easily gauged by the marginal clue of source credibility.
The knowledge self-confidence of information receivers is not significant in moderating the relationship between source reliability and perceived information credibility, but it weakens the influence of information quality on perceived information credibility. Moreover, higher cognitive receivers need higher-quality persuasive arguments to persuade them to change their original beliefs. Contrariwise, this phenomenon may be attributable to the professionalism of health information. Hence, an audience with higher knowledge self-confidence trusts its original cognition or professional level but does not exclude that which it believes is inaccurate. Reliance on false knowledge is not the same as ignorance, which is lack of relevant knowledge. Ignorance can have significant adverse effects on decisions, but these effects may not be as severe as the impact of trust in false knowledge. When people lack knowledge, they often rely on simple heuristics for making decisions. Overall, they have a relatively low level of confidence in decisions based solely on heuristics [68,69]. In other words, ignorance rarely leads to strong support for an idea, but if there is a high level of confidence in one’s knowledge, this support is often powerful. For example, individuals who oppose the scientific evidence for climate change most strongly are usually those who think they are experts on the subject [70]. Pre-existing scientific knowledge may influence the interpretation of newly received scientific knowledge [4]. When individuals are confident in their knowledge, but they cannot make a reasonable explanation with their original knowledge, they may be more difficult to be persuaded. This may be because an audience with high knowledge self-confidence has higher requirements regarding scientific explanations and information quality.
The public should be aware of the limitations of its cognition, and confidence in existing knowledge should not be an obstacle to new ideas. Social media should pay close attention to the “water army” of the Internet, use strict measures to prevent misinformation from spreading, and increase the speed at which misinformation rebuttals are spread by changing the manner in which information flow is presented. Thus, with these changes, the public could receive correct information before being subjected to misinformation.

6. Conclusions

To address health misinformation and enhance public health, the results of this study suggest that perceived credibility of health misinformation rebuttals can be improved by enhancing the information quality and information source. Information quality has a stronger influence on information credibility than information source under some circumstances. Information quality can be improved by enhancing information relevance, understandability, and usefulness. Source reliability can be improved by enhancing source expertise and authority. However, the cognitive conflict and knowledge self-confidence of information receivers weaken the influence of information quality on information credibility. In contrast, cognitive conflict can strengthen the influence of source credibility on information credibility. The public should be aware of the limitations of its cognition. Governments, platform managers, medical and health service personnel, and educators should combine the effect of information quality and source reliability to enhance information credibility.
This study constructed a theoretical model of the factors influencing the perceived credibility of health misinformation rebuttals on social media. First, this study analyzed the influencing factors from three perspectives, including information, information source, and the information recipient. Not only will this study enrich the application of the ELM theory but also expand the governance and mitigation of misinformation on social media. Second, information quality and source credibility are cardinal to changing the public’s perception. As it is more difficult to correct existing misinformation than accept new knowledge, the model in this study is a theoretical trial carried out solely to refute health misinformation on social media. Lastly, the target audience of the rebuttal of health misinformation has high cognitive conflicts and widely different knowledge self-confidence. This study focused on analyzing the moderating roles of the knowledge self-confidence and the cognitive conflict on correction paths. It is a theoretical complement to the persuasion field and is of great significance for understanding the public’s attitudes and changing their behaviors.
In practice, this study is of significance for the government, platform managers, medical and health service personnel, and educators. First, information quality and source credibility play a prime role in changing the public’s belief. Information publishers should provide high-quality, understandable, useful, and convincing information. Besides, the information should be transmitted by highly reliable transmitters, including the government or authoritative medical institutions. This will help enhance the public’s perception of information credibility. Second, the existence of cognitive conflict increases the difficulty of refuting misinformation. Misinformation refutation by credible communicators, such as governments and authoritative medical experts trusted by information receivers, can get twice the result with half the effort. Social media platforms that share health information should control the quality of information and identify and filter unhealthy information in a timely manner. Moreover, credit rating and other markers are used to distinguish the credibility of information publishers and reduce the difficulty in users’ cognitive processing. Lastly, users have different degrees of understanding of health professional information, and users with high knowledge self-confidence are the groups that find it more difficult to change their attitude. On the one hand, social media platforms can obtain correct information before users are exposed to health misinformation by means such as changing information presentation order. On the other hand, they need to push high-quality misinformation rebuttals from different sources more frequently to the hard-to-persuade groups.
This research has some limitations that should be overcome to expand future studies. First, it studied representative factors for the information, information source, and receiver. Notably, the effect of correcting health misinformation is influenced by numerous factors. Thus, more effort should be made in discovering other factors to improve the model—that is, the model can be combined with psychological factors to explore the psychological process of the public’s acceptance or rejection of the beliefs change and with sociological factors to explore solutions to the echo chamber situation. Second, we studied a single situation (i.e., bone soup, no calcium) through a questionnaire survey. The obtained results were based on the specific set up of the survey, and the applicability of the suggestions to other scenarios needs further consideration. Hence, other situations can be designed to study health misinformation according to the standards for levels of risks or different social and cultural backgrounds. Further, behavioral experiments can be conducted to manipulate signals about source credibility, social identity, and perceived importance of the information itself.

Author Contributions

Conceptualization, Y.S. and B.Z.; methodology, Y.S. and B.Z.; software, Y.S.; validation, Y.S. and B.Z.; formal analysis, Y.S.; investigation, Y.S.; resources, Y.S.; data curation, Y.S.; writing—original draft preparation, Y.S.; writing—review and editing, Y.S. and B.Z.; visualization, Y.S.; supervision, B.Z.; project administration, B.Z.; funding acquisition, B.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Major Program of National Fund of Philosophy and Social Science of China, grant number 18VZL010.

Institutional Review Board Statement

Ethical review and approval were waived for this study, due to this study does not involve moral and ethical issues, and tends to the field of economy and management.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Sabbagh, C.; Boyland, E.; Hankey, C.; Parrett, A. Analysing Credibility of UK Social Media Influencers’ Weight-Management Blogs: A Pilot Study. Int. J. Environ. Res. Public Health 2020, 17, 9022. [Google Scholar] [CrossRef]
  2. Li, Y.; Twersky, S.; Ignace, K.; Zhao, M.; Purandare, R.; Bennett-Jones, B.; Weaver, S.R. Constructing and communicating COVID-19 stigma on Twitter: A content analysis of Tweets during the early stage of the COVID-19 outbreak. Int. J. Environ. Res. Public Health 2020, 17, 6847. [Google Scholar] [CrossRef]
  3. Sharma, M.; Yadav, K.; Yadav, N.; Ferdinand, K.C. Zika virus pandemic: Analysis of Facebook as a social media health information platform. Am. J. Infect. Control. 2017, 45, 301–302. [Google Scholar] [CrossRef]
  4. Lewandowsky, S.; Ecker, U.K.H.; Seifert, C.M.; Schwarz, N.; Cook, J. Misinformation and its correction: Continued influence and successful debiasing. Psychol. Sci. Public Int. 2012, 13, 106–131. [Google Scholar] [CrossRef]
  5. Trew, B. Coronavirus: Hundreds Dead in Iran from Drinking Methanol Amid Fake Reports It Cures Disease. Independent. Available online: https://www.independent.co.uk/news/world/middle-east/iran-coronavirus-methanol-drink-cure-deaths-fake-a9429956.html (accessed on 29 April 2020).
  6. Duplaga, M. The determinants of conspiracy beliefs related to the COVID-19 pandemic in a nationally representative sample of Internet users. Int. J. Environ. Res. Public Health 2020, 17, 7818. [Google Scholar] [CrossRef] [PubMed]
  7. Pew Internet Research. The Social Life of Health Information. Available online: https://www.pewresearch.org/fact-tank/2014/01/15/the-social-life-of-health-information/ (accessed on 15 January 2014).
  8. Yun, G.W.; Morin, D.; Park, S.; Joa, C.Y.; Labbe, B.; Lim, J.; Lee, S.; Hyun, D. Social media and flu: Media twitter accounts as agenda setters. Int. J. Med. Inform. 2016, 91, 67–73. [Google Scholar] [CrossRef] [PubMed]
  9. Adams, S.A. Revisiting the online health information reliability debate in the wake of “web 2.0”: An inter-disciplinary literature and website review. Int. J. Med. Inform. 2010, 79, 391–400. [Google Scholar] [CrossRef] [PubMed]
  10. Balmas, M. When fake news becomes real: Combined exposure to multiple news sources and political attitudes of inefficacy, alienation, and cynicism. Commun. Res. 2014, 41, 430–454. [Google Scholar] [CrossRef]
  11. Le, H.T.; Nguyen, D.N.; Beydoun, A.S.; Le, X.T.T.; Nguyen, T.T.; Pham, Q.T.; Ta, N.T.K.; Nguyen, Q.T.; Nguyen, A.N.; Hoang, M.T.; et al. Demand for Health Information on COVID-19 among Vietnamese. Int. J. Environ. Res. Public Health 2020, 17, 4377. [Google Scholar] [CrossRef]
  12. Salaverría, R.; Buslón, N.; López-Pan, F.; León, B.; Erviti, M.C. Disinformation in times of pandemic: Typology of hoaxes on Covid-19. El Prof. De La Inf. 2020, 29, e290315. [Google Scholar]
  13. UNESCO. Journalism, ‘Fake News’ & Disinformation: Handbook for Journalism Education and Training; Unesco Publishing: Paris, France, 2018; Available online: https://en.unesco.org/sites/default/files/journalism_fake_news_disinformation_print_friendly_0_0.pdf (accessed on 3 September 2019).
  14. Swire-Thompson, B.; Lazer, D. Public Health and Online Misinformation: Challenges and Recommendations. Annu. Rev. Public Health 2020, 41, 433–451. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  15. Chou, C.H.; Wang, Y.S.; Tang, T.I. Exploring the determinants of knowledge adoption in virtual communities: A social influence perspective. Int. J. Inform. Manag. 2015, 35, 364–376. [Google Scholar] [CrossRef]
  16. Krittanawong, C.; Narasimhan, B.; Virk, H.U.H.; Narasimhan, H.; Tang, W.H.W. Misinformation dissemination in twitter in the covid-19 era. Am. J. Med. 2020, 133, 1367–1369. [Google Scholar] [CrossRef]
  17. Ecker, U.K.H.; Lewandowsky, S.; Swire, B.; Chang, D. Correcting false information in memory: Manipulating the strength of misinformation encoding and its retraction. Psychon. Bull. Rev. 2011, 18, 570–578. [Google Scholar] [CrossRef] [PubMed]
  18. Bolsen, T.; Druckman, J.N. Counteracting the Politicization of Science. J. Commun. 2015, 65, 745–769. [Google Scholar] [CrossRef]
  19. Lim, S.H.; Kim, D. The role of trust in the use of health infomediaries among university students. Inform. Health Soc. Care 2012, 37, 92–105. [Google Scholar] [CrossRef]
  20. Karlova, N.; Fisher, K.E. A social diffusion model of misinformation and disinformation for understanding human information behaviour. Inf. Res. 2013, 18, 1–17. [Google Scholar]
  21. Huo, C.; Ma, F.; Qiu, Y.; Wang, Y. Exploring the determinants of health knowledge adoption in social media: An intention-behavior-gap perspective. Inform. Dev. 2018, 34, 346–363. [Google Scholar]
  22. Angst, C.M.; Agarwal, R. Adoption of electronic health records in the presence of privacy concerns: The elaboration likelihood model and individual persuasion. MIS Quart. 2009, 33, 339–370. [Google Scholar] [CrossRef] [Green Version]
  23. Petty, R.E.; Schumann, C.D. Central and peripheral routes to advertising effectiveness: The moderating role of involvement. J. Consum. Res. 1983, 10, 135–146. [Google Scholar] [CrossRef] [Green Version]
  24. Sussman, S.W.; Siegal, W.S. Informational influence in organizations: An integrated approach to knowledge adoption. Inform. Syst. Res. 2003, 14, 47–65. [Google Scholar] [CrossRef] [Green Version]
  25. Chung, N.; Han, H.; Koo, C. Adoption of travel information in user-generated content on social media: The moderating effect of social presence. Behav. Inform. Technol. 2015, 34, 902–919. [Google Scholar] [CrossRef]
  26. Tseng, S.Y.; Wang, C.N. Perceived risk influence on dual-route information adoption processes on travel websites. J. Bus. Res. 2016, 69, 2289–2296. [Google Scholar] [CrossRef]
  27. Bhattacherjee, A.; Sanford, C. Influence processes for information technology acceptance: An elaboration likelihood model. MIS Quart. 2006, 30, 805–825. [Google Scholar] [CrossRef] [Green Version]
  28. Yoo, D.K.; Vonderembse, M.A.; Ragu-Nathan, T.S. Knowledge quality: Antecedents and consequence in project teams. J. Know. Manag. 2011, 15, 329–343. [Google Scholar]
  29. Zahedi, F.; Song, J. Dynamics of trust revision: Using health infomediaries. J. Manag. Inform. Syst. 2008, 24, 225–248. [Google Scholar] [CrossRef]
  30. Mak, B.; Schmitt, B.H.; Lyytinen, K. User participation in knowledge update of expert systems. Inform. Manag. 1997, 32, 55–63. [Google Scholar] [CrossRef]
  31. Luo, C.; Luo, X.; Schatzberg, L.; Sia, C.L. Impact of informational factors on online recommendation credibility: The moderating role of source credibility. Decis. Support Syst. 2013, 56, 92–102. [Google Scholar] [CrossRef]
  32. Laugesen, J.; Hassanein, K.; Yuan, Y. The impact of Internet health information on patient compliance: A research model and an empirical study. J. Med. Internet Res. 2015, 17, e143. [Google Scholar] [CrossRef] [Green Version]
  33. Gunther, A.C. Biased press or biased public: Attitudes toward media coverage of social groups. Public Opin. Quart. 1992, 56, 147–167. [Google Scholar] [CrossRef]
  34. Kim, H.; Park, S.Y.; Bozeman, I. Online health information search and evaluation: Observations and semi-structured interviews with college students and maternal health experts. Health Inf. Libr. J. 2011, 28, 188–199. [Google Scholar] [CrossRef] [PubMed]
  35. Hsu, H.Y.; Tsou, H.T. Understanding customer experiences in online blog environments. Int. J. Inf. Manag. 2011, 31, 510–523. [Google Scholar] [CrossRef]
  36. Zhang, W.; Watts, S.A. Capitalizing on content: Information adoption in two online communities. J. Assoc. Inf. Syst. 2008, 9, 73–94. [Google Scholar]
  37. Tormala, Z.L.; Clarkson, J.J. Assimilation and contrast in persuasion: The effects of source credibility in multiple message situations. Pers. Soc. Psychol. B. 2007, 33, 559–571. [Google Scholar] [CrossRef] [PubMed]
  38. Cheung, M.Y.; Luo, C.; Sia, C.L.; Chen, H. Credibility of electronic word-of-mouth: Informational and normative determinants of on-line consumer recommendations. Int. J. Electron. Commer. 2009, 13, 9–38. [Google Scholar] [CrossRef]
  39. Zhang, L.; Peng, T.Q.; Zhang, Y.P.; Wang, X.H.; Zhu, J.J.H. Content or context: Which matters more in information processing on microblogging sites. Comput. Hum. Behav. 2014, 31, 242–249. [Google Scholar] [CrossRef]
  40. Chu, S.; Kim, Y. Determinants of consumer engagement in electronic word-of-mouth (eWOM) in social networking sites. Int. J. Advert. 2011, 30, 47–75. [Google Scholar] [CrossRef]
  41. Flanagin, A.J.; Metzger, M.J. Trusting expert-versus user-generated ratings online: The role of information volume, valence, and consumer characteristics. Comput. Hum. Behav. 2013, 29, 1626–1634. [Google Scholar] [CrossRef]
  42. Kappes, A.; Harvey, A.H.; Lohrenz, T.; Montague, P.R.; Sharot, T. Confirmation bias in the utilization of others’ opinion strength. Nat. Neurosci. 2020, 23, 130–137. [Google Scholar] [CrossRef]
  43. Thornhill, C.; Meeus, Q.; Peperkamp, J.; Berendt, B. A digital nudge to counter confirmation bias. Front. Big Data 2019, 2, 11. [Google Scholar]
  44. Schwarz, N.; Sanna, L.J.; Skurnik, I.; Yoon, C. Metacognitive experiences and the intricacies of setting people straight: Implications for debiasing and public information campaigns. Adv. Exp. Soc. Psychol. 2007, 39, 127–161. [Google Scholar]
  45. Song, H.; Schwarz, N. Fluency and the detection of distortions: Low processing fluency attenuates the Moses illusion. Soc. Cogn. 2008, 26, 791–799. [Google Scholar] [CrossRef]
  46. Zhang, K.Z.K.; Cheung, C.M.K.; Lee, M.K.O. Examining the moderating effect of inconsistent reviews and its gender differences on consumers’ online shopping decision. Int. J. Inf. Manag. 2014, 34, 89–98. [Google Scholar] [CrossRef]
  47. Kruglanski, A.W.; Thompson, E.P. Persuasion by a single route: A view from the unimodal. Psychol. Inq. 1999, 10, 83–109. [Google Scholar] [CrossRef]
  48. Jiang, S.; Beaudoin, C.E. Health literacy and the internet: An exploratory study on the 2013 HINTS survey. Comput. Hum. Behav. 2016, 58, 240–248. [Google Scholar] [CrossRef]
  49. Tracey, J.; Arroll, B.; Barham, P.; Richmond, D. The validity of general practitioners’ self assessment of knowledge: Cross sectional study. BMJ Clin. Res. 1997, 315, 1426–1428. [Google Scholar] [CrossRef] [Green Version]
  50. Chaiken, S.; Eagly, A.H. Communication modality as a determinant of message persuasiveness and message comprehensibility. J. Pers. Soc. Psychol. 1976, 34, 605–614. [Google Scholar] [CrossRef]
  51. Lord, K.R.; Lee, M.S.; Sauer, P.L. The combined influence hypothesis: Central and peripheral antecedents of attitude toward the Ad. J. Advert. 1995, 24, 73–85. [Google Scholar] [CrossRef]
  52. Petty, R.E.; Cacioppo, J.T.; Goldman, R. Personal involvement as a determinant of argument-based persuasion. J. Pers. Soc. Psychol. 1981, 41, 847–855. [Google Scholar] [CrossRef]
  53. Wu, P.; Wang, Y. The influences of electronic word-of-mouth message appeal and message source credibility on brand attitude. Asia Pac. J. Market. Logist. 2011, 23, 448–472. [Google Scholar] [CrossRef] [Green Version]
  54. Langfred, C.W. The downside of self-management: A longitudinal study of the effects of conflict on trust, autonomy, and task interdependence in self-managing teams. Acad. Manag. J. 2007, 50, 885–900. [Google Scholar] [CrossRef] [Green Version]
  55. Armstrong, J.S.; Overton, T.S. Estimating nonresponse bias in mail surveys. J. Mark. Res. 1977, 14, 396–402. [Google Scholar] [CrossRef] [Green Version]
  56. Park, T.; Ryu, D. Drivers of technology commercialization and performance in SMEs. Manag. Decis. 2015, 53, 338–353. [Google Scholar] [CrossRef]
  57. Farivar, S.; Turel, O.; Yuan, Y. A trust-risk perspective on social commerce use: An examination of the biasing role of habit. Internet Res. 2017, 27, 586–607. [Google Scholar] [CrossRef]
  58. Chin, W.W.; Thatcher, J.B.; Wright, R.T. Assessing common method bias: Problems with the ULMC technique. MIS Quart. 2012, 36, 1003–1019. [Google Scholar] [CrossRef] [Green Version]
  59. Shiau, W.L.; Yuan, Y.; Pu, X.; Ray, S.; Chen, C.C. Understanding fintech continuance: Perspectives from self-efficacy and ECT-IS theories. Ind. Manag. Data Syst. 2020, 120, 1659–1689. [Google Scholar] [CrossRef]
  60. Fornell, C.; Larcker, D.F. Evaluating structural equation models with unobservable variables and measurement error. J. Mark. Res. 1981, 18, 39–50. [Google Scholar] [CrossRef]
  61. Falk, A.; Kosfeld, M. The hidden costs of control. Am. Econ. Rev. 2006, 96, 1611–1630. [Google Scholar]
  62. Henseler, J.; Ringle, C.M.; Sarstedt, M. A new criterion for assessing discriminant validity in variance-based structural equation modeling. J. Acad. Mark. Sci. 2015, 43, 115–135. [Google Scholar] [CrossRef] [Green Version]
  63. Cho, C.H.; Martens, M.L.; Kim, H.; Rodrigue, M. Astroturfing global warming: It isn’t always greener on the other side of the fence. J. Bus. Ethics 2011, 104, 571–587. [Google Scholar] [CrossRef]
  64. Del Vicario, M.; Bessi, A.; Zollo, F.; Petroni, F.; Scala, A.; Caldarelli, G.; Stanley, H.E.; Quattrociocchi, W. The spreading of misinformation online. Proc. Natl. Acad. Sci. USA 2016, 113, 554–559. [Google Scholar] [CrossRef] [Green Version]
  65. Kim, A.; Dennis, A.R. Says who? The effects of presentation format and source rating on fake news in social media. MIS Quart. 2019, 43, 1025–1039. [Google Scholar] [CrossRef]
  66. Caulfield, T. Pseudoscience and COVID-19—We’ve had enough already. Nature. 2020. Available online: https://0-www-nature-com.brum.beds.ac.uk/articles/d41586-020-01266-z (accessed on 27 April 2020).
  67. Goel, S.; Anderson, A.; Hofman, J.; Watts, D.J. The structural virality of online diffusion. Manag. Sci. 2016, 62, 180–196. [Google Scholar]
  68. Glöckner, A.; Bröder, A. Processin information and additional cues: A model-based analysis of choice, confidence, and response time. Judgm. Decis. Mak. 2011, 6, 23–42. [Google Scholar]
  69. Neys, W.D.; Cromheeke, S.; Osman, M. Biased but in doubt: Conflict and decision confidence. PLoS ONE 2011, 6, e15954. [Google Scholar]
  70. Leiserowitz, A.; Maibach, E.; Roser-Renouf, C.; Hmielowski, J.D. Politics and global warming: Democrats, Republicans, Independents, and the Tea Party Yale University and George Mason University New Haven, CT: Yale Project on Climate Change Communication. Available online: http://environment.yale.edu/climate/files/politicsglobalwarming2011.pdf (accessed on 5 June 2019).
Figure 1. Research model.
Figure 1. Research model.
Ijerph 18 01345 g001
Figure 2. Health misinformation rebuttal.
Figure 2. Health misinformation rebuttal.
Ijerph 18 01345 g002
Figure 3. Results of the research model.
Figure 3. Results of the research model.
Ijerph 18 01345 g003
Table 1. Measurement items and sources.
Table 1. Measurement items and sources.
ConstructsMeasurement ItemsSources
Information RelevanceThe information applies to my needs.
The information is relevant to me.
How much are you interested in the information?
Information UnderstandabilityThe information is clear in meaning.[32]
I think the information is easy to read.
I think the information is understandable.
Information AdequacyI think the health message provides complete information.
I think the health message provides adequate information.
I think the health message provides sufficient information.
Information UsefulnessI think the information is informative.
I think the information is helpful.
I think the information is useful.
Perceived Information QualityI think the information has high quality.[15]
I think the information is valuable.
I think the information is meaningful.
Source ExpertiseI think the information provider is an expert on this topic[53]
I think the information publisher is familiar with related knowledge.
I think the information publisher has the qualifications for publishing speeches about the topic.
Source AuthorityI think the information publisher is influential.[21]
I think the information publisher is reputed.
I think the information publisher is authoritative.
Perceived Source CredibilityI think the source of the information is reliable.[53]
I think the source of the information is dependable.
I think the source of the information is trustworthy.
Cognitive ConflictThis is the degree to which the information differs from what you already know.[15,54]
This is the degree to which this information conflicts with your prior knowledge.
This is the degree to which this information is inconsistent with your original perception.
Knowledge Self-confidenceHow much do you know about related knowledge?[24]
How much are you familiar with related knowledge?
How quickly do you grasp related knowledge?
Perceived Credibility of Health Misinformation RebuttalsI think the information is credible.[21]
I think the information is authentic.
I think the information is believable.
Table 2. Demographic information of study participants.
Table 2. Demographic information of study participants.
Demographic CategoryNumber (N = 415)Percentage (%)
Gender
Men16640%
Women24960%
Age (years)
<1840.96%
18−2510324.82%
26–3012630.06%
31−4015537.35%
>40236.51%
Education
High school or below81.93%
Bachelor’s degree37991.32%
Master’s degree266.27%
Doctoral degree20.48%
Table 3. Cronbach’s alpha (CA), composite reliability (CR), and average variance extracted (AVE).
Table 3. Cronbach’s alpha (CA), composite reliability (CR), and average variance extracted (AVE).
ConstructsCArho ACRAVE
Information Relevance0.7470.7630.8560.665
Information Understandability0.7680.7860.8650.681
Information Adequacy0.8570.8590.9130.777
Information Usefulness0.8480.8520.9080.766
Perceived Information Quality0.8000.8020.8820.714
Source Expertise0.7990.8040.8820.713
Source Authority0.7750.8140.8660.683
Perceived Source Credibility0.8950.8960.9350.827
Cognitive Conflict0.9130.9200.9450.852
Knowledge Self-confidence0.8800.8930.9260.807
Perceived Credibility of Health Misinformation Rebuttals0.8850.8870.9290.813
Table 4. Item factor loadings and cross-loadings.
Table 4. Item factor loadings and cross-loadings.
IRIUNIAIUSPIQSESAPSCCCKCPIC
IR10.8420.3040.3310.4030.5080.3040.3580.451−0.1200.0260.442
IR20.8630.3070.4600.4890.5330.3670.3750.513−0.1280.0930.469
IR10.7360.3030.3100.3940.4130.4720.4010.4380.041−0.0110.253
IUN10.4070.8490.4750.4990.4540.4040.3670.3780.019−0.1000.250
IUN20.2490.8440.3290.4560.3830.3490.3670.306−0.045−0.1040.230
IUN30.2400.7810.2740.3830.3260.2660.2240.202−0.0860.0220.222
IA10.3640.4070.8620.3980.3450.2920.3590.403−0.0730.0700.295
IA20.4260.3950.8840.4670.3770.4390.3920.510−0.0550.1120.367
IA30.4090.3810.8990.3820.3770.3480.3650.409−0.0720.0850.335
IUS10.4630.4970.4440.8930.5430.6480.5490.686−0.1590.0700.480
IUS20.5180.4660.4210.8880.5680.6390.5670.650−0.0930.0480.459
IUS30.3970.4720.3720.8460.4970.5690.4620.612−0.1170.0070.420
PIQ10.5380.3760.4090.5280.8340.5310.5140.598−0.2140.1220.665
PIQ20.4820.4130.3030.5360.8600.4290.4390.529−0.2100.0530.587
PIQ30.4910.4210.3360.4870.8400.3840.4370.475−0.2120.0750.517
SE10.4380.2650.3060.5890.4100.8760.6250.650−0.0430.0780.521
SE20.3420.4560.2760.6510.4680.8230.6280.559−0.073−0.0270.418
SE30.3740.3560.4510.5610.4840.8330.6470.624−0.087−0.0260.408
SA10.2950.2980.2720.4380.4140.5720.8150.467−0.1460.0160.344
SA20.3040.3770.2720.4550.4750.5860.8090.467−0.0750.0230.432
SA30.4920.3150.4550.5740.4750.6820.8550.694−0.003−0.0580.486
PSC10.5410.3460.4480.6630.5990.6900.6510.902−0.1740.0470.569
PSC20.4940.3150.4290.6660.5380.6310.5840.919−0.1900.0820.555
PSC30.5270.3420.4870.6940.5950.6570.6150.907−0.2200.0960.606
CC1−0.085−0.026−0.079−0.117−0.247−0.064−0.087−0.1830.911−0.409−0.334
CC2−0.097−0.026−0.072−0.144−0.254−0.080−0.065−0.2280.938−0.400−0.345
CC3−0.078−0.057−0.056−0.126−0.188−0.076−0.064−0.1770.921−0.399−0.291
KC10.003−0.0880.067−0.0010.086−0.026−0.013−0.002−0.4180.9180.220
KC20.078−0.0340.0870.0850.0980.033−0.0340.085−0.3750.9260.206
KC30.056−0.1000.1290.0520.0860.0330.0050.157−0.3830.8480.174
PIC10.4550.2680.3130.5360.6640.4980.4780.607−0.3250.2370.905
PIC20.4530.2440.3390.4260.5960.4340.4330.532−0.3290.2160.903
PIC30.4060.2570.3730.4350.6390.5100.4880.575−0.2990.1530.897
Note: IR, Information relevance; IUN, Information understandability; IA, Information adequacy; IUS, Information usefulness; PIQ, Perceived information quality; SE, Source expertise; SA, Source authority; PSC, Perceived source credibility; CC, Cognitive conflict; KC, Knowledge self-confidence; PIC, Perceived information credibility.
Table 5. Latent variable correlations.
Table 5. Latent variable correlations.
IRIUNIAIUSPIQSESAPSCCCKCPIC
IR0.815
IUN0.3720.825
IA0.4540.4470.882
IUS0.5270.5460.4720.875
PIQ0.5980.4760.4160.6140.845
SE0.4570.4190.4100.7080.5350.844
SA0.4590.3950.4220.6030.5510.7500.826
PSC0.5730.3680.5010.7420.6360.7260.6790.909
CC−0.095−0.038−0.075−0.140−0.251−0.079−0.078−0.2140.923
KC0.049−0.0810.1020.0490.1000.012−0.0160.082−0.4360.898
PIC0.4860.2850.3780.5180.7030.5340.5180.635−0.3520.2240.902
Note: IR, Information relevance; IUN, Information understandability; IA, Information adequacy; IUS, Information usefulness; PIQ, Perceived information quality; SE, Source expertise; SA, Source authority; PSC, Perceived source credibility; CC, Cognitive conflict; KC, Knowledge self-confidence; PIC, Perceived information credibility.
Table 6. Value of the heterotrait-monotrait ratio.
Table 6. Value of the heterotrait-monotrait ratio.
IRIUNIAIUSPIQSESAPSCCCKCPIC
IR
IUN0.479
IA0.6050.617
IUS0.6580.6700.613
PIQ0.7670.5990.5340.742
SE0.6700.5260.6510.8580.737
SA0.4820.5070.4910.6610.6330.861
PSC0.7000.4300.6240.8510.7450.8840.685
CC0.1480.0740.0910.1590.2910.0990.1300.235
KC0.0810.1200.1130.0770.1170.0840.0540.1070.486
PIC0.5850.3430.4410.5950.8280.6560.5350.7110.3900.252
Note: IR, Information relevance; IUN, Information understandability; IA, Information adequacy; IUS, Information usefulness; PIQ, Perceived information quality; SE, Source expertise; SA, Source authority; PSC, Perceived source credibility; CC, Cognitive conflict; KC, Knowledge self-confidence; PIC, Perceived information credibility.
Table 7. Results of the hypotheses testing.
Table 7. Results of the hypotheses testing.
tConfidence Intervals f 2
2.5%97.5%
H110.4850.3630.5230.291
H1a6.5590.2420.4630.169
H1b2.8200.0600.2610.029
H1c0.243−0.1240.1520.000
H1d3.4300.1260.5120.122
H26.6320.2120.3990.136
H2a14.4200.5690.7520.560
H2b2.2060.0230.2280.018
H3a3.897−0.270−0.0890.032
H3b5.5560.1220.2540.046
H4a2.811−0.232−0.0470.020
H4b0.573−0.0730.1200.001
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Sui, Y.; Zhang, B. Determinants of the Perceived Credibility of Rebuttals Concerning Health Misinformation. Int. J. Environ. Res. Public Health 2021, 18, 1345. https://0-doi-org.brum.beds.ac.uk/10.3390/ijerph18031345

AMA Style

Sui Y, Zhang B. Determinants of the Perceived Credibility of Rebuttals Concerning Health Misinformation. International Journal of Environmental Research and Public Health. 2021; 18(3):1345. https://0-doi-org.brum.beds.ac.uk/10.3390/ijerph18031345

Chicago/Turabian Style

Sui, Yujia, and Bin Zhang. 2021. "Determinants of the Perceived Credibility of Rebuttals Concerning Health Misinformation" International Journal of Environmental Research and Public Health 18, no. 3: 1345. https://0-doi-org.brum.beds.ac.uk/10.3390/ijerph18031345

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop