Online Vaccine Information in a Knowledge Exchange Social Website (KESW)


Background: The potential for misinformation on usercontrolled Knowledge Exchange Social Websites (KESWs) is concerning since it can actively influence Internet users’ knowledge, attitudes, and behaviors related to childhood vaccinations. Objective: The present study examines the accuracy and predictors of health information posted to a Knowledge Exchange Social Website (KESW). Methods: A sample of 480 answers to childhood vaccination questions were retrieved and rated for accuracy. Multiple logistic regression modeling was used to examine whether answer characteristics (best answer, professional background, statistical information, source disclosure, online link, word count, vaccine stance, and tone) predict accuracy. Results: Overall, only 56.2% of the posted answers were rated as “accurate.” Accuracy varied by topics with between 52.8% - 64.3% being rated as accurate. When Yahoo Answers’ “best answers” were examined, only 49.2% rated as accurate compared to 57.7% of all other answers, a finding attributed to widespread nominations of vaccine misinformation as “best answers” for questions addressing the side effects of vaccines. For all other types of questions, “best answers” were more likely to be accurate. Regression modeling revealed that discussions of personal choices regarding childhood vaccinations predicted the accuracy of posted answers, with those who mentioned vaccinating their own children proving more likely to communicate accurate vaccine information, and those expressing vaccine hesitancy proving more likely to share factually inaccurate statements about vaccines. Conclusion: The high prevalence of misinformation on KESWs suggests that these websites may serve as a vector for spreading vaccine misperceptions. Further research is needed to assess the impact of various KESWs and to develop effective, coordinated responses by public health agencies.

Share and Cite:

Gorman, F. , Yadegarians, D. , Meng, L. , Gorman, N. and Johnston, E. (2020) Online Vaccine Information in a Knowledge Exchange Social Website (KESW). Open Journal of Preventive Medicine, 10, 151-167. doi: 10.4236/ojpm.2020.106011.

1. Introduction

Increasingly the Internet is the public’s first destination when faced with health questions [1] [2], with over a third of adults in the United States reporting going online to self diagnose a medical condition and over two thirds seeking more general health information online [3]. The impetuses driving online health information seeking are manifold and include benefits such as the availability of immediate answers [2] [4], social support [4] [5] [6], and opportunities to bypass traditional face-to-face meetings with health practitioners [4].

Regarding immediacy, while wait times for face-to-face appointments with health practitioners vary widely, research suggests that wait times in the US are often considerable, with estimates ranging from 24 - 30 days for first time patients [7] [8]. In contrast, the Internet contains a wealth of static webpages that can be accessed immediately and interactive forums that yield near-instantaneous responses from other users [2]. Opportunities for near instantaneous health answers may be particularly enticing for individuals experiencing anxiety due to risky health behaviors or unexpected symptoms. For example, reading that there are benign explanations for sudden angina may offer health information seekers with immediate relief that could otherwise be delayed until an appointment can be scheduled with a physician.

Research has also shown that online communities built around health conditions can serve multiple supportive roles including supplying health information, providing an outlet for cathartic release, and offering social support [5] [6] [9] [10]. Not only can online communities provide health information seekers with answers to their direct queries, but research has also shown that health information seekers value access to broader accounts of other patients’ experiences [10]. While users may initially approach online forums with the intention of seeking health information, hearing the experiences of others with the same condition and receiving messages of support from forum users may provide a sense of social support unlike that experienced solely in face-to-face appointments with healthcare practitioners.

Several barriers inherent to traditional face-to-face meetings with healthcare practitioners may also explain increasing rates of online health information seeking. For instance, some health information seekers face fiscal barriers including lack of health insurance or funds to cover copayments [11] [12]. As of 2017, the US Census Bureau reported that 8.8% of the population experienced a lack of health insurance in the past year, with uninsured rates reaching as high as 16.1% in some subpopulations [13]. Further, estimates suggest up to 24.2% - 34.9% of adults in the United States experience underinsurance that may interfere with access to healthcare [12]. In these instances, health information seekers may perceive the Internet as their only means of obtaining high-quality health information.

For other health information seekers, face-to-face meetings with healthcare practitioners represent a stressful encounter in which they expect their health beliefs and practices to be challenged. For instance, parents who are opposed to childhood vaccinations may find seeking health information on anti-vaccination forums to be less stressful and confrontational than appointments with pediatricians who may challenge their vaccine beliefs. The sheer volume of health information posted online means that adherents of nearly any health belief, whether grounded in evidence-based practice or not, can find resources justifying their current health beliefs and practices.

While some health information seekers go online to avoid confrontations with health care practitioners, others go online to avoid embarrassment. The inherent anonymity of online health information seeking may be especially appealing to individuals who have engaged in taboo health behaviors or who feel that there is a societal stigma attached to their health condition [9] [14] [15], as previous research has shown that societal taboos can delay physician visits and detection of health conditions [16]. Even in the absence of social stigmas and taboos, fear of simple nonverbal cues of disapproval or judgment from healthcare practitioners may be sufficient to serve as a barrier to face-to-face appointments [9].

Risks on Online Health Information Seeking

Despite the allure of the Internet as a source of health information, research has documented several risks inherent to this medium, including the presence of widespread misinformation [17]. Past research has described various forms of online misinformation ranging from outright factual errors and deception to more subtle issues like omissions of details and the presence of outdated information [17].

The presence of misinformation online is concerning, as research has shown that such information shapes individuals’ knowledge, attitudes, and health behaviors [18]. With health information seekers increasingly turning toward the Internet as their first source of medical advice [1] [2], this misinformation poses special risks. For example, research has shown that exposure to online health information has changed the dynamic between patients and healthcare professionals during face-to-face visits [19] [20]. As a result of exposure to misinformation, patients are increasingly challenging their healthcare providers’ medical advice while physicians are reporting having to dedicate more of their limited time with patients to addressing concerns [21].

The presence of widespread misinformation online is especially insidious for individuals with low health literacy and e-health literacy skills, as these individuals may lack the ability or motivation to critically evaluate the veracity of the information that they encounter [22] [23] [24] [25]. Not only are these individuals at higher risk of engaging with inaccurate health information online, but efforts to improve consumers’ ability to navigate online health information have also found that the efficacy of such interventions is moderated by individuals’ health literacy skills, with only those with already high health literacy benefitting [26].

The risks posed by widespread misinformation online and low health literacy levels are further exacerbated by health information seekers’ often idiosyncratic, potentially biased search strategies [27]. For instance, while 77% of health information seekers reported that their hunt for health information began at a search engine such as Google, Bing, or Yahoo [3], only about 16% of health information seeking consumers report a preference for searching for credible websites such as government sponsored or academic websites [28].

The sheer volume of information online, presence of widespread misinformation, and preferences for accessing less credible websites together also facilitate confirmation biases, in which a person seeks or favors information that confirms their preexisting beliefs [29]. Among health information seekers, confirmation biases lead users to selectively seek out and access only health information outlets that mirror their own preexisting beliefs [30] [31]. By sheltering themselves from information that could challenge their beliefs, health information seekers may reinforce their preexisting misperceptions and make healthcare decisions based on poor or limited information.

Knowledge Exchange Social Websites (KESWs)

One health information channel warranting special attention is Knowledge Exchange Social Websites (KESWs) since the content is entirely user-generated. KESWs are classified as Web 2.0 websites, a category that also includes social media and message boards and is defined by users’ ability to engage with the website simultaneously as both contributors and consumers of content [32]. KESWs differ from other Web 2.0 sites in that their structure is strictly intended for the asking and answering of questions. Unlike message boards, KESWs often span many topics and frequently allow users to subjectively rate other users’ responses through systems of upvoting and downvoting. Similarly, some KESWs provide an option to identify “best answers” on the basis of upvotes, downvotes, or selection by the original question poster.

The format of KESWs is concerning, as the presence of user-generated questions and answers creates opportunities for the spread of misinformation above and beyond the risks poses in many other online health channels. For instance, because any KESW user can answer posted medical questions regardless of their actual knowledge, experience, or credentials, there are concerns about the accuracy of medical advice on these sites [22]. The impact of misinformation on KESWs is particularly concerning, as it can be magnified in two unique ways. First, the systems of upvoting and downvoting respondents’ answers means that medical question posters’ own misperceptions, ignorance, or biases may be reflected in how answers are arranged. By using KESWs’ upvoting, downvoting, and best answer features, question posters can not only seek out answers that reinforce their own beliefs, but also move those answers to the top of page, ensuring that those answers are the answers first seen and read by other KESW viewers. The inherent anonymity of KESWs also enables respondents to provide falsified credentials to lend credibility to their answers. In a system of anonymous question and answer posters, there is simply no way to confirm whether respondents actually are doctors, nurses, or other medical professionals, though these claims may lend respondents’ answers a patina of credibility regardless of the veracity of their claims.

While health questions are common across question-and-answer format websites [33] like KESWs, little is currently known about veracity of the health advice currently being disseminated through this channel.

Childhood Vaccination & Knowledge Exchange Social Websites

Given the potential for KESWs to disseminate incomplete or inaccurate health information and sparse information regarding the veracity of posted answers, it may be edifying to explore the accuracy of information being posted on timely and polarizing health issues. Childhood vaccinations, as a contentious health topic plagued by misinformation, may serve as useful entry point for exploring KESWs as a channel for health information.

Despite recent outbreaks of largely vaccine-preventable diseases, vaccine hesitancy in the US remains high. Notably, the 2014-2015 outbreak of measles, an infectious disease declared eradicated in the US in 2000, was linked to initial exposure from an unvaccinated individual at an amusement park in California [34]. The vaccination status was documented for 34 of the 59 cases, and of the 34, more than half were unvaccinated [34]. This occurrence prompted the topic of childhood vaccinations to come back in the public eye and continues to spark discussions pertinent to the urgency of vaccinations. Despite the high visibility of the 2014-2015 outbreak, measles outbreaks have continued, with 118 cases of measles reported from 15 states in 2017 [35].

Vaccine hesitancy is at least partially fueled by misinformation, which is particularly widespread in online settings [36]. Even brief exposure to online misinformation can influence perceptions of vaccine risks and result in poorly informed decisions [37] [38]. Of particular importance in the discussion of vaccinations and accuracy of health information is the role of vaccine hesitant individuals. The anti-vaccination movement originated with the now retracted and debunked study published by Andrew Wakefield in 1998, which suggested a causal link between the MMR vaccine and autism. This seminal study fueled the autism-vaccine debate, which has shaped parent’s attitudes and is still a concern in today’s society [39]. In fact, the recent Somali immigrants’ measles outbreak in Minnesota was fueled by targeted efforts of the vaccine hesitant community. Prior to 2008, the Somali community had a high rate of vaccine coverage but became a target of the anti-vaccination movement, which has influenced inaccurate fear that the MMR vaccine causes autism [40].

Furthermore, the Internet is increasingly being used to obtain information about vaccines, and the media continues to disseminate fear and misleading information on this topic. In online communities where there is greater freedom of speech and unmoderated content, such as a KESW, there is a dominance of viewpoints that link autism to vaccines [41]. Not only does the online search term “vaccination” yield more anti-vaccination or vaccine critical websites [42] [43], but parents that search for vaccine information on the Internet are more likely to have lower perceptions about vaccine safety and effectiveness [44]. Consequently, trends have observed that Internet users seek information about possible vaccine side-effects [45], which has potential to guide a consumer’s attitude due to the prevalence of vaccine critical websites despite being evaluated as lower quality [46].

The present study seeks to describe the types of childhood vaccine questions and answers being posted to a KESW and to rate the accuracy of the information posted in order to inform future online public health interventions targeting online media. This study serves as a first step by quantifying the extent and nature of misinformation in the under-studied and high-risk setting of KESWs.

2. Methods

Data collection

The decision was made to focus the present study on a single KESW. Of the KESWs reviewed, Yahoo Answers was selected on the basis of its popularity, the prevalence of posted vaccine questions, and features enabling the retrieval of questions. In 2016 Yahoo was ranked as the third most popular multi-platform web property in the United States, with 204 million unique visitors in a single month [47].

During Spring 2015, a quota sample of 220 posts with the keyword “children vaccinations” were extracted from Yahoo Answers for analysis (see Figure 1). The decision was made a prior to extract the first 220 posts in order to build a pool of approximately 500 total answers to examine. The estimation of number of posts to extract was based on previous experience by the lead authors [48]. Upon initial review, 153 (69.55%) posted questions were excluded as they either: 1) solicited subjective responses that could not be rated for accuracy, 2) asked policy questions that could not be assessed for accuracy due to variance in vaccine policies worldwide and no way to discern respondents’ locations, or 3) were not directly about childhood vaccinations (see Table 2 for an example of an excluded question). Review of the remaining 67 posted questions revealed that 4 posts were actually comprised of 2 distinct questions. Each of the distinct questions in these 4 posts was examined independently, resulting in a total list of 71 questions about childhood vaccinations (see Figure 1).

A total of 480 answers were given to the final 71 questions. Questions received between 1 to 11 answers (M = 6.76, SD = 3.04), with only 2 questions (2.82%) receiving a single answer, and 10 being the most common number of responses observed.

Figure 1. Flow chart of dataset development. This figure illustrates the exclusion criteria utilized to reduce the initial 200 posts to the 71 unique questions and 347 posted answers included for analysis.

Table 1. Answer characteristics examined.

In addition to questions and answers, eight accompanying data points were extracted from each answer (see Table 1).

Answer Accuracy

In order to evaluate the accuracy of each posted answer, answers were coded into one of four categories (see Table 2). Due to potentially severe consequences of disseminating misinformation about vaccinations, it was decided to forgo a “partially accurate” category and instead rate answers as inaccurate if any errors

Table 2. Accuracy coding schema.

were included, regardless of how much correct information they may have also contained.

In an adaptation of the methodology employed by Buhi, Daley, Oberne, Smith, Schneider, and Fuhrmann (2010), the accuracy of each answer was assessed independently by two trained research assistants with the support of a physician. The two research assistants arrived at the same accuracy rating for 58.86% of the questions. When accuracy ratings differed, a physician provided an expert opinion as the tiebreaker.

3. Data Analysis

All quantitative analyses were conducted by one of the authors using SPSS version 26.0 [49].

A thematic analysis was conducted in order to establish a codebook of the types of questions being asked about childhood vaccinations on Yahoo Answers [50]. In the first stage, two of the authors read through the entirety of the set of questions in order to familiarize themselves with the data. Following the read-through, both readers independently developed a set of emergent themes to organize the types of questions asked. These emergent themes were then shared with the full research team who helped to reconcile differences in the two authors’ coding schemes and arrive at a final coding scheme to categorize questions into one of five categories (see Table 3).

Simple descriptive statistics (frequency and valid percent) and bar charts were employed to examine the types of childhood vaccination questions being asked, the accuracy of answers to these questions, and the role of answers voted “Best Answer.”

Multiple logistic regression modeling was used to examine whether answer characteristics (best answer, professional background, statistical information, source disclosed, online link, word count, vaccine stance, and tone) predict accuracy (re-coded to a dichotomous accurate vs. inaccurate). Answers that fundamentally failed to address the question asked (i.e. subjective, policy, or

Table 3. Final coding scheme for types of questions posted.

unanswered) were excluded from the logistic regression model, as readers looking for an answer to a health question could reasonably be expected to disregard these answers. As there were no a priori predictions regarding which variables would emerge as significant predictors of answers’ accuracy, all predictors were force-entered together in a single block.

4. Results

Types of Childhood Vaccine Questions Asked

Figure 2 illustrates the frequency of the four types of questions observed. Concerns regarding the adverse effects of vaccinations were common, with the two most frequent types of questions focusing entirely or at least partially on adverse reactions. In contrast, very few questions focused exclusively on the risks posed by foregoing vaccinations.

Accuracy of Childhood Vaccine Answers

Of the answers that could be objectively rated as accurate or inaccurate, 56.2% of answers overall were rated as accurate (i.e. answering the question asked and containing no factual errors (see Table 2), though this varied by the type of question asked: see Figure 3).

When Yahoo Answers’ “best answers” were examined, an unexpected pattern of data emerged. While answers voted the “best answer” were more likely to be accurate for questions about vaccine schedules, the risks of not vaccinating, and the risks of not vaccinating versus the risks of adverse reactions, the differences between the accuracy of “best answers” versus other answers were modest. However, a pronounced 20.9% difference was observed among questions focusing on adverse reactions, with “best answers” being less likely to be accurate (see Figure 4).

Predictors of Answer Accuracy

Logistic regression modeling was conducted to examine whether several characteristics of posted answers (Best Answer, Professional Background, Statistics, Source, Link, Word Count, Vaccine Stance, and Tone; see Table 1) predict the answers’ accuracy.

Figure 2. Types of vaccine questions asked. This figure shows the proportion of posted questions devoted to each of the four types of questions identified during thematic analysis.

Figure 3. Accuracy of answers by question type. This figure summarizes the proportion of questions that were answered accurately overall and individually for each of the four types of questions identified during thematic analysis.

Ultimately, Best Answer, Professional Background, Statistics, Source, Link, Word Count, Vaccine Stance, and Tone together were found to serve as a statistically significant predictor of answers’ accuracy ( χ ( 9 ) 2 = 40.11, p < 0.001; Cox & Snell R2 = 0.11, Nagelkerke R2 = 0.15). after controlling for the other predictors in the model, only Vaccine Stance emerged as a statistically significant predictor of answers’ accuracy, with respondents who reported not vaccinating their children, under-vaccinating their children, or vaccinating their children on an alternative schedule being between 0.17 - 0.22 times as likely to answer each question accurately. On the other hand, those who reported that vaccinate their own children were 2.35 times as likely to provide completely accurate answers to posted questions (see Table 4).

Figure 4. Accuracy of answers by question type and “best answer.” This figure summarizes the proportion of questions that were answered accurately overall and individually for each of the four types of questions identified during thematic analysis separately for answers marked “best answer” and for all other answers.

Table 4. Summary of logistic regression modeling of KESW answer characteristics on answer accuracy (n = 339).

Note. Overall model statistics: χ ( 9 ) 2 = 40.11, p < 0.001; Cox & Snell R2 = 0.11, Nagelkerke R2 = 0.15. *Reference Category: Vaccine stance not mentioned.

5. Discussion

Overall, the accuracy of health information regarding childhood vaccinations on Yahoo Answers is troubling, with only slightly over half of the answers examined being rated as accurate. This is concerning given that the questions examined had, on average, at least six replies. If question posters set out to find information that reinforces their preexisting beliefs, these data suggest that it is likely that they’ll receive at least some answers that align with their own beliefs regardless of the accuracy of those beliefs. When examined separately by theme, it was discovered that answers’ accuracy varied relatively little from topic to topic.

Regarding the themes examined, KESW users appeared to focus primarily on concerns about potential adverse reactions to vaccines. In fact, not only were adverse reactions to vaccines the most often type of question asked, but these questions appeared more than 6 times as often as questions about the risks posed by foregoing vaccinations. This mirrors findings in previous literature regarding Internet users’ health information seeking [45] [51].

Interestingly, questions about vaccine side effects were not only the most frequent type asked, but also were the least likely to be accurately answered. In fact, this was the only theme in which “best answers” were less likely to be accurate than other posted answers. Users not only frequently visited this KESW with questions about the risks of vaccines, but also showed a preference for answers with factual inaccuracies. This is important to note because Internet users are likely to examine “best answers” first and hold these answers in higher regard. Against a backdrop of rising vaccine hesitancy, recent outbreaks of vaccine-preventable diseases, and the increasing prominence of the internet as a first, and sometimes only, source of health information, any website facilitating the transmission misinformation about vaccines should be a cause for concern.

The proliferation of questions about vaccine side effects and tendency to select inaccurate responses as “best answers” suggests that KESWs may be disproportionately attracting vaccine hesitant parents. This has the potential to create an unfortunately feedback loop; as KESWs become populated with inaccurate information, they in turn attract audiences seeking out information to support their own misconceptions. This audience, in turn, exacerbates the problem by then marking inaccurate responses to their own queries as “best answers.”

The regression modeling examined in this study highlighted the importance of KESW users’ stated vaccine stance. The data is largely as expected, with those most in favor of vaccination (as reflected by anecdotally vaccinating their own children) providing more accurate information about vaccines and those expressing vaccine hesitancy (by refusing vaccinations for their children altogether, refusing specific vaccinations, or preferring an alternative timeline to space out vaccinations) proving more likely to make factually inaccurate claims about vaccines. However, it is interesting to note that relatively little difference was observed in the accuracy of statements between those who refuse childhood vaccinations altogether and those refusing just specific vaccines or preferring alternative childhood vaccine schedules. Regardless of the degree of vaccine hesitancy users expressed, their likelihood of making factually inaccurate claims about vaccines was remarkably similar.

While these data offer some preliminary insights into the landscape of childhood vaccine questions and answers on KESWs, several limitations warrant consideration. For instance, as the present study was correlation in design, not only can causal claims not be established, but the study is also not sensitive to trends over time. In the wake of news stories highlighting disease outbreaks, celebrities advocating for alternative health practices, and the continued, diverse efforts health agencies to promote best practices, it seems reasonable to expect some degree of fluctuation in both the types of queries posted to KESWs and the accuracy of the responses they will receive. Future research utilizing longitudinal designs may be warranted to better understand overall trends in the accuracy of KESW-generated answers, as well as the degree to which the content of these websites is dependent on time-sensitive, outside factors such as major news headlines and disease outbreaks.

This study is similarly limited by its focus on a single KESW. The findings in this study may be at least partially attributable to the demographics of the users of this specific KESW. As a result, it’s possible that the findings presented here may not generalize to other KESW platforms such as Reddit, which may draw different audiences. Direct comparisons of the content on different KESWs may be challenging due to differing features of those sites. For instance, Yahoo Answers’ “best answer” feature highlights just a single answer and is utilized by just the person who posted the question, while sites like Reddit use a system of “upvotes” and “downvotes” in which every single response to a post can be promoted or suppressed by all the visitors to the thread. Despite structural differences like these, there may be value in comparing the accuracy of vaccine information disseminated through various KESWs. Similarly, future research exploring how users’ demography and KESWs’ structures influence the accuracy of posted health information may be informative.

As a final consideration, while the present study found a considerable amount of inaccurate information posted in response to questions about childhood vaccination, relatively little is known about how health information consumers utilize the information on KESWs. For instance, the relative weight users give to “best answers,” whether users preferentially scan replies for answers that reinforce their own beliefs, and differences in the information seeking strategies of question posters versus those who simply read others’ questions and replies, is not entirely understood. Future research exploring the health information seeking strategies of KESW users may provide important context regarding the actual risks posed by posted misinformation.

6. Conclusion

Ultimately, the accuracy of childhood vaccine information posted on Yahoo Answers was low, illustrating the potential risks of relying on KESWs as a source of health information. While further research is needed to explore the role of KESWs as a source of health information and potential foundation for health behaviors, the presence of misinformation about vaccines online is hardly novel. The present study serves to highlight the importance of considering KESWs as one part of the broader discussion of how health professionals can address misperceptions about vaccines and the proliferation of misinformation online. Public health professionals may consider increasing their health education outreach online and incorporating evidence-based strategies that target information seekers with low health literacy.

Conflicts of Interest

The authors declare no conflicts of interest regarding the publication of this paper.


[1] Chu, J.T., Wang, M.P., Shen, C., Viswanath, K., Lam, T.H. and Chan, S. (2017) How, When and Why People Seek Health Information Online: Qualitative Study in Hong Kong. Interactive Journal of Medical Research, 6, e24.
[2] Amante, D.J., Hogan, T.P., Pagoto, S.L., English, T.M. and Lapane, K.L. (2015) Access to Care and Use of the Internet to Search for Health Information: Results from the US National Health Interview Survey. Journal of Medical Internet Research, 17, e106.
[3] Fox, S. and Duggan, M. (2013) Health Online 2013. Health, 2013, 1-55.
[4] Cline, R.J. and Haynes, K.M. (2001) Consumer Health Information Seeking on the Internet: The State of the Art. Health Education Research, 16, 671-692.
[5] Chou, W.S., Hunt, Y.M., Beckjord, E.B., Moser, R.P. and Hesse, B.W. (2009) Social Media Use in the United States: Implications for Health Communication. Journal of Medical Internet Research, 11, e48.
[6] Neiger, B.L., Thackeray, R., van Wagenen, S., Hanson, C.L., West, J.H., Barnes, M.D. and Fagen, M.C. (2012) Use of Social Media in Health Promotion: Purposes, Key Performance Indicators, and Evaluation Metrics. Health Promotion Practice, 13, 159-164.
[7] Miller, P. (2017) 2017 Survey of Physician Appointment Wait Times and Medicare and Medicaid Acceptance Rate.
[8] Penn, M., Bhatnagar, S., Kuy, S., Lieberman, S., Elnahal, S., Clancy, C. and Shulkin, D. (2019) Comparison of Wait Times for New Patients between the Private Sector and United States Department of Veterans Affairs Medical Centers. JAMA Network Open, 2, e187096.
[9] Allen, C., Vassilev, I., Kennedy, A. and Rogers, A. (2016) Long-Term Condition Self-Management Support in Online Communities: A Meta-Synthesis of Qualitative Papers. Journal of Medical Internet Research, 18, e61.
[10] Hajli, M., Sims, J., Featherman, M. and Love, P. (2014) Credibility of Information in Online Communities. Journal of Strategic Marketing, 23, 1-16.
[11] Link, C.L. and McKinlay, J.B. (2010) Only Half the Problem Is Being Addressed: Underinsurance Is as Big a Problem as Uninsurance. International Journal of Health Services, 40, 507-523.
[12] Zhao, G., Okoro, C.A., Li, J. and Town, M. (2018) Health Insurance Status and Clinical Cancer Screenings among US Adults. American Journal of Preventive Medicine, 54, e11-e19.
[13] Berchick, E.R., Hood, E. and Barnett, J.C. (2018) Health Insurance Coverage in the United States: 2017. Current Population Reports. US Government Printing Office, Washington DC, 60-264.
[14] De Choudhury, M., Morris, M. and White, R. (2014) Seeking and Sharing Health Information Online: Comparing Search Engines and Social Media. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Toronto, 2014, 1365-1376.
[15] Newman, M., Lauterbach, D., Munson, S., Resnick, P. and Morris, M. (2011) It’s Not That I Don’t Have Problems, I’m Just Not Putting Them on Facebook: Challenges and Opportunities in Using Online Social Networks for Health. Proceedings of the ACM 2011 Conference on Computer Supported Cooperative Work, Hangzhou, 2011, 341-350.
[16] Jones, C.E., Maben, J., Jack, R.H., Davies, E.A., Forbes, L.J., Lucas, G. and Ream, E. (2014) A Systematic Review of Barriers to Early Presentation and Diagnosis with Breast Cancer among Black Women. BMJ Open, 4, e004076.
[17] Madden, T., Cortez, S., Kuzemchak, M., Kaphingst, K.A. and Politi, M.C. (2016) Accuracy of Information about the Intrauterine Device on the Internet. American Journal of Obstetrics and Gynecology, 214, 499.e1-499.e6.
[18] Zhang, Y., Sun, Y. and Kim, Y. (2017) The Influence of Individual Differences on Consumer’s Selection of Online Sources for Health Information. Computers in Human Behavior, 67, 303-312.
[19] Jacobs, W., Amuta, A., Jeon, K. and Alvares, C. (2017) Health Information Seeking in the Digital Age: An Analysis of Health Information Seeking Behavior among US Adults. Cogent Social Sciences, 3, Article ID: 1302785.
[20] Lo, B. and Parham, L. (2010) The Impact of Web 2.0 on the Doctor-Patient Relationship. Journal of Law, Medicine & Ethics, 38, 17-26.
[21] Glanz, J.M., Kraus, C.R. and Daley, M.F. (2015) Addressing Parental Vaccine Concerns: Engagement, Balance, and Timing. PLoS Biology, 13, e1002227.
[22] Diviani, N., Van Den Putte, B., Giani, S. and Van Weert, J. (2015) Low Health Literacy and Evaluation of Online Health Information: A Systematic Review of the Literature. Journal of Medical Internet Research, 17, E112.
[23] Stvilia, B., Mon, L. and Yi, Y.J. (2009) A Model for Online Consumer Health Information Quality. Journal of the American Society for Information Science and Technology, 60, 1781-1791.
[24] van Deursen, A.J. and van Dijk, J.A. (2011) Internet Skills Performance Tests: Are People Ready for eHealth? Journal of Medical Internet Research, 13, e35.
[25] Tennant, B., Stellefson, M., Dodd, V., Chaney, B., Chaney, D., Paige, S. and Alber, J. (2015) eHealth Literacy and Web 2.0 Health Information Seeking Behaviors among Baby Boomers and Older Adults. Journal of Medical Internet Research, 17, e70.
[26] Diviani, N. and Meppelink, C. (2017) The Impact of Recommendations and Warnings on the Quality Evaluation of Health Websites: An Online Experiment. Computers in Human Behavior, 71, 122.
[27] Sun, Y., Zhang, Y., Gwizdka, J. and Trace, C.B. (2019) Consumer Evaluation of the Quality of Online Health Information: Systematic Literature Review of Relevant Criteria and Indicators. Journal of Medical Internet Research, 21, e12522.
[28] LaValley, S.A., Kiviniemi, M.T. and Gage-Bouchard, E.A. (2017) Where People Look for Online Health Information. Health Information & Libraries Journal, 34, 146-155.
[29] Nickerson, R. and Salovey, P. (1998) Confirmation Bias: A Ubiquitous Phenomenon in Many Guises. Review of General Psychology, 2, 175-220.
[30] Kayhan, V. (2013) Seeking Health Information on the Web: Positive Hypothesis Testing. International Journal of Medical Informatics, 82, 268-275.
[31] Yom-Tov, E. and Fernandez-Luque, L. (2014) Information Is in the Eye of the Beholder: Seeking Information on the MMR Vaccine through an Internet Search Engine. AMIA Annual Symposium Proceedings, Washington DC, 2014, 1238-1247.
[32] Stevenson, M.P. and Liu, M. (2010) Learning a Language with Web 2.0: Exploring the Use of Social Networking Features of Foreign Language Learning Websites. CALICO Journal, 27, 233-259.
[33] Oh, S. (2012) The Characteristics and Motivations of Health Answerers for Sharing Information, Knowledge, and Experiences in Online Environments. Journal of the American Society for Information Science and Technology, 63, 543-557.
[34] Gore, A. (2015) California Department of Public Health Confirms 59 Cases of Measles.
[35] Centers for Disease Control and Prevention (2018) Measles Cases and Outbreaks.
[36] Kata, A. (2012) Anti-Vaccine Activists, Web 2.0, and the Postmodern Paradigm—An Overview of Tactics and Tropes Used Online by the Anti-Vaccination Movement. Vaccine, 30, 3778-3789.
[37] Betsch, C., Renkewitz, F., Betsch, T. and Ulsh?fer, C. (2010) The Influence of Vaccine-Critical Websites on Perceiving Vaccination Risks. Journal of Health Psychology, 15, 446-455.
[38] White, E. (2014) Science, Pseudoscience, and the Frontline Practitioner: The Vaccination/Autism Debate. Journal of Evidence-Based Social Work, 11, 269-274.
[39] Poland, G.A. and Spier, R. (2010) Fear, Misinformation, and Innumerates: How the Wakefield Paper, the Press, and Advocacy Groups Damaged the Public Health. Vaccine, 28, 2361-2362.
[40] Hall, V., Banerjee, E., Kenyon, C., et al. (2017) Measles Outbreak—Minnesota April-May 2017. Morbidity and Mortality Weekly Report, 66, 713-717.
[41] Venkatraman, A., Garg, N. and Kumar, N. (2015) Greater Freedom of Speech on Web 2.0 Correlates with Dominance of Views Linking Vaccines to Autism. Vaccine, 33, 1422-1425.
[42] Davies, P., Chapman, S. and Leask, J. (2002) Antivaccination Activists on the World Wide Web. Archives of Disease in Childhood, 87, 22-25.
[43] Wolfe, R.M. and Sharp, L.K. (2005) Vaccination or Immunization? The Impact of Search Terms on the Internet. Journal of Health Communication, 10, 537-551.
[44] Jones, A., Omer, S., Bednarczyk, R., Halsey, N., Moulton, L. and Salmon, D. (2012) Parents’ Source of Vaccine Information and Impact on Vaccine Attitudes, Beliefs, and Nonmedical Exemptions. Advances in Preventive Medicine, 2012, Article ID: 932741.
[45] Bragazzi, N., Barberis, I., Rosselli, R., Gianfredi, V., Nucci, D., Moretti, M., Martini, M., et al. (2017) How Often People Google for Vaccination: Qualitative and Quantitative Insights from a Systematic Search of the Web-Based Activities Using Google Trends. Human Vaccines and Immunotherapeutics, 13, 464-469.
[46] Fu, L., Zook, K., Spoehr-Labutta, Z., Hu, P. and Joseph, J. (2016) Search Engine Ranking, Quality, and Content of Web Pages That Are Critical versus Noncritical of Human Papillomavirus Vaccine. Journal of Adolescent Health, 58, 33-39.
[47] Comscore, Inc. (2016) Comscore Ranks the Top 50 U.S. Digital Media Properties for February 2016.
[48] Gorman, F., Yadegarians, D., Islam, T., Tongco, S., Johnston, E., Estrada, E. and Gorman, N. (2017) Accuracy of Ebola Information in a Knowledge Exchange Social Website (KESW). Open Journal of Preventive Medicine, 7, 210-223.
[49] IBM Corp. (2019) IBM SPSS Statistics for Mac, Version 26.0. IBM Corp., Armonk.
[50] Braun, V. and Clarke, V. (2006) Using Thematic Analysis in Psychology. Qualitative Research in Psychology, 3, 77-101.
[51] Harmsen, I., Doorman, G., Mollema, L., Ruiter, R., Kok, G. and De Melker, H. (2013) Parental Information-Seeking Behaviour in Childhood Vaccinations. BMC Public Health, 13, Article No. 1219.

Copyright © 2022 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.