For-profit Instagram Accounts Posting False Health Information

Inaccurate information regarding hepatitis B was present in nearly one out of every four popular Instagram posts about the disease, and these posts were much more likely than accurate ones to originate from for-profit accounts or accounts that offered a product or service.

In Charlotte, NC, during the American College of Gastroenterology’s 2022 Annual Scientific Meeting, new study was just recently presented that supports this.

According to presenter Zachary C. Warner, MD, MPH, an internal medicine resident at the University of Arizona, “Users that generate hepatitis B misinformation also have more reach with a higher number of followers and engagement with more likes than users who do not promote misinformation.” It’s probable that people with chronic illnesses, which don’t have straightforward therapies, are more susceptible to for-profit users and inaccurate information about health online.

Patients are turning to social media and other user-generated websites for information and support about their health, according to Warner, even if false information and scepticism about evidence-based medicine have become more pervasive online.

Despite the fact that these websites might give patients access to social support and knowledge they otherwise wouldn’t have, he cautioned that medical material on social media is uncontrolled.

Unreliable Intel

Negative impacts are possible even though the effects of being exposed to online disinformation have not been thoroughly researched.

Adoption of “unproven therapies” and “symptom management” may put patients at greater risk for unfavourable health outcomes and financial strain, Warner said. “It is unlikely that health insurance will pay for unproven treatments and symptom management techniques, potentially leaving patients with substantial out-of-pocket expenses.”

Warner and his team just looked at a snapshot of one month in December 2021 when conducting their Instagram search. They looked for any publicly accessible posts that contained the terms hepatitis B and hep B. In order to code the remaining 103 posts with a validated tool for assessing disinformation, they first removed duplicates from the top 55 posts for each term. Engagement metrics like likes and comments, user traits like follows, and claims with false information as determined by medical professionals were among the tool’s factors.

After that, the researchers looked at the posts’ profitability and inaccuracy levels. 23 percent of the posts contained inaccurate information regarding hepatitis B or its treatment. In addition, these postings received an average of 1,599 likes, which was higher than the average of 970 likes for posts that accurately described hepatitis B. The average number of accounts followed by accounts with inaccurate hepatitis B posts was larger (1,127) than the average number of accounts followed by accurate posts (889 accounts). However, the accounts disseminating false information had around a third as many followers on average (22,920) as the accounts disseminating true information (70,442 followers).

Warner added, “We believe it is prudent to maintain a heavy level of scepticism for information that promises results that are “too good to be true,” uses anecdotes as support, or is experimental. “We advise using the CRAAP test, which directs people in assessing the reliability of health information sources.”

Is This CRAAP-Compatible?

1 Take into account the information’s currentness
2 The Appropriateness to Your Needs
3 legitimacy of the source
4 The content’s accuracy, and
5 The reason the source is there

In their investigation, the researchers discovered that just under one-third (30%) of the hepatitis B messages made reference to a conspiracy theory, and a comparable percentage (29%) came from for-profit accounts. Additionally, 34% of the posts were published by accounts that used Instagram to hawk goods or services.

Overall, posts containing inaccurate information (47%) were posted by for-profit accounts more than three times as often as posts with proper information (14%) were. When compared to accurate posts from accounts selling a product or service (13%) there was a similar percentage of posts with incorrect information (43%).

David Gorski, MD, PhD, a professor of surgery at the Wayne State University School of Medicine, did not find these results surprising.

Although false information about health is frequently motivated by ideology and belief, it is virtually always also motivated by the desire for financial gain on the part of practitioners who recommend treatments based on the false information, the author claimed.

Gorski explained that most quacks “do believe in the quackery that they offer, and believers are significantly more effective marketers than grifters who know that what they are selling is quackery.”

Warner added, “We strongly emphasise that patients do their best to assess the possible drivers behind the individuals or organisations who create the health information viewed online, particularly on social media sites.” She also suggested that clinicians and health organisations openly engage in online and offline discussions about false information.

Hard-core denialists are virtually always unreachable and unlearnable, and it is generally ineffective to try to influence their ideas, according to Gorski. “People who are undecided and unsure can be reached, though. They should be the focus of our teaching efforts rather than the quack medicine sellers.”

Leave a Reply

Your email address will not be published. Required fields are marked *