The internet is fraught with medical misinformation, from conspiracy theories to unproven – if not dangerous – treatment methods. Experts say all this false reporting can undermine trust in science and medicine. However, the internet is also one of the best tools for finding accurate information. Chances are your patients are looking for health information online, whether it’s from a reputable news organization or Facebook.
This trend of increasing misinformation has worsened over the last few years as apps become integral to everyday life. A recent study from the research firm Pew Internet & American Life Project shows that 80% of internet users, or about 93 million Americans, have searched for a health-related topic online. That’s a considerable jump from 2001, when the firm found that just 62% of internet users were searching for health information online.
As we gear up for winter and the eventual COVID-19 vaccine, battling medical misinformation has taken on a whole new meaning.
Coronavirus Patients Go Online
COVID-19 is still a new disease, and doctors don’t always have the answers patients are looking for. Many of those who have been infected report experiencing symptoms long after the worst of the disease has gone away. For lack of a better name, this group has become known as the “COVID long-haulers.” Doctors are still studying the long-term effects of the disease, so these patients often go online to share and learn about other people’s experiences with the disease.
That was true for 36-year-old Matthew Long-Middleton, who got sick with the virus on March 12th. After recurring bouts of fatigue, discomfort in his chest, muscle weakness, and fever, he started using the Slack messaging channel Body Politic to learn more about the long-term effects of the disease.
“I had no idea where this road leads, and so I was looking for support and other theories and some places where people were going through a similar thing, including the uncertainty, and also the thing of like, we have to figure this out for ourselves,” he recalled.
However, these groups can sometimes do more harm than good. “You want to find hope, but you don’t want the hope to lead you down a path that hurts you,” Long-Middleton said.
Vanessa Cruz, a mother of two, has also been experiencing symptoms, including fatigue, fever, and confusion, since March. She turned to the Facebook group “have it/had it” to get in touch with other people who have had the disease. For her, it wasn’t just about learning the facts on COVID-19, it was about connecting with other people who understand where she’s coming from. Without traditional support groups, suffering from a new disease can be isolating.
She said, “It’s really become like a second family to me and being able to help everybody is a positive thing that comes out of all this negativity we’re experiencing right now.”
Unfortunately, the group, which now has over 30,000 members, has also become a magnet for misinformation in recent months, so Cruz volunteered to start fact-checking the page. Some posts have advocated for the use of a common tapeworm medication used in India, but it’s not FDA-approved. Others are calling for the use of hydroxychloroquine, which has not been proven effective in treating COVID-19.
Cruz is doing her best to keep the group factual. “It’s like you really don’t know what to question, what to ask for, how to reach for help. Instead of doing that, they just, they write up their story, basically, and they share it with everybody,” she commented.
Fighting Back Against Misinformation
As the Facebook group gets larger, moderators have appointed a 17-person team of fact-checkers to rid the website of any false information, including two nurses and a biologist. They examine every post that goes up on the group’s page to make sure it is medically accurate.
However, simply removing these posts may not be enough. Some images and shares may get thousands of views before moderators eventually delete them.
As Elizabeth Glowacki, a health communication researcher at Northeastern University, puts it, “Even if we’re not actively seeking information, we encounter these kinds of messages on social media, and because of this repeated exposure, there’s more likelihood that it’s going to seep into our thinking and perhaps even change the way that we view certain issues, even if there’s no real merit or credibility.”
Recent statistics show that posts on Facebook containing false or misleading information have received four times as many views as posts from official organizations such as the World Health Organization.
Fadi Quran, campaign director of Avaaz, a human rights group that studies disinformation campaigns, says that’s because Facebook uses algorithms that are inherently flawed. The social media company says it’s doing more to track and prevent the spread of misinformation, but Quran says moderators tend to focus on the most sensationalized posts that tend to get the most clicks. That means less-popular health-related posts that contain misinformation can easily fly under the radar.
Until Facebook changes its policies, the most we can do is keep an eye out for medical misinformation and delete it before it spreads. Keep these ideas in mind as you talk to your patients about the latest healthcare information.