Shares

There’s been a lot in the news lately about how online social media networks facilitate the spread of antivaccine misinformation and what, if anything, can be done about it, even though the last times I wrote about online health misinformation were in March, when I noted that streaming services like Amazon Prime and social media (e.g., YouTube and Facebook) were trying to remove or demonetize antivaccine content and about an attempt by physicians to organize online to defend pro-science advocates targeted by antivaccine swarms. However, a fair amount has occurred in the intervening three months or so, and, in fact, for some reason there have been reporters who have actually interviewed me about this for stories they were working on. The most recent of these was published in the Wall Street Journal right before the 4th of July by Daniela Hernandez and Robert McMillan and entitled “Facebook, YouTube overrun with bogus cancer-treatment claims“. Ironically, I don’t have a subscription to the WSJ but ultimately had someone email me the text. That’s why I’ll be a bit more liberal than usual in quoting the article. The story serves as a good jumping off point to update our readers on what has been going on.

The Wall Street Journal on cancer quackery on social media

In its story, the WSJ starts by noting:

Facebook Inc. and YouTube are being flooded with scientifically dubious and potentially harmful information about alternative cancer treatments, which sometimes gets viewed millions of times, a Wall Street Journal examination found.

Now, the companies say they are taking steps to curb such accounts. Facebook last month changed its News Feed algorithms to reduce promotion of posts promising miracle cures or flogging health services, a move that will reduce the number of times they pop up in user feeds, the company says. Some of the affected posts involve a supplement salesman who promotes baking-soda injections as part of cancer treatment.

“Misleading health content is particularly bad for our community,” Facebook said in a blog post set to publish Tuesday announcing the moves.

I’ll get to Facebook’s announcement, which was published just before the holiday last week, in a moment. First, let’s note in the article what social media companies have done thus far this year:

YouTube, which has guidelines that don’t allow videos that can result in immediate harm, considers medical misinformation especially concerning, a spokesman said. Videos conveying inaccurate medical information are among the 8.3 million videos the company says it has removed during the first three months of this year for violating its policies. The YouTube spokesman said that while the company’s systems aren’t perfect, YouTube’s results for searches for cancer information have improved.

Earlier this year, Facebook said it would crack down on false criticism of vaccines spread by skeptics, an effort that the company has acknowledged has a long way to go. YouTube also changed its algorithms to play down results for antivaccination content. And Pinterest has stopped surfacing vaccination-related search results because most cautioned against vaccines.

I had heard about Pinterest’s action, although I hadn’t known that the problem was so bad that vaccine-related searches turned up way more antivaccine content than science-based content. In any event, as I noted in March, YouTube is indeed a wretched hive of scum and quackery when it comes to health misinformation. Has it gotten any better? At the time Google, which owns YouTube, promised to deprioritize antivaccine and cancer quackery in its search results and to ban people and companies posting such content from monetizing it through its ad-serving system. Indeed, at the time, a number of prominent purveyors of medical misinformation started complaining loud and long about this, which is not surprising given that running YouTube ads on videos hosted on the platform can generate a considerable amount of income when the videos are viewed thousands—or even millions—of times. Add to that how YouTube allows content creators to host there for free on its servers, thus freeing them from the burden of hosting and bandwidth charges (which, if you’re streaming HD video, can become quite expensive rather quickly), and you can see how YouTube made the bar for publishing video online very low and the potential for profit for popular content creators very high. So if YouTube really has done what it said in March that it would do, you can easily see how that would impact the bottom line of those whose business model uses YouTube to promote its wares.

If the WSJ is correct, the results are, at best, mixed:

As of Monday, YouTube videos viewed millions of times were among the postings advocating the use of a cell-killing, or necrotizing, ointment called black salve to treat skin cancer. Use of the ointment can inadvertently burn or kill healthy skin, and doesn’t remove cancerous growths beneath the skin, as is claimed in some videos, said David Gorski, a professor of surgery at Wayne State University School of Medicine in Detroit who edits the blog Science-Based Medicine. The wounds could also lead to infection.

According to Dr. Gorski, misinformation about cancer on the internet is as much of a public-health issue as antivaccine misinformation. “It’s hard to argue which one is the worst,” he said.

It’s hard to get these concepts across when word count is so limited, but basically black salve burns. It’s an escharotic. So, yes, it can at times eliminate skin cancers, but the price is a far less precise extirpation and far more collateral damage to the surrounding tissue than a surgical excision would entail. (I guess ugly, disfiguring burns are better than a surgical scar as long as the burns were “natural”.) Also, claims that black salve can “draw out” cancerous tumors are utter nonsense, without scientific justification.

Just for yucks, I did some searches on YouTube yesterday. The results were not promising when I just searched for “cancer”. The first hit was the pseudoscientific Nutritional Chemotherapy: 4 Ways to Give Cancer a Biological Karate Chop by The People’s Chemist. It’s as pseudoscientific as the title suggests. The next two videos were fairly typical cancer videos, but the fourth video was by naturopath Paul Anderson, Epigenetics, Nutrigenomics & Cancer: MASTERCLASS: Dr. Paul Anderson. I’ve discussed how pseudoscientific this one is before. Basically, Not-a-Doctor Anderson is a “naturopathic oncology” quack. Next up (#5) was a video entitled Murdered for Curing Cancer: The Story of Dr. Max Gerson w/ Dr. Patrick Vickers, and slightly further down, that video in which Chris Wark interviewed Mark Simon about his NORI protocol. I discussed the NORI protocol in May, and let’s just say that it isn’t particularly science-based, claiming that a fruit-based diet can cure cancer. Then there was this video touting a new reality series looking at patients who treated their cancers “naturally” and survived, featuring our favorite cancer quack, Dr. Stanislaw Burzynski, who’s been peddling his unproven cancer “cure” for over 40 years now. I was not impressed.

I realize that Google personalizes its search results based on cookies and your previous searches, but let’s just say that YouTube still has some work to do in this sphere.

So does Facebook, it would appear:

Often the videos and social media postings are connected with online businesses seeking to generate sales of books, supplements and unproven products.

A Facebook page with more than 60,000 likes promotes baking-soda injections and juicing regimens to treat cancer sold by a supplement salesman named Robert O. Young. Mr. Young was convicted in a San Diego County court in 2016 for practicing medicine without a license.

Gina Darvas, the San Diego deputy district attorney who prosecuted him, said that Mr. Young used the internet—in particular YouTube and Facebook—to earn as much as $5 million a year before his conviction.

Mr. Young has multiple Facebook pages currently. He has a personal page and he and his affiliate run others dedicated to selling products and services, internationally and domestically. The pages contain posts with embedded videos and links to YouTube.

So Facebook and YouTube were somehow able to get rid of Mike Adams of Natural News, but still grants Robert O. Young accounts to promote his quacky business, even after he was convicted and had served prison time for practicing medicine without a license and one of his victims secured a $105 million judgment against him? Again, this is not promising.

Not promising at all:

Weeks after getting out of jail in November 2017, Mr. Young was back on Facebook. His main page gets frequent updates with posts selling his discredited cancer and dietary theories, plus services and products. Videos on an account featuring Mr. Young have earned more than 900,000 views, according to an analysis by socialmedia intelligence firm Storyful, which is owned by News Corp., The Wall Street Journal’s parent company.

In some cases, YouTube only demonetized a channel when reporters asked about it, For example, Chris Beat Cancer:

Purveyors of medical misinformation sometimes buy ads to promote their wares. The more popular ones on YouTube can serve ads with their videos and generate income. Ads for well-known brands, including pharmaceutical companies and auto makers, preceded videos for a channel named Chris Beat Cancer before the Journal asked about the channel and YouTube took away the advertising. The channel, which has 125,000 subscribers, plays down the benefits of preventive screenings like mammograms, and promotes non-validated tests, including so-called thermography.

Thermography is, of course, unproven. It’s never been validated as being “better than mammography”, as its sellers claim that it is. Unfortunately, it persists despite attempts by the US and Canada to tamp down on false and exaggerated advertising claims made for it. Unfortunately, the FDA still only reviews thermography claims on a case-by-case basis.

As for Chris of “Chris Beat Cancer” fame, he is a man who around 15 years ago was diagnosed at a very young age with a stage III colon cancer. He underwent appropriate surgery for his cancer. I’ve discussed his case in depth before. Suffice to say, the surgery cured him of his cancer, his attempts to claim that surgery can’t cure stage III colon cancer notwithstanding. Nonetheless, he attributes his survival to the quackery that he pursued after his surgery, rather than to his good fortune of being someone for whom surgery was sufficient to take care of his cancer. Since then, he’s built quite the lucrative “alternative health” online business and has been busily promoting dubious alternative cancer cure testimonials and cancer quackery like the NORI protocol.

In other words, social media companies are trying, but they’re still inadvertently facilitating the promotion of a lot of quackery.

The Washington Post weighs in

Apparently there’s enough interest in this topic that about a week before The Washington Post published a news report by Abby Ohlheiser, “They turn to Facebook and YouTube to find a cure for cancer — and get sucked into a world of bogus medicine“. It begins with an anecdote:

Mari pressed kale leaves through the juicer, preparing the smoothie that she believed had saved her life.

“I’m a cancer-killer, girl,” Mari told her niece, who stood next to her in the kitchen. The pair were filming themselves for a YouTube video.

Mari said she was in remission from a dangerous form of cancer, and the video was meant as a testimony to what she believed was the power of the “lemon ginger blast.” In went some cucumber, some apple, some bok choy, a whole habanero pepper.

While she pressed, she preached.

“I’m telling you, it’s anti-cancer,” Mari said. “It’ll kill your cancer cells.”

The video, first uploaded in 2016, remains on YouTube, but there’s an “important update” attached to the video’s description. It was written by Liz, the niece, a year later.

Mari’s cancer had returned, the note said, and she had died.

I embedded that very video in a post that I did about true believer and scammers in alternative medicine a year and a half ago, and, yes, the video is still there on YouTube. The patient was Mari Lopez, and the woman interviewing her was her niece Liz Johnson. Lopez represented a typical alternative cancer cure testimonial in that she had stage II breast cancer at age 37, underwent conventional surgery and adjuvant treatment. Later, thirteen years after her original treatment, Lopez’s cancer recurred in her lungs and liver and “everywhere”, as she put it in the video. The diet the two were promoting was Robert O. Young’s and treatments from a cancer quack named Alfredo Bowman, who went under the pseudonym Dr. Sebi to promote an intervention that he called Dr. Sebi’s Cell Food.

The reporter’s results a couple of months ago using incognito mode to prevent cookies and previous searches from skewing her results:

As recently as late April, searching “cure for cancer” in YouTube (turning on “incognito mode” so that my prior search history wouldn’t skew the results) surfaced several troubling results: The sixth video, with more than 1.4 million views, claimed that baking soda could cure cancer. The eighth was an interview with self-described cancer expert Leonard Coldwell, in which Coldwell explains that every cancer can be cured in weeks with a special diet that “alkalizes” the body, a claim that has been debunked by scientists. The video has more than 7 million views. (In an emailed statement to The Washington Post, a spokeswoman for Coldwell identifying herself as “Danielle” claimed that Coldwell, who no longer treats patients, had the “Highest Cancer Patient Cure Rate in the world,” and boasted that Coldwell remained popular despite being “the most blocked Cancer Patient Expert in the world.”)

Coldwell is among the quackiest and worst of cancer quacks. He claims that every cancer can be cured in 2-16 weeks, that alkaline is the way to “heal cancer,” and that medical doctors have among the shortest lifespan, all of which are simply not true.

Chris Wark also figured prominently in The Washington Post article, whose response to the observation that he had surgery and that’s what cured his cancer is to change the subject:

“Attempts to discredit me because I had surgery give far too much weight to my personal story, and miss the larger message. . . . People have healed all types and stages of cancer holistically (against the odds),” Wark said in a statement. “As a patient advocate, I am highly critical of the cancer industry and pharmaceutical industry,” he added, before saying that “I do not tell patients not to do the treatment.”

Surgery was the recommended primary treatment for Wark’s cancer, Gorski said. Chemotherapy is a secondary measure, meant to help prevent the cancer from coming back. Wark’s decision to forgo the post-surgery chemo was a risk, but by then the odds were in his favor.

Recall that, previously, his response was that surgery never cures stage III colon cancer. With that argument being untenable, Mr. Wark retreats into making it sound as though he never claimed that his survival was due to his embrace of alternative medicine and invokes the time-dishonored ploy of plausible deniability by cancer quacks: “I never told anyone not to undergo conventional treatment.”

What tech companies are doing: Google

A few weeks ago, I noticed a disturbance in the antivaccine and quack crankosphere. It began with Kelly Brogan, whom you might remember as the “holistic psychiatrist” who spoke at the In Goop Health Summit last year and who is also known for her rabidly antivaccine views and selling of a large variety of quackery. It came in the form of a Facebook post:

Brogan also noted that Joe Mercola was seeing similar results:

Mercola was complaining that search traffic to his website (which had been highly trafficked in 2012) had plummeted 99%:

Google traffic to Mercola.com has plummeted by about 99% over the past few weeks. The reason? Google’s June 2019 broad core update, which took effect June 3,1 removed most Mercola.com pages from its search results. As reported by Telaposts.com:2

“The June 2019 Google Broad Core Algorithm Update impacted the rankings of websites in Google’s Search Engine Results Pages. Several aspects of the algorithm were changed which caused some sites to gain visibility and others to lose visibility.

Generally speaking, sites negatively impacted will see a drop in rankings for many or all of important keywords or key phrases which they used to rank well for … The June 2019 Google Broad Core Algorithm Update impacted sites across the web, however, I am personally seeing the most impact on News and Health sites.”

Continuing his complaint:

Now, any time you enter a health-related search word into Google, such as “heart disease” or “Type 2 diabetes,” you will not find Mercola.com articles in the search results. The only way to locate any of my articles at this point is by searching for “Mercola.com heart disease,” or “Mercola.com Type 2 diabetes.” Even skipping the “.com” will minimize your search results, and oftentimes the only pages you’ll get are blogs, not my full peer-reviewed articles. Negative press by skeptics has also been upgraded, which means if you simply type in my name none of my articles will come but what you will find are a deluge of negative articles voicing critiques against me in your searches. Try entering my name in Yahoo or Bing and you will see completely different results.

So what had happened? I did a bit of digging at the time, not willing to trust Mercola’s biased presentation of information. Before I do, I note that one of the biggest problems with Google’s algorithm is that popular ≠ high quality. Having a lot of incoming links from popular sites (even reputable sites) is not necessarily an indication of high quality. That’s why now Google hires quality raters to evaluate the quality of websites. In May Google issued an update to its quality rating guidelines.

To understand search engine results, first, you need to know that Google uses two acronyms to describe what it’s looking for in terms of quality webpages: measurements, E-A-T and YMYL. E-A-T means “Expertise, Authoritativeness, Trustworthiness.” High-quality pages have a high level of E-A-T while low-quality pages don’t. Of course, how does Google measure E-A-T? There are a number of metrics, but one metric is important:

In order to be deemed high-quality, Google states that “websites need enough expertise to be authoritative and trustworthy on their topic.” It’s worth keeping in mind, however, that what comprises “Expert” content can vary depending upon a page’s type and purpose. For example, while high-level medical advice needs to be written by an accredited doctor in order to be considered “Expert” content, general information supplied on medical support forums can be considered “Expert” even if it’s been written by a layperson. Some topics inherently require less formal levels of expertise and, for these pages, Google is predominantly looking at how helpful, detailed, and useful the information provided is.

Of course, Mercola is a DO; so as a doctor he was considered an expert on medical topics by Google. The second aspect of Google rankings is known as YMYL, which stands for “Your Money or Your Life.” Basically, YMYL is a quality rating for websites that ask for your money or your life; i.e., usually financial transactions or medical advice:

YMYL stands for “Your Money or Your Life” pages and are comprised of pages that are important enough that, were they low-quality, they could have a potential negative impact on a person’s life, income, or happiness. As a general rule, the pages that Google requires to be written by experts are known as YMYL pages. Google thinks of the following categories as examples of YMYL pages:

  • Shopping or financial transaction pages
  • Pages that offer financial information, for example, investment or tax information
  • Pages that offer medical information about specific diseases or conditions or mental health
  • Pages that offer legal information about topics like child support, divorce, creating a will, becoming a citizen, etc.
  • Any page that has the potential to be dangerous or detrimental if it possessed low levels of E-A-T (car repair and maintenance, for example)

When it comes to these pages, Google has incredibly high page quality rating standards. This is Google’s effort to protect Google users from low-quality complex content that doesn’t possess the needed levels of E-A-T.

Basically, for pages that aren’t YMYL, Google doesn’t consider expertise as critical as it does for pages that are YMYL. Google’s own guidelines take this into account, although I truly cringed when I saw this section in the Google FAQ cited by Jennifer Slegg in an article on the May update regarding its page quality rating FAQs. You’ll see why in a minute. But first, I note that Mercola quotes this rather deceptively, mixing up commentary by Jennifer Slegg on TheSEMPost with actual excerpts from the Google FAQs. Here’s how Mercola does it:

There has been a lot of talk about author expertise when it comes to the quality rater guidelines … This section has been changed substantially … [I]f the purpose of the page is harmful, then expertise doesn’t matter. It should be rated Lowest!”

And here’s how it actually read, first the part by Jennifer Slegg:

There has been a lot of talk about author expertise when it comes to the quality rater guidelines, particularly with how site owners and authors can showcase their expertise. This section has been changed substantially to address this a bit more from Google’s perspective. Previously, it was implied that all content creators should have expertise. But they have lessened this slightly, for topics that don’t fall into YMYL pages.

And here’s the section from Google’s FAQ:

Pretty much any topic has some form of expert, but E­A­T is especially important for YMYL pages. For most page purposes and topics, you can find experts even when the field itself is niche or non­-mainstream. For example, there are expert alternative medicine websites with leading practitioners of acupuncture, herbal therapies, etc. There are also pages about alternative medicine written by people with no expertise or experience. E­A­T should distinguish between these two scenarios. One final note: if the purpose of the page is harmful, then expertise doesn’t matter. It should be rated Lowest!

Notice how Mercola used ellipses to stitch together Slegg’s commentary with the last sentence of the above answer to a question on Google’s FAQ. Of course, the example that Google uses in its FAQ is indeed cringeworthy, because experts in alternative medicine are quacks, and allowing quackery to rank on YMYL sites goes against Google’s own policy because by definition quackery has low E-A-T and is by definition harmful. Of course, what probably torpedoed Mercola’s and Brogan’s sites according to Google is that last section about how harmful content should always be ranked lowest. As Slegg notes, that part was not text that had been added or changed in the May update, but Mercola deceptively stitched together bits of text to make it seem as though it was part of that update. What might be different is that Google is now actually enforcing that guideline for antivaccine content, likely goaded by the light the current measles outbreak is shining on social media and search engines.

Joe Mercola, of course, views this as a huge conspiracy on Google’s part. He spends a fair amount of verbiage bragging about how his content used to show up near the top of Google search results, his expertise as a physician, how he even created a peer review panel of medical and scientific experts that review, edit and approve most of his articles before they’re published, and how his articles are “fully referenced, most containing dozens of references to studies published in the peer-reviewed scientific literature.” He then laments that “none of this now matters, as the very fact that the information I present typically contradicts industry propaganda places me in the lowest possible rating category.” No, Dr. Mercola, your information glorifies quackery, such as the time when you promoted cancer quack Tullio Simoncini, who thinks that all cancer is a fungus and that baking soda is the cure. I kid you not. Mercola’s been promoting quackery for 22 years now. His content has always been low quality, quackery disguised as real medical advice.

There’s also another reason why Mercola.com results were deprioritized. Here it is from Google itself:

Basically, as is explained on Search Engine Roundtable, Google is also instituting a change that will restrict search results to only two listings from the same domain for most searches. The intent behind the change is to show more diverse results from different domain names and that Google will generally treat sub-domains as part of the main domain. This change, too, could easily have affected various quack websites. Indeed, Telapost listed Mercola.com as one of the biggest losers after the early June algorithm update, along with DrAxe.com, which no longer ranks highly for searches for “keto diet”. (Interestingly, the Daily Mail was also a big loser.)

Another thing that irritates Mercola is that Google now instructs its quality reviewers to use Wikipedia to evaluate the expertise and trustworthiness of its sources. Of course, I’ve had issues with Wikipedia, but Mercola’s anti-Wikipedia rant is just beyond the pale. I’ve also had my issues with Google, but in this case I’m glad Google is finally trying to deprioritize antivaccine and quack information.

Predictably, Mike Adams wasn’t pleased at this news (it hurt his website traffic too), and spun this as—you guessed it—a conspiracy theory, predicting that Google will block access to “natural health sites” at the browser level in Chrome in 2020. Of course, although Chrome is popular, there are a lot of other browsers; so I’m not sure to what end Google would do that. Be that as it may, Google still has a long way to go in cleaning up its act with respect to facilitating the spread of dangerous medical misinformation

What tech companies are doing: Facebook

In June, all purpose health scammer and conspiracy theory creator Mike Adams was banned from Facebook, with a predictably histrionic reaction on his part to the interruption of his grift due to social media blacklisting. Then, last week I was made aware of this press release from Facebook:

In our ongoing efforts to improve the quality of information in News Feed, we consider ranking changes based on how they affect people, publishers and our community as a whole. We know that people don’t like posts that are sensational or spammy, and misleading health content is particularly bad for our community. So, last month we made two ranking updates to reduce (1) posts with exaggerated or sensational health claims and (2) posts attempting to sell products or services based on health-related claims.

  • For the first update, we consider if a post about health exaggerates or misleads — for example, making a sensational claim about a miracle cure.
  • For the second update, we consider if a post promotes a product or service based on a health-related claim — for example, promoting a medication or pill claiming to help you lose weight.

We handled this in a similar way to how we’ve previously reduced low-quality content like clickbait: by identifying phrases that were commonly used in these posts to predict which posts might include sensational health claims or promotion of products with health-related claims, and then showing these lower in News Feed.

And:

Posts with sensational health claims or solicitation using health-related claims will have reduced distribution. Pages should avoid posts about health that exaggerate or mislead people and posts that try to sell products using health-related claims. If a Page stops posting this content, their posts will no longer be affected by this change.

One of the greatest changes in the online experience that I’ve experienced in my nearly 30 years online has been the rise of social media platforms. When I first started online, it was basically BBS, email, and Usenet. Later, by the mid-1990s there were websites (most of which had no commenting sections), and I didn’t get into blogs until the early 2000s. These days, various messaging apps and platforms appear to be supplanting email, and the vast majority of people too young to have been online 15 years or more ago have no clue what Usenet was. (Does anyone even still use it?) Basically, you can think of Usenet as Reddit-like social media before there was social media. It was (is) a massive worldwide mass of discussion forums. What was very different from what we have now is that Usenet was decentralized, without a dedicated central server and administrator, and pretty much uncontrolled by anyone, other than Internet service providers, who decided which subset of the 100,000+ newsgroups they’d allow their users to access and how much storage space they would devote to each newsgroup. Oh, sure, people could set up moderated newsgroups for whom members had to be approved, but most of Usenet was the Wild West. In contrast, today, social media is centralized and controlled by a few companies: Facebook, Twitter, Google (which owns YouTube and, of course, controls the vast majority of the search engine business all of us depend on to find information online), and a handful of other, lesser players plus the comment sections of various websites and blogs (which, increasingly, are being run by software from Facebook or other players like Disqus and whose main sites tend to be run by WordPress or a couple of other companies) and some specialized web-based discussion forums.

There have been several consequences of this centralization of social media. One consequence is that it’s become much easier for people to post content that can rack up thousands (or millions) of views. With Facebook and YouTube, for instance, you can post video, image, or sound files for free and don’t have to worry about hosting your own site or paying for your own bandwidth. Apple and other services let you post audio files for podcasts for free. Even better, YouTube and Facebook provide ways for you to monetize your content by running ads, with the company getting its cut of course. Another consequence is that clicks mean everything, because monetization depends on getting people to read, listen to, or watch your online content. In addition, because these platforms make it far easier to share media than ever before, it’s very easy for information (and misinformation) to “go viral” and spread exponentially, as more and more people share and reshare it. Old-timers might remember how complicated it was to share binary files to Usenet. (Anyone remember uuencode?) The binary file had to be encoded into ASCII, and then you had to have a program to decode the ASCII back to binary to retrieve the file. (Most of these files were pictures or sound files; video formats were not well standardized yet, and video files were just too massive.) It was worse than that, though. Because of character limits, the ASCII-encoded binary file often had to be split into many Usenet posts and then reassembled. Fun times, indeed.

Of course, the huge problem that’s arisen is that the ease with which media, be it written, images, sound, or video, can be shared and monetized has been a boon for quacks and antivaxers, who routinely use Facebook, Twitter, YouTube, and other social media to spread their health misinformation and hawk their quackery, while monetizing their content—not to mention to harass their opponents as well. (It’s not just health misinformation, as the rise of Alex Jones and his ilk demonstrated.) Add to that the way Google has worked is to rank websites by the number and reputation of incoming links. It was basically popularity and usefulness contest, with the most popular content that is useful according to the metrics that Google uses to determine usefulness showing up on the first page. As a result, a whole lot of quack and antivaccine websites showed up way too high on Google search results for a whole lot of health topics, including vaccines; that is, at least until Google tweaked its algorithm and started enforcing the quality guidelines that it had for its human reviewers a month ago.

Which brings me back to the Facebook announcement. All I could think about was: How is Facebook going to implement this? Its announcement says that its method will identify phrases that are commonly used in posts promoting health misinformation to predict which posts might include sensational health claims or promotion of products with health-related claims and use them to rank these stories lower in the newsfeed. It all sounds good, but how? For a system like this to work, you either have to know these phrases already, in which case I’d wonder who is telling Facebook engineers and coders what these phrases are, or you have to have a collection of quack and antivax websites that Facebook engineers can analyze to identify common phrases that are much more common in such websites. Either way, Facebook needs people to do this, and these people need to be experts in what is and isn’t reliable health information. Does it have these people? If so, who are they?

The thing is, the number of health care professionals who are experts in identifying dubious health claims is a pretty small percentage of the total population of health care professionals. The percentage of physicians, for instance, who are skeptics and able to identify quack websites is fairly low. Of course, it’s likely that Facebook is only going after the most egregious examples, the sort of content that pretty much any physician or nurse should be able to identify, which is helpful, but would still leave a lot of less obvious health misinformation on its platform. Maybe that’s enough. Maybe it’s the best that can be done.

Of course, being as algorithm-obsessed as it is, it wouldn’t surprise me if Facebook is trying to do this alone with AI and without much in the way of input from knowledgeable medical professionals. It could just be relying on users to flag pages, links, and websites, which would be unlikely to work very well, particularly given that there is no way that I’ve seen to flag a page or post for promoting dangerous medical misinformation. Facebook should really add that.

Predictably, another cancer quack, Ty Bollinger, was none too happy with The Washington Post article and what tech companies are doing, referring to it as the “War on Truth“:

The truth of the matter is that “Big Pharma” is behind the push to scrub the internet of all content that threatens their huge profits or reveals the very real risks of using their vaccines, drugs, and chemotherapy agents. And now, we have Facebook, Pinterest, Google, YouTube, and Twitter going “all-in” with the pharmaceutical giants and agreeing to become the “gatekeepers” and engage in coordinated censorship of natural health information.

Or maybe they’re just trying not to let their platforms be used to spread misinformation.

A major problem left unaddressed

Unfortunately, all the tweaks that Google, Facebook, and the other tech and social media companies are making to their algorithms ignore one huge problem, although this problem, is unique to Facebook. Ohlheiser nailed it in her Washington Post article:

On Facebook, I easily found groups devoted to sharing “natural” cures for cancer, where people who have cancer diagnoses, or care for someone who does, asked other group members for ideas for how to cure it. “Cancer Cures & Natural Healing Research Group” has just under 100,000 members. I joined the closed group in February, identifying myself as a Washington Post journalist to the administrators.

The administrator was willing to private message, but then blocked Ohlheiser and her access to the group. Then:

Facebook’s algorithms then began suggesting other groups I might like to join: “Alternative Cancer Treatments” (7,000 members), “Colloidal Silver Success Stories” (9,000 members) and “Natural healing + foods” (more than 100,000 members). I requested access to some of those groups, too, and several admitted me. People in the groups would ask one another for cancer-fighting advice. Some would be told to use baking soda or frankincense.

Rather than remove the groups, Facebook’s strategy to limit health misinformation centers on making it harder to join them unknowingly. Facebook said in an emailed statement that it “will alert group members by showing Related Articles” for any post already deemed false by Facebook’s third-party fact-checkers, for instance.

This is where Facebook falls short—and massively so. One of the most powerful ways that Facebook promotes the spread of medical misinformation is by giving quacks and activists the ability to create groups where like-minded individuals form a natural constituency (and great marks) for cancer quacks and antivaccine grifters. Even leaving aside the selling of products, they form self-reinforcing echo chambers and online spaces to encourage parents not to vaccinate and cancer patients to abandon conventional therapy. Parents who have concerns about vaccines find antivaccine groups, join them, and become vaccine-averse or even outright antivaccine. Cancer patients scared of chemotherapy find alternative cancer therapy groups, join them, and end up deciding that they will be able to cure their cancer naturally and learn to launch crowdfunding campaigns to pay for it. The same happens for many other forms of medical misinformation. In addition, these groups form excellent platforms to organize attacks on those trying to combat medical misinformation. Physicians are swarmed on Facebook and Twitter; their ratings are flooded with false negative reviews on various physician rating cites; their Facebook pages are swarmed. It’s a problem that those pushing back against misinformation have only recently begun to try to address.

As far as getting rid of these groups, critics will argue, going back to the example of BBS, Usenet, and web-based discussion groups, that such groups have always existed, and that’s true, but they’ve never existed in a centralized location on a platform that makes them so easy to join with a search engine that entices people by suggesting many similar groups once you join one. As long as these groups are available on Facebook, all the tweaks Google, YouTube, Facebook, and other social media platforms are making to their search algorithms and all their attempts to demonetize quack content are likely to be far less effective than they could be.

[Note: I will be speaking at NECSS next weekend, with a talk on Friday and panels over the weekend. As a result, I do not know whether I will have time to produce a post for next Monday or not. If I don’t, I’ll make sure there’s a guest post. If you’re attending NECSS, feel free to come up and say hi.]

Shares

Author

Posted by David Gorski

Dr. Gorski's full information can be found here, along with information for patients. David H. Gorski, MD, PhD, FACS is a surgical oncologist at the Barbara Ann Karmanos Cancer Institute specializing in breast cancer surgery, where he also serves as the American College of Surgeons Committee on Cancer Liaison Physician as well as an Associate Professor of Surgery and member of the faculty of the Graduate Program in Cancer Biology at Wayne State University. If you are a potential patient and found this page through a Google search, please check out Dr. Gorski's biographical information, disclaimers regarding his writings, and notice to patients here.