National Whistleblower Center Updates Complaint that Facebook Auto-Generates ‘Business Pages’ that Benefit Middle East Extremists and White Supremacists

... #ad▼

Facebook just doesn’t seem to be able to get it priorities in order. The social media company that once suspended CARDINAL NEWS for sharing an ABC 7 Chicago news article with a picture of a Portillo’s hot dog, is involved in an ongoing accusation that the social media company is creating dozens of auto-generated “business pages” that benefit Middle East extremists and white supremacists, and that could be used as a tool for networking and recruitment.

Portillo's Hot Dog picture causing Facebook suspension July 16 2018
Portillo’s Hot Dog image caused Facebook suspension July 16 2018 — Wondering what Facebook thought it was? See more …

A whistleblower’s complaint claims that Facebook has inadvertently provided the two categories of extremist groups with a networking and recruitment tool by producing these auto-generated pages.

On Wednesday September 18, 2019, U.S. senators on the Committee on Commerce, Science, and Transportation were scheduled to question representatives from social media companies, including Monika Bickert, head of product policy and counterterrorism. This would probably be the same person that oversees moderators that incorrectly and capriciously suspend ordinary Americans that prefer to exercise their right to free speech on Facebook, while Middle East extremists and white supremacists are apparently benefitting from using Facebook’s platform.

This week there is an update of a complaint to the Securities and Exchange Commission in the pipeline from the National Whistleblower Center. The updated filing is expected to identify almost 200 auto-generated pages — some for businesses, others for schools or other categories — that directly reference the Islamic State group. Dozens of other groups represent al-Qaida and other known extremist groups. One page — categorized as a “political ideology” — is titled “I love Islamic state,” complete with an Islamic State logo inside the outlines of Facebook’s famous thumbs-up icon. CARDINAL NEWS would not dare include the logo, because the hyperlocal news website based in the United States of America (northwest of Chicago) would probably be at risk of being suspended for sharing the image of Islamic State logo.

How did you miss that one Facebook?

The updated filing found that users’ pages promoting extremist groups remain easy to find with simple searches using their names. For example a page was discovered for “Mohammed Atta” (the hijacker-pilot extraordinaire of American Airlines Flight 11 that crashed the passenger jet into the North Tower of the World Trade Center) included an iconic photo of a terrorist hijacker. The page lists the user’s workplace as “Al Qaidah” and education as “University Master Bin Laden” and “School Terrorist Afghanistan.”

Auto-generated pages are not like normal Facebook pages. People can’t comment or post on auto-generated pages. The obvious way that these pages might promote networking and recruitment is by inspiration. However, the auto-generated pages may be helping the extremist groups in other ways, because Facebook allows users to like the pages, which potentially provides a list of sympathizers, wannabes, and pre-activated zombies that recruiters can collect and contact later.

The updated complaint scrutinizes this particular function that helps scrape employment information from users’ pages to create pages for businesses, which can lead to recruitment of users who have liked any of the auto-generated pages. God help the poor innocent infidel that made a poor choice and just liked the page to keep track of it — or liked it as a joke (you’re on the NSA ‘list’ now). Hmmm, of course this might make the conspiracy theorist amongst us wonder if this isn’t just a US government setup anyway. If the NSA and Facebook are in cahoots, they might be also studying the list of people who liked the extremist-related auto-generated pages.

But back to policy. Facebook claims they delete any pages that violate policies. Apparently, Facebook has been quite incompetent at separating unsafe hot dogs from safe hot dogs, and still can’t efficiently recognize an extremist group. After all it seems as though Facebook’s algorithm, or whatever incompetent techniques they use, should probably have an easier time finding the Islamic State (IS) logo than a dangerous hot dog.

Ya Think?
“Right now, a lot of our A.I. systems make decisions in ways that people don’t really understand.”

— Mark Zuckerberg, Senate Hearing April 10, 2018

Facebook claims it has been working to limit the spread of extremist material on the platform, but so far with mixed results. The updated whistleblower report claims that it took Facebook more than six weeks to remove pages that were identified in the whistleblower’s initial report. Actually, the pages were deleted on June 25, 2019 — the day before Monika Bicker was facing a congressional hearing.

Facebook announced in February and March 2019 that it expanded its definition of prohibited content to include U.S. white nationalist and white separatist material as well as that from international extremist groups. Facebook claims it has banned 200 white supremacist organizations and 26 million pieces of content related to global extremist groups like IS and al-Qaida (or do you say al-Qaeda?). In May 2019, Facebook banned several controversial figures, including Alex Jones, Milo Yiannopoulos, Laura Loomer and Louis Farrakhan for violating community standards on hate speech, and for promoting violence.

Facebook also recently expanded its definition of terrorism to include not just acts of violence intended to achieve a political or ideological objective, but also attempts at violence that target civilians with the intent to coerce and intimidate.

“Facebook would like us to believe that its magical algorithms are somehow scrubbing its website of extremist content. Yet those very same algorithms are auto-generating pages with titles like ‘I Love Islamic State,’ which are ideal for terrorists to use for networking and recruiting.”

— John Kostyack, Executive Director National Whistleblower Center

Facebook seems to be exaggerating the competence and effectiveness of its ‘magical’ algorithms.

See also …
Transcript of Mark Zuckerberg’s Senate hearing



 facebook … 

Please ‘LIKE’ the ‘Arlington Cardinal Page. See all of The Cardinal Facebook fan pages at …

Help fund The Cardinal


Search Amazon …

Search for products sold on Amazon: is an Amazon Associate website, which means that a small percentage of your purchases gets paid to at no extra cost to you. When you use the search boxes above, any Amazon banner ad, or any product associated with an Amazon banner on this website, you help pay expenses related to maintaining and creating new services and ideas for a resourceful website. See more info at