Michael Taifour
16 min readSep 22, 2021

--

Watch it on YouTube https://youtu.be/NFzi_bi8o1Y

“We created the machine and can’t control the machine!”

That’s what Facebook leaders say about their site, Australia’s Business Insider wrote on September 17. In a meeting that took place in early September, Facebook leaders reportedly discussed the site’s future impact on a planet where nearly half the people alive use its product. And now, it has reportedly become a point of contention among the company’s leadership. Facebook remains the world’s largest social network with nearly 3 billion users.

The concerns were raised amidst a shocking news report that was published by the Wall Street Journal last week. The report contended that Facebook has become a recruitment platform for Mexican drug cartels and Middle-Eastern human traffickers.

Its other platform, Instagram, was criticized for its effect on teenager users, including increased levels of anxiety and depression, and even suicide. According to the Guardian, a 16-year-old girl has reportedly killed herself in May 2019 in Malaysia, after posting a poll on her Instagram account asking followers if she should die or not, and 69% of responders voting that she should.

What’s even more horrifying, according to the Wall Street Journal, is that drug cartels and human traffickers are said to be recruiting workers on Facebook. The site is said to allow users to post videos of murders, incitements to violence, government threats against pro-democracy campaigners, and advertisements for human trafficking. It is also said to have allowed Mexican drug cartels to openly recruit minors and women to work as hitmen, that is to assassinate people.

Allegedly, Facebook didn’t do enough to fully remove the Mexican cartels' posts from its sites. If this story was true, it wouldn’t be just scandalous, but it would also be shocking and outrageous.

Here’s how the events of this incredible and fascinating story unfolded as of early 2021 in what the Wall Street Journal referred to as “The Facebook Files”.

In January 2021, a former cop-turned Facebook investigator posted a memo to all staff on the company’s internal message board. It began by saying “Happy 2021 to everyone!!” and then made a shocking revelation: A Mexican drug cartel is using Facebook to recruit, train and pay hitmen.

Despite the scandalous exposure, Facebook reportedly hardly did anything to stop the cartel from posting on Facebook and Instagram, Facebook’s photo-sharing site.

The Wall Street Journal, which looked inside Facebook’s internal files, found that human traffickers in the Middle East used the site to lure women into abusive employment situations in which they were treated like slaves or forced to perform sex work. The secret and confidential documents are said to expose the company’s shortcomings in areas including rules that favor elites, teen mental health, and efforts to manage its algorithm.

What’s more…

Armed groups in Ethiopia are using Facebook to incite violence against ethnic minorities. The newspaper’s investigations also unveiled other violations including organ selling, pornography, and government action against political dissent. Documents obtained by the newspaper showed that many of these groups are operating openly.

Till now, Facebook is said to be unable to fix its systems that allow offenders to repeat their bad behavior. Instead, Facebook is said to be supporting authoritarian governments to operate within their borders, the documents obtained by the newspaper revealed.

Facebook treats harm in developing countries as “simply the cost of doing business” in those places, said Brian Boland, a former Facebook vice president. “Facebook has focused its safety efforts on wealthier markets with powerful governments and media institutions,” he added.

Photo by Alexander Shatov on Unsplash

GOLD-PLATED GUNS AND BLOODY CRIME SCENES

The developing world already has hundreds of millions more Facebook users than the U.S. — more than 90% of monthly users are now outside the U.S. and Canada. With growth largely stalled there and in Europe, nearly all of Facebook’s new users are coming from developing countries, where Facebook is the main online communication channel and source of news.

Facebook is rapidly expanding into such countries, planning for technology such as satellite internet and expanded Wi-Fi to bring users online including in poor areas of Indonesia one document, obtained by the Wall Street Journal, described as “slums.”

The documents reviewed by the Journal are reports from employees who are studying the use of Facebook around the world, including human exploitation and other abuses of the platform. They write about their embarrassment and frustration, citing decisions that allow users to post videos of murders, incitements to violence, government threats against pro-democracy campaigners, and advertisements for human trafficking.

The material is part of extensive company communications reviewed by the Journal that offer unparalleled detail about the company’s shortcomings in areas including rules that favor elites, teen mental health, and efforts to manage its algorithm.

The employee who identified the Mexican drug cartel is a former police officer and cybercrime expert hired in 2018 as part of a new investigation team focused largely on “at-risk countries,” where the rule of law is fragile and violence is common.

That year, hate speech in Myanmar proliferated across Facebook’s platforms, and the company has acknowledged it didn’t do enough to stop incitements to violence against the minority Rohingya population, which the U.S. said were victims of ethnic cleansing.

An internal Facebook report from March said actors including some states were frequently on the platform promoting violence, exacerbating ethnic divides, and delegitimizing social institutions.

The ex-cop and his team untangled the Jalisco New Generation Cartel’s online network by examining posts on Facebook and Instagram, as well as private messages on those platforms, according to the documents obtained by the Wall Street Journal. The former officer declined to comment on his findings, and Facebook declined to make him available for an interview with the journal.

The team identified key individuals, tracked payments they made to hitmen, and discovered how they were recruiting poor teenagers to attend hit-man training camps. Facebook messages showed recruiters warning young would-be hires “about being seriously beaten or killed by the cartel if they try to leave the training camp,” the former officer wrote.

The cartel, which law-enforcement officials say is the biggest criminal drug threat to the U.S., didn’t hide its activity. It had multiple Facebook pages with photos of gold-plated guns and bloody crime scenes, the documents show.

Photo by Karsten Winegeart on Unsplash

SLAVE MARKETS ON INSTAGRAM

The Facebook pages were posted under the name “CJNG,” widely known as the shorthand for Cartél Jalisco Nueva Generación, even though the company had internally labeled the cartel one of the “Dangerous Individuals and Organizations” whose pages should have been automatically removed from the platform under the Facebook policy. Wall Street Journal investigation showed that CJNG associates were openly recruiting people, including minors and women, to work as hitmen for the cartel.

Yet, Facebook didn’t fully remove the cartel from its sites.

The investigation team asked another Facebook unit tasked with coordinating different divisions to look at ways to make sure a ban on the cartel could be enforced. That wasn’t done effectively either, according to the documents, because the team assigned the job didn’t follow up.

On January 13, nine days after the report was circulated internally, the first post appeared on a new CJNG Instagram account: A video of a person with a gold pistol shooting a young man in the head while blood spurts from his neck. The next post showed a photo of a beaten man tied to a chair; the one after showed a trash bag full of severed hands.

The page, along with other Instagram and Facebook pages advertising the cartel, remained active for at least five months before being taken down. Since then, new pages have appeared under the CJNG name featuring guns and beheadings. Facebook commits fewer resources to stop harm overseas than in the U.S., the documents showed.

In 2020, Facebook employees and contractors spent more than 3.2 million hours searching out and labeling or, in some cases, taking down information the company concluded was false or misleading, the documents obtained by the Wall Street Journal showed. Only 13% of those hours were spent working on the content from outside the U.S. The company spent almost three times as many hours outside the U.S. working on “brand safety,” such as making sure ads don’t appear alongside content advertisers may find objectionable.

The investigation team spent more than a year documenting a bustling human-trafficking trade in the Middle East taking place on its services. On Facebook and Instagram, unscrupulous employment agencies advertised workers they could supply under coercive terms, using their photos and describing their skills and personal details, the Wall Street Journal wrote on September 16th.

The company took down some offending pages but took only limited action to try to shut down the activity until Apple Inc. threatened to remove Facebook’s products from the App Store unless it cracked down on the practice. The threat was in response to a BBC story on maids for sale.

A BBC investigative report published on 31 October 2019 exposed slave markets found on Instagram. The undercover investigation exposed the buying and selling of domestic workers in the Gulf region to the highest bidder for a few thousand dollars.

Some of the trade has been carried out on Facebook-owned Instagram, where posts have been promoted via algorithm-boosted hashtags, and sales negotiated via private messages, the BBC reported.

Other listings have been promoted in apps approved and provided by Google Play and Apple’s App Store said the BBC. “What they are doing is promoting an online slave market,” said Urmila Bhoola, the UN's special rapporteur on contemporary forms of slavery.

One man, a policeman, looking to offload his worker said: “Trust me she’s very nice, she laughs and has a smiley face. Even if you keep her up till 5 am she won’t complain.” The BBC team used this to show how domestic workers are used in the oil-rich Gulf countries as a commodity.

According to the BBC, some buy a maid in Kuwait for say $2,000 and then sell her on for $3,300. In one case, the BBC team was offered a 16-year-old girl. It has called her Fatou to protect her real name. Kuwait’s laws say that domestic workers must be over 21. Her seller’s sales pitch included the facts that she had given Fatou no time off, her passport and phone had been taken away, and she had not allowed her to leave the house alone. This is a quintessential example of modern slavery.

This online slave market is not just happening in Kuwait. In Saudi Arabia, the investigation found hundreds of women being sold on Haraj, another popular commodity app. There were hundreds more on Instagram. Since the BBC report surfaced, Apple said it “strictly prohibited” the promotion of human trafficking and child exploitation in apps made available on its marketplace.

In an internal summary about the episode, a Facebook researcher wrote: “Was this issue known to Facebook before BBC inquiry and Apple escalation?” The next paragraph begins: “Yes.”

Although Mark Zuckerberg said in a 2017 mission statement saying that “giving people a voice is a principle our community has been committed to since we began,” Facebook restricted the ability of users in Vietnam from seeing the posts of Bui Van Thuan, a prominent critic of Vietnam’s authoritarian government, for nine months beginning last year.

A former Facebook employee who worked in Asia was quoted by the Wall Street Journal as saying that Facebook is aware the Vietnamese government is using the platform to silence dissidents, but that it tolerates the abuse because Vietnam is a fast-growing advertising market.

According to the Journal, Facebook’s team of human-exploitation investigators, which in addition to the former police officer included a Polish financial expert who previously investigated trafficking finances at HSBC bank and a Moroccan refugee expert who formerly worked at the United Nations High Commissioner for Refugees, gathered evidence of human trafficking.

By looking across Facebook products, they found criminal networks recruiting people from poor countries, coordinating their travel, and putting them into domestic servitude or into forced sex work in the United Arab Emirates and other Persian Gulf countries. Facebook products facilitated each step, and the investigators followed communications across platforms to identify perpetrators and victims.

Facebook in 2018 didn’t have a protocol for dealing with recruiting posts for domestic servitude. In March 2018, employees found Instagram profiles dedicated to trafficking domestic servants in Saudi Arabia. An internal memo says they were allowed to remain on the site because the company’s policies “did not acknowledge the violation.”

The investigation team identified multiple trafficking groups in operation, including one with at least 20 victims, and organizers who spent at least $152,000 on Facebook ads for massage parlors.

In a memo, the Polish trafficking expert wrote that 18 months after it first identified the problem, Facebook hadn’t implemented systems to find and remove the trafficking posts.

The BBC and Apple flagged concerns in 2019. With the threat posing “potentially severe consequences to the business,” the trafficking expert wrote, Facebook began moving faster. A proactive sweep using the investigation team’s prior research found more than 300,000 instances of potential violations and disabled more than 1,000 accounts.

The investigation team also struggled to curb sex trafficking. In 2019, they discovered a prostitution ring operating out of massage parlors in the U.S.

Facebook discovered a much larger ring that used the site to recruit women from Thailand and other countries. They were held captive, denied access to food, and forced to perform sex acts in Dubai massage parlors, according to an internal investigation report. The investigation found traffickers bribed the local police to look away, according to the report.

Photo by Timothy Hales Bennett on Unsplash

AVERT DISASTER, APOLOGIZE AND KEEP GROWING

In another heart-shattering story, the Wall Street Journal recounted the events which unfolded last January, when Patricia Wanja Kimani, a 28-year-old tutor and freelance writer in Nairobi, saw a recruitment post on Facebook that promised free airfare and visas — even though Facebook has banned employment ads touting free travel and visa expenses, according to the documents.

She said she was promised $300 a month to work for a cleaning service in Riyadh. At the Nairobi airport, the recruiter gave her a contract to sign. It said she would receive 10% less pay than she was promised, and that only the employer could terminate the contract. If Kimani wanted to quit, she would lose her visa and be in Saudi Arabia illegally. Kimani told the recruiter that she was backing out.

The recruiter responded that since her contract had already been sold to an employer, the agency would have to reimburse the employer if she backed out. Kimani would have to pay the agency to make up for that. She didn’t have any money, so she flew to Riyadh. The agency kept her passport.

She worked in a home where a woman called her a dog. She slept in a storage room without air conditioning. The house’s locked courtyard and high walls made leaving impossible. She worked from 5 a.m. until dusk cleaning while completely detached from the rest of the world. She got sick and wasn’t allowed treatment. She also wasn’t paid.

After two months, she told the agency she wanted to return to Kenya. They said she could pay them $2,000 to buy herself out of the contract. She didn’t have the money, and she posted about her plight on Facebook. She named the employment agency, which pulled her from the job and left her at a deportation center.

She said there were other Kenyan women there and that one had marks from chains on her wrists and ankles. Eventually, her Facebook posts were forwarded to an official at the International Organization for Migration, a U.N. body, which helped negotiate her release and return to Kenya in July.

Despite all these accounts, Facebook keeps researching its own harms and burying the findings.

Facebook knew that teen girls on Instagram reported in large numbers that the app was hurting their body image and mental health. It knew that its content moderation systems suffered from an indefensible double standard in which celebrities were treated far differently than the average user. It knew that a 2018 change to its news feed software, intended to promote “meaningful interactions,” ended up promoting outrageous and divisive political content.

Facebook knew all of those things because they were findings from its own internal research teams. But it didn’t tell anyone. In some cases, its executives even made public statements at odds with the findings, according to MIT Technology Review.

The respected publication pointed to a deeper issue at Facebook. It is that the world’s largest social network employs teams of people to study its own ugly underbelly, only to ignore, downplay and suppress the results of their research when it proves awkward or troubling.

Only Facebook knows the extent of its misinformation problem. And it’s not sharing, even with the White House, said the MIT review in a report published on September 16.

For instance, the New York Times reported in 2018 that Facebook’s security team had uncovered evidence of Russian interference ahead of the 2016 U.S. election, but that Chief Operating Officer Sheryl Sandberg and Vice President of Global Public Policy Joel Kaplan had opted to keep it secret for fear of the political fallout. In February 2020, The Washington Post reported that an internal investigation following the 2016 election, called “Project P,” had identified a slew of accounts that had peddled viral fake news stories in the run-up to Donald Trump’s victory, but only a few were disabled after Kaplan warned of conservative backlash.

In September 2020, BuzzFeed obtained a memo written by former Facebook data scientist Sophie Zhang, making the case that the company habitually ignored or delayed action on fake accounts interfering in elections around the world. In July 2021, MIT Technology Review detailed how the company pulled the plug on efforts by its artificial intelligence team to address misinformation, out of concern that they would hurt user engagement and growth. Just last month, the company admitted that it had shelved a planned transparency report showing that its most shared link over a three-month period was an article casting doubt on the safety of coronavirus vaccines.

Facebook says a post that cast doubt on the covid-19 vaccine was most popular on the platform from January through March.

This is in line with Facebook’s strategy, which is to avert disaster, apologize and keep growing.

In the run-up to the 2020 election, the most highly contested in US history, Facebook’s most popular pages for Christian and Black American content were being run by Eastern European troll farms, MIT Journal said on September 16. These pages were part of a larger network that collectively reached nearly half of all Americans, according to an internal company report, and achieved that reach not through user choice but primarily as a result of Facebook’s own platform design and engagement-hungry algorithm.

The report, written in October 2019 and obtained by MIT Technology Review from a former Facebook employee not involved in researching it, found that after the 2016 election, Facebook failed to prioritize fundamental changes to how its platform promotes and distributes information. The company instead pursued a whack-a-mole strategy that involved monitoring and quashing the activity of bad actors when they engaged in political discourse and adding some guardrails that prevented “the worst of the worst.”

But this approach did little to stem the underlying problem, the report noted. Troll farms — professionalized groups that work in a coordinated fashion to post provocative content, often propaganda, to social networks — were still building massive audiences by running networks of Facebook pages. Their content was reaching 140 million US users per month — 75% of whom had never followed any of the pages. They were seeing the content because Facebook’s content-recommendation system had pushed it into their news feeds.

As of October 2019, around 15,000 Facebook pages with a majority US audience were being run out of Kosovo and Macedonia, known bad actors during the 2016 election, says MIT review.

Collectively, those troll-farm pages — which the report treats as a single page for comparison purposes — reached 140 million US users monthly and 360 million global users weekly. Walmart’s page reached the second-largest US audience at 100 million.

Photo by Brett Jordan on Unsplash

IT’S ADDICTIVE. IT’S DANGEROUS. IT’S NOT GOOD FOR YOU

In an article I read on September 17 in the famous New York Times titled “What Facebook Knows,” the newspaper suggested that Facebook plays down awareness of its flaws. The bombshell report by the Wall Street Journal last week described these flaws, which it obtained through a whistle-blower.

To this very day, Facebook exempts high-profile users from some of its rules. The system, called “XCheck,” allows at least 5.8 million V.I.P. users to avoid Facebook’s normal enforcement process.

Instagram’s own research shows risks to teenagers’ mental health. It makes body image issues worse for one in three teen girls. Yet, it does nothing.

Facebook knows its algorithm rewards outrage. It knows that misinformation, toxicity, and violent content are prevalent among reshares. Yet, it does nothing.

It can’t even stop drug cartels and human traffickers from using its platform.

Yet, all we get is the endless Facebook apology.

Facebook is like the new cigarettes of the 21st Century. It’s addictive. It’s dangerous. It’s not good for you. As it did with cigarette companies, the US government needs to step in and really regulate what’s happening with Facebook.

“The Facebook Files” published by The Wall Street Journal are incriminating documents that can no longer go unnoticed.

It is time to act against Facebook, as well as Google and its spinoff YouTube, Apple, Amazon, and Microsoft, even the latest addition of SpaceX and Tesla.

I cannot understand how the United States of America would allow such tech companies to monopolize the world and leave us at their mercy. What happened to the free economy? What happened to freedom of choice? What free choice do we have when we can only create videos on YouTube, post photos on Instagram, share tweets on Twitter, buy the latest iPhone from Apple, and install the latest operating system from Microsoft?

I can only reiterate what Milton Freedman once said: “Underlying most arguments against the free market is a lack of belief in freedom itself.

“Many people want the government to protect the consumer. A much more urgent problem is to protect the consumer from the government.”

Watch it on YouTube https://youtu.be/NFzi_bi8o1Y

--

--

Michael Taifour

Irrepressible, opinionated, and always politically incorrect, satirist Michael covers the week’s news and features its main events in his own distinct way.