How Facebook neglected the rest of the world, fueling hate speech and violence in India; Porter Stansberry and I debate Facebook; Chamath Palihapitiya on DWAC; Greetings from Las Vegas
1) I think Facebook (FB) and the major services it owns, Instagram and WhatsApp, do a lot of good. I use all three almost every day to help me connect with old friends, make new ones, and keep in touch with my friends and family.
But there's a very dark side to these services as well... As I wrote in my open letter to chief operating officer Sheryl Sandberg, "It's clear that you have unwittingly created a monster that is doing enormous damage to individuals, institutions, societies, and governments around the world."
Many Americans might view this as hyperbole because their feeds aren't filled with hatred and misinformation (mine isn't).
But it's a different story outside North America, where 90% of Facebook's users reside. As the latest documents released by whistleblower Frances Haugen reveal, despite being aware of serious problems for years, the company has failed to invest in people and artificial intelligence to eliminate some of the most toxic stuff globally. Here's a Washington Post article about it: How Facebook neglected the rest of the world, fueling hate speech and violence in India. Excerpt:
"Hate spreads like wildfire on Facebook," Junaid said. "None of the hate speech accounts were blocked."
For all of Facebook's troubles in North America, its problems with hate speech and disinformation are dramatically worse in the developing world. Internal company documents made public Saturday reveal that Facebook has meticulously studied its approach abroad – and was well aware that weaker moderation in non-English-speaking countries leaves the platform vulnerable to abuse by bad actors and authoritarian regimes.
Here's a related article in the New York Times: In India, Facebook Grapples With an Amplified Version of Its Problems. Excerpt:
On Feb. 4, 2019, a Facebook researcher created a new user account to see what it was like to experience the social media site as a person living in Kerala, India.
For the next three weeks, the account operated by a simple rule: Follow all the recommendations generated by Facebook's algorithms to join groups, watch videos and explore new pages on the site.
The result was an inundation of hate speech, misinformation, and celebrations of violence, which were documented in an internal Facebook report published later that month.
"Following this test user's News Feed, I've seen more images of dead people in the past three weeks than I've seen in my entire life total," the Facebook researcher wrote.
"The test user's News Feed has become a near-constant barrage of polarizing nationalist content, misinformation, and violence and gore."
The report was one of dozens of studies and memos written by Facebook employees grappling with the effects of the platform on India. They provide stark evidence of one of the most serious criticisms levied by human rights activists and politicians against the world-spanning company: It moves into a country without fully understanding its potential impact on local culture and politics, and fails to deploy the resources to act on issues once they occur.
With 340 million people using Facebook's various social media platforms, India is the company's largest market. And Facebook's problems on the subcontinent present an amplified version of the issues it has faced throughout the world, made worse by a lack of resources and a lack of expertise in India's 22 officially recognized languages.
The internal documents, obtained by a consortium of news organizations that included The New York Times, are part of a larger cache of material called The Facebook Papers. They were collected by Frances Haugen, a former Facebook product manager who became a whistleblower and recently testified before a Senate subcommittee about the company and its social media platforms. References to India were scattered among documents filed by Ms. Haugen to the Securities and Exchange Commission in a complaint earlier this month.
The documents include reports on how bots and fake accounts tied to the country's ruling party and opposition figures were wreaking havoc on national elections. They also detail how a plan championed by Mark Zuckerberg, Facebook's chief executive, to focus on "meaningful social interactions," or exchanges between friends and family, was leading to more misinformation in India, particularly during the pandemic.
Facebook did not have enough resources in India and was unable to grapple with the problems it had introduced there, including anti-Muslim posts, according to its documents. Eighty-seven percent of the company's global budget for time spent on classifying misinformation is earmarked for the United States, while only 13 percent is set aside for the rest of the world — even though North American users make up only 10 percent of the social network's daily active users, according to one document describing Facebook's allocation of resources.
I continue to think that Facebook should do what I wrote in my letter to Sandberg:
... immediately reactivate the Civic Integrity team and then persuade Frances Haugen to come back to lead it. I've been watching 60 Minutes since I was a kid in the 1970s, and she was one of the most impressive people I've ever seen on the show. Give her a blank check – at least ten times the budget that the team used to have. Lastly, make sure she reports directly to you and Mark and then do what she tells you.
2) I always like to consider – and share with my readers – well-articulated points of view that are contrary to my own. So with that in mind, here's what my old friend Porter Stansberry wrote to me in response to my recent e-mails about Facebook:
I'm writing only as a reader and not your business partner.
And I know we don't agree about lots of things or see the world in the same way. That's fine with me, of course. That's what makes a market and makes life interesting.
As your reader, I'm disappointed with your Facebook criticism. And lost credibility with me on this topic.
Here's why.
1. You seemed far too eager to believe the 'whistleblower' in a way that was devoid of any skepticism. Critical part being 'devoid of skepticism.' Can't tell you how many midlevel employees have been similarly critical of our leadership and business model – usually because of personal reasons that have nothing to do with their public comments.
Her view of the company and its products just seems woefully naive to me. Could she be right anyway? Maybe. But think about who owns Dow Jones. Shouldn't you at least acknowledge that a source made famous by a competing media conglomerate isn't likely to be telling the whole story?
Maybe I'm sensitive to this because of the hacks at Bloomberg and others have taken on me over the years half-truths without any context.
But you are saying she's the most credible person you've seen on 60 Minutes? Man, I feel like you've totally lost your judgment about this issue.
2. Saying Zuck is the most powerful person on the planet? That's laugh out loud ridiculous. Zuck has to depend on people choosing to use his products, none of which are necessary to life or happiness, and advertisers who are willing to pay for them. That's a tall order in a market with zero barriers to entry and competitors emerging constantly – like Snapchat and TikTok.
You might recall that there are several people who control thousands of nuclear warheads still hanging around. Seems like that's a lot more scary than a computer nerd who will post your pictures for you... doesn't it?
Anyways... just wanted you to know I'm reading and loving your work – even when I don't agree.
Your pal,
Porter
I replied:
Hi Porter,
To No. 1, that's a fair point. It's only one person. But I did find her awfully credible, and her story was very consistent with many other data points, as I explained in my follow-up e-mail . I also dedicated an entire e-mail to Zuckerberg and my friend's responses to her. (You may recall my friend called her "an idiot.") So, I feel like I fairly presented the other side.
Re. No. 2, I stand by my viewpoint. With 2.9 billion monthly users (37% of humans on earth), Facebook reaches more people, more often, for more time than any company or organization, past or present (for context, the largest religion in the world, Christianity, has 2.4 billion followers).
There have been plenty of studies showing how even small tweaks to its algorithms regarding what type of content users see can have a significant influence on what they think and do.
And Zuckerberg controls Facebook completely.
No, he's not the President and doesn't control the U.S. military or have his finger on the nuclear button, but given that the last two elections were decided by fewer than 80,000 total votes in 2016 and 44,000 in 2020 in three states that determined the winner, I'd argue that Zuckerberg, if he wanted, could decide who becomes president.
Thanks for the feedback – it's always welcome!
Best,
Whitney
3) Speaking of presenting the other side...
In Friday's e-mail, I heaped scorn on the latest foolish meme stock, Digital World Acquisition (DWAC), writing that "this is one of the stupidest things I've ever seen and that this stock is going to implode, likely within days."
The "King of SPACs," Chamath Palihapitiya, has a more nuanced view that he discussed with Internet entrepreneur, angel investor, and author Jason Calacanis in the duo's latest All-In podcast which is posted on YouTube here.
4) And speaking of Porter, he and I both spoke this morning at the 2021 Stansberry Conference & Alliance Meeting in Las Vegas. I'll be reporting on some of the highlights in my e-mails this week.
Best regards,
Whitney
