Members of the Oversight Board will meet with Facebook whistleblower Frances Haugen as it investigates the company’s controversial “cross check” system.
“In light of the serious claims made about Facebook by Ms. Haugen, we have extended an invitation for her to speak to the Board over the coming weeks, which she has accepted,” the Oversight Board wrote in a statement. “Board members appreciate the chance to discuss Ms. Haugen’s experiences and gather information that can help push for greater transparency and accountability from Facebook through our case decisions and recommendations.”
In a statement, Haugen confirmed the upcoming meeting. “Facebook has lied to the board repeatedly, and I am looking forward to sharing the truth with them,” she wrote.
I have accepted the invitation to brief the Facebook Oversight Board about what I learned while working there. Facebook has lied to the board repeatedly, and I am looking forward to sharing the truth with them.
The board has also been pushing Facebook to provide more information about the program, in light of Haugen’s disclosures. “Cross check” is the internal designation used by the social network for high profile accounts, including celebrities, politicians and athletes. The company has said it’s meant to provide an extra level of scrutiny when those accounts might break the platform’s rules. But according to documents Haugen provided to The Wall Street Journal, Facebook often doesn’t review violations from these accounts, effectively allowing them to break its rules without consequences. In other cases, reviews are so delayed that rule-breaking content is viewed millions of times before it’s removed.
Crosscheck was also a central issue in the Oversight Board’s handling of Donald Trump’s Facebook suspension. The board had asked Facebook for more details about cross check, saying that the company’s rules “should apply to all users.” But Facebook said it was “not feasible” to provide additional info, even though Haugen’s disclosures suggested the company has been tracking problems related to the program.
Facebook didn't immediately respond to a request for comment. The company said last month, following The Wall Street Journal's reporting, that it had asked the board to provide recommendations on how to improve cross check. The Oversight Board will release its first transparency report later this month, which will provide an update on cross check, based on its discussions with Facebook officials and Haugen. The report will be the board’s first assessment of how the social network has responded to its policy recommendations.
Facebook and its apps are down yet again, according to the company. The extent of the latest outage wasn't immediately clear, but Facebook and Instagram both acknowledged that "some" users are having trouble accessing their services.
We’re aware that some people are having trouble accessing our apps and products. We’re working to get things back to normal as quickly as possible and we apologize for any inconvenience.
Facebook is reportedly slowing down its product development so it can conduct “reputational reviews” in the wake of whistleblower Frances Haugen’s disclosures about the company.
According to The Wall Street Journal, Facebook has “put a hold on some work on existing products” while a team of employees analyze how the work could further damage their reputation. The group is looking at potential negative effects on children, as well as criticism the company could face.
Zuckerberg alluded to the change in a statement Tuesday — his first since the whistleblower’s disclosures became public. “I believe that over the long term if we keep trying to do what's right and delivering experiences that improve people's lives, it will be better for our community and our business,” he wrote. “I've asked leaders across the company to do deep dives on our work across many areas over the next few days so you can see everything that we're doing to get there.”
The change is one of the clearest signs yet of how much Haugen’s disclosures have rocked the company in recent weeks. Facebook has already “paused” its work on an Instagram Kids app, after a WSJ report on company research showing Instagram is harmful to some teens’ mental health. Though Facebook has attempted to downplay its own research, pressure has mounted since Haugen, a former product manager, stepped forward and testified in a three-hour Senate hearing this week.
She told lawmakers Zuckerberg and other executives have prioritized the social network’s growth over users’ safety, and that the company has misled the public about its AI-based moderation technology. She’s called on Facebook to make its research more widely available, and urged Congress to impose new regulations on the platform.
The whistleblower behind “bombshell” disclosures that have rocked Facebook in recent weeks spent much of Tuesday's three-hour hearing explaining to Congress how Facebook could fix itself.
While the hearing was far from the first time a Facebook critic has briefed lawmakers, her insider knowledge and expertise in algorithm design made her particularly effective. Her background as part of the company’s civic integrity team meant she was intimately familiar with some of the biggest problems on Facebook.
During the hearing, Haugen spoke in detail about Facebook’s algorithms and other internal systems that have hampered its efforts to slow misinformation and other problematic content. She also praised the company’s researchers, calling them “heroes,” and said Facebook should be required to make their work public.
Remove algorithmic ranking and go back to chronological feeds
One of the most notable aspects of Haugen’s testimony was her expertise, which gives her a nuanced understanding of how algorithms work and the often unintended consequences of using them.
“I hope we will discuss as to whether there is such a thing as a safe algorithm,” Sen. Richard Blumenthal said at the start of the hearing. While Haugen never addressed that question directly, she did weigh on the ranking algorithms that power the feeds in Facebook and Instagram. She noted that Facebook’s own research has found that “engagement-based ranking on Instagram can lead children from very innocuous topics like healthy recipes… to anorexia-promoting content over a very short period of time.”
She also said that Facebook’s AI-based moderation tools were much less effective than what the company has publicly portrayed. “We've seen from repeated documents within my disclosures that Facebook's AI systems only catch a very tiny minority of offending content,” Haugen said. “Best case scenario, in the case of something like hate speech, at most they will ever get to 10 to 20%.”
To address this, Haugen said that Facebook could move to a chronological feed where posts are ordered by recency, rather than what is most likely to get engagement. “I'm a strong proponent of chronological ranking, or ordering by time with a little bit of spam demotion, because I think we don't want computers deciding what we focus on,” Haugen said.
She noted that Facebook would likely resist such a plan because content that gets more engagement is better for their platform because it causes people to post and comment more. “I've spent most of my career working on systems like engagement-based ranking,” Haugen said. “When I come to you and say these things, I’m basically damning 10 years of my own work.”
TOM BRENNER via Getty Images
Reform Section 230
In a similar vein, Haugen said that Section 230 — the 1996 law that protects companies from being liable for what their users say and do on their platforms — should be reformed “to make Facebook responsible for the consequences of their intentional ranking decisions.” She said that such a law would likely “get rid of engagement-based ranking” because it would become too big of a liability for the company.
At the same time, she cautioned lawmakers to not let Facebook “trick” them into believing that changing Section 230 alone would be enough to address the scope of its problems. She also noted that using the law to police Facebook’s algorithms could be easier than trying to address specific types of content. “User generated content is something that companies have less control over, they have 100% control over their algorithms,” Haugen said.
The focus on Section 230 is significant because lawmakers from both parties have proposed various changes to the law. During the hearing, Blumenthal indicated that he too supported “narrowing this sweeping immunity when platforms’ algorithms amplify illegal conduct.” Senator Amy Klobuchar has also proposed ending 230 protections for vaccine and health misinformation. Meanwhile, Republicans have tried to eliminate Section 230 for very different reasons.
Slow down virality
Likewise, Haugen suggested that Facebook should slow down its platform with “soft interventions” that would add small bits of friction to the platform. She pointed to Twitter’s “read before sharing” prompts as the kind of measure that can reduce the spread of misinformation.
“Small actions like that friction don't require picking good ideas and bad ideas,” she said. “They just make the platform less twitchy, less reactive. And Facebook's internal research says that each one of those small actions dramatically reduces misinformation, hate speech and violence-inciting content on the platform.”
Facebook has taken these steps in the past. Notably, it applied these “break glass” measures in the days after the presidential election, though the company rolled some of them back the following month. The company implemented similar changes again, less than a month later, in the aftermath of the insurrection January 6th.
Huagen said that Facebook has mischaracterized these changes as being harmful to free speech, when in fact the company is concerned because it “wanted that growth back.” During the hearing, she said that Mark Zuckerberg had been personally briefed on just how impactful changes like this could be. But, she said, he prioritized the platform’s growth “over changes that would have significantly decreased misinformation and other inciting content.”
Open Facebook’s research to people outside the company
Access to Facebook’s data has become a hot button issue in recent weeks as researchers outside the company have complained that the company is stifling independent research. Haugen said the social network should work toward making its own internal research available to the public.
She proposed that there should be a set period of time — she suggested as long as 18 months — when Facebook is able to keep its research under wraps. But then the company should make it accessible to those outside the company.
“I believe in collaboration with academics and other researchers that we can develop privacy-conscious ways of exposing radically more data that is available today,” Haugen said. “It is important for our ability to understand how algorithms work, how Facebook shapes the information, we get to see that we have these data sets to be publicly available for scrutiny.”
She went on to say that Facebook's researchers are among its “biggest heroes” because “they are boldly asking real questions and willing to say awkward truths.” She said it was “unacceptable” that the company has been “throwing them under the bus” in its effort to downplay her disclosures.
Facebook
A dedicated ‘oversight body’
Besides internal changes, Haugen also said that there should be a dedicated “oversight body” with the power to oversee social media platforms. She said that such a group within an agency like the Federal Trade Commission could provide “a regulatory home where someone like me could do a tour of duty after working at a place like this.”
“Right now, the only people in the world who are trained to analyze these experiments, to understand what's happening inside of Facebook, are people who grew up inside of Facebook or Pinterest or another social media company,” she said.
Importantly, this “oversight body” would be separate from the Facebook-created Oversight Board, which advises Facebook on specific content decisions. While Facebook has said the creation of the Oversight Board is proof it’s trying to self-regulate, Haugen wrote in prepared remarks that the Oversight Board “is as blind as the public” when it comes to truly knowing what happens inside of the company.
It’s also worth noting that Haugen said she was opposed to efforts to break up Facebook. She said that separating Facebook and Instagram would likely result in more advertisers flocking to Instagram, which could deplete Facebook’s resources for making changes to improve its platform.
What’s next
While it’s unclear which, if any, of Haugen’s recommendations Congress will act on, her disclosures have already caught the attention of regulators. In addition to providing documents to Congress, she has also given documents to the Securities and Exchange Committee. She has alleged that Zuckerberg and other executives have “misled investors and the public about its role perpetuating misinformation and violent extremism relating to the 2020 election and January 6th insurrection," according to SEC filings published by 60 Minutes.
Meanwhile, Facebook has continued to push back on Haugen’s claims. A week after an executive told lawmakers that “this is not bombshell research,” the company tried to discredit Haugen more directly. In a statement, Facebook’s Director of Policy Communications Lena Pietsch, said Haugen “worked for the company for less than two years, had no direct reports, never attended a decision-point meeting with C-level executives. We don’t agree with her characterization of the many issues she testified about.” Pietsch added that “it’s time to begin to create standard rules for the internet.”
In an appearance on CNN following the hearing, Facebook VP Monika Bickert referred to Haugen’s disclosures as “stolen documents” and said the company’s research had been “mischaracterized.” Later that night, Zuckerberg publicly weighed in for the first time since The Wall Street Journal began publishing stories based on Haugen's disclosures (Zuckerberg did once refer to earlier coverage of the scandals, complaining that a news article has mistakenly described his hydrofoil as an "electric surfboard.") In his first substantive statement, he said "many of the claims don't make any sense," and that "the argument that we deliberately push content that makes people angry for profit is deeply illogical."
It could still get more difficult for Facebook to counter Haugen, though, particularly if new documents become public. Her letter to the SEC suggests that Facebook knew much more about QAnon and violent extremism on its platform than it let on, as Vicereported earlier. Haugen may also make appearances in front of lawmakers in other countries, too. European lawmakers, many of whom have expressed similar concerns as their US counterparts, have also indicated they want to talk to Haugen and conduct new investigations of their own.
The Facebook whistleblower who has provided a trove of internal documents to Congress and the Securities and Exchange Commission is testifying about research she says proves the social network has repeatedly lied about its platform. The documents were the basis for The Wall Street Journal's reporting on Facebook's controversial rules for celebrities, and the disastrous effect of Instagram on some teens' mental health.
“Facebook and big tech are facing their big tobacco moment,” committee chairman Sen. Richard Blumenthal said at the start of the hearing. “Facebook knows its products can be addictive and toxic to children. They value their profit more than the pain that they cause to children and their families.”
In her opening statement, Frances Haugen, the former Facebook product manager turned whistleblower, said that the company has ignored much of its own research and is “buying its profits with our safety.” She urged Congress to adopt new regulations
“The choices being made inside of Facebook are disastrous for our children, our public safety, our privacy and for our democracy,” Haugen said. “And that is why we must demand Facebook make changes.”
She highlighted Facebook’s unwillingness to make data available outside of its own research teams has helped the company mislead the public. “The company intentionally hides vital information from the public, from the US government, and from governments around the world,” Haugen said. “The documents I have provided to Congress prove that Facebook has repeatedly misled the public about what its own research reveals about the safety of children, the efficacy of its artificial intelligence systems, and its role in spreading divisive and extreme messages.”
She also said that Congress should not be swayed by Facebook’s insistence on “false choices,” and that simply reforming privacy laws or Section 230 would not go far enough. “We can afford nothing less than full transparency,” Haugen said. “Facebook wants you to believe that the problems we're talking about are unsolvable... Facebook can change, but it's clearly not going to do so on its own."
Haugen’s appearance comes days after Facebook sent its head of safety, Antigone Davis, to testify in front of the same committee. She and other executives have repeatedly tried to downplay the company's research, with Davis saying that the documents "were not bombshell research." In Tuesday’s hearing, some senators called out Mark Zuckerberg, saying that they should be hearing from him instead. “Rather than taking personal responsibility, showing leadership, Mark Zuckerberg is going sailing,” Blumenthal said, in an apparent reference to a recent Facebook post from the CEO.
Facebook services seem to be slowly coming back online after one of the biggest outages in recent memory. Facebook, Instagram and Messenger’s apps appear to be working again, though the websites are loading more slowly than usual. Meanwhile, WhatsApp's website seems to be back, but the app is still having issues connecting.
As of 6:05pm ET Monday, the "Facebook for Business Status" page was still showing "major disruptions," to the social network's core services. But that was still an improvement from earlier in the day when the website was offline entirely.
Facebook didn’t immediately comment or elaborate on the cause of the outage. In an earlier tweet, the company’s outgoing Chief Technology Officer, Michael Schroepfer, cited “networking issues.”
*Sincere* apologies to everyone impacted by outages of Facebook powered services right now. We are experiencing networking issues and teams are working as fast as possible to debug and restore as fast as possible
The outage lasted more than six hours, taking down Instagram, WhatsApp, Messenger and Oculus. It also wreaked havoc on the company internally, with employees reportedly unable to access emails, Workplace and other tools. The New York Timesreported that employees were also physically locked out of offices as workers’ badges stopped working.
It also shaved billions of dollars off of Mark Zuckerberg’s personal net worth as Facebook’s stock tanked, Bloomberg reported. Elsewhere, the company is still reeling from the fallout of a whistleblower who has accused the company of prioritizing “profits over safety.” The whistleblower was The Wall Street Journal’s primary source for several articles that details how Instagram is harmful to teens and the company’s controversial “cross check” program that allows high profile users to break its rules.
Facebook is once again asking a federal judge to dismiss the Federal Trade Commission’s antitrust suit against the social network. In a new filing, the company argued that the government “still has no factual basis for alleging monopoly power.”
The FTC originally filed antitrust charges against the company last December. A judge dismissed that complaint in June, saying the government’s case was “legally insufficient,” but gave the FTC a chance to refile. The FTC filed a new complaint in August. The amended complaint relied on the same arguments but was more detailed than the initial suit. In it, the government argued that Facebook used its acquisitions of WhatsApp and Instagram to quash rivals it viewed as an “existential threat.”
“The complaint alleges that after repeated failed attempts to develop innovative mobile features for its network, Facebook instead resorted to an illegal buy-or-bury scheme to maintain its dominance,” the FTC wrote in a statement at the time. “Lacking serious competition, Facebook has been able to hone a surveillance-based advertising model and impose ever-increasing burdens on its users.”
The judge has until November 17th to respond. Even if Facebook is successful in getting the new FTC suit dismissed, the company is still facing numerous other investigations into its policies and practices. European regulators have also opened an antitrust probe into the social network, and the UK’s competition watchdog is also reportedly investigating the company.Meanwhile, in the US, Facebook is still reeling from the fallout of a whistleblower who has provided thousands of documents to Congress and the Securities and Exchange Commission, which she says prove the company “chooses profit over safety.” The whistleblower, former product manager Frances Haugen, is scheduled to testify at a Senate Commerce Committee hearing Tuesday morning.
Facebook recently introduced its first wearable: Ray-Ban Stories, smart sunglasses with cameras, microphones and speakers built in.If that sounds familiar, it might be because the glasses are pretty similar to what Snapchat has been doing for the last five years with Spectacles. Even the name, Ray-Ban Stories, feels like a big subtweet at Snap. But despite its head start, Spectacles have yet to be a big hit for the company. And, with a $300 price tag and Facebook’s name on the box, Ray-Ban Stories may also prove to be a difficult sell.
Both Spectacles and Ray-Ban Stories represent something much bigger to the social media companies that made them. Snapchat and Facebook are hoping to define the future of augmented reality, and are betting that camera-enabled sunglasses will help them get there.
But look closely, and the companies have taken very different approaches. While Ray-Ban Stories look pretty close to regular Wayfarers, Spectacles have never looked like a typical pair of sunglasses. Snapchat has also been more ambitious about integrating its augmented reality effects into the glasses. And the company recently began experimenting with a new set of Specs that are capable of real AR, though they aren’t for sale.
Non-AR “smart glasses” are still a niche product, but Ray-Ban Stories might be one of the best iterations yet. The frames make it easy to capture first-person photos and videos, and the built-in speakers sound surprisingly good. Most importantly, they look more like designer sunglasses than a piece of tech. But Facebook’s reputation is hard to ignore, especially when you’re wearing a camera it designed on your actual face.
But if you’re excited about the future of augmented reality, and what one day might be possible, both Ray-Ban Stories and Spectacles offer an intriguing look at how two of the biggest social media platforms are thinking about getting there.
Yet another Facebook official just spent hours being grilled by members of Congress about the company’s policies, and whether or not it does enough to protect some of its most vulnerable users. And once again, the Facebook executive — today it was Head of Safety Antigone Davis — seemed to do her best to dodge the most difficult questions.
But the latest hearing on teen mental health, which came in response to reporting from The WSJ, was different from past hearings. That’s because, thanks to a whistleblower, members of the Senate Commerce Committee now have access to thousands of internal documents written by the company’s own researchers.
The documents, some of which have been made public, paint a very different picture of Facebook and Instagram’s understanding of how their services impact teens’ mental health than what they’ve publicly portrayed. Those documents are in the hands of lawmakers, making the findings that much harder for Facebook to spin. The disclosures have already forced Facebook to "pause" work on an Instagram Kids app.
“We now have a deep insight into Facebook's relentless campaign to recruit and exploit young users,” Senator Richard Blumenthal said at the start of the hearing. “We now know, while Facebook publicly denies that Instagram is deeply harmful for teens, privately, Facebook, researchers and experts have been ringing the alarm for years.”
This has forced Facebook into the uncomfortable position of trying to downplay the significance of its own research. “This is not bombshell research,” Davis repeated multiple times during the hearing. One day earlier, Facebook released heavily annotated versions of two of the documents, with notes that also tried to explain away its own findings. Those documents, which were just two of the “thousands” Blumenthal said he now has access to, used words like “myopic” and “sensationalizing” to try to minimize findings like the fact that Instagram makes “body images worse for 1 in 3 teen girls.”
The tactic didn’t go over well in the Senate on Thursday. “This research is a bombshell,” Blumenthal said. “It is powerful, gripping, riveting evidence that Facebook knows the harmful effects of its site on children, and that it has concealed those facts and findings.”
As with past hearings, there were some cringey moments. At one point, Blumenthal demanded to know if Facebook would “commit to ending finsta” — a reference to the secondary accounts often used by teens to stay anonymous. That forced Davis to awkwardly explain that so-called “finstas” are not an official Instagram feature. At another point, Sen. Ted Cruz demanded Davis explain why she wasn’t appearing at the hearing in person (she cited COVID-19 protocols).
Here's the full exchange. Unlike Sen. Orrin Hatch in 2018 (who was mocked based on a clip that was taken out of context to suggest he didn't know Facebook runs ads), the longer clip in this case doesn't really get any less awkward for Sen. Blumenthal. https://t.co/dn0LCmdiQ4
But even with those moments, it was difficult to ignore the significance of these issues. It may seem obvious, but kids and teens are incredibly important to the company, which is consistently behind rivals like TikTok and Snapchat for that demographic. So much so that a former employee who worked on Messenger Kids recently said that “losing the Teen audience was considered an 'existential threat,'” for Facebook.
Worse for Facebook, there are very likely more bombshells coming. The whistleblower who provided the documents to The Journal and lawmakers, is appearing on 60 MinutesSunday night. And she is testifying at a separate Commerce Committee hearing next week. So while Facebook executives may be able to dodge questions and insist that their researchers’ conclusions have been mischaracterized, it will be much harder to rebut someone who was closely involved with that work.
Some senators hinted that there would be more to come at the next hearing. Senator Ray Luján asked Davis whether “Facebook ever tested whether a change to its platform increases an individual's or a group of users' propensity to post a violent or hateful language.” Davis said that it wasn’t her “area of expertise.”
“We might get more responses to that one next week,” he said.
Facebook has published two slide decks detailing its research into how Instagram affects teens’ mental health. The slides were heavily cited by The Wall Street Journal earlier this month in a story that reported the company’s own researchers had found that “Instagram is harmful for a sizable percentage” of teens, particularly teenage girls.
Instagram has attempted to rebut those claims, saying its research was mischaracterized. But the ensuing backlash has already forced the company to “pause” its work on an Instagram Kids app. It also raised pressure on Facebook to release the underlying research, which the company ultimately agreed to do. Facebook’s head of safety is scheduled to testify at a Senate Commerce Committee hearing on child safety on Instagram Thursday.
Many of the slides include lengthy annotations with additional “context” on the more controversial aspects of the research. For example, a slide titled “The Perfect image, feeling attractive, and having enough money are most likely to have started on Instagram,” states that the information in the slide “should not be used as estimates of average experience among teen users.”
Other annotations, like one on a slide, titled “One in five teens say that Instagram makes them feel worse about themselves, with UK girls the most negative,” attempt to downplay the findings. “This research was not intended to (and does not) evaluate causal claims between Instagram and health or well-being.” (That line is repeated on several other slides.)