One week after a massive Facebook outage that took all of the social network’s apps offline for more than six hours, Instagram says it’s testing notifications that will alert users to “temporary issues” like outages or other technical issues.
The new alerts would appear in users’ Activity Feed, alongside other in-app notifications. The messages could be used to let users know about specific issues, like Story uploads not working, or a more widespread problem, like the two outages last week. Importantly, Instagram says it doesn’t plan to alert users to every issue, but ones that may be a source of widespread confusion.
“We won’t send a notification every single time there is an outage, but when we see that people are confused and looking for answers, we’ll determine if something like this could help make things clearer,” Instagram wrote in a blog post. The company added that it’s testing the feature in the US “for the next few months.”
Instagram
Separately, Instagram also showed off a new “account status” section of its app, which is meant to alert users to “what's going on with your account” more generally. Instagram says it’s starting with notifications about posts that are removed and when an account “is at risk of being disabled” due to rule violations.
According to Instagram, the feature is meant to make it easier for users to understand why a post may have been removed, and whether or not they may be in danger of losing an account altogether. While the app has notified users in the past when a post is labeled or removed, the company hasn’t always done a good job letting people know which policy they violated. The Oversight Board has repeatedly told Facebook it needs to do a better job at explaining rules to users, and account status could help them do just that.
Account status could also help the app address a more Instagram-specific issue: concerns over “shadowbanning.” Instagram says that “in the coming months” it plans to update account status to let people know “how their content is being distributed and recommended across different parts of Instagram.”
All Twitter users can now remove a follower without having to block them. The company started testing this option last month, and starting today, everyone will have access to it. To quietly stop someone from seeing your tweets in their feed, go to the Followers tab on your profile, click the three-dot menu next to the user in question and select the "Remove this follower" option.
This is part of Twitter's efforts to reduce harassment on the platform. Blocking someone you don't want to follow you could lead to retaliation from that person via their allies or their secondary accounts after they find out. Cutting them in this fashion and muting them will mean they're none the wiser that they're out of the loop.
This method won't prevent someone you boot from your followers list from seeing your public tweets. Only blocking them or making your account private will do that. Elsewhere, Twitter is testing a Safety Mode, which automatically blocks accounts that use “potentially harmful language.” It's also looking into more ways to filter and limit replies, so it seems the company is making its anti-harassment efforts a bigger priority.
Members of the Oversight Board will meet with Facebook whistleblower Frances Haugen as it investigates the company’s controversial “cross check” system.
“In light of the serious claims made about Facebook by Ms. Haugen, we have extended an invitation for her to speak to the Board over the coming weeks, which she has accepted,” the Oversight Board wrote in a statement. “Board members appreciate the chance to discuss Ms. Haugen’s experiences and gather information that can help push for greater transparency and accountability from Facebook through our case decisions and recommendations.”
In a statement, Haugen confirmed the upcoming meeting. “Facebook has lied to the board repeatedly, and I am looking forward to sharing the truth with them,” she wrote.
I have accepted the invitation to brief the Facebook Oversight Board about what I learned while working there. Facebook has lied to the board repeatedly, and I am looking forward to sharing the truth with them.
The board has also been pushing Facebook to provide more information about the program, in light of Haugen’s disclosures. “Cross check” is the internal designation used by the social network for high profile accounts, including celebrities, politicians and athletes. The company has said it’s meant to provide an extra level of scrutiny when those accounts might break the platform’s rules. But according to documents Haugen provided to The Wall Street Journal, Facebook often doesn’t review violations from these accounts, effectively allowing them to break its rules without consequences. In other cases, reviews are so delayed that rule-breaking content is viewed millions of times before it’s removed.
Crosscheck was also a central issue in the Oversight Board’s handling of Donald Trump’s Facebook suspension. The board had asked Facebook for more details about cross check, saying that the company’s rules “should apply to all users.” But Facebook said it was “not feasible” to provide additional info, even though Haugen’s disclosures suggested the company has been tracking problems related to the program.
Facebook didn't immediately respond to a request for comment. The company said last month, following The Wall Street Journal's reporting, that it had asked the board to provide recommendations on how to improve cross check. The Oversight Board will release its first transparency report later this month, which will provide an update on cross check, based on its discussions with Facebook officials and Haugen. The report will be the board’s first assessment of how the social network has responded to its policy recommendations.
Facebook is trying to mend its reputation in the wake of whisleblower Frances Haugen's testimony, and that includes promises of features lessening the potential harm for teens. CNN and Reuters report that Facebook Global Affairs VP Nick Clegg promised Instagram would introduce a "take a break" feature that encouraged teens to simply stop using the social network for a while. Clegg didn't say when it would be ready, but this was clearly meant to reduce addiction and other unhealthy behavior.
The social media exec also said Facebook would "nudge" teens away from material in its apps that "may not be conducive to their well-being." He didn't provide specifics for this new approach. He did, however, suggest that Facebook's algorithms should be "held to account," including by regulation if needed, to be sure real-world results matched intentions.
The new methods might address some of Haugen's concerns. She claimed Facebook was aware its algorithms were destructive, leading children to harmful material and removing only a fraction of hate speech. Haugen also felt Congress should reform the Communications Decency Act's Section 230 to increase Facebook's liability for algorithm-chosen content, and that Facebook should add friction to reduce the virality of content and force users to think about posts rather than share them reflexively.
At the same time, this might not satisfy Haugen and fellow critics. Breaks and nudges may reduce exposure to harmful content, but they won't remove the content in question. Clegg's statements also reflect a familiar strategy at Facebook. It likes to invite regulation, but only the regulation it's comfortable with. While the proposed changes could help, politicians may demand more — in part to prevent Facebook from dictating its own regulation.
Facebook and its apps are down yet again, according to the company. The extent of the latest outage wasn't immediately clear, but Facebook and Instagram both acknowledged that "some" users are having trouble accessing their services.
We’re aware that some people are having trouble accessing our apps and products. We’re working to get things back to normal as quickly as possible and we apologize for any inconvenience.
More than a decade after Aaron Sorkin and David Fincher dramatized the rise of Facebook with The Social Network, a new TV series will attempt to tell the story of the company’s more recent history. Per Deadline, production companies Anonymous Content and Wiip, best known for their work on Mr. Robot and Dickinson, are working on a show titled Doomsday Machine that will star two-time Emmy winner Claire Foy as COO Sheryl Sandberg.
Big News! The insanely talented @ayadakhtar has written a TV drama based on our book and Claire Foy is set to play Sheryl Sandberg. I am so psyched to see what they do! https://t.co/u38CGeXnA1
Based in part on AnUgly Truth: Inside Facebook’s Battle for Domination, Deadline reports the series will cover everything from Facebook’s actions during the 2016 presidential election up to more recent revelations about its business. That includes recent reporting from The Wall Street Journal that showed Facebook has for years ran a program called XCheck, which has allowed high-profile users, including former President Donald Trump, to skirt its content moderation rules.
The timing of the announcement comes as Facebook faces increasing scrutiny from federal lawmakers. On Wednesday, whistleblower Frances Haugen told the Senate Commerce Committee Congress should regulate the social media giant. It also comes after the company went through an hours-long outage on October 4th that left people unable to access Facebook, Instagram and WhatsApp. In the aftermath of that event, there have been renewed calls from American lawmakers, including Representative Alexandria Ocasio-Cortez, to break the company up into smaller entities.
The Internet Archive is marking its 25th anniversary by peering into the future to predict what the web might look like a quarter of a century from now. The non-profit took the opportunity to rail against internet regulation by offering a grim vision of what lies ahead.
Punch a URL into the Wayforward Machine and you'll see a version of that page covered in pop-ups. The messages include one reading "Classified content. The website you are trying to access features information that the owner(s) have opted to restrict to users that have not shared their personal information." Another reads "This site contains information that is currently classified as Thought Crime in your region."
The way things are going, the Internet Archive suggests, free and open access to knowledge on the web may become far more limited. A Wayforward subsite includes a timeline of things that might go awry in the coming years, starting with the repeal of section 230 of the Communications Decency Act, which protects websites and internet platforms from being liable for things that users post. A repeal could have enormous consequences for the web, though some, such as Facebook CEO Mark Zuckerberg, have proposed that the provision should be reformed.
The timeline includes some other wild-but-not-inconceivable suggestions, such as a law allowing corporations to copyright facts, forcing Wikipedia to move to the Dark Web, and more countries introducing their own versions of China's Great Firewall. The Internet Archive teamed up with several digital rights organizations for this project, including the Electronic Frontier Foundation, Fight for the Future and the Wikimedia Foundation. The subsite includes resources on how to help protect freely available information.
The Wayforward Machine is, of course, a satirical version of the Wayback Machine, which has archived hundreds of billions of web pages over the last two and a half decades. It's an important resource for helping preserve the history of the internet, including things like Flash games and animations, so it's probably worth paying attention to the Internet Archive's vision of the future.
YouTube scrapped its Rewind 2020 video due to COVID-19 and social unrest, but it's not coming back now that the turmoil is (partly) calming down. As Tubefilter first confirmed, YouTube is cancelling its year-end Rewind videos once and for all. The service insists it's not due to the blowback from Rewind 2018, however. Rather, YouTube is reportedly so large that it would be impractical to summarize the site with a yearly video.
The Google-owned brand will instead trust creators like MrBeast and Slayy Point to produce end-of-year videos, and promote them through social networks. You'll also see annual trend lists, awards shows and a currently mysterious "interactive experience."
Rewind debuted in 2010 and was popular for most of its history as a snapshot of the online zeitgeist. That all fell apart with Rewind 2018, however. Many felt the video both ignored major creators like Pewdiepie and had more than a few cringe-worthy moments (Will Smith's "oh, that's hot" haunts people to this day). When YouTube returned with Rewind 2019, it abdicated editorial control and let the statistics guide the content to the frustration of viewers. Even if YouTube is right about the site becoming too large for Rewind, the demand just isn't what it used to be — a revival might not have much of an impact.
The whistleblower behind “bombshell” disclosures that have rocked Facebook in recent weeks spent much of Tuesday's three-hour hearing explaining to Congress how Facebook could fix itself.
While the hearing was far from the first time a Facebook critic has briefed lawmakers, her insider knowledge and expertise in algorithm design made her particularly effective. Her background as part of the company’s civic integrity team meant she was intimately familiar with some of the biggest problems on Facebook.
During the hearing, Haugen spoke in detail about Facebook’s algorithms and other internal systems that have hampered its efforts to slow misinformation and other problematic content. She also praised the company’s researchers, calling them “heroes,” and said Facebook should be required to make their work public.
Remove algorithmic ranking and go back to chronological feeds
One of the most notable aspects of Haugen’s testimony was her expertise, which gives her a nuanced understanding of how algorithms work and the often unintended consequences of using them.
“I hope we will discuss as to whether there is such a thing as a safe algorithm,” Sen. Richard Blumenthal said at the start of the hearing. While Haugen never addressed that question directly, she did weigh on the ranking algorithms that power the feeds in Facebook and Instagram. She noted that Facebook’s own research has found that “engagement-based ranking on Instagram can lead children from very innocuous topics like healthy recipes… to anorexia-promoting content over a very short period of time.”
She also said that Facebook’s AI-based moderation tools were much less effective than what the company has publicly portrayed. “We've seen from repeated documents within my disclosures that Facebook's AI systems only catch a very tiny minority of offending content,” Haugen said. “Best case scenario, in the case of something like hate speech, at most they will ever get to 10 to 20%.”
To address this, Haugen said that Facebook could move to a chronological feed where posts are ordered by recency, rather than what is most likely to get engagement. “I'm a strong proponent of chronological ranking, or ordering by time with a little bit of spam demotion, because I think we don't want computers deciding what we focus on,” Haugen said.
She noted that Facebook would likely resist such a plan because content that gets more engagement is better for their platform because it causes people to post and comment more. “I've spent most of my career working on systems like engagement-based ranking,” Haugen said. “When I come to you and say these things, I’m basically damning 10 years of my own work.”
TOM BRENNER via Getty Images
Reform Section 230
In a similar vein, Haugen said that Section 230 — the 1996 law that protects companies from being liable for what their users say and do on their platforms — should be reformed “to make Facebook responsible for the consequences of their intentional ranking decisions.” She said that such a law would likely “get rid of engagement-based ranking” because it would become too big of a liability for the company.
At the same time, she cautioned lawmakers to not let Facebook “trick” them into believing that changing Section 230 alone would be enough to address the scope of its problems. She also noted that using the law to police Facebook’s algorithms could be easier than trying to address specific types of content. “User generated content is something that companies have less control over, they have 100% control over their algorithms,” Haugen said.
The focus on Section 230 is significant because lawmakers from both parties have proposed various changes to the law. During the hearing, Blumenthal indicated that he too supported “narrowing this sweeping immunity when platforms’ algorithms amplify illegal conduct.” Senator Amy Klobuchar has also proposed ending 230 protections for vaccine and health misinformation. Meanwhile, Republicans have tried to eliminate Section 230 for very different reasons.
Slow down virality
Likewise, Haugen suggested that Facebook should slow down its platform with “soft interventions” that would add small bits of friction to the platform. She pointed to Twitter’s “read before sharing” prompts as the kind of measure that can reduce the spread of misinformation.
“Small actions like that friction don't require picking good ideas and bad ideas,” she said. “They just make the platform less twitchy, less reactive. And Facebook's internal research says that each one of those small actions dramatically reduces misinformation, hate speech and violence-inciting content on the platform.”
Facebook has taken these steps in the past. Notably, it applied these “break glass” measures in the days after the presidential election, though the company rolled some of them back the following month. The company implemented similar changes again, less than a month later, in the aftermath of the insurrection January 6th.
Huagen said that Facebook has mischaracterized these changes as being harmful to free speech, when in fact the company is concerned because it “wanted that growth back.” During the hearing, she said that Mark Zuckerberg had been personally briefed on just how impactful changes like this could be. But, she said, he prioritized the platform’s growth “over changes that would have significantly decreased misinformation and other inciting content.”
Open Facebook’s research to people outside the company
Access to Facebook’s data has become a hot button issue in recent weeks as researchers outside the company have complained that the company is stifling independent research. Haugen said the social network should work toward making its own internal research available to the public.
She proposed that there should be a set period of time — she suggested as long as 18 months — when Facebook is able to keep its research under wraps. But then the company should make it accessible to those outside the company.
“I believe in collaboration with academics and other researchers that we can develop privacy-conscious ways of exposing radically more data that is available today,” Haugen said. “It is important for our ability to understand how algorithms work, how Facebook shapes the information, we get to see that we have these data sets to be publicly available for scrutiny.”
She went on to say that Facebook's researchers are among its “biggest heroes” because “they are boldly asking real questions and willing to say awkward truths.” She said it was “unacceptable” that the company has been “throwing them under the bus” in its effort to downplay her disclosures.
Facebook
A dedicated ‘oversight body’
Besides internal changes, Haugen also said that there should be a dedicated “oversight body” with the power to oversee social media platforms. She said that such a group within an agency like the Federal Trade Commission could provide “a regulatory home where someone like me could do a tour of duty after working at a place like this.”
“Right now, the only people in the world who are trained to analyze these experiments, to understand what's happening inside of Facebook, are people who grew up inside of Facebook or Pinterest or another social media company,” she said.
Importantly, this “oversight body” would be separate from the Facebook-created Oversight Board, which advises Facebook on specific content decisions. While Facebook has said the creation of the Oversight Board is proof it’s trying to self-regulate, Haugen wrote in prepared remarks that the Oversight Board “is as blind as the public” when it comes to truly knowing what happens inside of the company.
It’s also worth noting that Haugen said she was opposed to efforts to break up Facebook. She said that separating Facebook and Instagram would likely result in more advertisers flocking to Instagram, which could deplete Facebook’s resources for making changes to improve its platform.
What’s next
While it’s unclear which, if any, of Haugen’s recommendations Congress will act on, her disclosures have already caught the attention of regulators. In addition to providing documents to Congress, she has also given documents to the Securities and Exchange Committee. She has alleged that Zuckerberg and other executives have “misled investors and the public about its role perpetuating misinformation and violent extremism relating to the 2020 election and January 6th insurrection," according to SEC filings published by 60 Minutes.
Meanwhile, Facebook has continued to push back on Haugen’s claims. A week after an executive told lawmakers that “this is not bombshell research,” the company tried to discredit Haugen more directly. In a statement, Facebook’s Director of Policy Communications Lena Pietsch, said Haugen “worked for the company for less than two years, had no direct reports, never attended a decision-point meeting with C-level executives. We don’t agree with her characterization of the many issues she testified about.” Pietsch added that “it’s time to begin to create standard rules for the internet.”
In an appearance on CNN following the hearing, Facebook VP Monika Bickert referred to Haugen’s disclosures as “stolen documents” and said the company’s research had been “mischaracterized.” Later that night, Zuckerberg publicly weighed in for the first time since The Wall Street Journal began publishing stories based on Haugen's disclosures (Zuckerberg did once refer to earlier coverage of the scandals, complaining that a news article has mistakenly described his hydrofoil as an "electric surfboard.") In his first substantive statement, he said "many of the claims don't make any sense," and that "the argument that we deliberately push content that makes people angry for profit is deeply illogical."
It could still get more difficult for Facebook to counter Haugen, though, particularly if new documents become public. Her letter to the SEC suggests that Facebook knew much more about QAnon and violent extremism on its platform than it let on, as Vicereported earlier. Haugen may also make appearances in front of lawmakers in other countries, too. European lawmakers, many of whom have expressed similar concerns as their US counterparts, have also indicated they want to talk to Haugen and conduct new investigations of their own.
Facebook CEO Mark Zuckerberg didn't testify at today's whistleblower hearing, but he has posted a lengthy reply to the accusations being lobbed at the company. He said the Frances Haugen's claims don't make sense and that they paint a "false picture" of the social network. "At the heart of these accusations is this idea that we prioritize profit over safety and well-being. That's just not true," he wrote in his post. The Facebook chief cited the Meaningful Social Interactions (MSI) update to News Feed, which was designed to show fewer viral videos and more content from friends and family.
He said the company went through with the change knowing that it would make people spend less time on the website, because research suggested it was the right thing thing to do for people's well-being. In Haugen's testimony, she painted MSI in a less flattering light. She said Zuckerberg chose to apply "metrics defined by Facebook" like MSI "over changes that would have significantly decreased misinformation and other inciting content." The whistleblower said the CEO was presented with solutions to make Facebook "less viral, less twitchy," but he decided not to use them because they had a negative impact on the MSI metric.
In the SEC complaint she filed, Haugen claimed that Facebook allowed "hateful" and "divisive" content, because it is "easier to inspire people to anger than it is to other emotions." Zuckerberg addressed that in his post, as well, calling it "deeply illogical." Facebook makes money from ads, he said, and advertisers apparently tell the company that they don't want their ads next to harmful or angry content.
In addition, Zuckerberg said the research into how Instagram affects young people was mischaracterized. He didn't explicitly mention it, but The Wall Street Journal published an article in mid-September about how it knows Instagram is toxic for teen girls based on internal documents detailing Facebook's own research. The social network eventually published a couple of documents from that research, but Haugen provided Congress with four more. Zuckerberg defended the platform, writing that many teens the company heard from actually "feel that using Instagram helps them when they are struggling with the kinds of hard moments and issues teenagers have always faced."
Haugen, who joined Facebook in 2019, worked on democracy and misinformation issues when she was with the company. She brought "tens of thousands" of pages of internal Facebook documents to Whistleblower Aid founder John Tye in addition to filing a whistleblower complaint with the SEC. There were several reports that came out based on those documents, including the existence of a VIP program that enabled high-profile users to skirt Facebook's rules. Haugen also accused Facebook of contributing to election misinformation and the January 6th US Capitol riots.
As for Zuckerberg, part of his post reads:
"If we wanted to ignore research, why would we create an industry-leading research program to understand these important issues in the first place? If we didn't care about fighting harmful content, then why would we employ so many more people dedicated to this than any other company in our space — even ones larger than us? If we wanted to hide our results, why would we have established an industry-leading standard for transparency and reporting on what we're doing? And if social media were as responsible for polarizing society as some people claim, then why are we seeing polarization increase in the US while it stays flat or declines in many countries with just as heavy use of social media around the world?"