Posts with «social & online media» label

Facebook Gaming creators can now stream together

If you turn to Facebook Gaming to watch other people play your favorite games, you’ll now see creators collaborate on joint streams. Facebook is introducing support for co-streaming. The feature allows up to four accounts to stream together concurrently. With today’s rollout, co-streaming is available to all content creators, not just those enrolled in Facebook’s partner program. You can access the functionality search for and tag other creators through the Live Producer left rail, in addition to the Gaming Tab and Stream Dashboard of the edit stream module.

“With co-streaming, we aim to increase discoverability for creators, encourage collaboration between creators and elevate the overall viewing experience for everyone,” the company said in a blog post. The addition of co-streaming support is a case of Facebook playing catch up. Twitch has offered similar functionality through its Squad Stream feature since 2019. It’s also worth pointing out the idea itself isn’t new. While it’s no longer around, OnLive introduced a multi-view feature back in 2012. All the same, it’s a significant addition for Facebook Gaming, particularly as it continues to try to compete with Twitch.

Facebook whistleblower hearing: ‘Facebook and big tech are facing a big tobacco moment’

The Facebook whistleblower who has provided a trove of internal documents to Congress and the Securities and Exchange Commission is testifying about research she says proves the social network has repeatedly lied about its platform. The documents were the basis for The Wall Street Journal's reporting on Facebook's controversial rules for celebrities, and the disastrous effect of Instagram on some teens' mental health.

“Facebook and big tech are facing their big tobacco moment,” committee chairman Sen. Richard Blumenthal said at the start of the hearing. “Facebook knows its products can be addictive and toxic to children. They value their profit more than the pain that they cause to children and their families.”

In her opening statement, Frances Haugen, the former Facebook product manager turned whistleblower, said that the company has ignored much of its own research and is “buying its profits with our safety.” She urged Congress to adopt new regulations

“The choices being made inside of Facebook are disastrous for our children, our public safety, our privacy and for our democracy,” Haugen said. “And that is why we must demand Facebook make changes.”

She highlighted Facebook’s unwillingness to make data available outside of its own research teams has helped the company mislead the public. “The company intentionally hides vital information from the public, from the US government, and from governments around the world,” Haugen said. “The documents I have provided to Congress prove that Facebook has repeatedly misled the public about what its own research reveals about the safety of children, the efficacy of its artificial intelligence systems, and its role in spreading divisive and extreme messages.”

She also said that Congress should not be swayed by Facebook’s insistence on “false choices,” and that simply reforming privacy laws or Section 230 would not go far enough. “We can afford nothing less than full transparency,” Haugen said. “Facebook wants you to believe that the problems we're talking about are unsolvable... Facebook can change, but it's clearly not going to do so on its own."

Haugen’s appearance comes days after Facebook sent its head of safety, Antigone Davis, to testify in front of the same committee. She and other executives have repeatedly tried to downplay the company's research, with Davis saying that the documents "were not bombshell research." In Tuesday’s hearing, some senators called out Mark Zuckerberg, saying that they should be hearing from him instead. “Rather than taking personal responsibility, showing leadership, Mark Zuckerberg is going sailing,” Blumenthal said, in an apparent reference to a recent Facebook post from the CEO.

Developing...

Instagram brings IGTV videos out of their silo and into your regular feed

Instagram may have been down for hours, but it's back with an arguably overdue change to how the social network handles videos. The Facebook brand is merging its long-form IGTV format with the regular videos from your feed, leaving just one format for all Instagram footage — you won't have to distinguish between the two. We've asked Instagram if this will lead to longer videos playing directly in your feed, but there will always be a Video tab in profiles to explore more content.

The move might help Instagram-based creators, too. While they'll still have to produce long-form video to qualify for ads (now called Instagram In-Stream video ads), they should have an easier time reaching viewers who would have glossed over IGTV in the past. Instagram is also merging post and video insights into a single metric.

You may have an easier time polishing that video, too. Instagram is bringing a few common editing features to its videos, including trimming, filters, location tags and people tags. Still images have had filters and tags for a while, of course, but this theoretically puts moving pictures more on par. You may have an easier time finding videos linked to a friend or a memorable concert.

The merger could boost uptake for Instagram video. IGTV demand wasn't what Instagram expected, and this update makes the longer video format more accessible, not to mention less confusing. Creators might be more inclined to share videos on Instagram as a result — and that might help the social media giant better compete against TikTok, YouTube and other video-focused heavyweights.

Facebook is coming back online after hours-long outage

Facebook services seem to be slowly coming back online after one of the biggest outages in recent memory. Facebook, Instagram and Messenger’s apps appear to be working again, though the websites are loading more slowly than usual. Meanwhile, WhatsApp's website seems to be back, but the app is still having issues connecting. 

As of 6:05pm ET Monday, the "Facebook for Business Status" page was still showing "major disruptions," to the social network's core services. But that was still an improvement from earlier in the day when the website was offline entirely. 

Facebook didn’t immediately comment or elaborate on the cause of the outage. In an earlier tweet, the company’s outgoing Chief Technology Officer, Michael Schroepfer, cited “networking issues.”

*Sincere* apologies to everyone impacted by outages of Facebook powered services right now. We are experiencing networking issues and teams are working as fast as possible to debug and restore as fast as possible

— Mike Schroepfer (@schrep) October 4, 2021

The outage lasted more than six hours, taking down Instagram, WhatsApp, Messenger and Oculus. It also wreaked havoc on the company internally, with employees reportedly unable to access emails, Workplace and other tools. The New York Timesreported that employees were also physically locked out of offices as workers’ badges stopped working.

It also shaved billions of dollars off of Mark Zuckerberg’s personal net worth as Facebook’s stock tanked, Bloomberg reported. Elsewhere, the company is still reeling from the fallout of a whistleblower who has accused the company of prioritizing “profits over safety.” The whistleblower was The Wall Street Journal’s primary source for several articles that details how Instagram is harmful to teens and the company’s controversial “cross check” program that allows high profile users to break its rules.

Developing...

Facebook whistleblower reveals identity, says company 'chooses profits over safety'

Internal documents published by the Wall Street Journal (WSJ) recently revealed that Facebook allowed VIPs to break its rules and that it was aware of how Instagram affected the mental health of teens. Now, the whistleblower who brought that information to light has revealed herself as Frances Haugen in an interview with 60 Minutes, the New York Times has reported.

"I’ve seen a bunch of social networks and it was substantially worse at Facebook than what I had seen before," Haugen told the NYT. "Facebook, over and over again, has shown it chooses profit over safety." 

Haugen joined Facebook in 2019, working on democracy and misinformation issues, while also handling counterespionage, according to a personal website and Twitter account she and her team set up. She worked as a Facebook product manager and left the company in May. 

She first brought "tens of thousands" of pages of internal Facebook documents to Whistleblower Aid founder John Tye, requesting legal protection and help in releasing the information. The trove included internal company research, slide decks, cover letters and more. She also filed a whistleblower complaint with the Securities and Exchange Commission (SEC), accusing Facebook of taking internal actions that didn't match its public statements. 

Whistleblower Frances Haugen is a data scientist from Iowa with a computer engineering degree and a Harvard MBA. She told us the only job she wanted at Facebook was to work against misinformation because she had lost a friend to online conspiracy theories. https://t.co/csgaRe6k5hpic.twitter.com/tSNav057As

— 60 Minutes (@60Minutes) October 3, 2021

In the SEC complaint, Haugen compared Facebook's internal research and documents to public statements and disclosures made by CEO Mark Zuckerberg and other executives. In one example, she said that Facebook contributed to election misinformation and the January 6th US Capitol insurrection. 

"Facebook has publicized its work to combat misinformation and violent extremism relating to the 2020 election and insurrection," she wrote in a cover letter on the subject. " In reality, Facebook knew its algorithms and platforms promoted this type of harmful content, and it failed to deploy internally recommended or lasting countermeasures."

On top of being in touch with the SEC's whistleblower office, which normally provides protections for corporate tipsters, she and her legal team contacted Senators Richard Blumenthal (D) and Marsha Blackburn (R). She also spoke to lawmakers in France and Britain, along with a member of the European parliament. 

Facebook, which has struggled to quell leaks of late, preemptively pushed back ahead of the 60 Minutes interview, calling the accusations "misleading." VP for policy and global affairs Nick Clegg told CNN that Facebook represented "the good, the bad and the ugly of humanity" and that it was trying to "mitigate the bad, reduce it and amplify the good." He added that it was "ludicrous" to blame January 6th on social media.

In a statement to the New York Times, Facebook spokesperson Lena Pietsch said it was continuing "to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true." On Tuesday, December 5th, Haugen is set to testify in Congress about issues surrounding Facebook's impact on young users. 

Facebook keeps downplaying its own research and lawmakers aren’t buying it

Yet another Facebook official just spent hours being grilled by members of Congress about the company’s policies, and whether or not it does enough to protect some of its most vulnerable users. And once again, the Facebook executive — today it was Head of Safety Antigone Davis — seemed to do her best to dodge the most difficult questions.

But the latest hearing on teen mental health, which came in response to reporting from The WSJ, was different from past hearings. That’s because, thanks to a whistleblower, members of the Senate Commerce Committee now have access to thousands of internal documents written by the company’s own researchers.

The documents, some of which have been made public, paint a very different picture of Facebook and Instagram’s understanding of how their services impact teens’ mental health than what they’ve publicly portrayed. Those documents are in the hands of lawmakers, making the findings that much harder for Facebook to spin. The disclosures have already forced Facebook to "pause" work on an Instagram Kids app.

“We now have a deep insight into Facebook's relentless campaign to recruit and exploit young users,” Senator Richard Blumenthal said at the start of the hearing. “We now know, while Facebook publicly denies that Instagram is deeply harmful for teens, privately, Facebook, researchers and experts have been ringing the alarm for years.”

This has forced Facebook into the uncomfortable position of trying to downplay the significance of its own research. “This is not bombshell research,” Davis repeated multiple times during the hearing. One day earlier, Facebook released heavily annotated versions of two of the documents, with notes that also tried to explain away its own findings. Those documents, which were just two of the “thousands” Blumenthal said he now has access to, used words like “myopic” and “sensationalizing” to try to minimize findings like the fact that Instagram makes “body images worse for 1 in 3 teen girls.”

The tactic didn’t go over well in the Senate on Thursday. “This research is a bombshell,” Blumenthal said. “It is powerful, gripping, riveting evidence that Facebook knows the harmful effects of its site on children, and that it has concealed those facts and findings.”

As with past hearings, there were some cringey moments. At one point, Blumenthal demanded to know if Facebook would “commit to ending finsta” — a reference to the secondary accounts often used by teens to stay anonymous. That forced Davis to awkwardly explain that so-called “finstas” are not an official Instagram feature. At another point, Sen. Ted Cruz demanded Davis explain why she wasn’t appearing at the hearing in person (she cited COVID-19 protocols).

Here's the full exchange. Unlike Sen. Orrin Hatch in 2018 (who was mocked based on a clip that was taken out of context to suggest he didn't know Facebook runs ads), the longer clip in this case doesn't really get any less awkward for Sen. Blumenthal. https://t.co/dn0LCmdiQ4

— Will Oremus (@WillOremus) September 30, 2021

But even with those moments, it was difficult to ignore the significance of these issues. It may seem obvious, but kids and teens are incredibly important to the company, which is consistently behind rivals like TikTok and Snapchat for that demographic. So much so that a former employee who worked on Messenger Kids recently said that “losing the Teen audience was considered an 'existential threat,'” for Facebook.

Worse for Facebook, there are very likely more bombshells coming. The whistleblower who provided the documents to The Journal and lawmakers, is appearing on 60 MinutesSunday night. And she is testifying at a separate Commerce Committee hearing next week. So while Facebook executives may be able to dodge questions and insist that their researchers’ conclusions have been mischaracterized, it will be much harder to rebut someone who was closely involved with that work.

Some senators hinted that there would be more to come at the next hearing. Senator Ray Luján asked Davis whether “Facebook ever tested whether a change to its platform increases an individual's or a group of users' propensity to post a violent or hateful language.” Davis said that it wasn’t her “area of expertise.”

“We might get more responses to that one next week,” he said.

Google adds more information to its ‘About this result’ feature

At the start of the year, Google added a feature to its search engine called About This Result. By tapping on the three dots icon located next to most results, the tool allows you to find out more about a website before you navigate to it. With the initial rollout of About This Result, Google displayed information from Wikipedia, and, if that wasn’t available, it pulled what it could from one of its services. The panels also included details about the website like when it was first indexed by the company, and if you could expect a secure connection.

Today, Google is making those panels more robust by adding to the diversity of information they display. To start, in addition to a description from Wikipedia, you’ll see what each website has to say about itself in its own words. You will also see what others have had to say about them — be that through reviews or a simple news article. In the “about the topic” section, Google will include other coverage or results from different sources.

As before, the idea behind the About This Result feature is to save you an extra search when you want to find out more about a website you’re about to visit. Google also sees it as a way to help people make more informed decisions about how they use the internet and provide peace of mind if you’re looking for important information related to important topics. 

Aside from these "Information Literacy features," the company also announced new features coming to its results pages during its Search On event today. They're called Things to Know, Refine This Search, Broaden This Search and Related Topics to its results to make it easier to learn more about different topics. Things To Know, for example, will pull up the basics you'll need to understand a new subject, while refining and broadening your search can help you explore related issues.

Google Search users in the US will see today's expansion roll out over the coming weeks and months.

Facebook will promote Instagram Reels in News Feed

Facebook’s TikTok clone is no longer just for Instagram. As of today, the social network is officially bringing Reels to the main Facebook app. With the change, users can create Reels directly from Facebook, and the company will recommend the short-form videos in all users’ News Feeds.

Facebook has been testing out various ways of bringing Reels out of Instagram for awhile, and began testing cross-posting features last month. Now, it’s also testing a feature that allows Reels creators on Instagram to promote their videos in Facebook’s News Feed directly, even if they don’t use the app.

Reels has become increasingly important to facebook and its efforts to challenge TikTok. The company has been steadily adding features to the service, and is attempting to lure creators with the promise of payouts for hitting certain milestones. Now, the company is adding a new "invite-only" bonus program to coincide with Reels' launch on Facebook and encourage creators to start posting on the social network. 

Facebook

But it’s the potential challenge to TikTok that could be most significant for the company. Documents reported by The Wall Street Journal show that Facebook has been struggling to incentivize teens and younger users to post original content. Internally, the company is reportedly worried about ceding influence to TikTok, where teens spend much more time than on Facebook’s apps.

Promoting Reels in the main Facebook app, which is already not especially popular with teens, may not seem like the most direct way to solve that. But getting more eyes on users’ Reels will help the feature grow even if its top users don’t spend much time on Facebook itself.

YouTube bans all content containing vaccine misinformation

YouTube has banned all videos containing misinformation about vaccines that are currently administered and have been approved by local health authorities or the World Health Organization. The measure is an expansion of a policy covering COVID-19 vaccines.

The service says that users shouldn't, for instance, post videos in which they claim that vaccines lead to chronic side effects (other than rare side effects that health authorities have acknowledged); content that alleges vaccines don't reduce transmission or contraction of diseases; or videos that have inaccuracies about vaccine ingredients.

There are some exceptions. YouTube "will continue to allow content about vaccine policies, new vaccine trials and historical vaccine successes or failures." Users can also share scientific discussions of vaccines and personal testimonials about their experiences, as long as they don't have a history of promoting vaccine misinformation and their video complies with YouTube's other rules. Posting videos that "condemn, dispute or satirize misinformation" that violates YouTube's policies should be okay too.

YouTube told the Washington Post that it's taking down channels linked to prominent anti-vaccine advocates, including Joseph Mercola and Robert F. Kennedy Jr. The reason it didn't move to ban all anti-vaccine content sooner is because it was focusing on COVID-19 vaccine misinformation.

“Developing robust policies takes time,” YouTube’s vice president of global trust and safety Matt Halprin told the publication. “We wanted to launch a policy that is comprehensive, enforceable with consistency and adequately addresses the challenge.”

YouTube, as well as Facebook and Twitter, banned COVID-19 misinformation in the early days of the pandemic in the spring of 2020. YouTube has removed more than 130,000 videos that broke its rules about COVID-19 vaccines, which it announced last October, and more than a million videos in total that included coronavirus misinformation.

Meanwhile, Facebook has been working to reduce the spread of anti-vaccine content since at least 2019. It formally banned vaccine misinformation in February.

CNN restricts access to its Facebook pages in Australia

CNN has become the first US media organization to restrict Australians' access to its Facebook pages, according to The Wall Street Journal. The move comes weeks after the country's high court ruled that media companies are liable for the comments left by other people on their Facebook posts. A CNN spokesperson told the publication that users in Australia will no longer be able to see its main Facebook page, its CNN International page and the dedicated pages for its shows. 

Dylan Voller filed the original case that prompted Australia's courts to decide whether media organizations should be liable for comments left on their Facebook pages. Voller became famous back in 2016 after a TV exposé on the mistreatment of minors in the criminal detention system showed a photo of him hooded and strapped to a chair when he was only 17. Major news outlets used that photo for their articles that were then posted on Facebook, where commenters falsely accused Voller of serious crimes, such as raping an elderly woman. 

A CNN source told The Journal that the organization asked Facebook if it would help media companies disable comments entirely. However, the social network reportedly declined to disable all comments on CNN's pages in Australia. Facebook rolled out a tool back in March that allows celebrities, politicians and news outlets to restrict who can comment on their pages, but they'd still have to set a restriction for every post. CNN has decided that doing so for all its properties would be time-consuming and opted to completely block Australia instead.

A CNN spokesperson said:

“We are disappointed that Facebook, once again, has failed to ensure its platform is a place for credible journalism and productive dialogue around current events among its users."

As for Facebook, it told The Journal that it supports the reform of Australia's defamation laws. In addition, it said it provided CNN with features it can use to manage comments and that it continues to "provide Australians a destination for quality journalism, including through Facebook News which we launched in August." 

Earlier this year, Australia also passed a law that requires tech giants to pay news outlets for using their content. As a response, Facebook blocked Australian publishers and residents from sharing news content. It quickly rolled back the ban, however, and agreed to pay some news organizations for their content.