Posts with «media» label

Warner Bros. 'Reminiscence' promo uses deepfake tech to put you in the trailer

If you want to see yourself on screen with Hugh Jackman, this is your chance. The promo for Warner Bros. upcoming Reminiscence movie uses deepfake technology to turn a photo of your face — or anybody's face, really — into a short video sequence with the star. According to Protocol, a media startup called D-ID created the promo for the film. D-ID reportedly started out wanting to develop technology that can protect consumers against facial recognition, but then it realized that its tech could also be used to optimize deepfakes.

For this particular project, the firm created a website for the experience, where you'll be asked for your name and for a photo. You can upload the photo of anybody you want, and the experience will then conjure up an animation for the face in it. The animation isn't perfect by any means, and the face could look distorted at times, but it's still not bad, considering the technology created it from a single picture. 

Reminiscence is a sci-fi thriller about Nick Bannister, a "private investigator of the mind." The idea behind the promo is that you're a client looking into your memories to solve a case. The movie will be shown in theatres on August 20th, but like most new releases these days, it will also be available for streaming on HBO Max.

Engadget

Why is Facebook so bad at countering vaccine misinformation?

It’s been six months since Facebook announced a major reversal to its policies on vaccine misinformation. Faced with a rising tide of viral rumors and conspiracy theories, the company said it would start removing vaccine mistruths from its platform. Notably, the effort not only encompassed content about COVID-19 vaccines, but all vaccines. That includes many of the kinds of claims it had long allowed, like those linking vaccines and autism, statements that vaccines are “toxic” or otherwise dangerous.

The move was widely praised, as disinformation researchers and public health officials have long urged Facebook and other platforms to treat vaccine misinformation more aggressively. Since then, the company has banned some prominent anti-vaxxers, stopped recommending health-related groups and shown vaccine-related PSAs across Facebook and Instagram. It now labels any post at all that mentions COVID-19 vaccines, whether factual or not.

Yet, despite these efforts, vaccine misinformation is still an urgent problem, and public health officials say Facebook and other social media platforms aren’t doing enough to address it. Last month, the Surgeon General issued an advisory warning of the dangers of health misinformation online. The accompanying 22-page report didn’t call out any platforms by name, but it highlighted algorithmic amplification and other issues commonly associated with Facebook. The following day, President Joe Biden made headlines when he said that misinformation on Facebook was “killing people.”

While Facebook has pushed back, citing its numerous efforts to quash health misinformation during the pandemic, the company’s past lax approach to vaccine misinformation has likely made that job much more difficult. In a statement, a Facebook spokesperson said vaccine hesitancy has decreased among its users in the US, but the company has also repeatedly rebuffed requests for more data that could shed light on just how big the problem really is.

“Since the beginning of the pandemic, we have removed 18 million pieces of COVID misinformation, labeled hundreds of millions of pieces of COVID content rated by our fact-checking partners, and connected over 2 billion people with authoritative information through tools like our COVID information center,” a Facebook spokesperson told Engadget. “The data shows that for people in the US on Facebook, vaccine hesitancy has declined by 50% since January, and acceptance is high. We will continue to enforce against any account or group that violates our COVID-19 and vaccine policies and offer tools and reminders for people who use our platform to get vaccinated.”

Facebook’s pandemic decision

Throughout the pandemic, Facebook has moderated health misinformation much more aggressively than it has in the past. Yet for the first year of the pandemic, the company made a distinction between coronavirus misinformation — e.g., statements about fake cures or disputing the effectiveness of masks, which it removed — and vaccine conspiracy theories, which it said did not break the company’s rules. Mark Zuckerberg even said that he would be reluctant to moderate vaccine misinformation the same way the company has with COVID misinformation.

That changed this year, with the advent of COVID-19 vaccines and the rising tide of misinformation and vaccine hesitancy that accompanied them, but the damage may have already been done. A peer-reviewed study published in Nature in February found that exposure to misinformation about the COVID-19 vaccines “lowers intent to accept a COVID-19 vaccine” by about 6 percent.

People are also more likely to be unvaccinated if they primarily get their news from Facebook, according to a July report from the COVID States Project. The researchers sampled more than 20,000 adults in all 50 states and found that those who cited Facebook as a primary news source were less likely to be vaccinated. While the authors note that it doesn’t prove that using Facebook affects someone’s choice to get vaccinated, they found a “surprisingly strong relationship” between the two.

“If you rely on Facebook to get news and information about the coronavirus, you are substantially less likely than the average American to say you have been vaccinated,” they write. “In fact, Facebook news consumers are less likely to be vaccinated than people who get their coronavirus information from Fox News. According to our data, Facebook users were also among the most likely to believe false claims about coronavirus vaccines.”

The researchers speculate that this could be because people who spend a lot of time on Facebook are less likely to trust the government, the media or other institutions. Or, it could be that spending time on the platform contributed to that distrust. While there’s no way to know for sure, we do know that Facebook has for years been an effective platform for spreading disinformation about vaccines.

A spotty record

Doctors and researchers have warned for years that Facebook wasn’t doing enough to prevent lies about vaccines from spreading. Because of this, prominent anti-vaxxers have used Facebook and Instagram to spread their message and build their followings.

A report published earlier this year from the CCDH found that more than half of all vaccine misinformation online could be linked to 12 individuals who are part of a long-running, and often coordinated, effort to undermine vaccines. But while the company has banned some accounts, some of those individuals still have a presence on a Facebook-owned platform, according to the CCDH. Facebook has disputed the findings of that report, which relied on analytics from the company's CrowdTangle tool. But the social network’s own research into vaccine hesitancy indicated “a small group appears to play a big role” in undermining vaccines, The Washington Postreported in March.

There are other issues, too. For years, Facebook’s search and recommendation algorithm have made it extraordinarily easy for users to fall into rabbit holes of misinformation. Simply searching the word “vaccine” would be enough to surface recommendations for accounts spreading conspiracy theories and other vaccine disinformation.

Engadget reported last year on Instagram’s algorithmic search results associated anti-vaccine accounts with COVID-19 conspiracies and QAnon content. More than a year later, a recent study from Avaaz found that although this type of content no longer appears at the top of search results, Facebook’s recommendation algorithms continue to recommend pages and groups that promote misinformation about vaccines. In their report, researchers document how users can fall into misinformation “rabbit holes” by liking seemingly innocuous pages or searching for “vaccines.” They also found that Facebook’s page recommendation algorithm appeared to associate vaccines and autism.

“Over the course of two days, we used two new Facebook accounts to follow vaccine-related pages that Facebook suggested for us. Facebook’s algorithm directed us to 109 pages, with 1.4M followers, containing anti-vaccine content — including pages from well-known anti-vaccine advocates and organizations such as Del Bigtree, Dr. Ben Tapper, Dr. Toni Bark, Andrew Wakefield, Children's Health Defense, Learn the Risk, and Dr. Suzanne Humphries. Many of the pages the algorithm recommended to us carried a label, warning that the page posts about COVID-19 or vaccines, giving us the option to go directly to the CDC website. The algorithm also recommended 10 pages related to autism — some containing anti-vaccine content, some not — suggesting that Facebook’s algorithm associates vaccines with autism, a thoroughly debunked link that anti-vaccine advocates continue to push.”

Facebook has removed some of these pages from its recommendations, though it’s not clear which. Avaaz points out that there’s no way to know why Facebook’s recommendation algorithm surfaces the pages it does as the company doesn’t disclose how these systems work. Yet it’s notable because content associating vaccines with autism is exactly one of the claims that Facebook said it would ban under its stricter misinformation rules during the pandemic. That Facebook’s suggestions are intermingling the topics is, at the very least, undermining those efforts.

Claims and counterclaims

Facebook has strongly opposed these claims. The company repeatedly points to its messaging campaign around covid-19 vaccines, noting that more than 2 billion people have viewed the company’s COVID-19 and vaccine PSAs.

In a blog post responding to President Biden’s comments last month, Facebook’s VP of Integrity Guy Rosen argued that “vaccine acceptance among Facebook users in the US has increased.” He noted that the company has “reduced the visibility of more than 167 million pieces of COVID-19 content debunked by our network of fact-checking partners so fewer people see it.”

He didn’t share, however, how much of that misinformation was about vaccines, or details on the company’s enforcement of its more general vaccine misinformation rules. That’s likely not an accident. The company has repeatedly resisted efforts that could shed light on how misinformation spreads on its platform.

Facebook executives declined a request from their data scientists who asked for additional resources to study COVID-19 misinformation at the start of the pandemic, according toThe New York Times. It’s not clear why the request was turned down, but the company has also pushed back on outsiders’ efforts to gain insight into health misinformation.

Facebook has declined to share the results of an internal study on vaccine hesitancy on its platform, according to Washington DC Attorney General Karl Racine’s office, which has launched a consumer protection investigation into the company’s handling of vaccine misinformation.

“Facebook has said it’s taking action to address the proliferation of COVID-19 vaccine misinformation on its site,” a spokesperson said. “But then when pressed to show its work, Facebook refused.”

The Biden Administration has also — unsuccessfully — pushed Facebook to be more forthcoming about vaccine misinformation. According to The New York Times, administration officials have met repeatedly with Facebook and other platforms as part of its effort to curb misinformations about coronavirus vaccines. Yet when a White House official asked Facebook to share “how often misinformation was viewed and spread,” the company refused. According to The Times, “Facebook responded to some requests for information by talking about vaccine promotion strategies,’ such as its PSAs or its tool to help users book vaccine appointments.

One issue is that it’s not always easy to define what is, and isn’t, misinformation. Factual information, like news stories or personal anecdotes about vaccine side effects, can be shared with misleading commentary. This, Facebook has suggested, makes it difficult to study the issue in the way that many have asked. At the same time, Facebook is a notoriously data-driven company. It’s constantly testing even the smallest features, and it employs scores of researchers and data scientists. It’s difficult to believe that learning more about vaccine hesitancy and how misinformation spreads is entirely out of reach.

Facebook Messenger rolls out end-to-end encrypted voice and video calls

Facebook is rolling out a host of features for Messenger users who switch on end-to-end encryption (E2EE). You can now call Messenger contacts using voice or video with E2EE enabled, just like in WhatsApp.

No one other than the person you're speaking with can see or listen to your E2EE chats or calls, so you can add an extra layer of protection to your voice and video conversations on Messenger. However, Facebook says you can still report messages if needed.

There are updates for disappearing messages as well. You'll see an option for them when you tap your profile photo in a chat, as well as in the message compose field (tap the timer icon there). You can now activate disappearing messages for everyone in a chat, not just yourself. On top of that, you'll have more control between how long messages are viewed and when they vanish — between five seconds and 24 hours.

Facebook has some other E2EE features in the works. It's planning to start public tests of E2EE group chats and calls in Messenger in the coming weeks. The company will also begin a limited test E2EE for Instagram direct messages. You'll need to have an existing chat with someone or to follow each other before you can enable E2EE on a DM exchange.

Apple's Tom Hanks sci-fi movie 'Finch' arrives November 5th

Apple has given a release date for the second of two Tom Hanks films it acquired during the pandemic. Finch, a futuristic tale about a reclusive inventor and his canine and robot road buddies, hits Apple TV+ on November 5th. Like Hanks' war movie Greyhound before it, the film became a casualty of the pandemic, mired by release date delays until Apple swooped in to acquire it from Universal. The robot (pictured above) is played by Caleb Landry Jones, fresh off a best actor win at Cannes. 

Hanks plays the titular character, an ailing robotics engineer who emerges from his self-imposed underground exile to journey across a desolate American wasteland. Along for the ride are his dog, Goodyear, and an android who names himself Jeff. Together, they make a dysfunctional family, but can they learn to get along? I guess we'll have to wait till November to find out. 

Finch is also loaded with offscreen talent. The good-natured sci-fi flick is directed by Emmy-winning Game of Thrones lead Miguel Sapochnik, with Alien scribe Ivor Powell and newcomer Craig Luck on screenwriting duties. While director Robert Zemeckis serves as an exec-producer and the film hails from Steven Spielberg's Amblin Entertainment, which may explain why the plot gives off major Cast Away and AI vibes.

The November release date also signals Apple's possible confidence in the film's awards chances. Either that or it's just looking to add a major movie to its fall line-up, which is already stocked with big hitters like sci-fi series Foundation, the second season of The Morning Show and current affairs show The Problem with Jon Stewart.

'Not Tonight 2' tackles capitalism and political greed in an 'alternative' US

Three years after taking on Brexit in Not Tonight, publisher No More Robots and developer PanicBarn are working on a sequel to the satirical political RPG. Not Tonight 2 takes place in a supposedly alternate version of the US, where "capitalism and political greed have taken center stage and democracy is a thing of the past." 

Not Tonight 2 centers around three intertwined stories written by a group of POC artists and authors. Three characters — Malik, Kevin and Mari — embark on a road to trip to try and save their friend Eduardo from deportation or another terrible fate.

The original game drew comparisons to Papers, Please for its core gameplay of checking IDs as a bouncer at pubs and nightclubs. Those mechanics are back in the sequel, along with a variety of minigames, including everything from word association and rhythm games to working at the border wall, joining a cult and serving poutine to Canadians. 

More than 400,000 people have played Not Tonight, which landed on Nintendo Switch last year. Not Tonight 2 should hit Steam later this year.

No More Heroes/PanicBarn

Facebook may be forced to sell Giphy following UK regulator findings

The UK's competition regulator has found that Facebook's acquisition of GIF-sharing platform Giphy will harm competition within social media and digital advertising. As part of its provisional decision, the watchdog voiced concerns that Facebook could prevent rivals including TikTok and Snapchat from accessing Giphy, a service they already use. It added that Facebook could also require customers of the GIF platform to hand over more data in return for access. If its objections are confirmed as part of the ongoing review, the regulator said it could force Facebook to unwind the deal and to sell off Giphy in its entirety.

The Competition and Markets Authority (CMA) ultimately determined that the deal stands to increase Facebook's sizeable market power. Together, its suite of apps — including Facebook, WhatsApp and Instagram — account for 70 percent of social media activity and are accessed at least once a month by 80 percent of internet users, the CMA said.

Beyond social media, the watchdog suggested that the acquisition could remove a potential challenger to Facebook in the $5.5 billion display advertising market. Citing Facebook's termination of Giphy's paid ad partnerships following the deal, the regulator said the move had effectively stopped the company's ad expansion (including to additional countries like the UK) in its tracks. This in turn had an impact on innovation in the broader advertising sector, the CMA explained.

Facebook's announcement last May that it was acquiring Giphy, with plans to integrate it with Instagram, for a reported $400 million immediately raised alarm bells for regulators. The social network is facing antitrust complaints in the US and the EU over its social media and advertising monopolies, respectively. At the same time, the UK has ramped up its scrutiny of Big Tech by creating a dedicated Digital Markets Unit to oversee the likes of Google, Facebook and Apple. The fledgling agency sits within the CMA and is designed to give people more control over their data.

Today, the CMA echoed those principles in its initial decision. The regulator said that it would "take the necessary actions" to protect users if it concludes that the merger is detrimental to competition. It will now consult on its findings as part of the reviews process. A final decision is slated for October 6th.

Facebook told Variety that it "disagrees" with the CMA's preliminary findings. "We will continue to work with the CMA to address the misconception that the deal harms competition,” the company added. It previously argued that Giphy has no operations in the UK, meaning that the CMA has no jurisdiction over the deal. In addition, it has claimed that Giphy's paid services cannot be classified as display advertising under the regulator's own market definition. 

The new season of ‘Star Trek: Lower Decks’ stays true to the show’s core

The following contains some spoilers for the second season premiere of ‘Star Trek: Lower Decks.’

The first season of Lower Decks was a pleasant surprise to many in the Star Trek fandom. What a lot of people had written off as Family Guy- or Rick and Morty-Trek ended up being a wholesome love letter to the history of the franchise. It was filled with plenty of low-brow humor, sure, but it also showcased characters who genuinely cared about each other and what they do. Thankfully, season two is more of the same.

Lower Decks takes its name from a season-seven The Next Generation episode that revolved around the lives of four ensigns, and the parts they played in a mission that only the bridge crew really understood the full scope of. It’s generally considered one of the best episodes of the franchise, which meant that anything that even vaguely referenced it had a lot to live up to. Luckily, Lower Decks creator and executive producer Mike McMahan was a big fan with deep knowledge of Trek. He is also the creator of the @TNG_S8 parody Twitter account, as well as a veteran of animated shows like South Park, Axe Cop and, yes, Rick and Morty.

CBS

The conceit of a Lower Decks series was that the stories would focus on a core group of four ensigns on the USS Cerritos: Beckett Mariner, Brad Boimler, D’Vana Tendi and Sam Rutherford, also known as “beta shift.” There was a bridge crew, voiced by stars such as Jerry O’Connell and Dawnn Lewis, but their storylines would always be what’s going on in the background, and the ensigns wouldn’t always be privy to what’s happening with the ship.

Unlike the TNG episode, however, even the audience has been kept out of the loop on many occasions, with the ensigns even being forced to testify on their commanders’ behalf in an unexplained trial. (It turned out to be a party in honor of the senior officers, which confused our protagonists even more.)

CBS

It’s a pretty great idea for a show, one that’s yielded hilarious results. But Star Trek doesn’t have a good track record of sticking to a concept. The majority of shows since TNG have started out as one thing and become something else over the course of their runs. All shows evolve, but the changes in Trek have been obvious and purposeful. Deep Space Nine was intended to be a “frontier outpost” type of show, showing the long-term relationship between the Starfleet and one of the planets it encountered, Bajor. By season three they were given a warship, and season four brought in TNG-veteran Worf and a war with the Klingon Empire.

Star Trek: Voyager operated on the premise of “what if a Starfleet ship was lost far from home?” And it stuck with that, sure, but it also continued to operate like any other Starfleet vessel over seven seasons, and the ship remained in surprisingly good condition despite the lack of spacedocks for repair — something that frustrated writer Ronald D. Moore and later spurred him to create the Battlestar Galactica reboot (the title ship was a wreck by the finale). They also ended up re-establishing contact with the Federation in later seasons, which dampened the whole “alone in a strange quadrant” theme.

Enterprise wasn’t even called Star Trek until its third season. But still, though it was a show that promised to show us the early origins of Starfleet and the Federation, the first two seasons got bogged down in a “Temporal Cold War” and later episodes brought in 24th-century-era baddies like the Borg and Ferengi.

The latest concept switcheroo was the premiere Paramount+ Trek show, Discovery. The producers touted it as the first series where the captain was not the main character, with the program focusing on Commander Michael Burnham instead. This sounded great in theory, as it could show us a different side of Starfleet. In practice, however, even if Burnham wasn’t the captain the entire universe seemed to revolve around her anyway: the mysterious “Red Angel〞of season two turned out to be her mother (and her). The show ended up jettisoning its 23rd century setting after that, traveling to the 32nd century to a galaxy with a Federation in tatters. As of the end of season three Burnham became the captain anyway. So much for any Lower Decks-esque perspective on that show.

CBS

Season two of Lower Decks starts off a bit shaky in that regard — after the events of the last few episodes, Mariner is now BFF with the captain (who is also her mother) and Boimler is a bridge officer on the USS Titan. Neither of them feel like the scrappy underdog anymore. At least Tendi and Rutherford are still pretty minor players, though Tendi is alarmed at sudden changes in Rutherford’s personality and worries she may be losing his friendship.

The premiere finds Mariner having the carte blanche to go on any side missions she wants, and in one of these authorized-unauthorized missions she accidentally turns first officer Jack Ransom into a god-like being set on taking over a planet. The latter event is, at least, a pretty standard plot contrivance for Star Trek. Where Lower Decks stands apart is that as Ransom is threatening the Cerritos and banging away at its shields, the camera cuts to Tendi attacking Rutherford in a corridor, afraid that his new personality traits mean he’s suffering a serious disease, or that he just doesn’t like her anymore. The larger existential threat is background color in this scene (literally, as you can see rainbow beams blasting outside the window) while the show chooses to focus on the individual struggles of these two characters.

CBS

By the end of the episode Rutherford and Tendi sort things out, and even Mariner gets put back in her place, with the partnership between her and her mother dissolved and Beckett back in the brig. The only missing piece of the fabulous four is Boimler and well, he’s not having a great time on the Titan, because maybe things are a bit too exciting up on the bridge. The lower decks of the USS Cerritos are the still place to be, with season two off to a solid start.

Criterion is releasing 'Citizen Kane' and five other classics on 4K Blu-ray

Criterion has unveiled its first 4K Ultra HD Blu-ray releases with a six-film slate that includes Citizen Kane, Menace II Society, The Piano, Mulholland Dr., The Red Shoes, and A Hard Day’s Night. The new releases will give film buffs a chance to see some of these films with the highest detail ever, even during their theatrical runs. 

Criterion notes that Orson Welles's 1941 masterpiece Citizen Kane was its first laserdisc release 37 years ago. "It now rejoins the library after a long absence, making its first appearance in 4K Ultra HD," Criterion wrote in its blog. The other releases represent an eclectic variety of periods ranging from the 1940s with The Red Shoes, 1960s (Hard Day's Night) to the '90s and aughts with Menace II Society, Mulholland Drive and The Piano)

Each title will come in a combo pack that includes both 4K UHD and 1080p Blu-ray versions, along with Criterion's popular special features about each film. Some films will also be available in Dolby Vision HDR and Dolby Atmos, though Criterion has yet to say which. 

Criterion's new 4K UHD releases will likely represent the best way to watch classic films at home. If you don't need the super pristine quality, however, it launched the Criterion streaming service in 2019 and now offers over "1,000 important classic and contemporary films," according to the site

Marvel’s ‘What If…?’ is a fun diversion, but not required viewing

Marvel has often been taken to task for poor pacing on its shows. The Netflix programs were always said to be padded out, with more installments than they really needed per season. The Disney+ era has given us shows with fewerepisodes, but that hasn’t deterred complaints about slow pacing. What If…?, premiering this week on the service, has a different problem: It’s frantic and rushed, like a podcast episode played at 1.5x speed.

The concept behind What If…? is simple. Take a pivotal moment from the Marvel Cinematic Universe, change one thing, see what happens. In the premiere episode, set during the events of Captain America: The First Avenger, Peggy Carter chooses to stay in the room where Steve is receiving the super soldier serum. Steve gets shot, forcing Peggy to jump into the machine and get bulked up in his place.

You’d need to be intimately familiar with the original movie to spot the difference, which is why the omniscient narrator is there to point it out. It’s probably the only time the episode stops to catch its breath.

The problem is that this is a half-hour show attempting to present an alternate version of a two-hour movie. It isn’t even enough to just say that Captain Carter has super powers; they feel the need to show how the events of the entire movie play out, down to the final battle with the Red Skull. There isn’t a lot of time for character development, because they assume you already know the characters well from seeing them on the silver screen. (Also, why is she Captain Carter and not Captain Britain?)

Marvel Studios

It runs from plot point to plot point, a highlight reel of the film with some small and a lot of big changes. You’ll probably want to rewatch the original movie either before or after, just because there are so many winks and nudges to it that the episode simply cannot stand alone. It’s like a DVD extra and fan fiction had a baby — which, to be fair, is what the original comic felt like.

The difference here is that this is a version of What If...? that gets to play in the MCU sandbox, with the voices and likenesses to boot (except for Hugo Weaving, who is once again replaced by Ross Marquand as the Red Skull). Animation is the only way to pull it off, given that the cast and setting changes with every episode so a live action production would be prohibitively expensive.

Marvel Studios

But, despite being owned by one of the most famous animation studios in the world, Marvel Studios went with third-party animators. It’s a cel shaded style, which is more often used in video games and here looks a lot like rotoscoping. It’s sort of stiff and awkward, with more attention paid to making characters look like their actors instead of being more fluid or expressive. It’s a shame, given that Disney’s 2012 short film Paperman utilized a hybrid 2D/3D style which looks similar to this, but with a lot more personality.

Future episodes will explore other divergences from the MCU, like T’Challa becoming Star Lord or Tony Stark getting saved by Killmonger. So it’s likely some episodes will be far more enjoyable than others based on their conceit alone, though Captain Carter is still a solid start. But a good concept can’t completely overcome animation and pacing issues.

Facebook's Oversight Board orders a post criticizing the Myanmar coup to be restored

Facebook's Oversight Board has instructed the social network to restore a post from a user that criticized the Chinese state. According to the board, Facebook mistakenly removed the post for violating its hate speech policy under the belief it targeted Chinese people.

"This case highlights the importance of considering context when enforcing hate speech policies, as well as the importance of protecting political speech," the Oversight Board wrote. "This is particularly relevant in Myanmar given the February 2021 coup and Facebook’s key role as a communications medium in the country."

The user, who appeared to be in Myanmar, posted the message in question in April. The post argued that, rather than providing funding to Myanmar's military following the coup in February, tax revenue should be given to the Committee Representing Pyidaungsu Hlutaw, a group of legislators that opposed the coup. The post, which was written in Burmese, was viewed around half a million times.

Although no users reported the post, Facebook decided to take it down. The post used profanity while referencing Chinese policy in Hong Kong. Facebook's translation of the post led four content reviewers to believe that the user was criticizing Chinese people. 

Under its hate speech rules, Facebook doesn't allow content that targets someone or a group of people based on ethnicity, race or national origins that use “profane terms or phrases with the intent to insult.” The user who wrote the post claimed in their appeal that they shared it in an effort to “stop the brutal military regime.”

The Oversight Board says context is particularly important in this case. The Burmese language uses the same word to refer to both a state and people who are from that state. Other factors made it clear the user was referring to the Chinese state, according to the board.

Two translators who reviewed the post "did not indicate any doubt" that the word at the heart of the case was referring to a state. The translators told the board the post includes terms that Myanmar’s government and the Chinese embassy commonly use to refer to each other. Public comments the board received regarding the case indicated the post was political speech.

The Oversight Board ordered Facebook to restore the post and recommended Facebook ensures "its Internal Implementation Standards are available in the language in which content moderators review content. If necessary to prioritize, Facebook should focus first on contexts where the risks to human rights are more severe."

The company has had a complicated history with Myanmar. In 2018, Facebook was accused of censoring information about ethnic cleansing in the country. It admitted it didn't do enough to stop people from using the platform to incite offline violence and "foment division," following a report it commissioned about the matter.

Soon after the coup, Facebook was temporarily blocked in Myanmar. After it returned, Facebook took steps to limit the reach of the country's military on its platform, and later banned the military outright on Facebook and Instagram.

The Oversight Board previously told Facebook to restore a post from another user based in Myanmar. As with the latest ruling, the board said Facebook misinterpreted the post as hate speech. While it was “pejorative or offensive,” the post didn't “advocate hatred” or directly call for violence.