Jack Dorsey says that Square is “considering” building its own Bitcoin mining system using custom silicon and open source software. “Square is considering building a Bitcoin mining system based on custom silicon and open source for individuals and businesses worldwide,” Dorsey wrote in Friday.
He added that such a project would follow a similar approach as the bitcoin Square began working on earlier this summer. But building a mining system would be considerably more complicated for the payments company than simply building a wallet. Creating custom chips is, as Dorsey points out, “very expensive,” and would be new territory for the payments company, which has been a major supporter of Bitcoin.
“Mining needs to be more efficient,” Dorsey wrote. “Driving towards clean and efficient energy use is great for Bitcoin’s economics, impact, and scalability. Energy is a system-level problem that requires innovation in silicon, software, and integration.”
3/Silicon design is too concentrated into a few companies. This means supply is likely overly constrained. Silicon development is very expensive, requires long term investment, and is best coupled tightly with software and system design. Why aren’t more companies doing this work?
As with his earlier tweets about plans for the hardware wallet, Dorsey didn’t share many details about how the mining system would actually work. But he said the goal would be to make mining more efficient and accessible to more people, which could address two of the most important issues related to cryptocurrency mining.
Bitcoin-related power usage has reached in recent years, raising about the cryptocurrency’s impact on climate change. Mining has also driven up the prices and scarcity , which has made it increasingly difficult for the average crypto enthusiast to mine on their own.
Our team led by @jessedorogusker will start the deep technical investigation required to take on this project. We’d love your thoughts, ideas, concerns, and collaboration. Should we do this? Why or why not? We’ll update this thread as we make our decisions. And now over to Jesse.
"Bitcoin mining should be as easy as plugging a rig into a power source,” Dorsey said. Whether or not Square will be able to accomplish that, is less clear. He said that the company “will start the deep technical investigation required to take on this project,” and is hoping to hear feedback on the idea in the meantime.
Facebook is ramping up its fight against leakers following the disclosures of whistleblower Frances Haugen. According to The New York Times, Facebook is to some internal groups that deal with “sensitive” issues like safety and elections. That the change, which was made to prevent further leaks, immediately leaked is both highly amusing and emblematic of some of the bigger issues the company is currently facing.
Ever since Haugen herself as the whistleblower, one of the more noteworthy aspects of her story is that the documents she provided to Congress and the Securities and Exchange Commission were widely accessible to employees. The documents included slides detailing the company’s research into teen , as well as numerous memos about how the company has handled for VIPs, and other thorny issues.
As The Times points out, the reason these documents were so readily available is because Facebook has long had an open culture that promotes sharing. And employees themselves often take to its internal communication platform, Workplace, to discuss controversial issues facing the company.
But now the social network is moving away from that openness. The company is making some internal groups private, and will remove employees “whose work isn’t related to safety and security,” according to the report. “Sensitive Integrity discussions will happen in closed, curated forums in the future,” the company told employees in a memo.
On one hand, the fact that news of the change immediately leaked would seem to back up that the company is in fact more leaky than it has been in years past. But it could also signal increasing unrest among employees, some of whom are reportedly concerned that walling off teams that work on important issues could ultimately do more harm than good.
It also underscores just how much Facebook is still reeling from Haugen’s disclosures. In addition to the Senate hearing last week, Haugen is expected the select committee investigating the Jan. 6th insurrection, as well as investigating the company. The SEC also appears to be her claims.
Facebook is slowly expanding its effort to weed out political content from News Feed. The company is now testing its “less political” feed in 75 new countries, Facebook said in an .
The company has already introduced a version of the revamped News Feed in the , as well as Costa Rica, Sweden, Spain, Ireland, Canada, Brazil, and Indonesia. But the latest update marks a significant expansion of the effort, and brings the total number of countries involved to more than 80. Facebook didn’t identify the latest countries to join the test, but a spokesperson confirmed the company is showing the News Feed changes to “a small percentage of people” in each country. The spokesperson added that countries with upcoming elections and those “at higher risk of conflict” are not included in the tests.
Mark Zuckerberg first plans to make News Feed less political in January, just weeks after the insurrection. “People don’t want politics and fighting to take over their experience,” he said at the time.
Rolling out the changes to more countries could help Facebook learn more about how to lower the temperature on its platform, which could be particularly useful as the company is accused of making its service to boost engagement. At the same time, the company has acknowledged the changes could hurt publishers. “As we get more insights from these tests, we’ll share updates on what we’re learning and will continue to make changes accordingly,” Facebook wrote in an updated blog post.
One week after a massive that took all of the social network’s apps offline for more than six hours, Instagram says it’s that will alert users to “temporary issues” like outages or other technical issues.
The new alerts would appear in users’ Activity Feed, alongside other in-app notifications. The messages could be used to let users know about specific issues, like Story uploads not working, or a more widespread problem, like the two outages last week. Importantly, Instagram says it doesn’t plan to alert users to every issue, but ones that may be a source of widespread confusion.
“We won’t send a notification every single time there is an outage, but when we see that people are confused and looking for answers, we’ll determine if something like this could help make things clearer,” Instagram wrote in a blog post. The company added that it’s testing the feature in the US “for the next few months.”
Separately, Instagram also showed off a new “account status” section of its app, which is meant to alert users to “what's going on with your account” more generally. Instagram says it’s starting with notifications about posts that are removed and when an account “is at risk of being disabled” due to rule violations.
According to Instagram, the feature is meant to make it easier for users to understand why a post may have been removed, and whether or not they may be in danger of losing an account altogether. While the app has notified users in the past when a post is labeled or removed, the company hasn’t always done a good job letting people know which policy they violated. The Oversight Board has Facebook it needs to do a better job at explaining rules to users, and account status could help them do just that.
Account status could also help the app address a more Instagram-specific issue: concerns over Instagram says that “in the coming months” it plans to update account status to let people know “how their content is being distributed and recommended across different parts of Instagram.”
Members of the Oversight Board will meet with Frances Haugen as it investigates the company’s controversial “cross check” system.
“In light of the serious claims made about Facebook by Ms. Haugen, we have extended an invitation for her to speak to the Board over the coming weeks, which she has accepted,” the Oversight Board wrote in a statement. “Board members appreciate the chance to discuss Ms. Haugen’s experiences and gather information that can help push for greater transparency and accountability from Facebook through our case decisions and recommendations.”
In a statement, Haugen confirmed the upcoming meeting. “Facebook has lied to the board repeatedly, and I am looking forward to sharing the truth with them,” she wrote.
I have accepted the invitation to brief the Facebook Oversight Board about what I learned while working there. Facebook has lied to the board repeatedly, and I am looking forward to sharing the truth with them.
The board has also been Facebook to provide more information about the program, in light of Haugen’s disclosures. is the internal designation used by the social network for high profile accounts, including celebrities, politicians and athletes. The company has said it’s meant to provide an extra level of scrutiny when those accounts might break the platform’s rules. But according to documents Haugen provided to The Wall Street Journal, Facebook often doesn’t review violations from these accounts, effectively allowing them to break its rules without consequences. In other cases, reviews are so delayed that rule-breaking content is viewed millions of times before it’s removed.
Crosscheck was also a central issue in the Oversight Board’s handling of Donald Trump’s Facebook . The board had asked Facebook for more details about cross check, saying that the company’s rules “should apply to all users.” But Facebook said it was “not feasible” to provide additional info, even though Haugen’s disclosures suggested the company has been tracking problems related to the program.
Facebook didn't immediately respond to a request for comment. The company said last month, following The Wall Street Journal's reporting, that it had asked the board to provide recommendations on how to improve cross check. The Oversight Board will release its first transparency report later this month, which will provide an update on cross check, based on its discussions with Facebook officials and Haugen. The report will be the board’s first assessment of how the social network has responded to its policy recommendations.
Facebook and its apps are down yet again, according to the company. The extent of the latest outage wasn't immediately clear, but Facebook and Instagram both acknowledged that "some" users are having trouble accessing their services.
We’re aware that some people are having trouble accessing our apps and products. We’re working to get things back to normal as quickly as possible and we apologize for any inconvenience.
Facebook is reportedly its product development so it can conduct “reputational reviews” in the wake of whistleblower Frances Haugen’s disclosures about the company.
According to The Wall Street Journal, Facebook has “put a hold on some work on existing products” while a team of employees analyze how the work could further damage their reputation. The group is looking at potential negative effects on children, as well as criticism the company could face.
Zuckerberg alluded to the change in Tuesday — his first since the whistleblower’s disclosures became public. “I believe that over the long term if we keep trying to do what's right and delivering experiences that improve people's lives, it will be better for our community and our business,” he wrote. “I've asked leaders across the company to do deep dives on our work across many areas over the next few days so you can see everything that we're doing to get there.”
The change is one of the clearest signs yet of how much Haugen’s disclosures have rocked the company in recent weeks. Facebook has its work on an Instagram Kids app, after a WSJ report on company research showing Instagram is harmful to some teens’ mental health. Though Facebook has attempted to its own research, pressure has mounted since Haugen, a former product manager, and testified in a three-hour Senate hearing this week.
She Zuckerberg and other executives have prioritized the social network’s growth over users’ safety, and that the company has misled the public about its AI-based moderation technology. She’s Facebook to make its research more widely available, and urged Congress to impose new regulations on the platform.
The whistleblower behind “bombshell” disclosures that have rocked Facebook in recent weeks spent much of Tuesday's three-hour hearing explaining to Congress how Facebook could fix itself.
While the hearing was far from the first time a Facebook critic has briefed lawmakers, her insider knowledge and expertise in algorithm design made her particularly effective. Her background as part of the company’s civic integrity team meant she was intimately familiar with some of the biggest problems on Facebook.
During the hearing, Haugen spoke in detail about Facebook’s algorithms and other internal systems that have hampered its efforts to slow misinformation and other problematic content. She also praised the company’s researchers, calling them “heroes,” and said Facebook should be required to make their work public.
Remove algorithmic ranking and go back to chronological feeds
One of the most notable aspects of Haugen’s testimony was her expertise, which gives her a nuanced understanding of how algorithms work and the often unintended consequences of using them.
“I hope we will discuss as to whether there is such a thing as a safe algorithm,” Sen. Richard Blumenthal said at the start of the hearing. While Haugen never addressed that question directly, she did weigh on the ranking algorithms that power the feeds in Facebook and Instagram. She noted that Facebook’s own research has found that “engagement-based ranking on Instagram can lead children from very innocuous topics like healthy recipes… to anorexia-promoting content over a very short period of time.”
She also said that Facebook’s AI-based moderation tools were much less effective than what the company has publicly portrayed. “We've seen from repeated documents within my disclosures that Facebook's AI systems only catch a very tiny minority of offending content,” Haugen said. “Best case scenario, in the case of something like hate speech, at most they will ever get to 10 to 20%.”
To address this, Haugen said that Facebook could move to a chronological feed where posts are ordered by recency, rather than what is most likely to get engagement. “I'm a strong proponent of chronological ranking, or ordering by time with a little bit of spam demotion, because I think we don't want computers deciding what we focus on,” Haugen said.
She noted that Facebook would likely resist such a plan because content that gets more engagement is better for their platform because it causes people to post and comment more. “I've spent most of my career working on systems like engagement-based ranking,” Haugen said. “When I come to you and say these things, I’m basically damning 10 years of my own work.”
Reform Section 230
In a similar vein, Haugen said that Section 230 — the 1996 law that protects companies from being liable for what their users say and do on their platforms — should be reformed “to make Facebook responsible for the consequences of their intentional ranking decisions.” She said that such a law would likely “get rid of engagement-based ranking” because it would become too big of a liability for the company.
At the same time, she cautioned lawmakers to not let Facebook “trick” them into believing that changing Section 230 alone would be enough to address the scope of its problems. She also noted that using the law to police Facebook’s algorithms could be easier than trying to address specific types of content. “User generated content is something that companies have less control over, they have 100% control over their algorithms,” Haugen said.
The focus on Section 230 is significant because lawmakers from both parties have proposed various changes to the law. During the hearing, Blumenthal indicated that he too supported “narrowing this sweeping immunity when platforms’ algorithms amplify illegal conduct.” Senator Amy Klobuchar has also proposed ending 230 protections for vaccine and health misinformation. Meanwhile, Republicans have tried to eliminate Section 230 for very different reasons.
Slow down virality
Likewise, Haugen suggested that Facebook should slow down its platform with “soft interventions” that would add small bits of friction to the platform. She pointed to Twitter’s “read before sharing” prompts as the kind of measure that can reduce the spread of misinformation.
“Small actions like that friction don't require picking good ideas and bad ideas,” she said. “They just make the platform less twitchy, less reactive. And Facebook's internal research says that each one of those small actions dramatically reduces misinformation, hate speech and violence-inciting content on the platform.”
Facebook has taken these steps in the past. Notably, it applied these “break glass” measures in the days after the presidential election, though the company rolled some of them back the following month. The company implemented similar changes again, less than a month later, in the aftermath of the insurrection January 6th.
Huagen said that Facebook has mischaracterized these changes as being harmful to free speech, when in fact the company is concerned because it “wanted that growth back.” During the hearing, she said that Mark Zuckerberg had been personally briefed on just how impactful changes like this could be. But, she said, he prioritized the platform’s growth “over changes that would have significantly decreased misinformation and other inciting content.”
Open Facebook’s research to people outside the company
Access to Facebook’s data has become a hot button issue in recent weeks as researchers outside the company have complained that the company is stifling independent research. Haugen said the social network should work toward making its own internal research available to the public.
She proposed that there should be a set period of time — she suggested as long as 18 months — when Facebook is able to keep its research under wraps. But then the company should make it accessible to those outside the company.
“I believe in collaboration with academics and other researchers that we can develop privacy-conscious ways of exposing radically more data that is available today,” Haugen said. “It is important for our ability to understand how algorithms work, how Facebook shapes the information, we get to see that we have these data sets to be publicly available for scrutiny.”
She went on to say that Facebook's researchers are among its “biggest heroes” because “they are boldly asking real questions and willing to say awkward truths.” She said it was “unacceptable” that the company has been “throwing them under the bus” in its effort to downplay her disclosures.
A dedicated ‘oversight body’
Besides internal changes, Haugen also said that there should be a dedicated “oversight body” with the power to oversee social media platforms. She said that such a group within an agency like the Federal Trade Commission could provide “a regulatory home where someone like me could do a tour of duty after working at a place like this.”
“Right now, the only people in the world who are trained to analyze these experiments, to understand what's happening inside of Facebook, are people who grew up inside of Facebook or Pinterest or another social media company,” she said.
Importantly, this “oversight body” would be separate from the Facebook-created Oversight Board, which advises Facebook on specific content decisions. While Facebook has said the creation of the Oversight Board is proof it’s trying to self-regulate, Haugen wrote in prepared remarks that the Oversight Board “is as blind as the public” when it comes to truly knowing what happens inside of the company.
It’s also worth noting that Haugen said she was opposed to efforts to break up Facebook. She said that separating Facebook and Instagram would likely result in more advertisers flocking to Instagram, which could deplete Facebook’s resources for making changes to improve its platform.
While it’s unclear which, if any, of Haugen’s recommendations Congress will act on, her disclosures have already caught the attention of regulators. In addition to providing documents to Congress, she has also given documents to the Securities and Exchange Committee. She has alleged that Zuckerberg and other executives have “misled investors and the public about its role perpetuating misinformation and violent extremism relating to the 2020 election and January 6th insurrection," according to SEC filings published by 60 Minutes.
Meanwhile, Facebook has continued to push back on Haugen’s claims. A week after an executive told lawmakers that “this is not bombshell research,” the company tried to discredit Haugen more directly. In a statement, Facebook’s Director of Policy Communications Lena Pietsch, said Haugen “worked for the company for less than two years, had no direct reports, never attended a decision-point meeting with C-level executives. We don’t agree with her characterization of the many issues she testified about.” Pietsch added that “it’s time to begin to create standard rules for the internet.”
In an appearance on CNN following the hearing, Facebook VP Monika Bickert referred to Haugen’s disclosures as “stolen documents” and said the company’s research had been “mischaracterized.” Later that night, Zuckerberg publicly weighed in for the first time since The Wall Street Journal began publishing stories based on Haugen's disclosures (Zuckerberg did once refer to earlier coverage of the scandals, complaining that a news article has mistakenly described his hydrofoil as an "electric surfboard.") In his first substantive statement, he said "many of the claims don't make any sense," and that "the argument that we deliberately push content that makes people angry for profit is deeply illogical."
It could still get more difficult for Facebook to counter Haugen, though, particularly if new documents become public. Her letter to the SEC suggests that Facebook knew much more about QAnon and violent extremism on its platform than it let on, as Vicereported earlier. Haugen may also make appearances in front of lawmakers in other countries, too. European lawmakers, many of whom have expressed similar concerns as their US counterparts, have also indicated they want to talk to Haugen and conduct new investigations of their own.
The Facebook whistleblower who has provided a trove of internal documents to Congress and the Securities and Exchange Commission is testifying about research she says proves the social network has repeatedly lied about its platform. The documents were the basis for The Wall Street Journal's reporting on Facebook's controversial rules for celebrities, and the disastrous effect of Instagram on some teens' mental health.
“Facebook and big tech are facing their big tobacco moment,” committee chairman Sen. Richard Blumenthal said at the start of the hearing. “Facebook knows its products can be addictive and toxic to children. They value their profit more than the pain that they cause to children and their families.”
In her opening statement, Frances Haugen, the former Facebook product manager , said that the company has ignored much of its own research and is “buying its profits with our safety.” She urged Congress to adopt new regulations
“The choices being made inside of Facebook are disastrous for our children, our public safety, our privacy and for our democracy,” Haugen said. “And that is why we must demand Facebook make changes.”
She highlighted Facebook’s to make data available outside of its own research teams has helped the company mislead the public. “The company intentionally hides vital information from the public, from the US government, and from governments around the world,” Haugen said. “The documents I have provided to Congress prove that Facebook has repeatedly misled the public about what its own research reveals about the safety of children, the efficacy of its artificial intelligence systems, and its role in spreading divisive and extreme messages.”
She also said that Congress should not be swayed by Facebook’s insistence on “false choices,” and that simply reforming privacy laws or Section 230 would not go far enough. “We can afford nothing less than full transparency,” Haugen said. “Facebook wants you to believe that the problems we're talking about are unsolvable... Facebook can change, but it's clearly not going to do so on its own."
Haugen’s appearance comes days after Facebook sent its head of safety, Antigone Davis, to testify in front of the . She and other executives have repeatedly tried to downplay the company's research, with Davis saying that the documents "were not bombshell research." In Tuesday’s hearing, some senators called out Mark Zuckerberg, saying that they should be hearing from him instead. “Rather than taking personal responsibility, showing leadership, Mark Zuckerberg is going sailing,” Blumenthal said, in an apparent reference to a recent Facebook from the CEO.
Facebook services seem to be slowly coming back online after one of the biggest outages in recent memory. Facebook, Instagram and Messenger’s apps appear to be working again, though the websites are loading more slowly than usual. Meanwhile, WhatsApp's website seems to be back, but the app is still having issues connecting.
As of 6:05pm ET Monday, the "Facebook for Business Status" page was still showing "major disruptions," to the social network's core services. But that was still an improvement from earlier in the day when the website was offline entirely.
Facebook didn’t immediately comment or elaborate on the cause of the outage. In an earlier tweet, the company’s Chief Technology Officer, Michael Schroepfer, cited “networking issues.”
*Sincere* apologies to everyone impacted by outages of Facebook powered services right now. We are experiencing networking issues and teams are working as fast as possible to debug and restore as fast as possible
The outage lasted more than six hours, taking down Instagram, WhatsApp, Messenger and Oculus. It also wreaked havoc on the company internally, with employees reportedly unable to access emails, Workplace and other tools. The New York Times that employees were also physically locked out of offices as workers’ badges stopped working.
It also shaved off of Mark Zuckerberg’s personal net worth as Facebook’s stock tanked, Bloomberg reported. Elsewhere, the company is still reeling from the fallout of a whistleblower who has accused the company of prioritizing The whistleblower was The Wall Street Journal’s primary source for several articles that details how Instagram is harmful to teens and the company’s controversial program that allows high profile users to break its rules.