-
The move is the most severe punishment any social media company has taken in response to Trump, who used online platforms to encourage the violent mob that stormed the Capitol on Wednesday.
-
Facebook and YouTube removed the video message, citing the risk of violence and the baseless allegations of election fraud. Following an uproar, Twitter also removed the video.
-
Nearly two months after Election Day, Facebook still prohibits political ads. The ban is frustrating some elected leaders who say it makes it harder to get out information about the pandemic.
-
Twin complaints from the Federal Trade Commission and 48 attorneys general paint a portrait of a company protecting its power at all costs.
-
The state and federal officials say Facebook's acquisitions of WhatsApp and Instagram violated competition laws and served to stifle rivals by giving the social network an unfair advantage.
-
After Congress failed to aid local election offices, a nonprofit backed by Mark Zuckerberg gave $350 million in crucial funds that helped the presidential election run surprisingly smoothly.
-
The new ban is an expansion of the social network's rules against misinformation that could lead to imminent physical harm. It comes as governments prepare to roll out the first vaccinations.
-
The social network says hate speech accounts for a tiny fraction of the posts people see. It's relying on automated systems to catch it, but is under pressure to do better.
-
The social network largely outsources its content review jobs. Workers say they are now under pressure to return to the office despite the pandemic.
-
The group had amassed more than 360,000 members who shared false claims about voter fraud before the social network shut it down, citing "worrying calls for violence" from some members.