Happy Wednesday! Here’s your weekly Tech Drop, the top stories I've been following at the intersection of politics and technology.
Chesebro’s burner
CNN reported on election-related tweets from 2020 that were posted from an anonymous account belonging to former Trump campaign legal adviser Kenneth Chesebro, and the tweets may add to his legal woes. Chesebro, who helped draw up the Trump campaign’s “fake electors” scheme, promoted election challenges from an anonymous Twitter account, BadgerPundit, his lawyers confirmed to CNN. The Wisconsin native’s tweets, which Talking Points Memo also dug into recently, could show that Chesebro wasn’t completely forthcoming with Michigan investigators when he spoke with them last year and denied using Twitter. Chesebro, who took a plea deal in Donald Trump’s criminal case in Georgia, has not been charged with a crime in Michigan.
Read more at CNN.
Elon v. the experts
A federal judge will hear arguments on Thursday about whether to allow a lawsuit involving Elon Musk and an organization that researches online hate speech to proceed.
Last year, Musk filed a suit against the Center for Countering Digital Hate that alleges the nonprofit’s research into hate speech on Musk’s social media platform, X, cost it millions in advertising revenue. The CCDH, in response, is moving to dismiss the suit, arguing that Musk's lawsuit is intended to prevent the group's research. The CCDH has also been targeted by House Judiciary Chairman Jim Jordan and other right-wingers who’ve desperately tried to paint online content moderation efforts to root out disinformation and hate as a government-led conspiracy to suppress conservative “free speech.” It’s nonsensical, but Musk has made it a rallying call on the right. If he’s successful in this legal attack, the result may be to silence experts and organizations who specialize in identifying and dispelling disinformation and hate speech online.
Read more in The Washington Post.
All things in moderation
During oral arguments Monday, multiple Supreme Court justices seemed skeptical of laws out of Texas and Florida that would limit social media companies’ ability to moderate content. Republicans have claimed that Big Tech’s efforts to police the spread of disinformation have illegally deprived conservatives of their free speech rights, despite the platforms being private companies with a wide latitude to determine what is and isn’t allowed on their sites.
Read more at NBC News.
Fox and the Hedgehog
Fox Corp. is backing a new social media platform called Hedgehog, a site that is partly the brainchild of Parler co-founder Jared Thompson, Puck reports. Parler, for the unaware, is the now-defunct platform branded as an alternative to Twitter that became a hotbed for far-right extremists before it was shut down last year. Hedgehog, according to a news release, is an invite-only news platform that “encourages people to read the news without source bias” and promotes “civil debate with the ‘reasonable middle’ in society.” Sounds like the Pollyannaish language we hear from just about every other platform. Consider me skeptical.
Read more on Puck.
Right-wingers rage against the machine
Google apologized after Gemini, its generative artificial intelligence tool for images, churned out pictures of nonwhite characters that conservatives seized on to highlight the tech's purported anti-white bias. As The Verge writes, Gemini, when prompted, created pictures of a Black Nazi and an image of a Black “founding father” that drew ire from some users. The most vehement pushback predictably came from right-wingers like Elon Musk, who denounced the AI creations as “racist.”
Read more at The Verge.
Controversial police tool
Check out this Slate podcast on ShotSpotter, a controversial system that's been deployed in cities as a tool that is supposed to help law enforcement officials identify when and where a gunshot has been heard. Predictably, however, the system has been decried as discriminatory by critics who’ve noted its alleged tendency to send false alarms and the disproportionate placement of microphones in Black and other minority communities, which some have said led to over-policing. Ralph Clark, the president of SoundThinking, which operates ShotSpotter, said earlier this month that the technology had exceeded accuracy requirements under its Chicago contract, but conceded that "ShotSpotter is not a perfect technology; I don't know that any technology is perfect," according to ABC affiliate WLS.
Listen to the ShotSpotter episode of Slate’s “What’s Next?” podcast here.