- Malware by Boys Club
- Posts
- ghost ride the whip
ghost ride the whip
AI poison pill, White House report, self-driving fail
gm,
Have you been following our SBF trial coverage? Catch up here and don’t worry, we’ll stop talking about it soon.
Love,
The Boys
Cruise control. Or lack thereof. Self-driving taxi company Cruise had its license revoked after an accident seriously injured a pedestrian. The DMV deemed the cars “not safe for the public’s operation” after just 3 months of being allowed to operate in San Francisco 24/7. The DMV claims the company intentionally withheld video footage of the accident, presumably to hide the parts of the video that make them look real bad. Cruise competitor Waymo on the other hand, just launched a partnership with Uber in Phoenix, allowing riders to request a self-driving car. Sounds like the jury’s out as to whether they’re more or less safe than human drivers, but at least you’ll never have to listen to them blasting EDM when you’re trying to take a work call.
Trick or treat. A new tool lets artists prevent AI companies from using their work to train their models without their consent. This “poison pill” called Nightshade makes changes to images that are invisible to the human eye, but confuse the f- out of AI image models. That means if one of these companies such as Midjourney or DALL-E scrapes this image and adds it to their training set without permission from the artist, the poison pill will corrupt the data and start making the models go mad (for example, generate dogs when a user asks for cats) It’s giving… Edgar Allen Poe.
Hot off the press. The White House released their hotly anticipated executive order governing AI development in the U.S on Monday. The 111-page report has been met with general support from big tech and politicians, though critics feel it’s perhaps too broad. The order directs multiple government agencies to take on new initiatives such as establishing an AI safety board to review infrastructure, create guidance on avoiding bias and discrimination in models, and protect workers from job elimination. It also mandates that the AI big bois share the results of their safety testing with the gov before releasing new models, showing they’re taking AI safety seriously and want to collaborate with private companies rather than lay down the law. This whole thing is somehow both too much and too little? Idk don’t @ me.
This newsletter is supported by your new project management bff. Join the waitlist at www.catalistai.com
you are listening to the new blink 182 album for the 17th time today and about to play the new mario with your friends who brought over mountain dew.
you ordered dominos with a discount code your mom gave you, $10/pizza.
the year is 2023, and you are 38 years old. life is good
— Sam Altman (@sama)
2:54 AM • Oct 26, 2023
has anyone tried training an artificial intelligence to be as dumb as possible
— █̶̳̘͛̄̃͒̄̃͜█̴͇̱̅͒̅█̵̻̣̝͒̈̄̈͝͝█̴̞̜̻̝͍̂̽͜█̷̢ (@SHL0MS)
4:35 PM • Oct 29, 2023
the 399 pound taylor swift pumpkin has not left my mind since i saw it this morning
— jar jar binky (@caseyaonso)
10:42 PM • Oct 25, 2023
@allhailthealgorithm This one's for my queen @Julia fox who liked the last one 🤩 now with AI DeSantis voice! #deepfake #ai #ml #aivoice #uncutgems #muse #julia... See more