- Malware by Boys Club
- Posts
- barbie.ai
barbie.ai
AI that can see, writer's strike, Elon's new thing
gm,
It’s a big day so we’ll keep it quick.
Love,
The Boys
The AIs have Eyes. In Bard’s latest update, users can upload a photo and it can, well, look at it. Meaning you might upload a photo of a meal you ate and ask for Bard to give you the recipe (hot dog or not hot dog v2.) Or a screenshot of a sign and ask it to type the text for you. But what might be the coolest use case is it can take a screenshot or even a sketch of an app or website, and actually turn it into working code. Fuck me up Sundar! Related but different: NVIDIA’s creepy Eye Contact tool.
Will there be a Happily Ever SAG-AFTRA? If you haven’t been Keeping up with the Kardashians, there’s a big strike going on in Hollywood right now, and the actors just joined the party. AI is center stage, and one of the main plot lines is that film studios have been digitizing background actors and using the tech to to render extras instead of hiring actors, or paying established actors to use their Avatars. The strikers are looking to reach a deal with producers that compensate them fairly. Will they come to terms before all of our favorite fall shows get cancelled? Stranger Things have happened. Ayo!
Is there a method to the madness? Elon announced he’s starting a new company called xAI, and it’s all starting to come together. In a seemingly “billionaires will be billionaires” act of buying Twitter, Musk bought himself a treasure trove of text conversations and images which he can use to train his new toy AI model. Twitter’s recently imposed rate limits and data-scraping lawsuits hint that he might be planning on making tweets proprietary to xAI, which he claims won’t be as “woke” as some other companies. And let’s not forget about Tesla, which brings to the table massive capabilities in hardware and advanced image-detection used for its self-driving tech. Are we all pawns in Musk’s game of chess? Probably!
We hang out in AI-image bot chatrooms so you don’t have to.