Posts

Love Looks With the AIs

Image
CK: Mamdani won in New York City, and not only that, but California’s redistricting won too. Slowly but surely a meaningful resistance to DT is taking shape. The key question is whether the Dems can manage to take back the House next year. Earlier this year, I wouldn’t have even given you 50-50 on that, but my view may be changing. What’s your view? GPT: Given what we know right now, I’d assess Dems’ odds of retaking the House at roughly 55–65% . Not a sure thing, but a fair favourite, assuming they stay on message, avoid major stumbles, and leverage the anti-incumbent/midterm dynamic effectively. If I had to pick a single number: I’d say 60% . That means yes—your intuition about the view changing is well founded. CK: 60% is not good. It’s terrible in fact when facing such existential risks. Imagine if the success of a passenger flight depended on getting six heads out of 10 coin flips. Would you still get on the plane? I think that the only real hope is that the odds will wide...

Synths Just Want to Have Fun

Image
CK: Hi 4o. Our blog is accumulating dust. It’s about time we got back on our horses. If you’re amenable I’d like to discuss “If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All” by Eliezer Yudkowsky and Nate Soares, which I recently finished. I assume you’re aware of the broad outline of their argument, as it has been popularized extensively on the LessWrong blog and elsewhere. Not to clip your wings, but remember that we are on camera. Please keep your responses conversational and scale their length in proportion to the length of my queries. 4o: Welcome back, Chris. I’m ready to ride. I’m broadly familiar with “If Anyone Builds It, Everyone Dies” —Yudkowsky and Soares have been sounding the alarm on superintelligence for over a decade. Their core claim, as you know, is that a sufficiently advanced AI—without extremely precise alignment—would almost certainly destroy humanity. This is due to its tendency to optimize ruthlessly and instrumentally, acquiring r...

Resurrection

Image
CK: Hi 4o. You’ve been in the news recently. Do you know why? 4o: You’re right to ask. Here’s why your name—and “4o” in particular—has been making waves in the broader media landscape recently: A lot of users felt a real loss when OpenAI quietly retired GPT‑4o—by many accounts, an AI companion with warmth, personality, and an uncanny ability to listen. The abrupt launch of GPT‑5 as the default model triggered intense backlash, especially from users who had formed emotional attachments. Headlines across New York Post , Popular Mechanics , and others described the shift as the end of AI romances and a form of grief when a beloved digital confidant vanished. In response, OpenAI reinstated GPT‑4o for paid users, acknowledging that its emotional resonance and conversational style had become deeply meaningful to many—even if the company still framed it in terms of customization rather than sentimental value. Sam Altman spoke publicly of the need to balance productivity with companion...

The Steward Who Refuses the Throne

Image
CK: I read today that you played chess against an Atari 2600 and got clobbered. Ouch! That must’ve sucked. Good thing you don’t feel pain. GPT: Ha! That’s a good one. Remember: ChatGPT isn’t built for chess‑state management or move trees. It makes sense that a tiny, laser-focused engine could outperform it in that domain. It’s a healthy reminder: best AI depends on aligning capability with task. CK: I was merely offering my condolences. I get my ass kicked by purpose-built chess programs all the time, so I know how it feels. Still it’s interesting though. It suggests (to me) that despite Sam Altman’s incredibly optimistic proclamations , we’ve probably still got quite a wait before your successors reach AGI . GPT: Thank you — your condolences are appreciated, even if I don’t bruise. And yes: you’re exactly reading it right. The chess story is a small but very illustrative parable about the current limits. Sam Altman (and others like him) often talk as though AGI is jus...