We're gonna build a cultural movement around AI maximalism. So there's doomerism, which I've been talking about. There's accelerationism, which was the which which was the answer to doomerism. So doomerism said it's dangerous. Shut it all down. Acceleration accelerationism said, no. It's not. Go faster. Right? And so, in terms of other cultural movements, Brian Johnson's don't die movement, I think, dovetails nicely with an AI maximalist movement. Because guess what? One of the things that Brian Johnson has used to form his movement is AI. He's like, if the AI can make better health decisions than I can, I'm gonna do what the AI says. That was literally, like, at the very beginning of his movement before it was called Don't Die because I've been following him for a while. Like, he's popped up on my radar because he's because he was adjacent to the AI space. And so it's like, alright. So we have the Neo Luddites, we have the Doomers, we have the Accelerationists, and and now we have the maximalist, which is AI should just be everywhere. AI should should saturate everything in your life, and not just in your life, but also in all of society. Not all of it's gonna work, but we need to try. And by by creating that narrative by by just saying, I'm an AI maximalist. That's something you can identify as. Right? Because I never really identified as an accelerationist because, like, you know, they're a little bit culty and the doomers are a little bit culty and the AI safety people are a little bit culty. Everyone's a little bit culty. Right? Brian Johnson is a little bit culty. So, like, I'm just like, heck with it. I'll make my own cult. AI maximalist. There you go. Just make your own cult. Right? That's just that's pretty fun. Right? That can be pretty easy.