• 1 Post
  • 1.24K Comments
Joined 2 years ago
cake
Cake day: March 22nd, 2024

help-circle



  • This feels like another case where the specific context matters more than whatever supposed principal the thought experiment is supposed to illuminate. The example that came to my mind when I tried to think about how to justify “voting red” was about running into a burning building. Sure, if some large fragment of people did so then their combined numbers would presumably let them get everyone out. But on the other hand, throwing yourself in is a wholly unnecessary risk, and the only people in need of rescuing are the people who ran in trying to do the right thing without thinking. Noble, but stupid and creates that much more risk for the firefighters who now have to not only stop the fire from spreading but also figure out how to rescue the failed good samaritans.

    But then what really makes the difference between the examples is purely in the details not included, which is the kind of null case. Nobody has to go into a burning building that isn’t already in there when it catches fire. The danger of harm is entirely optional and voluntary. But you can’t just choose to not eat; the danger in your framing is omnipresent threat of starvation, and the question is whether to prioritize individual or collective well-being.

    Ed: also, to reference the scholarly work of Christ, Wiener, Et Al.:

    RED IS MADE OF FIRE




  • I don’t have much sympathy for the “let’s wait and see” moderates, but I do think there’s a coherent difference between people who have tried AI tools and found some use for them in some limited context and people who go full Howard Hughes with it like John McGasTown or whatever that idiot’s name is. To me it feels like an extension of the argument that these so-called AI systems are a normal trchnology. They aren’t a harbinger of the end times, whether you interpret that as the singularity or the biblical Armageddon. It’s a normal technology that is breaking in normal ways and is breaking society and the economy in the ways we would expect late capitalism to break. If it wasn’t this it would probably be something else. Hell, there’s still a chance that the wheel turns to “Quantum” or something else after this and we stretch another few years out of that before the music stops.

    AI is a bad tool for any given job, and is fundamentally not worth the price that we as a society are paying to let it exist at this scale. If it wasn’t being subsidized by capitalists chasing ridiculous returns and bouyed by an economic system structured entirely around giving it to them then there’s no way in hell it would have hit this point. But that’s not incompatible with people being able to find utility in it in some cases, and I think we lose credibility by treating any admission that someone has found any value in AI products as a confession of unseriousness. That doesn’t mean their use isn’t still part of the problem, but I’d we frame the critique in terms of “how much would you actually be willing to pay for you ‘occasional’ use?” It would redirect the discussion away from the subjective “well I found it useful for X” to the more objective question of just how expensive and destructive these things are to operate and how much of those costs are going to have to be subsidized forever if these things are going to stick around.



  • Given the Star Wars discussion alluded to in the next paragraph, I think we’re looking at “try rereading your first book while being less of a self-important dumbass.” Like, I get it, Revan is one of the best characters in that canon, and where Vader fell for very human if selfish reasons Revan pushes even farther and was using the dark side to conquer the galaxy in order to try and save it from… being conquered by a sith empire that drew great and terrible power from the dark side of the force. What happened to Vader again? Oh yeah, he sought the dark side for the power to save his wife and became a great and terrible warlord by calling on his rage and despair over… killing his wife. Like, the fact that trying to gain power through the dark side is at best a self-destructive shortcut that will undermine your actual goals is pretty goddamn consistent, and this is Star Wars Legends, a canon not exactly known for being internally consistent. I’m not saying you need to “agree” with that premise, and I think the franchise as a whole is usually too conservative, with the passivity of the light sife being a big part of that. It’s just deeply absurd to me for that to be the takeaway from that story. Like all the people who’s main takeaway from Jurassic Park was “man, wouldn’t it be cool if we had real dinosaurs?” who then went on to be the victims and villains of Jurassic World.


  • I can understand what they’re saying, though. Like, his defining moment is the finale of SC1 where he does sacrifice himself and become this major culture hero. There is definitely room to question that warrior ethos and what it says about the Protoss and what that in turn says about how we think about the real-world cultures and ideas that inspired them, and I’m pretty open to those constructs not being particularly respectful. But within those background structures and the culture they describe the immediate storyline is about how the conclave and even the Khala itself is ultimately destructive and makes the Protoss more vulnerable even as it is their source of strength and identity, which feels actually pretty timely if you read it that way.


  • Okay, today’s Rat fixation that I want to rant about is “constructing hypothetical examples to justify my idiosyncratic position.” Like, I’m not even interested in arguing about whether their conclusion makes sense in their hypothetical world, I’m more curious about what kind of chain of thought leads you to speculate about that in 2026. Like, maybe I’m reading way too much into this but in practical terms it feels like “how do I justify voting for the Republicans no matter how far-right they might go, if my local Democrats try to move the tiniest bit left?” which feels like the rat/tech ethos in a nutshell.

    Or maybe it’s the more traditional past time of trying to construct arguments in favor of controversial-sounding positions so that you can feel smarter and more open-minded than everyone else.