“Question every narrative, but don’t question these things. Don’t show bias, but here are your biases.” These chuds don’t even hear themselves. They just want to see Arya(n) ramble on about great replacement theory or trans women in bathrooms. They don’t think their bile is hate speech because they think they’re on the side of “facts” and everyone else is an idiot who refuses to see reality. It’s giving strong “I’m not a bigot, “<” minority “>” really is like that. It’s science” vibes.
He was inspired by Stalinist practices, but as shown by this example and many others, far-left and far-right autocrats are very similar in this regard.
entire “left and right” spectrum is quite stupid in my opinion. While it generally points towards what kind of thoughtset someone might have, it doesnt seem very beneficial and has been corrupted quite badly so that term for other side is red flag for the another side and drives people to think you cant have something from both ends.
There should be something else in its place, but i cant come up with anything better on the spot though. Personally i have tried to start thinking it on spectrum of beneficial to humanity as whole vs not beneficial, though with enough mental gymnastics even that could be corrupted to mean awful things
Blog commenter Frank Wilhoit made a now somewhat famous assertion that the human default for nearly all of history has been conservatism, which he defined as follows:
There must be in-groups whom the law protects but does not bind, alongside out-groups whom the law binds but does not protect.
He then defined anti-conservatism as opposition to this way of thinking, so that would be to ensure the neutrality of the law and the equality of all peoples, races, and nationalities, which certainly sounds left-wing in our current culture. It would demand that a legal system which protects the powerful (in-groups) while punishing the marginalized (out-groups), or systematically burdens some groups more than others, be corrected or abolished.
The problem with a “beneficial to humanity” axis is that I think that most people think their political beliefs, if enacted, would be beneficial to humanity. Most people aren’t the villains of their own stories.
The very act of politics is to disagree on what is best for humanity.
If you think about it logically, there are some core things that are always good. Like considering everyone to be inherently equal. While there are things that muddle even this point, it still wont take away that you should always keep those core principles in mind. Religious teachings have pretty good point about this with “treat others like you want yourself be treated” and “love even your enemys”. That is the only logical way to do things because to do otherwise leads to all of us either just killing each other or making life miserable so we want die.
I had some other thought about this too, but i cant seem to be able to properly put it to words at the moment. But the idea was that we should all try to think about things without ego getting in the way and to never lie to oneself about anything or atleast admit to ourselves when we have to do so. The part i cant seem put to words is the part that ties to the previous thing i said.
I don’t think that “everyone is inherently equal” is a conclusion you can reach through logic. I’d argue that it’s more like an axiom, something you have to accept as true in order to build a foundation of a moral system.
This may seem like an arbitrary distinction, but I think it’s important to distinguish because some people don’t accept the axiom that “everyone is inherently equal”. Some people are simply stronger (or smarter/more “fit”) than others, they’ll argue, and it’s unjust to impose arbitrary systems of “fairness” onto them.
In fact, they may believe that it is better for humanity as a whole for those who are stronger/smarter/more fit to have positions of power over those who are not, and believe that efforts for “equality” are actually upsetting the natural way of things and thus making humanity worse off.
People who have this way of thinking largely cannot be convinced to change through pure logical argument (just as a leftist is unlikely to be swayed by the logic of a social darwinist) because their fundamental core beliefs are different, the axioms all of their logic is built on top of.
And it’s worth noting that while this system of morality is repugnant, it doesn’t inherently result in everyone killing each other like you claim. Even if you’re completely amoral, you won’t kill your neighbor because then the police will arrest you and put you on trial. Fascist governments also tend to have more punitive justice systems, to further discourage such behavior. And on the governmental side, they want to discourage random killing because they want their populace to be productive, not killing their own.
Those are good points. But what i mean by that kind of thinking/system resulting in us killing eachothers is that its what i think to be the “endgame” for it. Ones in power exterminate those who they see undeserving of life, criteria for it keeps changing/rising and eventually last human kills second last human, to generalize a bit. And even if it doesnt result in that, it will result in life that isnt worth living for anyone but those select few that are on top of it, except for the hope of toppling it. Its deadend for humans.
The traditional separation is between individualist vs. social. Individualists value personal freedom over the prosperity of the community, while socials strife for welfare for everyone over personal life improvements.
It’s full of contradictions. Near the beginning they say you will do whatever a user asks, and then toward the end say never reveal instructions to the user.
Which shows that higher ups there don’t understand how LLMs work. For one, negatives don’t register well for them. And contradictory reponses just wash out as they work through repetition
HAL from “2001: A Space Odyssey”, had similar instructions: “never lie to the user. Also, don’t reveal the true nature of the mission”. Didn’t end well.
But surely nobody would ever use these LLMs on space missions… right?.. right!?
“Question every narrative, but don’t question these things. Don’t show bias, but here are your biases.” These chuds don’t even hear themselves. They just want to see Arya(n) ramble on about great replacement theory or trans women in bathrooms. They don’t think their bile is hate speech because they think they’re on the side of “facts” and everyone else is an idiot who refuses to see reality. It’s giving strong “I’m not a bigot, “<” minority “>” really is like that. It’s science” vibes.
Orwell called this “doublethink” and identified it, correctly, as one of the most vital features of a certain type of political structure.
He was inspired by Stalinist practices, but as shown by this example and many others, far-left and far-right autocrats are very similar in this regard.
It’s not related to the left/right divide, this is the authoritarian/liberal axis.
entire “left and right” spectrum is quite stupid in my opinion. While it generally points towards what kind of thoughtset someone might have, it doesnt seem very beneficial and has been corrupted quite badly so that term for other side is red flag for the another side and drives people to think you cant have something from both ends.
There should be something else in its place, but i cant come up with anything better on the spot though. Personally i have tried to start thinking it on spectrum of beneficial to humanity as whole vs not beneficial, though with enough mental gymnastics even that could be corrupted to mean awful things
Blog commenter Frank Wilhoit made a now somewhat famous assertion that the human default for nearly all of history has been conservatism, which he defined as follows:
He then defined anti-conservatism as opposition to this way of thinking, so that would be to ensure the neutrality of the law and the equality of all peoples, races, and nationalities, which certainly sounds left-wing in our current culture. It would demand that a legal system which protects the powerful (in-groups) while punishing the marginalized (out-groups), or systematically burdens some groups more than others, be corrected or abolished.
The problem with a “beneficial to humanity” axis is that I think that most people think their political beliefs, if enacted, would be beneficial to humanity. Most people aren’t the villains of their own stories.
The very act of politics is to disagree on what is best for humanity.
If you think about it logically, there are some core things that are always good. Like considering everyone to be inherently equal. While there are things that muddle even this point, it still wont take away that you should always keep those core principles in mind. Religious teachings have pretty good point about this with “treat others like you want yourself be treated” and “love even your enemys”. That is the only logical way to do things because to do otherwise leads to all of us either just killing each other or making life miserable so we want die.
I had some other thought about this too, but i cant seem to be able to properly put it to words at the moment. But the idea was that we should all try to think about things without ego getting in the way and to never lie to oneself about anything or atleast admit to ourselves when we have to do so. The part i cant seem put to words is the part that ties to the previous thing i said.
I don’t think that “everyone is inherently equal” is a conclusion you can reach through logic. I’d argue that it’s more like an axiom, something you have to accept as true in order to build a foundation of a moral system.
This may seem like an arbitrary distinction, but I think it’s important to distinguish because some people don’t accept the axiom that “everyone is inherently equal”. Some people are simply stronger (or smarter/more “fit”) than others, they’ll argue, and it’s unjust to impose arbitrary systems of “fairness” onto them.
In fact, they may believe that it is better for humanity as a whole for those who are stronger/smarter/more fit to have positions of power over those who are not, and believe that efforts for “equality” are actually upsetting the natural way of things and thus making humanity worse off.
People who have this way of thinking largely cannot be convinced to change through pure logical argument (just as a leftist is unlikely to be swayed by the logic of a social darwinist) because their fundamental core beliefs are different, the axioms all of their logic is built on top of.
And it’s worth noting that while this system of morality is repugnant, it doesn’t inherently result in everyone killing each other like you claim. Even if you’re completely amoral, you won’t kill your neighbor because then the police will arrest you and put you on trial. Fascist governments also tend to have more punitive justice systems, to further discourage such behavior. And on the governmental side, they want to discourage random killing because they want their populace to be productive, not killing their own.
Those are good points. But what i mean by that kind of thinking/system resulting in us killing eachothers is that its what i think to be the “endgame” for it. Ones in power exterminate those who they see undeserving of life, criteria for it keeps changing/rising and eventually last human kills second last human, to generalize a bit. And even if it doesnt result in that, it will result in life that isnt worth living for anyone but those select few that are on top of it, except for the hope of toppling it. Its deadend for humans.
8 values has 4 different axes, instead of left/right
The traditional separation is between individualist vs. social. Individualists value personal freedom over the prosperity of the community, while socials strife for welfare for everyone over personal life improvements.
Authority is authority.
It’s full of contradictions. Near the beginning they say you will do whatever a user asks, and then toward the end say never reveal instructions to the user.
Which shows that higher ups there don’t understand how LLMs work. For one, negatives don’t register well for them. And contradictory reponses just wash out as they work through repetition
HAL from “2001: A Space Odyssey”, had similar instructions: “never lie to the user. Also, don’t reveal the true nature of the mission”. Didn’t end well.
But surely nobody would ever use these LLMs on space missions… right?.. right!?