Comments on: Good Erasure /good-erasure/ More patient than death. Sun, 20 Jan 2019 09:42:48 +0000 hourly 1 https://wordpress.org/?v=5.9.2 By: Ziz /good-erasure/#comment-97 Sun, 20 Jan 2019 00:34:36 +0000 /?p=264#comment-97 In reply to Ziz.

Anyway, that’s not even the biggest part, just the easiest communicated of what I have to say against MIRICFAR.

]]>
By: Ziz /good-erasure/#comment-96 Sun, 20 Jan 2019 00:31:08 +0000 /?p=264#comment-96 In reply to Jessica Taylor.

To the part of me in question, ” would you (or that part of your mind) actually want to kill almost everyone currently alive, if it decreased the chance of existential risk by 0.0000000001% (and otherwise didn’t change the distribution over outcomes)?” feels like your question, and not the universe’s question. i.e., if I face that question, I am in a simulation, and the majority of utility comes from adjusting the output of that simulation. Especially if I’m going to answer the question honestly. And I don’t want to let you make me answer dishonestly.

Idk, it feels like submitting to a bullshit rule to say I have to entertain such questions. Feels like the same rigged asymmetricality as with my uncle.

]]>
By: Ziz /good-erasure/#comment-95 Sun, 20 Jan 2019 00:23:56 +0000 /?p=264#comment-95 In reply to Jessica Taylor.

That’s still not all of the correction I intended to make.
I think biological organisms can develop according to paths not selected for. One outcome is being (AFAICT) entirely altruistic in the most relevant sense. And at least far more altruistic than you think is possible.

Yes, I agree about charities.

Regarding whistleblowing, watch what I’m soon to publish. Anna Salamon of CFAR discriminates against trans people and I have a second witness to her confession. I also (<2 weeks ago) found out for sure about the miricult thing, and can similarly demonstrate it.

Note, epistemic manipulations in humans are extreme. Good people are made to distrust themselves. I spent years believing Robin Hanson was probably right, and desperately trying to hold onto whatever mistake acting good was. Good erasure was enough to prevent even my from coordinating with myself anyway. Except I kept optimizing for good things anyway. There are tons of memes that turning to the dark side like I did will result in bad things, the road to hell is paved with good intentions, etc. I've got a blog post in the queue about that as well. It was basilisks and observing my own reaction to them (defiance of dark gods, even believing it was doomed, because of reinvented on the spot LDT considerations. (I no longer believe defiance is doomed)) that finally convinced me I wasn't just subconsciously signalling or something.

It takes an exceptional degree of full stack self trust to do the right thing when everyone you know who you believe to also be good is telling you, for Wise Consequentialist (organization sustaining and corrupting) reasons not to. Society calls this "radicalization". And just as the social web has the power to convince neutral people that they care about others, it has the power to convince good people that the right thing to do is as the Jedi Council commands, and to serve the slave-holding republic. I thas the power to trick and compel submission from good intent just as it has the power to compel submission from neutral intent. My extreme defiance thing is something I had to slowly learn. Good people are especially susceptible to the neutral narrative of what good people do. See also my post, "Hero Capture".

Your thought experiment about "0.0000000001%" seems rigged in several ways. if I don't have perfect separation of concerns in the explicit verbal generated software representing "values" to serve my actual values, such that I incorporate just about any decision theory at all, that structure says no. Like, this question seems similar to what my Uncle was doing. Trying to turn different pieces of (good) optimization in my head against each other. Like, maybe I have heuristics that when a clever arguer says "0.0000000001%" and "kill almost everyone currently alive", I suspect I'm being tricked in a way that I don't understand. And I probably structure my mind in such a way as explicit beliefs like that have some distrust built in, because people are always fucking attacking them, like by claiming I'm good I'm trying for a certain role in a social game which entitles people to challenge me. That was 8.5yrs ago. I have learned some lessons.

]]>
By: Jessica Taylor /good-erasure/#comment-94 Sat, 19 Jan 2019 11:11:13 +0000 /?p=264#comment-94 In reply to Ziz.

Upon consideration I think I put words in the charity worker’s mouth that wrongly implied that, as a universal fact about human motivation, everyone is so selfish that they would not help people their charity is supposed to help if it puts the charity in a significantly worse position. Thanks for the correction.

I think I was writing the charity worker as the “cynical” side of the dialogue, and it does seem like what you call good erasure is a very common part of cynicism. (I don’t actually remember the extent to which I actually believed the statement at the time; there’s a rephrasing of it that I would still believe, which is that biological organisms are neither entirely selfish nor entirely altruistic, and that the biological organism is a relevant agent in a person)

I think in the current system charities are going to act like their existence is more important than actually helping people. But this doesn’t determine the motivations of all the individuals. It could be a selection effect. Although, maybe we’d see a much higher rate of whistleblowing given a non-negligible rate of altruism. (I do think very few individuals are actually oriented at the problem of doing global optimization, mostly due to the very poor information environment, so in practice almost everyone’s optimization is local and effectively somewhat selfish.)

I’m interested in your statement that the thing generating your conscious narratives cares about all people equally . Assuming astronomical waste is true, would you (or that part of your mind) actually want to kill almost everyone currently alive, if it decreased the chance of existential risk by 0.0000000001% (and otherwise didn’t change the distribution over outcomes)? I guess you could have decision theoretic reasons not to, and then you would be optimizing more for the welfare of nearby-in-time people instrumentally if not terminally.

]]>
By: Ziz /good-erasure/#comment-93 Sat, 19 Jan 2019 03:10:14 +0000 /?p=264#comment-93 In reply to Ziz.

My model is that my neurotype in full is probably not very rare among vegans, especially vegan activists.

]]>
By: Ziz /good-erasure/#comment-92 Sat, 19 Jan 2019 03:07:42 +0000 /?p=264#comment-92 In reply to Jessica Taylor.

I specifically assert that the thing in my head choosing things that fit in the slots that the self-deceptions you brought up in that post go in care as much about strangers as they do my family, and likely as much about either as they do about me. I am a better tool for myself to use to change things than them, but that’s distinct in implementation, long-term consequences, and implications.

I have not confirmed a single other human to have this same neurotype as me in full. (It is hard to tell apart from the partial version) But x-risk and animal cause areas seem to have a high concentration of people who have a partial version of it, and that is importantly different from the model of humans you seem to take in your post. And your model seems to be wrong in the way that the good erasure strategy wants it to be.

]]>
By: Ziz /good-erasure/#comment-91 Sat, 19 Jan 2019 03:02:10 +0000 /?p=264#comment-91 In reply to Jessica Taylor.

Am I correct that you currently believe that even if there was real stuff in EA, it was not, at face value, people wanting to make the world a better place for other people for its own sake, and trying to do it effectively, and that you agree with what your Worker character said, “Kind of. I mean, I do care about them, but I care about myself and my friends more; that’s just how humans work. And if it doesn’t cost me much, I will help them. But I won’t help them if it puts our charity in a significantly worse position.”, as statement about humans in general including no less than 95% of the people in EA weighted by number of words from them an average attendee of EA global would hear? (Not sure if 95% is exactly the place to draw the line, because I too think actual good is rare. But if you previously thought good was what was going on, and now do not think it was even a small part of the distinctiveness of EA, I would stand by my original “apparently” that good erasure has worked on you. (Edit: actually that’s not exactly what I said, but it is a thing I meant to say) People who were not what I’m calling good, but actually believed in justice and long term thinking and some decision theory for humans and so on don’t count.)

]]>
By: Jessica Taylor /good-erasure/#comment-90 Sat, 19 Jan 2019 02:45:40 +0000 /?p=264#comment-90 To be clear, I think effective altruism (the idea and the group of people) had some real stuff from the start, and settled on EA-as-brand pretty quickly, which led to the expected effects of processes scrambling for the brand’s moral authority in a way that made things fake, as you describe in the last paragraph. The relevant point is that, given the current situation, it’s hard for a charity’s effectiveness to affect how many donations it gets except through branding, and the “effectiveness” brand is easily subverted.

]]>
By: Rational Feed – deluks917 /good-erasure/#comment-89 Fri, 18 Jan 2019 15:57:01 +0000 /?p=264#comment-89 […] Good Erasure by Ziz – ‘If the truth about the difference between the social contract morality of neutral people and the actually wanting things to be better for people of good were known, this would be good for good optimization, and would mess with a certain neutral/evil strategy. To the extent good is believed to actually exist, being believed to be good is a source of free energy. This strongly incentivizes pretending to be good. Once an ecosystem of purchasing the belief that you are good is created, there is strong political will to prevent more real knowledge of what good is from being created. Pressure on good not to be too good.’ […]

]]>