{"id":179,"date":"2018-01-21T07:27:45","date_gmt":"2018-01-21T07:27:45","guid":{"rendered":"http:\/\/sinceriously.fyi\/?p=179"},"modified":"2018-01-21T07:27:45","modified_gmt":"2018-01-21T07:27:45","slug":"lies-about-honesty","status":"publish","type":"post","link":"https:\/\/sinceriously.fyi\/lies-about-honesty\/","title":{"rendered":"Lies About Honesty"},"content":{"rendered":"

The current state of discussion about using decision theory as a human is one where none dare urge restraint<\/a>. It is rife with light side<\/a> narrative breadcrumbs<\/a>\u00a0and false faces<\/a>. This is utterly inadequate for the purposes for which I want to coordinate with people and I think I can do better. The rest of this post is about the current state, not about doing better, so if you already agree, skip it. If you wish to read it, the concepts I linked are serious prerequisites, but you need not have gotten them from me. I’m also gonna use the phrase “subjunctive dependence”, defined on page 6 here<\/a> a lot.<\/p>\n

I am building a rocket here, not trying to engineer social norms.<\/p>\n

I’ve heard people working on the most important problem in the world say decision theory compelled them to vote in American elections. I take this as strong evidence that their idea of decision theory is fake<\/a>.<\/p>\n

Before the 2016 election, I did some Fermi estimates which took my estimates of subjunctive dependence into account, and decided it was not worth my time to vote. I shared this calculation, and it was met with disapproval. I believe I had found people executing the algorithm,<\/p>\n

The author of Integrity for consequentialists<\/a> writes:<\/p>\n

I\u2019m generally keen to\u00a0find efficient ways to do good\u00a0for those around me. For one, I\u00a0care about the people around me. For two, I\u00a0feel pretty optimistic that\u00a0if I create\u00a0value, some of it will flow back to me. For three, I want to be the kind of person who is good to be around.<\/p>\n

So if the optimal\u00a0level of integrity from a social perspective is 100%, but from my personal perspective would be something close to 100%, I am more than\u00a0happy to just go with 100%. I think this is probably one of the most cost-effective ways I can sacrifice a (tiny) bit of\u00a0value in order to help those around me.<\/p><\/blockquote>\n

This seems to be clearly a false face.<\/p>\n

Y’all’s actions are not subjunctively dependent with that many other people’s or their predictions of you. Otherwise, why do you pay your taxes when you could coordinate that a reference class including you could decide not to? At some point of enough defection against that the government becomes unable to punish you.<\/p>\n

In order for a piece of software like TDT to run outside of a sandbox, it needs to have been installed by an unconstrained “how can I best satisfy my values” process. And people are being fake, especially in the “is there subjunctive dependence here” part. Only talking about positive examples.<\/p>\n

Here’s another seeming false face<\/a>:<\/p>\n

I\u2019m trying to do work that has some fairly broad-sweeping consequences, and I want to know, for myself, that we\u2019re operating in a way that is deserving of the implicit trust of the societies and institutions that have already empowered us to have those consequences.<\/p><\/blockquote>\n

Here’s another post<\/a> I’m only skimming right now, seemingly full of only exploration of how subjunctively dependent things are, and how often you should cooperate.<\/p>\n

If you set out to learn TDT, you’ll find a bunch of mottes<\/a> that can be misinterpreted as the bailey, “always cooperate, there’s always subjunctive dependence”. Everyone knows that’s false, so they aren’t going to implement it outside a sandbox. And no one can guide them to the actual more complicated position of, fully, how much subjunctive dependence there is in real life.<\/p>\n

But you can’t blame the wise in their mottes. They have a hypocritical light side mob running social enforcement of morality software to look out for.<\/p>\n

Socially enforced morality is utterly inadequate for saving the world. Intrinsic or GTFO. Analogous for decision theory.<\/p>\n

Ironically, this whole problem makes “how to actually win through integrity” sort of like the Sith arts from Star Wars. Your master may have implanted weaknesses in your technique. Figure out as much as you can on your own and tell no one.<\/p>\n

Which is kind of cool, but fuck that.<\/p>\n","protected":false},"excerpt":{"rendered":"

The current state of discussion about using decision theory as a human is one where none dare urge restraint. It is rife with light side narrative breadcrumbs\u00a0and false faces. This is utterly inadequate for the purposes for which I want to coordinate with people and I think I can do better. The rest of this … Continue reading “Lies About Honesty”<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[1],"tags":[],"_links":{"self":[{"href":"https:\/\/sinceriously.fyi\/wp-json\/wp\/v2\/posts\/179"}],"collection":[{"href":"https:\/\/sinceriously.fyi\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/sinceriously.fyi\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/sinceriously.fyi\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/sinceriously.fyi\/wp-json\/wp\/v2\/comments?post=179"}],"version-history":[{"count":1,"href":"https:\/\/sinceriously.fyi\/wp-json\/wp\/v2\/posts\/179\/revisions"}],"predecessor-version":[{"id":180,"href":"https:\/\/sinceriously.fyi\/wp-json\/wp\/v2\/posts\/179\/revisions\/180"}],"wp:attachment":[{"href":"https:\/\/sinceriously.fyi\/wp-json\/wp\/v2\/media?parent=179"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/sinceriously.fyi\/wp-json\/wp\/v2\/categories?post=179"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/sinceriously.fyi\/wp-json\/wp\/v2\/tags?post=179"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}