« These people should get blogs and blog at least weekly | Main | Why Robin Hanson got tenure »

May 16, 2008

Comments

Vladimir Nesov

But you do take some risks to your life to avoid inconvenience, for example drive cars, cross roads, etc. There are no iron-cast algorithms in your nature that say that you must survive no matter what. Asserting otherwise is factually incorrect. You may choose the moral imperative for yourself to adhere to such goal, but how do you know that it's the right goal to follow? This is not a rhetorical question, I'm very much interested in rational and increasingly precise ways of determining the right goals to follow.

Vladimir Nesov

(On second thought, if I got you wrong, and you point wasn't what I though it was.) If you need to choose a tradeoff between investing in cryonics, SENS, pie-in-the-sky, etc., how come probability of success is irrelevant to the choice? Whatever the strategy or goal of such decision, it's graded by probability of success, because it's what determines the meaning of the technique. If I say that I'm going to fly to the Moon, but with 10^-100 probability of succeeding, I'm not really going to fly to the Moon, it's merely irrelevant worldplay when I say that I'm going to fly to the Moon. On the other hand, if I have (1-(10^-100)) (that is, near 100%) probability of flying to Mars when I say that I'm going to fly to the Moon, the meaning of my venture is in fact flying to the Mars, not flying to the Moon. So, probability of success matters, as it defines the meaning of action. And as it's success that you care about, not wordplay about what action supposed to achieve, you shouldn't ignore probabilistic estimations.

Frank McGahon

HA, you seem determined to read into my caveats and objections some sort of opposition to the notion or desirability of "maximising persistence". Far from it: knock yourself out I say. Personally, I'd love to maximise my own persistence and am reasonably optimistic about the prospects of extending lifespans even within my own (currently) projected lifetime. However I have a major problem with Cryonics in particular and believe it to be a dangerously seductive fantasy. My caution to you is that, even with your professed aim of maximising persistence, any resources you divert towards cryonics detract from other, perhaps more fruitful, methods of extending (perhaps indefinitely) lifespan. Nothing you have posted indicates that you are taking seriously the fatal, paradoxical flaw of Cryonics which is that the only way it can bear fruit is for technology which will render it obsolete to develop. That is, you should apply a very very low probability to the notion that Cryonics will payoff and adjust your calculations accordingly.

Frank, you seem to miss my central, solipstic, point of view that all that's functionally real to me is the successive "nows" that my subjective conscious experience exists in. Your claim that there are things I should value more than maximize the its persistence odds is absurd, weird, and unrelatable to me. Of COURSE I would play the long odds of avoiding death by asteroid. That you think that somehow the asteroid argument would persuade me to be blithely hedonistic in that context illustrates how deep our disconnect is. I aspire to be hedonistic only to the degree that it maximizes my persistence odds.

I don't miss the point at all. I am prone to solipsism myself and am fully aware that I can't verify that anything else in the universe is "real", I just adopt the notion that it is as a working hypothesis. RE valuing anything more, I'm already assuming that you don't value anything outside your own existence. Even with that assumption, wasting money (or life) on longshot payoffs is not rational. Persistence is instrumental to living - being able to live is precisely why anyone would want to persist and can't be an intrinsic value which trumps all other personal values (even disregarding entirely any values outside ones own existence).

The comments to this entry are closed.