https://www.liquidpoker.net/


LP international Poland    Contact            Users: 419 Active, 1 Logged in - Time: 09:15

Pessimism + Optimism + Suicide - Page 2

New to LiquidPoker? Register here for free!
Forum Index > Poker Blogs
  First 
  < 
  1 
 2 
  All 
RiKD    United States. Jul 27 2017 04:52. Posts 8522

I watched "It's a Wonderful Life" tonight at the suggestion of Marion Cotillard. George Bailey owes $8,000 and is going to get thrown in jail and is thinking about ending it. He wishes he would have never been born. So, he comes to the right conclusion. Then the movie gets a bit daft and then it gets a bit hokey but George gets it right only for the movie to get it so wrong. I was not really a fan. I suppose there is some truth to "You are not a failure if you have friends." I suppose the ending was ok. That is one thing about suicide. All angles have to be examined. Help should be sought. It is not wise to make decisions like that caught in a pit of depression. If I have family, friends, psychiatrist, therapist all working together some quicksand situations become manageable.


Loco   Canada. Jul 28 2017 14:14. Posts 20963


  On July 27 2017 01:43 bigredhoss wrote:
Saying we’re more likely to do X than Y isn’t a valid argument for eliminating the possibility of Y from consideration.



You said that the odds are in favor of X happening and I responded saying that it's naive to believe that because Y is a lot more likely than X, not that Y isn't worth any consideration.

I'm not denying life extension at all, we keep living longer and longer on average and that trend is obviously going to continue, but yes, personally it doesn't jive with me past a certain point. It's not about fatalism at all. It's not even a pessimistic remark. I know optimistic people whose life philosophy is built on the acceptance that life is temporary and that this is what gives life meaning. They wouldn't have it any other way. They don't want to upload their consciousnesses in a machine.

As for immortality, I'm not denying it any more than I'm denying the existence of goblins. It's a fantasy. Life extension can only go so far. It's only a matter of time before mankind perishes like 99.8% of the species that have ever existed did. Every great empire and civilization thought themselves immortal: the Mesopotamians, the Egyptians, the Romans, Persians, Ottoman, Mayans, Aztecs, Incans... they all disappeared. That's history. This faith in technology as humanity's salvation is crypto-religious in nature. It is not founded on rationality. Even if all these predictions about AGI turn out to be true, and we become something completely different, without death and decay as we know it, there's still no immortality. In every best scenario you can imagine, it ends in death because your existence depends on a universe that has an expiration date that we can't escape.

fuck I should just sell some of my Pokemon cards, if no one stakes that is what I will have to do - lostaccountLast edit: 28/07/2017 14:45

bigredhoss   Cook Islands. Jul 28 2017 18:56. Posts 8648


  On July 28 2017 13:14 Loco wrote:
You said that the odds are in favor of X happening and I responded saying that it's naive to believe that because Y is a lot more likely than X, not that Y isn't worth any consideration.



I think there’s some confusion because we’re focusing on different parts of each other’s points. I said I think we’re likely to achieve extreme longevity if AI doesn’t destroy us, and I don’t know what the odds of that are. I’m inclined to believe the emersion of superhuman AI is more likely to destroy us than save us (although I really have no clue and have heard interesting arguments for both sides). So, I don’t believe the odds are in favor of humanity achieving immortality. I think they are in favor of that only if the AI is “nice”. This is where I start having trouble with some pie-in-sky views that that will happen - it seems like the most likely scenario is that the AI views us as a useless/damaging drain on resources.


  I'm not denying life extension at all, we keep living longer and longer on average and that trend is obviously going to continue, but yes, personally it doesn't jive with me past a certain point. It's not about fatalism at all. It's not even a pessimistic remark. I know optimistic people whose life philosophy is built on the acceptance that life is temporary and that this is what gives life meaning. They wouldn't have it any other way. They don't want to upload their consciousnesses in a machine.



I specifically avoided using the terms optimistic/pessimistic because you said you have no desire for immortality (with caveats, I think), so there isn’t a mutual assumption that immortality = optimistic. Either way, those terms are innately tied to emotions, and I am only talking about probability.

Sure, I think immortality would be good. But even if I thought it was a bad thing, I would still acknowledge it as a possible development in the future.


  As for immortality, I'm not denying it any more than I'm denying the existence of goblins. It's a fantasy. Life extension can only go so far. It's only a matter of time before mankind perishes like 99.8% of the species that have ever existed did. Every great empire and civilization thought themselves immortal: the Mesopotamians, the Egyptians, the Romans, Persians, Ottoman, Mayans, Aztecs, Incans... they all disappeared. That's history. This faith in technology as humanity's salvation is crypto-religious in nature. It is not founded on rationality. Even if all these predictions about AGI turn out to be true, and we become something completely different, without death and decay as we know it, there's still no immortality. In every best scenario you can imagine, it ends in death because your existence depends on a universe that has an expiration date that we can't escape.



It seems like you are underestimating the future (and maybe even the present) of AI, both the potential for good and bad. For example, it’s almost certainly a greater threat to humanity than any of the environmental issues that you’re concerned about (and the environmental issues are obviously legitimate). There is no uncertainty about the fact that robots capable of performing all human functions while possessing superhuman intelligence are going to exist eventually. And if they don’t, it will only be because something terrible happened first.

You can draw whatever conclusions you want from this. To me, it seems obvious that it means our future - in terms of what “we” want - will have a fat tail distribution in terms of outcomes (meaning a higher chance of extreme deviations and lower chance of moderate ones). Immortality certainly falls into one of those extremes.

Truck-Crash Life 

 
  First 
  < 
  1 
 2 
  All 



Poker Streams

















Copyright © 2024. LiquidPoker.net All Rights Reserved
Contact Advertise Sitemap