2023-01-10
------------------------------------------------------------------

In season 3, episode 29 of Conversations with Coleman, he's
talking with Will MacAskill. I started to listen, then remembered
almost immediately who MacAskill is, or rather what his main
topic is. Had to stop and just rant about it for a while.

I actually quite like Will and the stuff he has to say. It's just
the one thing, the one thing they start the conversation with. I
loathe the idea with very visceral disgust. Coleman recounts the
basic idea of the movie Tenet about future having a war against
the past, and asks if Will would be on the side of the future
in that scenario, to which Will enthusiastically answers that yes
he would be. He continues that to him 100 lives saved is in some
absolute way always better than 10 lives saved and there is no
reason to think differently when comparing our own lives to the
lives of people in the future.

At this point I start to feel the recognition, and start to
cringe, so I can't remember now if he went further. But I have
heard him make this case in hours long podcasts before, several
times actually, so I think I can safely say that he would
continue saying stuff like "The future is so vast, potentially,
that no matter what discount value you give to the future people,
you will have to come to the conclusion that our lives are always
going to be less important than the future people's lives". I
even listened through the longest podcast he has about this on
The 80000 Hours Podcast just in order to see if he gives the
qualification to the argument about it's relation to other
concerns. Meaning: Does he recognise that his argument might not
be sort of a black hole that has more gravity than anything else
we might be worried about. He does at some point say that he is
not aiming at using all or even most of our energy towards his
future people, but actually just widening the extremely tiny
margin that is considered to be used for these future people at
the moment. He goes over this so fast, though, that he might
as well not have said it. To me it seems the most important fact
actually, if he wanted to be an effective communicator. I don't
know if he gets to that point in this conversation, so I am
basically railing against my preconceived notion about the guy,
but I just want to put this down.

It took me quite a long time to actually realize what was it
that was so off-putting about the idea. Factually I could see the
logic and it seems sound: All people are different, so there is
no reason to say that any one life is more important
(statistically) than another. The differences are there, but they
are not quantifiable. So in that sense it makes sense to say that
if you can save 10 lives, or 100 lives, you should choose the
100. It seemed like I should agree with Will, but I just didn't.

I now realise the reason I can't agree with him, but I still
can't back it up through the sort of logic that would be
acknowledged on his turf. It could even be that my reason for
disagreeing with him makes either me or him delusional,
depending on who is judging.

My point is simply that you cannot compare the suffering of real
people to any amount of suffering by hypothetical people.

There are ways to harden or soften the argument. I could allow
more "reality" to unborn babies, to people being born a year from
now compared to 100 years from now and so on, but the basic core
of the argument doesn't change. I think it is disgusting to gloss
over the suffering that is happening right now and jump into some
imaginary future suffering.

Also, it is quite obvious that the people in the future will not
exist if we fail to live as a somewhat unified globe right now,
so in that sense the empathy for the present generations is the
foundation for the existance of the future generations. So, the
more subtle point would be that Will seems to be sawing the branch
that he is sitting on (or imagines to be sitting on in the
future).

In an even more subtle (and probably unnecessary) point of view
we could say that the modern global powers have already skipped
over the current state of affairs in several epochs in the
history, so it might be the fucking time to ground yourself in
the present and not go running after some imaginary thriumphs of
altruism when there are plenty to do here and now.

Of course, then Will points out that he is only looking for the
tiny marginal increase, and I have to calm myself and agree that
it is surely overdue.

Yeah. But the interesting rub here is that it isn't clear to me
who is fooling themselves here. Either Will is willing to give
his attention to imaginary beings, or I am giving some magical
value to the present beings when in fact they are no different
than the future beings. Both sides can potentially and quite
realistically be superstition.

------------------------------------------------------------------