Listen now
About this episode
Welcome back to another thought-provoking episode of Don't Be Caught Dead! Today, we dive deep into the fascinating world of neuroscience and the radical idea of abolishing death with our special guest, Ariel Zeleznikow-Johnston. As a neuroscientist and author of The Future Loves You, Ariel explores the potential of brain preservation and the philosophical implications of what it means to be alive. If you've ever wondered about the future of dying and whether we can truly cheat death, this episode is for you!
In our chat, Ariel shares his journey into the realm of neuroscience, where he combines big philosophical questions about consciousness with the nitty-gritty of brain biology. We discuss the historical evolution of the definition of death, the concept of the connectome, and how advancements in technology might one day allow us to preserve and potentially revive those who have died. Ariel challenges us to think about the future of medical technology and the ethical considerations surrounding it. What if we could put people in stasis and bring them back to life when medical science has advanced enough to cure their ailments? It’s a wild ride through science fiction becoming science fact!
We also touch on the emotional aspects of dying, grief, and the societal implications of such technologies. How would our relationships with loved ones change if we could preserve them for future generations? Ariel's insights are not only scientifically grounded but also deeply human, reminding us of the importance of connection and the legacy we leave behind.
Remember; You may not be ready to die, but at least you can be prepared.
Take care,
Catherine
Show notes
Guest Bio

Neuroscientist and Author of The Future Loves You
Ariel Zeleznikow-Johnston is a neuroscientist based in Melbourne, Australia, and the author of The Future Loves You: How and Why We Should Abolish Death.
He is currently a Research Fellow at Monash University, working within the Monash Neuroscience of Consciousness laboratory. His research focuses on developing novel methods to characterise conscious experiences, contributing to the broader effort to understand the neural basis of consciousness. Ariel completed his PhD at The University of Melbourne in 2019, where he investigated how genetic and environmental factors influence cognition in both healthy and diseased brains.
He has published extensively across the field of cognitive neuroscience, with work spanning the decline, preservation, and restoration of cognitive function across the lifespan, as well as investigations into how people consciously experience colour.
As an author, Ariel explores how cutting-edge neuroscientific developments may one day make it possible to suspend death through brain preservation, —potentially offering individuals the chance of future revival. His work examines the scientific, medical, and philosophical foundations underlying this radical proposition.
Summary
Key points from our discussion:
- The evolution of the definition of death and its implications for modern medicine.
- The concept of the connectome and its significance in understanding consciousness.
- The potential of brain preservation technologies and the ethical considerations involved.
- How advancements in neuroscience could reshape our understanding of life and death.
- The emotional impact of preserving loved ones and the societal responsibilities that come with it.
Transcript
1
00:00:02,009 --> 00:00:07,740
Maybe we'd be able to take people who are
dying somehow put them in stasis where
2
00:00:07,740 --> 00:00:10,925
they're unchanging, un sort of inert.
3
00:00:11,565 --> 00:00:15,135
And then at some future time point,
pull 'em out, restore them to
4
00:00:15,135 --> 00:00:19,185
health, cure their cancer or injury
or whatever was harming them, and
5
00:00:19,185 --> 00:00:20,535
then they can continue living.
6
... Read More
1
00:00:02,009 --> 00:00:07,740
Maybe we'd be able to take people who are
dying somehow put them in stasis where
2
00:00:07,740 --> 00:00:10,925
they're unchanging, un sort of inert.
3
00:00:11,565 --> 00:00:15,135
And then at some future time point,
pull 'em out, restore them to
4
00:00:15,135 --> 00:00:19,185
health, cure their cancer or injury
or whatever was harming them, and
5
00:00:19,185 --> 00:00:20,535
then they can continue living.
6
00:00:23,475 --> 00:00:25,335
Welcome to Don't Be Caught Dead.
7
00:00:25,605 --> 00:00:30,615
A podcast encouraging open conversations
about dying and the death of a loved one.
8
00:00:31,064 --> 00:00:36,074
I'm your host, Catherine Ashton, founder
of Critical Info, and I'm helping to
9
00:00:36,074 --> 00:00:40,815
bring your stories of death back to
life because while you may not be ready
10
00:00:40,815 --> 00:00:43,335
to die, at least you can be prepared.
11
00:00:46,695 --> 00:00:47,595
Don't be caught dead.
12
00:00:47,595 --> 00:00:51,075
Acknowledges the lands of the
Kulin Nations and recognizes their
13
00:00:51,075 --> 00:00:53,745
connection to land, sea, and community.
14
00:00:54,065 --> 00:00:58,145
We pay our respects to their elders
past, present, and emerging, and
15
00:00:58,145 --> 00:01:02,045
extend that respect to all Aboriginal
and Torres Strait Islander and First
16
00:01:02,045 --> 00:01:03,845
Nation peoples around the globe.
17
00:01:08,255 --> 00:01:13,520
Today we are speaking with Ariel
Zeleznikow-Johnston, a neuroscientist
18
00:01:13,520 --> 00:01:17,630
based in Melbourne, and the author
of the Future Loves You, how
19
00:01:17,630 --> 00:01:19,970
and Why We Should Abolish Death.
20
00:01:20,720 --> 00:01:24,710
He is currently a research fellow
at Monash University working
21
00:01:24,710 --> 00:01:27,705
within the Monash Neuroscience
Consciousness Laboratory.
22
00:01:28,515 --> 00:01:32,595
His research focuses on developing
novel methods to characterize
23
00:01:32,745 --> 00:01:37,245
conscious experiences contributing to
the broader effort and to understand
24
00:01:37,245 --> 00:01:39,164
the neural basis of consciousness.
25
00:01:39,555 --> 00:01:43,815
Ariel completed his PhD at the
University of Melbourne in 2019
26
00:01:44,115 --> 00:01:49,634
where he investigated how genetic and
environmental factors influence cognition
27
00:01:49,875 --> 00:01:52,155
in both healthy and disease brains.
28
00:01:52,570 --> 00:01:57,100
He has published extensively across
the field of cognitive neuroscience
29
00:01:57,369 --> 00:02:02,710
with work spanning the decline,
preservation, and restoration of
30
00:02:02,710 --> 00:02:07,660
cognitive function across the lifespan,
as well as investigations into how
31
00:02:07,660 --> 00:02:10,030
people consciously experience color.
32
00:02:11,079 --> 00:02:16,390
As an author, Ariel explores how cutting
edge neuroscientific developments may
33
00:02:16,390 --> 00:02:21,430
one day make it possible to suspend
death through brain preservation.
34
00:02:21,430 --> 00:02:26,310
I. Potentially offering individuals
a chance of future revival.
35
00:02:26,850 --> 00:02:32,100
His work examines the scientific,
medical, and philosophical foundations
36
00:02:32,370 --> 00:02:35,250
underlying this radical proposition.
37
00:02:35,940 --> 00:02:38,850
Thank you so much for
being with us today, Ariel.
38
00:02:39,225 --> 00:02:40,275
Thanks for having me on the show.
39
00:02:40,785 --> 00:02:44,625
I have to say, I can't remember
how I came across your book.
40
00:02:44,625 --> 00:02:50,175
It might have been an interview
that you did on ABC Radio
41
00:02:51,165 --> 00:02:53,655
and I went, oh my goodness.
42
00:02:53,655 --> 00:02:57,780
I. Who is promising a
future without death?
43
00:02:58,260 --> 00:02:59,429
Is this possible?
44
00:02:59,519 --> 00:03:05,070
And so needless to say, I went and
purchased the book and have been
45
00:03:05,070 --> 00:03:09,780
taken on a fascinating journey
for the last month of reading it.
46
00:03:10,470 --> 00:03:16,350
So tell me what drew you to this
particular avenue of neuroscience
47
00:03:16,350 --> 00:03:20,820
and, and more specifically,
when we die and what happens?
48
00:03:21,590 --> 00:03:27,110
So my background is as a neuroscientist,
and I've always found neuroscience in
49
00:03:27,110 --> 00:03:31,670
general to be fascinating because it's
the combination of big philosophical
50
00:03:31,670 --> 00:03:36,230
questions of who we are and what
it is to be conscious, what it is
51
00:03:36,230 --> 00:03:39,950
to be a person combined with the
fun, sort of nitty gritty of like.
52
00:03:40,270 --> 00:03:42,490
How does biology make us who we are?
53
00:03:42,580 --> 00:03:44,290
How do our brains actually function?
54
00:03:44,290 --> 00:03:46,240
What are all these like
squishy cell things doing?
55
00:03:46,570 --> 00:03:49,690
And like, that's what I've
been working in for the past, I
56
00:03:49,690 --> 00:03:51,460
guess over 15 years now or so.
57
00:03:52,120 --> 00:03:56,110
But at the same time as working as an
actual scientist, I've always been really
58
00:03:56,110 --> 00:04:01,480
interested in both the past and future
of scientific and medical developments.
59
00:04:01,660 --> 00:04:06,610
So things like looking back to how we got
to the world we have today where we have.
60
00:04:06,850 --> 00:04:12,070
You know, antibiotics, surgery, anesthesia
compared to the primitive state of
61
00:04:12,070 --> 00:04:13,480
the world a few hundred years ago.
62
00:04:13,899 --> 00:04:18,010
And also looking forwards to, well, what
sort of things do we expect to come about
63
00:04:18,039 --> 00:04:20,229
in the decades and centuries to come?
64
00:04:20,890 --> 00:04:24,400
And as part of that, I enjoy reading
sci-fi and, and watching sci-fi.
65
00:04:25,065 --> 00:04:28,304
And in particular there's, there's
one sort of technology that's often
66
00:04:28,304 --> 00:04:32,625
explored in science fiction that is
quite relevant to, you know, discussions
67
00:04:32,625 --> 00:04:38,114
of death and future medicine, which is
this idea that like maybe we'd be able
68
00:04:38,114 --> 00:04:42,825
to take people who are dying somehow
put them in stasis where they're
69
00:04:43,065 --> 00:04:49,034
unchanging, un sort of inert, and then
at some future time point, pull 'em out.
70
00:04:49,295 --> 00:04:53,585
Restore them to health, cure their cancer
or injury or whatever was harming them.
71
00:04:53,855 --> 00:04:55,355
And then they can continue living.
72
00:04:56,195 --> 00:05:00,395
And when I came across this, like
in Futurama for one example, or in
73
00:05:00,395 --> 00:05:04,535
the movie Intertel, these sorts of
depictions, I used to wonder like, you
74
00:05:04,535 --> 00:05:06,485
know, is this, is this really real?
75
00:05:06,515 --> 00:05:07,295
Could it be real?
76
00:05:07,295 --> 00:05:11,435
Is it just a plot device that's used in
stories or is there something to this?
77
00:05:11,465 --> 00:05:15,335
And like, I guess I'd always speculated
on that in the back of my mind.
78
00:05:15,885 --> 00:05:20,595
But in the past decade or so, I
kept coming across, I guess, new
79
00:05:20,595 --> 00:05:25,185
developments that made me think maybe
there is something to this sort of
80
00:05:25,185 --> 00:05:27,285
idea of being able to make it work.
81
00:05:27,855 --> 00:05:32,175
The two big things being, we've developed
a much better understanding of how
82
00:05:32,175 --> 00:05:34,335
memories are actually stored in the brain.
83
00:05:34,515 --> 00:05:38,055
So like the structures that
underlie sort of the continuation
84
00:05:38,055 --> 00:05:40,005
of our psychology over time.
85
00:05:40,635 --> 00:05:45,105
And also some of the technologies
to preserve brains and preserve
86
00:05:45,105 --> 00:05:48,585
bodies has gotten a lot, lot
better in the recent past as well.
87
00:05:49,155 --> 00:05:52,725
So being aware of those, I was like,
well, maybe, maybe it would be good to
88
00:05:52,725 --> 00:05:56,955
actually dive in and see if anyone has
written like a proper write-up of like,
89
00:05:57,195 --> 00:05:58,875
you know, this is why maybe it could work.
90
00:05:58,875 --> 00:06:02,355
Or actually, here are all the technical
reasons why it's forever impossible.
91
00:06:02,880 --> 00:06:06,690
And when I went looking for that, I
realized firstly that there was no long
92
00:06:06,690 --> 00:06:11,159
form analysis of that, and secondly,
that like perhaps I was in a particularly
93
00:06:11,159 --> 00:06:12,870
good place to, to do the assessment.
94
00:06:13,635 --> 00:06:16,605
That's how I ended up doing the
work and writing up the book.
95
00:06:17,115 --> 00:06:20,715
And I think that was interesting, the
fact that when you were looking back,
96
00:06:21,015 --> 00:06:25,485
it made me really realize how far we
have come in such a short period of time
97
00:06:25,755 --> 00:06:30,645
in relation to, you know, a condition
such as diabetes you may have, you
98
00:06:30,645 --> 00:06:32,236
know, died of that many years ago.
99
00:06:32,925 --> 00:06:35,445
But now it's something that's just,
you can just take a tablet and,
100
00:06:35,445 --> 00:06:40,065
and you can manage it, or you have
insulin or, and there was many case
101
00:06:40,065 --> 00:06:44,085
studies that you used throughout the
book that clearly, you know, give an
102
00:06:44,085 --> 00:06:46,425
example of just how far we have come.
103
00:06:46,425 --> 00:06:49,545
So it made me start thinking well.
104
00:06:50,085 --> 00:06:54,105
I do wonder what we're going to
develop in the next 100 years and
105
00:06:54,105 --> 00:06:55,515
what that's going to look like.
106
00:06:56,025 --> 00:06:58,605
Yeah, it, it's very easy
to take for granted.
107
00:06:58,664 --> 00:07:02,835
The medical technology we have today is
basically always having been there or is
108
00:07:02,985 --> 00:07:07,305
like the absolute bare minimum standards
of, of what we expect people to have.
109
00:07:07,605 --> 00:07:11,414
So whether it's things like insulin
for diabetes, whether it's dialysis
110
00:07:11,414 --> 00:07:12,495
for people with kidney failure.
111
00:07:13,245 --> 00:07:16,245
Whether it's statins for people with
heart disease, these are all like
112
00:07:16,245 --> 00:07:17,985
very common and ubiquitous today.
113
00:07:18,345 --> 00:07:21,705
But most of these things were only
developed since the last 19th century
114
00:07:21,705 --> 00:07:26,415
or so, and there's a lot of people alive
today with chronic health conditions
115
00:07:26,445 --> 00:07:29,025
who just would've died in earlier times.
116
00:07:29,295 --> 00:07:33,885
In the most dramatic sense, it's the
case that prior to the late 19th century.
117
00:07:34,229 --> 00:07:37,289
One in two children died before
they reached the age of 15.
118
00:07:37,289 --> 00:07:37,349
Yeah.
119
00:07:37,739 --> 00:07:41,340
So we really shouldn't take
what we have today for granted.
120
00:07:41,370 --> 00:07:46,229
It, it's miraculous what we can do,
and hopefully people in the future
121
00:07:46,229 --> 00:07:49,979
will look back on technology we have
today as equivalently primitive.
122
00:07:50,370 --> 00:07:53,789
At least that's what I, I think is
worth exploring and taking seriously.
123
00:07:54,115 --> 00:07:58,045
And let's talk about how you
discuss that within the book.
124
00:07:58,045 --> 00:08:03,685
So I found it really interesting
that the term of medical term of
125
00:08:03,685 --> 00:08:08,545
when you're actually the definition
of death, I found that fascinating.
126
00:08:08,545 --> 00:08:11,845
Could you maybe talk through
that, like how that is?
127
00:08:12,205 --> 00:08:15,775
Yeah, I, I think it's useful probably
thinking through it with a bit of
128
00:08:15,805 --> 00:08:19,765
historical context of how the definition
of death has changed over time.
129
00:08:20,455 --> 00:08:20,905
So.
130
00:08:21,225 --> 00:08:25,425
Prior to the middle of the 20th century
or so, people were defined as dead,
131
00:08:25,455 --> 00:08:29,025
just based on, you know, they stopped
breathing or their heart stopped.
132
00:08:29,385 --> 00:08:32,715
And that was a perfectly functional
definition because at that time
133
00:08:32,715 --> 00:08:35,925
point, there wasn't anything we
could do for people who'd stop
134
00:08:35,925 --> 00:08:37,875
breathing or stop having blood flow.
135
00:08:37,875 --> 00:08:42,520
I. But with the invention of things
like ventilators to help people breathe
136
00:08:42,549 --> 00:08:46,150
even when their lungs had failed,
and then subsequently things like
137
00:08:46,180 --> 00:08:50,260
cardiopulmonary bypass machines, so
devices that could keep people's blood
138
00:08:50,260 --> 00:08:52,270
flowing even when their heart had stopped.
139
00:08:52,270 --> 00:08:55,800
I. It became apparent that, you know,
there were circumstances where it
140
00:08:55,800 --> 00:08:59,880
seemed like people were still alive
despite heart and lung function,
141
00:08:59,880 --> 00:09:01,800
having at least temporarily ceased.
142
00:09:02,190 --> 00:09:07,200
So in the 1950s and sixties and onwards,
people realized that we needed a more
143
00:09:07,230 --> 00:09:09,450
sophisticated definition of death.
144
00:09:10,050 --> 00:09:14,520
And in particular in the US there was
an effort by legislators and ethicists
145
00:09:14,520 --> 00:09:17,280
and doctors to come up with something.
146
00:09:17,585 --> 00:09:21,935
Which ended up being called the Uniform
Determination of Death Act, which
147
00:09:21,935 --> 00:09:26,495
involved defining someone as dead based
on one of two possible definitions.
148
00:09:27,155 --> 00:09:30,065
The first was that they'd
suffered irreversible cessation
149
00:09:30,065 --> 00:09:31,745
of heart and lung functions.
150
00:09:31,925 --> 00:09:36,185
So essentially the old definition, but
the newer definition that they also used
151
00:09:36,335 --> 00:09:41,465
was defining someone as dead based on
irreversible cessation of all functions
152
00:09:41,465 --> 00:09:45,125
of the brain, which is what we commonly
think of these days as brain death.
153
00:09:45,944 --> 00:09:49,875
So what that definition acknowledged
was that there are circumstances
154
00:09:49,875 --> 00:09:52,755
where, you know, sometimes people's
hearts and lungs have stopped for a
155
00:09:52,755 --> 00:09:55,155
while, but they can be brought back.
156
00:09:55,215 --> 00:10:00,255
They're still alive in that sense,
or there might be circumstances where
157
00:10:00,345 --> 00:10:05,385
people's hearts and lungs are still
functioning either intact or with support
158
00:10:05,385 --> 00:10:10,425
from medical machines, but the person's
brain is essentially been destroyed,
159
00:10:10,425 --> 00:10:12,075
and they'll never be conscious again.
160
00:10:12,165 --> 00:10:14,325
Their memories are gone,
their personality's gone.
161
00:10:14,715 --> 00:10:18,885
And maybe in that circumstance, you
know, it, it feels like the person has
162
00:10:18,885 --> 00:10:21,165
been lost and should be defined as dead.
163
00:10:21,720 --> 00:10:26,670
I think that was passed in the 1980s in
the US and it then became used generally
164
00:10:26,670 --> 00:10:28,440
worldwide, including in Australia.
165
00:10:28,860 --> 00:10:32,160
But even at the time, people noted
that there were still problems
166
00:10:32,160 --> 00:10:35,430
with this updated definition
of death for two reasons.
167
00:10:36,210 --> 00:10:41,130
The first was it defined people
as dead only if they've suffered a
168
00:10:41,130 --> 00:10:43,175
cessation of all functions of the brain.
169
00:10:44,040 --> 00:10:47,970
If you actually go and you look in the
brains of people who've been declared
170
00:10:48,030 --> 00:10:52,170
brain dead while still being kept on
some life support, so heart and lung
171
00:10:52,170 --> 00:10:56,520
function, you can often find little bits
of brain activity that are still going.
172
00:10:56,820 --> 00:11:01,140
So for example, their hypothalamus, a
small brain region controlling like body
173
00:11:01,140 --> 00:11:05,400
temperature and some instincts is often
still functional in these patients.
174
00:11:06,165 --> 00:11:09,015
Doctors are just like, well,
like that doesn't really matter.
175
00:11:09,015 --> 00:11:10,035
It doesn't count.
176
00:11:10,035 --> 00:11:11,505
Like the memories aren't there.
177
00:11:11,505 --> 00:11:12,615
The consciousness isn't there.
178
00:11:12,615 --> 00:11:15,855
It doesn't matter if their body
temperature's still going, which I think
179
00:11:15,855 --> 00:11:19,545
is fair, but it does suggest that there's
something wrong with the definition.
180
00:11:20,355 --> 00:11:25,365
The other part that's also problematic is
this irreversible cessation of all brain
181
00:11:25,365 --> 00:11:27,495
functions being part of the definition.
182
00:11:28,454 --> 00:11:32,535
What's irreversible at some time
point might not stay irreversible.
183
00:11:33,074 --> 00:11:36,405
So in the same sense that, you know,
once upon a time, if your heart and
184
00:11:36,405 --> 00:11:39,944
lung stopped, you had irreversible
cessation of heart and lung function.
185
00:11:40,275 --> 00:11:42,765
Nowadays we have machines
to replace those.
186
00:11:43,425 --> 00:11:47,564
In the same way we're increasingly
developing better and better sort of
187
00:11:47,834 --> 00:11:53,084
brain implants, neural prosthesis,
which can replace lost brain functions.
188
00:11:53,594 --> 00:11:55,995
And there's a question of
as that continues to get
189
00:11:55,995 --> 00:11:57,375
better and better and better.
190
00:11:57,885 --> 00:12:02,204
How are we gonna define irreversible
cessation of all brain functions
191
00:12:02,505 --> 00:12:03,824
as part of a definition of death?
192
00:12:04,454 --> 00:12:09,045
And so given all that background,
that's why I and some other philosophers
193
00:12:09,074 --> 00:12:12,824
have a slightly different, and I
think better definition of how we
194
00:12:12,824 --> 00:12:14,415
should be defining people as dead.
195
00:12:14,415 --> 00:12:14,475
I.
196
00:12:14,815 --> 00:12:18,955
Well, I just have to ask Ariel, what is
that definition that you're proposing?
197
00:12:19,825 --> 00:12:24,655
Yeah, so a definition that was first
proposed again in the 1980s by, on the
198
00:12:24,655 --> 00:12:29,185
one hand, some bioethicists at Harvard,
and also almost simultaneously by a
199
00:12:29,185 --> 00:12:33,505
computer scientist, was this idea that
instead of defining someone as dead
200
00:12:33,505 --> 00:12:38,815
based on loss of heart functions or
breathing, or irreversible loss of all
201
00:12:38,815 --> 00:12:40,945
brain functions, including things like.
202
00:12:40,960 --> 00:12:44,320
Controlling your body posture or
controlling your breathing rates.
203
00:12:44,800 --> 00:12:50,710
What really matters is, you know,
for someone's survival is that their
204
00:12:50,770 --> 00:12:53,860
consciousness still exists, that
their memories are still there, that
205
00:12:53,980 --> 00:12:57,670
their personality is still there, that
the things that make them them in a
206
00:12:57,670 --> 00:13:00,100
psychological sense are still continuing.
207
00:13:01,035 --> 00:13:04,995
And what they proposed is that, well,
if those are the things that matter for
208
00:13:04,995 --> 00:13:09,345
survival, then the definition of death
we should be using for when a person
209
00:13:09,345 --> 00:13:11,985
stops is when those things are lost.
210
00:13:12,195 --> 00:13:16,605
So basically defining death as the
irreversible loss of personal identity.
211
00:13:17,475 --> 00:13:20,745
And what that implies is both
that there might be circumstances
212
00:13:20,745 --> 00:13:24,015
where, you know, someone's brain
activity has stopped for a while.
213
00:13:24,390 --> 00:13:28,949
Either under anesthesia or in a
case like induced hypothermia, or
214
00:13:28,949 --> 00:13:33,810
even in the weird preservation cases
that I speculate on where as long
215
00:13:33,810 --> 00:13:37,469
as there's some chance someone could
be brought back in the future, that
216
00:13:37,560 --> 00:13:39,510
they may actually still be alive.
217
00:13:39,750 --> 00:13:41,910
They're just, you know,
not currently conscious.
218
00:13:42,900 --> 00:13:45,930
And that there might be other
circumstances where, you know, someone
219
00:13:45,930 --> 00:13:49,890
may have suffered extremely severe brain
damage to the point where they've had
220
00:13:50,070 --> 00:13:53,520
massive personality change or where
they're, they'll never be conscious
221
00:13:53,520 --> 00:13:58,410
again, where it's arguably questionable
that the person is still there.
222
00:13:58,740 --> 00:14:01,950
Even if there's some very primitive
functions remaining, like being able
223
00:14:01,950 --> 00:14:05,505
to drive breathing or maintain body
temperature or those sorts of things.
224
00:14:06,990 --> 00:14:07,830
Yeah, it's interesting.
225
00:14:07,865 --> 00:14:13,410
I, I think that what really in the
examples that you use were people who have
226
00:14:13,439 --> 00:14:18,810
had acquired brain injuries, how that has
impacted on their personality and who they
227
00:14:18,810 --> 00:14:24,480
are as a person, both before the injury
and then after the injury, and then also.
228
00:14:25,005 --> 00:14:29,445
Both from their reflections, but also
those perceptions of those around
229
00:14:29,445 --> 00:14:31,125
them I thought was really interesting.
230
00:14:31,485 --> 00:14:37,005
And also the fact that, you know, the
ability to make memories from there on in.
231
00:14:37,005 --> 00:14:42,740
Like if you have had some damage
sustained, I. You know, how much does that
232
00:14:42,740 --> 00:14:45,800
actually create, who we are as a person?
233
00:14:46,340 --> 00:14:51,800
So I find your whole discussion around
this topic really fascinating and
234
00:14:52,130 --> 00:14:57,380
perhaps if you could explain the term
connectome, because I know that you go
235
00:14:57,380 --> 00:15:01,820
speak a lot about levels of consciousness
and what consciousness means, where
236
00:15:01,820 --> 00:15:05,120
memories are stored in the brain,
where we think they are, where we, we.
237
00:15:05,180 --> 00:15:09,770
You know, there's still so much to learn,
but something that I did take out of your
238
00:15:09,770 --> 00:15:14,750
book was that term connectome, and I'd
love if you could explain that to us.
239
00:15:15,020 --> 00:15:15,380
Yeah.
240
00:15:15,709 --> 00:15:16,040
Okay.
241
00:15:16,370 --> 00:15:19,430
So I, I think one thing I should
just be very clear about before I get
242
00:15:19,430 --> 00:15:23,420
into this is just I wanna make clear
that I don't think any amount of.
243
00:15:23,665 --> 00:15:28,435
Brain damage or trauma is sufficient to,
you know, stop a person from existing
244
00:15:28,435 --> 00:15:30,235
or turn them into a different person.
245
00:15:30,625 --> 00:15:34,255
There's all sorts of circumstances
where people have mild brain injuries or
246
00:15:34,255 --> 00:15:38,485
concussions or things where, you know,
there's definitely a loss of neurons and
247
00:15:38,485 --> 00:15:42,835
some damage, but we, we continue to think
that someone is still the same person.
248
00:15:43,465 --> 00:15:46,735
Kind of what I get into in the book
though, is we can see these circumstances
249
00:15:46,735 --> 00:15:50,574
where there's increasing levels of
change or increasing levels of damage.
250
00:15:51,020 --> 00:15:54,650
Up to the point where there's complete
brain damage and complete loss of
251
00:15:54,650 --> 00:15:59,030
consciousness, and the question does
arise as to like, well, what level
252
00:15:59,030 --> 00:16:03,680
of damage or what level of integrity
is required for a person to continue
253
00:16:03,680 --> 00:16:05,360
to exist and continue to survive?
254
00:16:06,350 --> 00:16:10,400
Now, as part of trying to really give a
technical answer to that sort of question,
255
00:16:10,490 --> 00:16:13,130
you've gotta ask, well, what is a person?
256
00:16:13,130 --> 00:16:14,060
What is their brain?
257
00:16:14,060 --> 00:16:17,330
How does the structure of someone's
brain contain and give rise to
258
00:16:17,330 --> 00:16:18,650
their psychological properties?
259
00:16:19,230 --> 00:16:21,870
And as part of that to, to finally
get to your question about what the
260
00:16:21,870 --> 00:16:27,089
connectome is, that's the term that
neuroscientists use for the complete
261
00:16:27,089 --> 00:16:32,400
set of connections between brain cells
in a person or an animal's brain.
262
00:16:32,835 --> 00:16:37,155
That store all of the like neural
information that makes them who they are.
263
00:16:37,305 --> 00:16:41,955
So their memories, their personality,
all the, the things they've learned, or
264
00:16:42,105 --> 00:16:46,485
all the things that shape their behavior
that gets referred to as their connector.
265
00:16:46,935 --> 00:16:52,185
And to be really specific for a human,
that's all 86 billion or so neurons in
266
00:16:52,185 --> 00:16:57,165
your head, plus the roughly quadrillion
connections between those neurons.
267
00:16:57,405 --> 00:16:59,985
And for different animals,
it'll be different numbers.
268
00:17:00,525 --> 00:17:05,865
It's a homage to the term genome where
genome refers to the full 20,000 genes
269
00:17:05,865 --> 00:17:07,694
or so that make up each individual.
270
00:17:07,875 --> 00:17:10,395
Your connectome is the full
set of neural connections that
271
00:17:10,665 --> 00:17:13,214
give rise to your psychology.
272
00:17:13,514 --> 00:17:17,744
And I think when you stipulated
those numbers in the book, I'm like.
273
00:17:18,095 --> 00:17:18,155
Wow.
274
00:17:19,475 --> 00:17:24,995
And, and I think the comparison that you
drew to current computers and what their
275
00:17:24,995 --> 00:17:30,035
computing capabilities are, and then when
you compare it to the, the current brain,
276
00:17:30,815 --> 00:17:37,895
our brains are really working hard by
comparison, and the figures really blew
277
00:17:37,895 --> 00:17:41,825
my mind when you actually think that.
278
00:17:42,360 --> 00:17:46,500
Like this is all something
that happens naturally.
279
00:17:47,010 --> 00:17:47,220
Yeah.
280
00:17:47,220 --> 00:17:52,110
One thing that's incredible is today's
most advanced supercomputers have
281
00:17:52,110 --> 00:17:56,670
computing power that is probably
roughly equivalent to a human brain,
282
00:17:57,090 --> 00:18:00,660
except a human brain can run on a
sandwich, and these like computing
283
00:18:00,660 --> 00:18:04,350
clusters take essentially small
power plants worth of energy to run.
284
00:18:04,740 --> 00:18:07,200
So we are much more energy
efficient than these things.
285
00:18:07,680 --> 00:18:08,550
We're a bit slower.
286
00:18:08,610 --> 00:18:12,570
They work on, on faster time speeds
due to the, the electronic components,
287
00:18:12,840 --> 00:18:16,080
whereas human brains work on, you
know, tens of milliseconds, hundreds of
288
00:18:16,080 --> 00:18:18,150
milliseconds, so eons compared to them.
289
00:18:18,480 --> 00:18:21,000
But in terms of energy efficiency,
they've got nothing on us yet.
290
00:18:22,890 --> 00:18:23,760
I like that.
291
00:18:23,760 --> 00:18:26,730
It makes us feel that we are,
we are not redundant, you know,
292
00:18:26,755 --> 00:18:28,140
not for a few years at least.
293
00:18:28,170 --> 00:18:28,740
Yeah, yeah.
294
00:18:28,740 --> 00:18:29,610
Yeah, that's great.
295
00:18:29,940 --> 00:18:31,590
But it does really make me.
296
00:18:32,010 --> 00:18:39,060
Think just how difficult it must be to
try and work out what functionality is
297
00:18:39,060 --> 00:18:41,160
associated with what parts of the brain.
298
00:18:41,190 --> 00:18:45,150
When you see that there's that many
different, you know, electrical
299
00:18:45,450 --> 00:18:50,580
components as part of that that you,
you speak about, like it is phenomenal.
300
00:18:51,090 --> 00:18:51,150
Yeah.
301
00:18:51,150 --> 00:18:54,840
I mean it, so on the one hand,
neuroscience is still a very long
302
00:18:54,840 --> 00:18:58,470
way from being able to give a full
description of how the brain works.
303
00:18:58,830 --> 00:19:02,760
In particular we're, we're stuck on
questions of like, how exactly is it that
304
00:19:02,760 --> 00:19:04,379
the brain gives rise to consciousness?
305
00:19:04,649 --> 00:19:06,030
That that's still a long way from now.
306
00:19:06,750 --> 00:19:09,480
On the other hand, we have made
a lot of progress over the last
307
00:19:09,480 --> 00:19:13,230
century or so in understanding
some of the basic elements and even
308
00:19:13,230 --> 00:19:14,760
some of the more complicated ones.
309
00:19:15,149 --> 00:19:19,215
So in the book, for example, I outline
with human studies, you can learn quite
310
00:19:19,260 --> 00:19:21,000
a lot from looking at cases where.
311
00:19:21,335 --> 00:19:25,265
People have had particular kinds
of brain injuries, or they've taken
312
00:19:25,265 --> 00:19:29,045
particular kinds of drugs, or they
have particular kinds of developmental
313
00:19:29,045 --> 00:19:33,935
variants, and seeing how individuals
in those circumstances differ in
314
00:19:33,935 --> 00:19:38,375
their behavior from those with, you
know, more typical brain structures.
315
00:19:38,735 --> 00:19:42,545
So to give some examples, we know that
memory depends on the hippocampus to
316
00:19:42,545 --> 00:19:46,115
a large degree because there are cases
where people have had damage to the
317
00:19:46,115 --> 00:19:49,955
hippocampus and they've been unable to
form long-term memories going forwards.
318
00:19:50,564 --> 00:19:53,504
Or we know that there's other
circumstances where, for example,
319
00:19:53,504 --> 00:19:58,094
there's these pair of twins in Canada
that are conjoined at the brain.
320
00:19:58,125 --> 00:20:01,395
So essentially they have what would
look normally like two brains that
321
00:20:01,395 --> 00:20:02,919
have been joined together into one.
322
00:20:03,885 --> 00:20:04,935
Brain, I guess.
323
00:20:05,385 --> 00:20:09,525
And they report in some ways having
slightly different experiences
324
00:20:09,525 --> 00:20:10,725
and still being separated.
325
00:20:11,055 --> 00:20:14,085
But on the other hand, being able
to share a large degree of neural
326
00:20:14,085 --> 00:20:17,835
information and like being able to see
out of each other's eyes, for example.
327
00:20:18,135 --> 00:20:19,845
And that also gives us
a bit of a sense of.
328
00:20:20,405 --> 00:20:23,555
What functions are localized to
which bits of the brain and what
329
00:20:23,555 --> 00:20:24,665
gets shared and what doesn't.
330
00:20:25,085 --> 00:20:26,465
So that's from a human perspective.
331
00:20:27,095 --> 00:20:30,395
Simultaneously, we're getting better
and better and better at being able
332
00:20:30,395 --> 00:20:34,596
to manipulate the brains of animals
that we can use to learn more about.
333
00:20:35,365 --> 00:20:39,175
Neuroscience generally to the point
where these days there are techniques
334
00:20:39,175 --> 00:20:43,405
that can be used to like label the
specific neural circuits involved in
335
00:20:43,405 --> 00:20:48,295
the formation of a particular memory
and even erase particular memories.
336
00:20:48,535 --> 00:20:51,805
Like there's some great studies that have
recently come out where they train these
337
00:20:51,805 --> 00:20:54,505
mice to do two different motor tasks.
338
00:20:54,800 --> 00:20:58,285
One was like balancing on a high
beam and one was sort of like.
339
00:20:58,320 --> 00:21:00,419
Balancing on a rotating rod.
340
00:21:00,960 --> 00:21:04,290
Um, and then they show that they could
map like which circuits were involved
341
00:21:04,290 --> 00:21:08,550
in each of the tasks, and they could
selectively erase only one of the two
342
00:21:08,939 --> 00:21:10,500
while leaving the other one intact.
343
00:21:11,010 --> 00:21:13,860
And from, from doing these sorts
of experiments, we learn, you know,
344
00:21:13,860 --> 00:21:18,899
that memories do depend on specific
sets of connections between specific
345
00:21:18,899 --> 00:21:23,100
populations of neurons in a way that
we can then generalize to how it is
346
00:21:23,100 --> 00:21:24,659
that human brains function as well.
347
00:21:25,050 --> 00:21:27,480
So there is quite a lot of
progress being made in this area.
348
00:21:27,870 --> 00:21:28,170
Yeah.
349
00:21:28,170 --> 00:21:32,220
I found that the evidence and the
research that you've referred to all
350
00:21:32,220 --> 00:21:36,930
the way through is very compelling
because it does support just how far
351
00:21:36,930 --> 00:21:42,420
we have come over the last 100 years
in understanding the functionality of
352
00:21:42,420 --> 00:21:47,160
the brain, but also answering a lot of
those like big philosophical questions,
353
00:21:47,340 --> 00:21:52,470
you know, that you do tackle and there
is so many different philosophies that.
354
00:21:52,595 --> 00:21:57,185
You refer to in there that were
fascinating just in relation to how
355
00:21:57,185 --> 00:22:02,375
our brain functions, how we function,
memory, it was really quite amazing.
356
00:22:02,435 --> 00:22:07,865
Um, when you are thinking about
consciousness and referring to the connect
357
00:22:07,865 --> 00:22:12,995
dome, what is it that's so important
about understanding that when we are
358
00:22:12,995 --> 00:22:17,285
talking about death and the possibility
of cheating death in the future?
359
00:22:17,765 --> 00:22:21,515
Yeah, so in an ideal circumstance.
360
00:22:22,050 --> 00:22:27,120
We would be able to give to people who
were dying some sort of pill or drug or
361
00:22:27,120 --> 00:22:31,950
standard treatment that would just cure
their cancer, cure their heart disease,
362
00:22:32,190 --> 00:22:37,050
just give them continuous years of healthy
living without any sort of complications
363
00:22:37,050 --> 00:22:40,980
about weird neuroscience, philosophy
problems, any of those sorts of things.
364
00:22:40,980 --> 00:22:43,950
It would just be a simple standard
treatment available to people.
365
00:22:45,074 --> 00:22:49,334
Unfortunately, I don't think that's
gonna become available anytime soon,
366
00:22:49,965 --> 00:22:55,064
and the only prospect I see is being
able to help those who are dying, but
367
00:22:55,155 --> 00:22:56,834
wish that they could have more time.
368
00:22:57,225 --> 00:23:00,675
Is the possibility that maybe
we could take them and preserve
369
00:23:00,675 --> 00:23:05,294
them in some sort of unconscious
inert state for a period of time.
370
00:23:05,699 --> 00:23:07,784
But if, if that's what one is proposing.
371
00:23:08,610 --> 00:23:12,510
You've gotta make claims about like why
you think that person isn't necessarily
372
00:23:12,510 --> 00:23:16,770
dead or why you think you might be able to
revive them at some point in the future.
373
00:23:16,949 --> 00:23:20,129
And how it is exactly that
you would be able to revive
374
00:23:20,280 --> 00:23:22,409
a preserved body in some way.
375
00:23:22,409 --> 00:23:27,014
I. And getting into those questions
involves really getting to grips with
376
00:23:27,014 --> 00:23:28,965
like, well, what exactly is a person?
377
00:23:29,115 --> 00:23:30,794
What does it mean for
a person to be alive?
378
00:23:30,855 --> 00:23:32,534
What does it mean for a person to die?
379
00:23:32,865 --> 00:23:35,925
What sort of changes can
happen to a person and still
380
00:23:35,925 --> 00:23:37,304
have them be the same person?
381
00:23:37,335 --> 00:23:38,445
All those sorts of things.
382
00:23:38,655 --> 00:23:41,985
And that's, that's why I had to get
into the neuroscience and philosophy
383
00:23:41,985 --> 00:23:46,605
of really drilling into these topics,
because I'm kind of speculating on the
384
00:23:46,605 --> 00:23:48,135
development of technology that I think.
385
00:23:48,295 --> 00:23:52,465
Will maybe one day exist, but I'm
making the case for it, but it
386
00:23:52,465 --> 00:23:56,335
doesn't yet exist, and so one has
to understand the possibilities.
387
00:23:56,715 --> 00:24:01,695
We've been fortunate enough to speak
with the founder of Southern Cryonics
388
00:24:01,695 --> 00:24:06,435
last year on the podcast, and they
use, and they successfully have
389
00:24:06,825 --> 00:24:10,065
cryopreserved, the first person in
the Southern Hemisphere last year.
390
00:24:10,605 --> 00:24:13,305
What is like cryo preservation?
391
00:24:13,575 --> 00:24:19,065
How does it differ from what you refer
to in the book, which is a, a different
392
00:24:19,065 --> 00:24:21,675
process, which is vitro fixation, so.
393
00:24:22,379 --> 00:24:26,430
So the general philosophy is the same
in both circumstances, where it's this
394
00:24:26,430 --> 00:24:31,560
idea that maybe you can take an animal,
a human, a patient, preserve them for
395
00:24:31,560 --> 00:24:36,000
a while, and then restore them back to
biological activity and consciousness.
396
00:24:36,780 --> 00:24:40,530
And it has precedent in the, the
natural world and the scientific
397
00:24:40,530 --> 00:24:44,220
world for things that are already, you
know, actively working and ongoing.
398
00:24:44,640 --> 00:24:49,140
So some examples are, there's these
things like the arctic wood frogs, these
399
00:24:49,140 --> 00:24:52,950
frogs that, you know, they, they freeze
every winter and then they thaw out come
400
00:24:52,950 --> 00:24:54,780
spring and they go back to hopping around.
401
00:24:55,080 --> 00:24:56,340
Yeah, that's bizarre.
402
00:24:56,630 --> 00:25:00,830
Yeah, it is bizarre, but it's also
something we can do with humans, at
403
00:25:00,830 --> 00:25:02,510
least with small bits of biology.
404
00:25:02,750 --> 00:25:06,140
So, you know, it used to be sci-fi in
the seventies and the eighties, but
405
00:25:06,140 --> 00:25:10,160
today there's a lot of people who have
already been, you know, frozen for
406
00:25:10,160 --> 00:25:15,620
part of their life in that we routinely
cryopreserve sperm and eggs and embryos in
407
00:25:15,680 --> 00:25:18,200
IVF and assisted reproduction therapies.
408
00:25:18,980 --> 00:25:22,520
And what happens in those circumstances
is you'll take an embryo, you'll
409
00:25:22,520 --> 00:25:24,140
lower it to very cold temperatures.
410
00:25:24,660 --> 00:25:29,130
And you'll also add in some antifreeze
chemicals, which will prevent ice
411
00:25:29,130 --> 00:25:32,970
formation and ensure that when the
embryo is preserved, it's preserved
412
00:25:32,970 --> 00:25:37,590
in more of sort of a, a glass state
at cold temperatures rather than
413
00:25:37,740 --> 00:25:39,930
in a crystalline sort of ice state.
414
00:25:40,590 --> 00:25:42,150
That's what's used routinely today.
415
00:25:42,150 --> 00:25:47,504
I. The issue though, with trying to scale
that up from embryo size to organ sized or
416
00:25:47,534 --> 00:25:52,635
whole human sized, is that the chemicals
required to prevent ice crystal formation,
417
00:25:52,754 --> 00:25:57,735
these antifreeze chemicals, they can
cause some degree of dehydration of the
418
00:25:57,735 --> 00:25:59,175
tissues that you're trying to preserve.
419
00:25:59,774 --> 00:26:02,504
And indeed, if, if you try and do
cryo preservation, cryo-preservation
420
00:26:02,534 --> 00:26:05,415
of a whole human using just
these antifreeze chemicals.
421
00:26:05,725 --> 00:26:09,475
As per the Southern Cryonics case
that you mentioned, what we typically
422
00:26:09,475 --> 00:26:13,584
see in that circumstance is that
there's a lot of shrinkage of organs
423
00:26:13,645 --> 00:26:15,024
and particularly of the brain.
424
00:26:15,294 --> 00:26:18,655
So if you look at the brains of patients
preserved these way, you'll see something
425
00:26:18,655 --> 00:26:20,995
like a 50% reduction in the size.
426
00:26:20,995 --> 00:26:22,465
So the brain's kind of shriveled.
427
00:26:22,705 --> 00:26:25,855
By about half, 30% to half
in these circumstances.
428
00:26:26,545 --> 00:26:30,625
And that's not super encouraging
for thinking that the brain is
429
00:26:30,625 --> 00:26:34,945
still intact and all the connections
between the neurons in those brains
430
00:26:35,215 --> 00:26:38,845
that store someone's memory and
personality and everything is
431
00:26:38,845 --> 00:26:40,495
still in there in a intact way.
432
00:26:41,220 --> 00:26:45,540
It might be the case that there would
be ways to reverse that and, you know,
433
00:26:45,540 --> 00:26:49,679
expand it again and maybe the connections
don't break when they're dehydrated.
434
00:26:50,070 --> 00:26:53,310
But no one has shown that in a
sort of definitive way as of yet.
435
00:26:53,460 --> 00:26:55,500
So it's a concern that, that I have.
436
00:26:56,159 --> 00:27:00,300
Instead though, there are other ways
of doing preservation that don't induce
437
00:27:00,300 --> 00:27:01,980
this sort of like shrinking related.
438
00:27:01,990 --> 00:27:06,700
Damage, which have better evidence behind
them as keeping someone's connectome,
439
00:27:07,000 --> 00:27:08,710
keeping someone's brain intact.
440
00:27:09,370 --> 00:27:12,460
In particular, there's a method that
involves what's called fixation.
441
00:27:12,850 --> 00:27:18,130
You take an an animal or a human and
you introduce preservative chemicals,
442
00:27:18,220 --> 00:27:19,390
formaldehyde, glutaraldehyde.
443
00:27:20,324 --> 00:27:22,754
Into their sort of circulatory system.
444
00:27:23,385 --> 00:27:25,605
And what those do is
they permeate everywhere.
445
00:27:25,935 --> 00:27:30,375
They get inside all cells, they get
inside all the small organelles within
446
00:27:30,375 --> 00:27:32,804
cells and they fix them in place.
447
00:27:32,834 --> 00:27:37,185
So they stop them from moving, they stop
any sort of further decay processes,
448
00:27:37,635 --> 00:27:40,784
and in doing so, they can give you
very high quality preservation.
449
00:27:41,175 --> 00:27:44,804
So you can then look at the brains after
this preservation procedure and see
450
00:27:44,804 --> 00:27:48,554
that the connectome is still intact,
the neural connections are still intact.
451
00:27:48,929 --> 00:27:50,490
That's the main advantage of it.
452
00:27:50,939 --> 00:27:56,220
The disadvantage is that you could
not in principle just take this
453
00:27:56,220 --> 00:28:00,030
and you know, warm the body back up
again and it just goes straight back
454
00:28:00,030 --> 00:28:01,980
to operating as it did previously.
455
00:28:02,580 --> 00:28:06,000
That's what we can do, for example,
with, you know, cryopreserved embryos
456
00:28:06,360 --> 00:28:09,720
and that's what the people who hope
to cryopreserve whole human bodies.
457
00:28:10,169 --> 00:28:11,220
Wish could happen.
458
00:28:11,550 --> 00:28:13,740
Maybe they'll develop a
protocol for that one day.
459
00:28:14,220 --> 00:28:17,909
But with the fixation preservation,
you need some more advanced revival
460
00:28:17,909 --> 00:28:21,090
technique to be able to restore
the person to consciousness.
461
00:28:21,540 --> 00:28:24,510
Now I think that's very plausible
that we will have that one day,
462
00:28:24,810 --> 00:28:27,750
but that's the, the trade-off
between the two different ideas.
463
00:28:29,129 --> 00:28:35,070
And Ariel, I was just wondering, um, there
is currently in the Western Australian
464
00:28:35,070 --> 00:28:37,710
Gallery an exhibition, which is.
465
00:28:39,330 --> 00:28:43,410
Saying that it has, I'm bringing it up
so we can have a look at it together.
466
00:28:44,010 --> 00:28:45,180
Have you seen this yet?
467
00:28:45,240 --> 00:28:45,420
I'm
468
00:28:45,420 --> 00:28:45,840
not sure
469
00:28:45,840 --> 00:28:46,410
exactly
470
00:28:46,410 --> 00:28:47,070
what you're referring to.
471
00:28:47,070 --> 00:28:51,600
Okay, so what it is is I only
found out about this on Sunday.
472
00:28:51,990 --> 00:28:53,490
I think it's vilification.
473
00:28:53,490 --> 00:28:57,210
I. I can guess while you're looking,
is it, is it one of those exhibitions
474
00:28:57,210 --> 00:29:01,830
where they have the preserved human
bodies, either plastinated on display?
475
00:29:02,100 --> 00:29:02,460
No.
476
00:29:02,775 --> 00:29:03,390
No, it's not.
477
00:29:03,390 --> 00:29:05,550
A Body Works kind of, yeah, yeah, yeah.
478
00:29:05,550 --> 00:29:07,890
I know where, where you're going
with that one, but no, this one,
479
00:29:08,400 --> 00:29:12,960
I'll read this to you and see whether
you can help me with this one.
480
00:29:13,320 --> 00:29:17,400
So it's currently on, at the Art
Gallery of Western Australia.
481
00:29:17,730 --> 00:29:23,600
It's entitled v. Vilification is an
extraordinary immersive exhibition
482
00:29:23,600 --> 00:29:28,250
combining sound and cutting edge
biological innovation to bring to life
483
00:29:28,250 --> 00:29:31,010
the musical genius of a deceased composer.
484
00:29:31,760 --> 00:29:39,320
Four years in the making, vilification
delivers a historic first, the in vitro.
485
00:29:39,645 --> 00:29:44,745
External, they say Brain of
the late composer Alvin Lucia.
486
00:29:45,195 --> 00:29:46,875
Dunno whether I'm
pronouncing that correctly.
487
00:29:47,535 --> 00:29:55,575
From 1931 to 2021, they were
alive creating a new work in real
488
00:29:55,575 --> 00:29:59,835
time as a live performance over
the duration of the exhibition.
489
00:30:00,435 --> 00:30:05,385
So it's actually been in developed with
an artist with by the name of Guy Ben
490
00:30:05,385 --> 00:30:13,305
Ari, Nathan Thompson, and Matthew Gral
with a neuroscientist Stuart Hodges, based
491
00:30:13,305 --> 00:30:18,315
in the University of Western Australia,
who individually spent 25 years pushing
492
00:30:18,315 --> 00:30:20,145
the boundaries in biological arts.
493
00:30:20,175 --> 00:30:20,235
Wow.
494
00:30:20,745 --> 00:30:24,764
I am not sure exactly what they
have done, but it does remind me
495
00:30:24,764 --> 00:30:28,575
of something I explore in the book,
which is this question of like, could
496
00:30:28,575 --> 00:30:33,225
you take brains and recreate them
in some sort of digital format in
497
00:30:33,225 --> 00:30:35,294
a way that captures the function?
498
00:30:35,294 --> 00:30:39,875
I. Behavior memories, everything that
was in the original biological brain.
499
00:30:40,175 --> 00:30:43,775
As an example of the sort of current
cutting edge progress that's been made
500
00:30:43,775 --> 00:30:49,475
in that late last year was the first
publication of the like full mapping
501
00:30:49,625 --> 00:30:54,605
of a flies connectome, A fruit flies
connectome, so it's 160,000 neurons
502
00:30:54,605 --> 00:30:58,475
and billions of connections between
those neurons were fully mapped out.
503
00:30:59,055 --> 00:31:02,565
And from that mapping, the
neuroscientists made a sort of like
504
00:31:02,625 --> 00:31:05,625
digital recreation of that fly brainin.
505
00:31:06,075 --> 00:31:09,525
Now, they didn't make it in perfect
quality as they themselves acknowledge,
506
00:31:09,885 --> 00:31:14,655
but even their sort of digital version of
this fly, Brainin was able to show some
507
00:31:14,655 --> 00:31:16,575
of the behaviors of the original fly.
508
00:31:16,935 --> 00:31:20,025
So, for example, they stimulated
some of its neurons that
509
00:31:20,025 --> 00:31:21,525
correspond to tasting sugar.
510
00:31:22,095 --> 00:31:26,625
And they saw outputs from the, the
motor neurons that corresponded to
511
00:31:26,625 --> 00:31:31,155
it, trying to extend its SCUs or
essentially like stick out its tongue.
512
00:31:31,155 --> 00:31:31,215
Wow.
513
00:31:31,995 --> 00:31:36,764
Now I use this example both as one
to demonstrate this is like cool.
514
00:31:36,764 --> 00:31:40,185
And these are the steps that are
made on the way to maybe being
515
00:31:40,185 --> 00:31:44,205
able to show one way of reviving,
preserved humans in the future.
516
00:31:44,790 --> 00:31:47,730
But also to make very clear that
we're not up to humans just yet.
517
00:31:47,775 --> 00:31:51,389
We're, we're just trying to get
things like flies and aspirationally
518
00:31:51,389 --> 00:31:52,980
mice in the next few years to work.
519
00:31:53,370 --> 00:31:56,639
So this example in Western Australia,
I, I'm not exactly sure what they're
520
00:31:56,639 --> 00:31:59,670
doing, but it's definitely not
the case that they've managed to
521
00:31:59,910 --> 00:32:01,889
upload this composer as of yet.
522
00:32:02,460 --> 00:32:03,240
Yeah, yeah.
523
00:32:03,330 --> 00:32:06,780
I think they may have taken a bit of
poetic license with the exhibition
524
00:32:06,780 --> 00:32:09,540
description, but it's fascinating
to know that that's where we are
525
00:32:09,540 --> 00:32:11,520
currently at, at this point in time.
526
00:32:12,030 --> 00:32:12,629
So.
527
00:32:13,050 --> 00:32:16,230
I'd like to talk about the future
and the process that you sort
528
00:32:16,230 --> 00:32:20,850
of outline using what we see and
the example you've just given us.
529
00:32:20,850 --> 00:32:24,960
Then with the fly, what do you see
developing over the next, you know,
530
00:32:24,990 --> 00:32:29,640
100 years, 200 years, and, and
what does that look like for us?
531
00:32:30,000 --> 00:32:36,360
Yeah, so in advocating for preservation as
a means of providing those who are dying,
532
00:32:36,450 --> 00:32:40,560
but who would like to have more time, a
potentially viable method for doing so.
533
00:32:40,980 --> 00:32:42,570
It's sort of a two part technology.
534
00:32:42,975 --> 00:32:46,754
There's the initial preservation
and storage side, and there's the
535
00:32:46,754 --> 00:32:51,435
eventual revival side, and we can
examine each part individually.
536
00:32:52,350 --> 00:32:56,550
I actually think we already have the
technology available today in order
537
00:32:56,550 --> 00:33:00,959
to preserve people in high quality,
and the issue is mostly around
538
00:33:01,260 --> 00:33:03,179
public knowledge that it exists.
539
00:33:03,300 --> 00:33:06,750
Public acceptance, the rollout
and integration of it as part
540
00:33:06,750 --> 00:33:08,280
of routine medical technology.
541
00:33:08,760 --> 00:33:12,510
It's of no use to anyone if you know
there's no provider that's available
542
00:33:12,510 --> 00:33:14,699
to integrate it into hospices.
543
00:33:15,060 --> 00:33:19,530
Or hospitals or a place where terminally
ill patients might be able to access it.
544
00:33:20,010 --> 00:33:23,010
And even if it was there, if nobody
is aware of it or wants to make
545
00:33:23,010 --> 00:33:25,050
use of it, then there's no point.
546
00:33:25,440 --> 00:33:27,960
But I, I do think that the
technology to preserve people
547
00:33:27,960 --> 00:33:30,000
well today does already exist.
548
00:33:30,060 --> 00:33:33,570
And there's a couple of groups on the
west coast of the US that are trying to
549
00:33:33,840 --> 00:33:38,790
provide it commercially at the moment,
but that's only the preservation side.
550
00:33:39,225 --> 00:33:41,025
Uh, and then keeping people in storage.
551
00:33:41,295 --> 00:33:44,985
The question is like, at what point
would you maybe be able to restore
552
00:33:44,985 --> 00:33:47,745
those individuals back to consciousness?
553
00:33:48,135 --> 00:33:51,675
And there's sort of two ways people
imagine maybe being able to do so.
554
00:33:52,230 --> 00:33:58,020
One is through sort of directly reversing
the preservation procedure itself, so
555
00:33:58,350 --> 00:34:03,480
undoing the fixation that happened with
those preservative chemicals, and then I
556
00:34:03,480 --> 00:34:05,280
guess warming the person back up again.
557
00:34:05,940 --> 00:34:09,420
That would require sort of
very advanced nanomedicine far
558
00:34:09,420 --> 00:34:11,280
beyond what we currently have.
559
00:34:11,635 --> 00:34:16,675
And sort of more in the, the scope
of what it is that, you know, biology
560
00:34:16,675 --> 00:34:20,034
can do, like the machinery within
our cells, the machinery of other
561
00:34:20,034 --> 00:34:25,045
life forms that's currently still far
beyond human medical capabilities.
562
00:34:25,435 --> 00:34:26,574
So that, that's one side.
563
00:34:27,025 --> 00:34:30,355
The other possibility is more similar
to what I was mentioning with creating
564
00:34:30,355 --> 00:34:32,250
digital brains or uploads just then.
565
00:34:32,805 --> 00:34:36,735
And actually the avenue that I think is
likely to arrive sooner, which is the
566
00:34:36,735 --> 00:34:41,595
idea that it might be possible to take a
preserved individual's brain and body to
567
00:34:41,655 --> 00:34:47,355
scan it at very, very high resolution,
and then to create them a new sort of
568
00:34:47,865 --> 00:34:52,575
artificial, electronic, digital brain
and possibly body to go along with that.
569
00:34:53,065 --> 00:34:55,915
So looking at what's been happening,
for example, in the fly example I
570
00:34:55,915 --> 00:34:59,965
mentioned, and sort of making that
better, making that work at scale,
571
00:35:00,085 --> 00:35:03,475
increasing the computing power and
the knowledge of neuroscience to the
572
00:35:03,475 --> 00:35:08,485
point where we can do that to a human
with estimates of that being possible.
573
00:35:08,545 --> 00:35:12,865
I don't know, sometime between
the 2060s or early 21 hundreds,
574
00:35:12,985 --> 00:35:16,765
depending on how optimistic the
neuroscientist in question you ask is.
575
00:35:17,015 --> 00:35:21,305
Because I suppose when, when the reference
that you made previously about the
576
00:35:21,305 --> 00:35:24,995
genome, you know, when you think how
far we've come in, just the fact that
577
00:35:25,325 --> 00:35:29,855
everyday knowledge, now we know that
genome and DNA, we can do DNA testing
578
00:35:29,855 --> 00:35:34,415
ourselves in a home kit, you know, so
it's come a long, long way from where
579
00:35:34,685 --> 00:35:37,205
we were with understanding the genome.
580
00:35:37,565 --> 00:35:38,105
So.
581
00:35:38,580 --> 00:35:42,240
Are you thinking that we are getting,
we will get to a stage where we
582
00:35:42,240 --> 00:35:47,850
will understand our connectome and
to the point that it can be then
583
00:35:47,850 --> 00:35:50,490
replicated in an artificial manner.
584
00:35:51,000 --> 00:35:56,100
So it's not actually so much so
the original tissue coming back and
585
00:35:56,100 --> 00:36:01,620
being alive, but more maybe I, I, I
feel like I wanna make a reference
586
00:36:01,620 --> 00:36:06,390
to sci-fi here, like, you know,
a cyborg or something like that.
587
00:36:06,855 --> 00:36:10,785
Yeah, I, I would say it's taking
the logical extension of current
588
00:36:10,785 --> 00:36:14,384
neural prosthesis technology
to its ultimate conclusion.
589
00:36:14,774 --> 00:36:17,480
So at the moment, people have
probably heard about people who
590
00:36:17,484 --> 00:36:21,765
have had spinal cord injuries or
lost limbs, or lost the ability to
591
00:36:21,765 --> 00:36:23,040
speak after a stroke, for example.
592
00:36:23,879 --> 00:36:27,225
I. But they've had those functions
partially restored through some sort
593
00:36:27,225 --> 00:36:31,395
of brain implant that has taken over
the function of, of what was lost,
594
00:36:31,845 --> 00:36:36,165
where they're given a bionic limb or
they're given some interface that,
595
00:36:36,225 --> 00:36:39,525
you know, enables them to speak
through a sort of artificial means.
596
00:36:39,525 --> 00:36:43,635
Again, the idea of what I see is,
well, we get better and better at,
597
00:36:43,635 --> 00:36:46,635
better at understanding how to make
those things and how to make them
598
00:36:46,635 --> 00:36:48,735
for more and more brain functions.
599
00:36:49,080 --> 00:36:51,960
And we also get better and better
at understanding how the brain works
600
00:36:51,990 --> 00:36:57,360
normally and naturally to the point
where we can essentially recreate a
601
00:36:57,540 --> 00:37:02,400
human brain and we can also scan the
information present in a human biological
602
00:37:02,400 --> 00:37:06,870
brain in a way where we can recreate
it in a new sort of artificial format.
603
00:37:07,380 --> 00:37:11,070
And if we do that just right
in high enough fidelity, then
604
00:37:11,070 --> 00:37:14,370
hopefully we'd be able to restore
a person to consciousness that way.
605
00:37:15,045 --> 00:37:17,955
I mean, there's a lot of philosophy,
neuroscience, everything to
606
00:37:17,955 --> 00:37:20,895
really drill into whether that's
possible and how one would do it.
607
00:37:21,165 --> 00:37:23,295
But that is the the general
argument that I'm making.
608
00:37:23,444 --> 00:37:23,835
Yes.
609
00:37:24,225 --> 00:37:29,565
And what I thought was very interesting
is that you also went into just how
610
00:37:29,565 --> 00:37:33,855
much it would cost, and you, I think
you were touching on it briefly
611
00:37:33,884 --> 00:37:35,384
before, is about the fact that.
612
00:37:36,290 --> 00:37:40,009
You know, we have the technology available
now, but we don't have the awareness.
613
00:37:40,009 --> 00:37:42,980
We don't have the funding and
the resources allocated to it.
614
00:37:43,310 --> 00:37:47,029
So much so that you actually put a
call out to different people working
615
00:37:47,029 --> 00:37:50,960
within different industries to say,
if you wanna see this happen, you
616
00:37:50,960 --> 00:37:52,879
need to be having conversations.
617
00:37:52,879 --> 00:37:52,940
I.
618
00:37:53,174 --> 00:37:54,585
Yeah, no, exactly.
619
00:37:54,585 --> 00:37:56,475
I mean, so there's two things.
620
00:37:56,475 --> 00:37:59,865
Again, I would say the cost of
doing preservation in storage is
621
00:37:59,895 --> 00:38:03,495
a lot cheaper probably than the,
the cost of doing revival, which
622
00:38:03,495 --> 00:38:04,935
at the moment is roughly infinite.
623
00:38:04,964 --> 00:38:06,285
'cause we, we can't do it at all.
624
00:38:06,825 --> 00:38:10,115
And also it's the case that I. Even
if we don't know which specific
625
00:38:10,115 --> 00:38:13,535
revival method we'll be able to use
in the future, I think there's a
626
00:38:13,535 --> 00:38:17,615
strong case that preservation works
well enough that it'll be compatible
627
00:38:17,615 --> 00:38:20,735
with at least one of the forms of
revival that will become available.
628
00:38:20,915 --> 00:38:24,995
Whether that's uploading or whether
that's direct restoration of reversing
629
00:38:24,995 --> 00:38:26,735
the fixation, whatever it may be.
630
00:38:27,375 --> 00:38:31,035
So in the book I, I really try and say,
well, if you think these ideas are at
631
00:38:31,035 --> 00:38:35,835
all plausible, then how much is it worth
to spend on this, on preserving people?
632
00:38:36,254 --> 00:38:39,674
Because on the one hand, if it costs a
billion dollars per procedure, then you
633
00:38:39,674 --> 00:38:41,595
know it's of use to basically no one.
634
00:38:41,654 --> 00:38:46,410
Even if you can guarantee it works,
and if it's incredibly cheap, well then
635
00:38:46,575 --> 00:38:50,444
even if you are not that confident that
it will work, it might still be worth
636
00:38:50,444 --> 00:38:54,705
exploring the same way as insurance or
any other sort of speculative prospect.
637
00:38:55,200 --> 00:38:58,650
And so what I get into in the book is
like, how do we actually decide how much
638
00:38:58,650 --> 00:39:00,960
to pay for medical treatments generally?
639
00:39:01,320 --> 00:39:05,310
So in a, in a more concrete and more
everyday sense, people have probably
640
00:39:05,310 --> 00:39:08,430
heard about the fact that there's
constantly new cancer therapies that
641
00:39:08,430 --> 00:39:12,270
are being developed, but that often
come with very expensive price tags.
642
00:39:12,270 --> 00:39:17,460
So $50,000, $60,000 for these new
treatments, and when deciding whether
643
00:39:17,460 --> 00:39:21,360
that's worthwhile or whether it's
not, what we have to look at is.
644
00:39:21,720 --> 00:39:25,830
How much more time, in what level of
health do these treatments buy for
645
00:39:25,830 --> 00:39:29,040
someone versus how much does it cost?
646
00:39:29,430 --> 00:39:33,510
And in Australia with our public health
system, we use a threshold of something
647
00:39:33,510 --> 00:39:39,150
like $50,000 per year of healthy
life where to make that concrete.
648
00:39:39,150 --> 00:39:44,280
That's saying that, you know, if your
cancer drug costs $50,000 but it cures
649
00:39:44,280 --> 00:39:46,080
someone's cancer and gives them 10 years.
650
00:39:46,410 --> 00:39:47,070
Then that's great.
651
00:39:47,070 --> 00:39:48,390
Yeah, we'll absolutely fund it.
652
00:39:48,870 --> 00:39:53,220
But if it costs $200,000 and it only
statistically improves, you know,
653
00:39:53,250 --> 00:39:57,270
survival rates by about a month or
so, then we decline to fund that.
654
00:39:57,300 --> 00:40:00,540
We don't think that's a, a good
use of resources in comparison
655
00:40:00,540 --> 00:40:01,920
to what else it could be.
656
00:40:01,920 --> 00:40:04,650
I. To bring that back to preservation.
657
00:40:04,740 --> 00:40:09,240
The conclusion I have is that if done at
scale, preservation and storage should
658
00:40:09,390 --> 00:40:15,780
probably cost something like 10 to $20,000
per procedure, which means that, you
659
00:40:15,780 --> 00:40:20,130
know, as, as long as you think it has some
chance of giving people only a few more
660
00:40:20,130 --> 00:40:24,120
years of healthy life, then it's probably
a, a worthwhile thing to be considering.
661
00:40:24,720 --> 00:40:29,730
And tell me, you know, we've
discussed the, the possibility of
662
00:40:29,730 --> 00:40:33,750
this happening, it, we've done the
mapping of the connectome, we've done
663
00:40:33,750 --> 00:40:38,370
the process of the preservation, but
it's really that other side, isn't it?
664
00:40:38,370 --> 00:40:43,620
And I love how you speak about really
the future and how we are dependent
665
00:40:43,620 --> 00:40:46,649
on the generations that come after us.
666
00:40:46,965 --> 00:40:50,565
To actually like us enough to
wanna be able to revive us.
667
00:40:51,105 --> 00:40:53,745
Yeah, I mean this is a
common question people ask.
668
00:40:53,745 --> 00:40:56,535
They say like, even if you can
guarantee me that the technology
669
00:40:56,595 --> 00:41:01,065
works, why would people in the future
want to revive their ancestors?
670
00:41:01,065 --> 00:41:04,005
Like, why would they even care for me?
671
00:41:04,005 --> 00:41:07,815
The reason why I think they might
care is the same sort of reasons
672
00:41:07,815 --> 00:41:12,195
why I am grateful to my ancestors
and the people who came before us.
673
00:41:12,570 --> 00:41:16,080
For example, I mentioned earlier
that one in two children used
674
00:41:16,080 --> 00:41:20,820
to die before the age of 15, but
because nowadays we have things like
675
00:41:20,820 --> 00:41:26,520
antibiotics and clean water and sewage
systems, that doesn't happen anymore.
676
00:41:26,520 --> 00:41:28,770
And I get to live as a result of this.
677
00:41:28,980 --> 00:41:33,510
And I'm very grateful to everyone who
worked in previous times to make the world
678
00:41:33,510 --> 00:41:37,920
a little bit easier for themselves and
their children and their grandchildren.
679
00:41:38,460 --> 00:41:43,020
My hope is that should we meet the
challenges of our time, things like
680
00:41:43,140 --> 00:41:47,700
solving climate change issues, things
like preventing nuclear war, things
681
00:41:47,700 --> 00:41:52,680
like ensuring that everyone has access
to economic security, that we continue
682
00:41:52,680 --> 00:41:56,339
to build a better and better and better
world, to the point where hopefully
683
00:41:56,339 --> 00:41:59,790
people in the future will be living
in such nice conditions that they're
684
00:41:59,790 --> 00:42:03,750
like, yeah, why not bring back our
ancestors who helped give us this world?
685
00:42:04,529 --> 00:42:06,779
And on the other hand, if we fail.
686
00:42:07,140 --> 00:42:10,980
If the world is destroyed or if
it's merely just a, an impoverished
687
00:42:10,980 --> 00:42:14,280
sort of condition, then nobody's
gonna be bothering to bring
688
00:42:14,280 --> 00:42:16,380
back their preserved ancestors.
689
00:42:16,650 --> 00:42:20,910
So it's sort of both a, a hope
and also an impetus to like really
690
00:42:20,910 --> 00:42:26,010
take seriously that there will be a
2100, there will be a 2200, like the
691
00:42:26,010 --> 00:42:29,640
future is going to happen the same
way that the past was really real.
692
00:42:30,120 --> 00:42:34,080
And we should want that to go well,
both for our descendants and maybe
693
00:42:34,080 --> 00:42:35,640
a little bit also for ourselves.
694
00:42:36,705 --> 00:42:40,725
And is that one of the biggest hurdles
with this is that we're trusting
695
00:42:40,725 --> 00:42:42,584
humanity to do the right thing?
696
00:42:43,185 --> 00:42:46,845
Yeah, I, I mean, it's a, an uncertainty
with all endeavors and particularly
697
00:42:46,845 --> 00:42:48,555
this just makes it very concrete.
698
00:42:48,615 --> 00:42:52,095
If anyone wants to build anything,
whether it's improving the governance
699
00:42:52,095 --> 00:42:55,544
of their country, whether it's building
a business that lasts into the future,
700
00:42:55,754 --> 00:42:58,395
whether it's ensuring that their
children and their grandchildren have
701
00:42:58,395 --> 00:43:02,805
good educations and good lives, there's
always a degree of like uncertainty.
702
00:43:03,190 --> 00:43:06,940
About how the world is gonna go,
how future humans are gonna act
703
00:43:06,940 --> 00:43:08,530
and how we're gonna act today.
704
00:43:09,010 --> 00:43:12,670
And normally we, we just, you know,
think that that's a problem, but
705
00:43:12,670 --> 00:43:15,100
we don't dwell on it too much.
706
00:43:15,520 --> 00:43:19,150
But if you're really concretely thinking
about, Hey, is there a chance that
707
00:43:19,270 --> 00:43:23,560
I will exist in 2100, or that I need
something from someone at that future
708
00:43:23,560 --> 00:43:28,390
time points, I guess it makes very
explicit this uncertainty that is there.
709
00:43:28,995 --> 00:43:32,565
But I, I think uncertainty's not the
same as, you know, it will definitely
710
00:43:32,654 --> 00:43:38,565
fail in the same way that I think people
who lived in 1500, 1600 or so would be
711
00:43:38,715 --> 00:43:42,944
very pleasantly surprised by actually
how well the future went compared to
712
00:43:42,944 --> 00:43:44,355
the living standards that they had.
713
00:43:44,654 --> 00:43:49,215
And I think that you do a good job of also
drawing on past quotations from people
714
00:43:49,215 --> 00:43:55,035
who were quite visionary in our past that
have, you know, thought about an idea or
715
00:43:55,035 --> 00:43:58,125
what our, our world could possibly be and.
716
00:43:58,425 --> 00:44:00,075
Some of those things have come true.
717
00:44:00,075 --> 00:44:04,875
So, you know, it does pose the
question that people have asked
718
00:44:04,875 --> 00:44:08,775
these things in the past and they
have become a reality for us today.
719
00:44:09,165 --> 00:44:12,735
And that's what I, I really like
about the book all the way through,
720
00:44:12,735 --> 00:44:14,145
is that you're being challenged.
721
00:44:14,385 --> 00:44:18,015
Just when you think that, oh, maybe
this isn't a possibility, then you.
722
00:44:18,595 --> 00:44:23,065
Challenge me or, or provide me with
another reason about why someone
723
00:44:23,065 --> 00:44:26,365
in the past may have had the same
concerns about, you know, things that
724
00:44:26,365 --> 00:44:30,145
we've already been able to accomplish
in medical, you know, achievements.
725
00:44:30,205 --> 00:44:35,245
Something that I, I went to recently
was a, an exhibition in an art
726
00:44:35,245 --> 00:44:38,095
gallery in Gippsland, and it was.
727
00:44:38,180 --> 00:44:43,520
A beautiful collection of
drawings based on centenarians.
728
00:44:43,790 --> 00:44:45,200
I'm always horrible at that word.
729
00:44:45,200 --> 00:44:46,580
I can never pronounce it properly.
730
00:44:46,970 --> 00:44:52,970
For those of you people have reached
the ripe old age of 100, and a lot
731
00:44:52,970 --> 00:44:58,190
of them didn't know how they made
it that age, and a lot of them.
732
00:44:58,504 --> 00:45:03,154
You know, had a sense of grief about
them because they were the only ones left
733
00:45:03,154 --> 00:45:05,015
of their loved ones and their family.
734
00:45:05,705 --> 00:45:11,225
So do you think the success for
something like this is making sure
735
00:45:11,225 --> 00:45:16,115
that it's not just being offered to
the few, but being offered to the many?
736
00:45:17,190 --> 00:45:18,029
Yeah, absolutely.
737
00:45:18,029 --> 00:45:22,830
I, I think this is in the same way as all
medical technology, something where you
738
00:45:22,830 --> 00:45:26,520
want as many people as possible to have
access to it, and something that should
739
00:45:26,520 --> 00:45:29,040
be used to empower people generally.
740
00:45:29,820 --> 00:45:33,930
Now, that's not to say that if for
whatever reason, only a few people
741
00:45:33,930 --> 00:45:38,340
take it up that it's not worthwhile
in that it sort of reminds me of
742
00:45:38,340 --> 00:45:42,840
how my grandparents were Holocaust
survivors, for example, and they told
743
00:45:42,840 --> 00:45:47,460
me stories of of friends of theirs
who had had their entire families die.
744
00:45:47,835 --> 00:45:53,145
And who had come to Australia as refugees
with no connections, nothing essentially.
745
00:45:54,015 --> 00:45:58,605
Now, that was horrible for them, but
at the same time, they still went on
746
00:45:58,605 --> 00:46:03,495
to have meaningful lives, to build new
connections, to build a essentially a, a
747
00:46:03,500 --> 00:46:08,819
new and and satisfying life for themselves
despite having lost everything previously.
748
00:46:09,645 --> 00:46:10,634
And in, in the same way.
749
00:46:10,634 --> 00:46:11,685
I think that even if.
750
00:46:12,020 --> 00:46:16,790
Isolated individuals wanted to make use of
something like preservation without their
751
00:46:16,879 --> 00:46:21,950
friends or families or others, then that
that might still be worthwhile for them.
752
00:46:22,590 --> 00:46:26,160
But, but it's obviously the case that
the more people have access to this,
753
00:46:26,160 --> 00:46:30,509
the more people whose communities are
kept intact to get to live again with
754
00:46:30,509 --> 00:46:33,690
their friends and family and loved
ones, the more this has succeeded.
755
00:46:34,020 --> 00:46:36,780
I definitely think it should
be accessible to everyone.
756
00:46:37,590 --> 00:46:40,995
And I suppose, here's the big question,
Ariel, would you sign up for it?
757
00:46:41,880 --> 00:46:44,160
I absolutely would.
758
00:46:44,340 --> 00:46:49,080
Where I, for example, given a, a
terminal cancer diagnosis or something.
759
00:46:49,410 --> 00:46:52,830
I'm only 32 at the moment, so my
hope is I won't have to make that
760
00:46:52,830 --> 00:46:55,200
choice explicitly anytime soon.
761
00:46:55,890 --> 00:47:00,150
But it is the case that when, for
example, I've looked at surveys of the
762
00:47:00,150 --> 00:47:05,550
terminally ill and their will to live,
where people have gone into hospices and
763
00:47:05,550 --> 00:47:09,575
they've asked people at death store, do
you still have a strong will to live?
764
00:47:10,665 --> 00:47:15,105
Most people in that circumstance,
about 70% when asked, continue to
765
00:47:15,105 --> 00:47:19,065
say, I have a very strong will to
live even right up until the end.
766
00:47:20,160 --> 00:47:23,040
And I, I think I would likely be the same.
767
00:47:23,220 --> 00:47:26,700
And if someone could credibly
offer me some way of being able to
768
00:47:26,790 --> 00:47:29,790
live longer at some point in the
future, then that is something I
769
00:47:29,790 --> 00:47:31,560
would push very hard to take up.
770
00:47:32,040 --> 00:47:34,680
Because ultimately I think it is
the case that these things should
771
00:47:34,680 --> 00:47:36,690
be under individual's controls.
772
00:47:37,140 --> 00:47:41,460
People should be able to, if they
want to, I guess, end their lives at
773
00:47:41,460 --> 00:47:45,150
the, the point where they're suffering
huge amounts of pain or where they
774
00:47:45,210 --> 00:47:48,600
no longer feel their life is at the
quality that they want it to be.
775
00:47:49,140 --> 00:47:54,120
On the other hand, if we have people
who are terminally ill but who think
776
00:47:54,120 --> 00:47:57,509
that this might give them some chance
at more time, it's absolutely something
777
00:47:57,509 --> 00:47:58,799
they should have access to as well.
778
00:47:59,549 --> 00:47:59,970
Wow.
779
00:47:59,970 --> 00:48:01,230
I think that's amazing.
780
00:48:01,230 --> 00:48:08,174
So how do you think that the role of grief
and mourning comes into this process?
781
00:48:08,490 --> 00:48:14,009
If someone does choose this as an option,
how do you think that that might play out?
782
00:48:14,069 --> 00:48:16,560
You know, play out for those
loved ones left behind.
783
00:48:17,265 --> 00:48:21,915
Yeah, I have not thought a huge
amount about this, to be fair.
784
00:48:22,125 --> 00:48:26,775
And I'm also not a sociologist or a
psychologist in grief, so of course
785
00:48:26,775 --> 00:48:30,110
I, I put some, you know, caveats
on, on what I'm about to say.
786
00:48:30,110 --> 00:48:30,272
Yeah, of course.
787
00:48:30,795 --> 00:48:34,815
My guess, and this is just a guess,
is that it seems somewhat similar
788
00:48:34,815 --> 00:48:37,875
to sort of what was actually
historically more than norm.
789
00:48:38,455 --> 00:48:42,985
Where most people used to believe in an
afterlife, a religious sort of afterlife
790
00:48:42,985 --> 00:48:47,215
of people where they would, you know,
their loved ones would die, but they
791
00:48:47,215 --> 00:48:50,965
might be reunited again with them at
some point in the future in heaven also.
792
00:48:51,625 --> 00:48:55,825
And it obviously, this is a, a variant
on that insofar as, uh, there's no
793
00:48:55,825 --> 00:48:59,365
guarantee that this technology will
work either from a technical sense
794
00:48:59,370 --> 00:49:03,505
or that future generations will have
the capacity to bring people back.
795
00:49:04,255 --> 00:49:06,025
So I think it's sort of two parts to it.
796
00:49:06,375 --> 00:49:10,904
There's the fear that their loved ones
might be gone forever, but there's also
797
00:49:10,904 --> 00:49:14,145
just the loss that even if they are
sure, they'll get to see their loved
798
00:49:14,145 --> 00:49:15,645
ones again at some point in the future.
799
00:49:16,125 --> 00:49:17,685
They've still lost them for now.
800
00:49:17,685 --> 00:49:20,475
They're still grieving the loss
of their friendship, the loss of
801
00:49:20,475 --> 00:49:24,404
their family member all the time
that they won't spend with them.
802
00:49:24,884 --> 00:49:28,154
And I imagine that's gotta be
pretty similar to the standard
803
00:49:28,154 --> 00:49:30,105
sort of grieving process.
804
00:49:30,645 --> 00:49:32,234
At least that's how I I think about it.
805
00:49:32,325 --> 00:49:34,515
Even if someone could
guarantee me that I'd see my.
806
00:49:35,015 --> 00:49:38,165
Friend again in a hundred
years, I'd be pretty devastated
807
00:49:38,165 --> 00:49:39,365
that I wasn't seeing them.
808
00:49:39,365 --> 00:49:39,725
Now
809
00:49:40,385 --> 00:49:44,345
it's interesting because it, then it
comes back to that age old challenge
810
00:49:44,345 --> 00:49:50,855
between faith or religion and science
and that the possibility of an afterlife.
811
00:49:51,215 --> 00:49:55,715
So this would be a scientific,
you know, version of, of
812
00:49:55,715 --> 00:49:57,155
really an afterlife, wasn't it?
813
00:49:57,975 --> 00:50:00,975
Well, I mean, I guess in a
technical sense, I would argue
814
00:50:00,975 --> 00:50:02,625
that the person never really died.
815
00:50:02,655 --> 00:50:04,815
So it's, it's not quite
awful that that's true.
816
00:50:05,175 --> 00:50:05,445
Yeah.
817
00:50:05,445 --> 00:50:05,895
But, but it is.
818
00:50:05,895 --> 00:50:06,135
Yeah.
819
00:50:06,135 --> 00:50:07,635
I, there is a strong analogy.
820
00:50:07,635 --> 00:50:10,035
I think that's it's fair to make.
821
00:50:10,560 --> 00:50:13,020
This has been absolutely fascinating.
822
00:50:13,020 --> 00:50:15,870
I've loved the book,
I've loved reading it.
823
00:50:15,930 --> 00:50:21,120
It, it's taken me through so many
different avenues and just really made
824
00:50:21,120 --> 00:50:25,680
me appreciate just how difficult it
is and how complex we are as humans
825
00:50:25,950 --> 00:50:31,200
and, and how far we've come in, in
modern technology and, and science.
826
00:50:31,319 --> 00:50:36,060
For anyone fascinated about the next
steps, you know, how can they advocate?
827
00:50:36,390 --> 00:50:37,740
What do you.
828
00:50:38,170 --> 00:50:41,770
Suggest people do to support
this kind of, you know, future
829
00:50:41,770 --> 00:50:43,300
that you are, you are suggesting
830
00:50:43,810 --> 00:50:45,160
they are interested?
831
00:50:45,160 --> 00:50:46,120
In general.
832
00:50:46,270 --> 00:50:49,900
I have an ongoing newsletter where
they can find out more about sort
833
00:50:49,900 --> 00:50:51,250
of developments that are happening.
834
00:50:51,490 --> 00:50:53,950
It's called Preserving Hope on Substack.
835
00:50:54,220 --> 00:50:55,840
It's also available through my website.
836
00:50:56,515 --> 00:51:00,595
If they want more specific information
about organizations helping push for
837
00:51:00,595 --> 00:51:04,435
this sort of stuff, they can look up, for
example, the Brain Preservation Foundation
838
00:51:04,765 --> 00:51:08,485
and if they're interested in the providers
that actually currently exist in this
839
00:51:08,485 --> 00:51:13,585
space, the two to check out would be
Oregon Brain Preservation and Nectar,
840
00:51:13,915 --> 00:51:17,455
which are in the west coast of the us
And hopefully, you know, we'll be able to
841
00:51:17,455 --> 00:51:21,085
expand and grow to the point where they
can start to offer services in Australia
842
00:51:21,475 --> 00:51:22,975
at some point in the the near future.
843
00:51:23,145 --> 00:51:26,325
People should also feel free to
email me if they're interested and
844
00:51:26,355 --> 00:51:27,915
those details are also on my website.
845
00:51:28,935 --> 00:51:29,805
That's fantastic.
846
00:51:29,805 --> 00:51:32,775
Ariel, I can't thank you enough for
being a guest on the show today.
847
00:51:32,835 --> 00:51:34,215
I really enjoyed our time here.
848
00:51:34,665 --> 00:51:38,085
I'm glad you had me and I'm glad you're
having these conversations with people.
849
00:51:38,115 --> 00:51:38,535
It's great.
850
00:51:38,535 --> 00:51:38,595
I.
851
00:51:42,210 --> 00:51:45,630
We hope you enjoyed today's
episode of Don't Be Caught Dead,
852
00:51:45,960 --> 00:51:47,700
brought to you by Critical Info.
853
00:51:48,450 --> 00:51:52,710
If you liked the episode, learn something
new, or were touched by a story you
854
00:51:52,710 --> 00:51:54,540
heard, we'd love for you to let us know.
855
00:51:54,810 --> 00:51:58,410
Send us an email, even tell
your friends, subscribe so you
856
00:51:58,410 --> 00:52:00,150
don't miss out on new episodes.
857
00:52:00,330 --> 00:52:01,980
If you can spare a few moments.
858
00:52:02,455 --> 00:52:06,235
Please rate and review us as it
helps other people to find the show.
859
00:52:06,535 --> 00:52:07,855
Are you dying to know more?
860
00:52:08,005 --> 00:52:08,995
Stay up to date with.
861
00:52:08,995 --> 00:52:12,925
Don't be caught dead by signing up to
our newsletter and follow us on social
862
00:52:12,925 --> 00:52:19,105
media Head to Don't Be Caught dead.com for
more information and loads of resources.
Read Less
Resources
- Visit the Website: Dr Ariel Zeleznikow-Johnston
- Read the Book: The Future Loves You: How and Why We Should Abolish Death
- Visit the Website: Brain Preservation Foundation
- Visit the Website: Nectar
- Subscribe to Newsletter: Preserving Hope on Substack
-
Make Death Admin Easy with The Critical Info Platform
A simple system to sort your personal paperwork for when your information becomes critical.
-
My Loved One Has Died, What Do I Do Now?
Our guide, ‘My Loved One Has Died, What Do I Do Now?’ provides practical steps for the hours and days after a loved one's death. Purchase it here.
-
Support Services
If you're feeling overwhelmed by grief, find support through our resources and bereavement services here.

