Transcript
WEBVTT
1
00:00:02.680 --> 00:00:03.960
A listener production.
2
00:00:08.279 --> 00:00:11.279
Hey there, you're listening to Crappita Happy. I'm cast done.
3
00:00:11.320 --> 00:00:14.720
I'm a clinical and coaching psychologist, a mindfulness meditation teacher
4
00:00:14.919 --> 00:00:18.000
and author of the Crappita Happy books. In this show,
5
00:00:18.079 --> 00:00:21.879
I introduce you to interesting, inspiring, intelligent people who are
6
00:00:21.920 --> 00:00:24.399
experts in their fields, with the hope that the insights
7
00:00:24.399 --> 00:00:27.039
and experiences they share can help you to feel a
8
00:00:27.079 --> 00:00:30.839
whole lot less crappy and more happy. And today I
9
00:00:30.879 --> 00:00:34.200
am very excited to be talking to doctor Christy Goodwin,
10
00:00:34.359 --> 00:00:37.399
who is one of Australia's leading digital health, well being
11
00:00:37.439 --> 00:00:40.759
and productivity expert. She also happens to be a mum
12
00:00:40.799 --> 00:00:43.079
who knows full well what it's like to deal with
13
00:00:43.200 --> 00:00:47.000
kids techno tantrums. Christy is the author of Raising Your
14
00:00:47.039 --> 00:00:49.359
Child in a Digital World, and what I love about
15
00:00:49.399 --> 00:00:52.000
her is that she doesn't suggest that we ban the
16
00:00:52.039 --> 00:00:57.479
iPhone or offer other entirely unrealistic solutions. At a time
17
00:00:57.520 --> 00:00:59.719
when we are all working harder than ever to put
18
00:00:59.719 --> 00:01:03.399
limit on our screen time while also relying so heavily
19
00:01:03.439 --> 00:01:06.840
on our devices for our social connection and working from home,
20
00:01:07.200 --> 00:01:10.480
Christie's advice is more important than ever, and so I
21
00:01:10.519 --> 00:01:12.879
hope that you get as much out of this conversation
22
00:01:13.000 --> 00:01:17.040
as I did. I have been dying to talk to
23
00:01:17.079 --> 00:01:21.079
you for a long time, but especially now after the
24
00:01:21.159 --> 00:01:24.319
year that we have had so Christy, I think we've
25
00:01:24.359 --> 00:01:26.760
all become a lot more aware of the negative impacts
26
00:01:26.760 --> 00:01:28.920
of too much technology. You know, we've all been on
27
00:01:28.959 --> 00:01:33.719
that path of pulling back and having digital detoxes. But
28
00:01:33.879 --> 00:01:38.719
now we're in this world where we're basically being forced
29
00:01:38.879 --> 00:01:41.680
onto our screens to live our lives. All of our
30
00:01:41.719 --> 00:01:44.879
socializing is online. Everybody's working from home and has to
31
00:01:44.920 --> 00:01:48.319
be online. What do you make of this, like this
32
00:01:48.359 --> 00:01:50.239
whole new world that we're living in. That's a very
33
00:01:50.239 --> 00:01:52.000
broad question, but look.
34
00:01:51.840 --> 00:01:55.000
I think we all agree that none of us are
35
00:01:55.040 --> 00:01:58.120
immune to the digital pool, whether you're a toddler, a teenager,
36
00:01:58.239 --> 00:02:00.439
or an adult. I think we would all, if we're
37
00:02:00.480 --> 00:02:03.640
really honest, admit that we're often tethered to technology. We
38
00:02:03.719 --> 00:02:06.959
find it so appealing, and that is for a whole
39
00:02:06.959 --> 00:02:09.599
host of reasons. You know, as you said, in this
40
00:02:09.680 --> 00:02:13.960
particular climate, our digital devices have been our portal for
41
00:02:14.159 --> 00:02:17.159
a leisure, they've been our portal for work, they've been
42
00:02:17.199 --> 00:02:22.120
our portal for connection and socialization, and all of us
43
00:02:22.159 --> 00:02:25.360
have relied on these digital devices for a plethora of reasons.
44
00:02:25.800 --> 00:02:28.639
But these devices that we all rely on have been
45
00:02:29.080 --> 00:02:33.639
very deliberately and intentionally designed to be psychologically appealing and
46
00:02:33.719 --> 00:02:35.719
to hook us into wanting to use them.
47
00:02:36.120 --> 00:02:37.520
So I think we feel the tug.
48
00:02:37.680 --> 00:02:41.639
You know, we couldn't imagine doing lockdown without our digital appendages.
49
00:02:42.000 --> 00:02:44.599
But we also recognize that it's really hard to switch
50
00:02:44.639 --> 00:02:47.120
off and that they're often having, if we're really honest,
51
00:02:47.120 --> 00:02:49.919
a negative impact on many facets, you know, on our
52
00:02:49.919 --> 00:02:52.919
mental health, on our physical wellbeing. So I think what
53
00:02:52.960 --> 00:02:54.960
you described, I call it the digital dilemma.
54
00:02:55.400 --> 00:02:57.599
You know, you can't live it's hard to live with it,
55
00:02:57.680 --> 00:03:00.240
and it's hard to live without it. And how do
56
00:03:00.280 --> 00:03:00.879
we you know what.
57
00:03:00.960 --> 00:03:04.840
I'm actually someone that doesn't suggest digital detoxes. I think
58
00:03:04.879 --> 00:03:07.479
we've got to whether we love it or loughverit, the
59
00:03:07.520 --> 00:03:09.599
technology is here to stay. So we have to leverage
60
00:03:09.599 --> 00:03:12.800
the benefits it offers us and mitigate the potential pitfalls.
61
00:03:12.800 --> 00:03:14.759
And that means we have to find the best ways
62
00:03:14.800 --> 00:03:18.680
to use it, not promote digital abstinence or amputation, especially
63
00:03:18.680 --> 00:03:19.560
if you've got kids.
64
00:03:19.599 --> 00:03:20.879
It just will not work.
65
00:03:21.439 --> 00:03:25.800
Yes, I definitely want to talk too specifically about kids
66
00:03:26.039 --> 00:03:28.680
and devices. I have a fourteen year old daughter, so
67
00:03:28.759 --> 00:03:32.080
she's very much super glued to her phone.
68
00:03:32.560 --> 00:03:34.000
But let's just go back.
69
00:03:34.520 --> 00:03:38.000
You said there are all of these potential negative consequences
70
00:03:38.000 --> 00:03:40.280
of too much time on our technology, and that they're
71
00:03:40.280 --> 00:03:41.360
designed to keep us hooked.
72
00:03:41.400 --> 00:03:43.280
Can you just dig in a little bit for.
73
00:03:43.280 --> 00:03:47.319
Us about what some of those negative effects are on
74
00:03:47.360 --> 00:03:49.879
our physical, mental, emotional well being.
75
00:03:50.599 --> 00:03:52.599
Absolutely so the research tells us.
76
00:03:52.639 --> 00:03:55.240
And obviously it depends on how we're using it and
77
00:03:55.560 --> 00:03:57.680
for how long and what platforms, so it's hard to
78
00:03:57.759 --> 00:04:02.199
make broad generalizations. But globally speaking, we know that if
79
00:04:02.280 --> 00:04:05.360
we are using whether we're adults or talking about children,
80
00:04:05.840 --> 00:04:08.879
if we're using technology excessively or at the wrong times
81
00:04:08.919 --> 00:04:10.919
of the day or in the wrong ways, it can
82
00:04:10.919 --> 00:04:13.400
have a negative impact on our physical health and mental
83
00:04:13.439 --> 00:04:13.840
well being.
84
00:04:13.879 --> 00:04:16.000
So in physical physical health, we.
85
00:04:16.000 --> 00:04:19.120
Know, for example, rates of myopia so near sightedness has
86
00:04:19.120 --> 00:04:21.519
increased in the last ten years.
87
00:04:21.319 --> 00:04:22.759
A very substantial increase.
88
00:04:22.879 --> 00:04:25.560
Now, our initial reaction was to point the blame and
89
00:04:25.600 --> 00:04:28.680
so too much screen time, and there is definitely an
90
00:04:28.759 --> 00:04:32.040
element of that. But what new Australian research is suggesting
91
00:04:32.079 --> 00:04:35.639
that it's the displacement effect of our time on devices,
92
00:04:35.959 --> 00:04:39.160
and that is we're not getting enough time in natural sunlight,
93
00:04:39.240 --> 00:04:42.920
vitamin D. We know that vitamin D actually helps elongate
94
00:04:42.959 --> 00:04:47.319
the eyes and that can contribute to decreasing the likelihood
95
00:04:47.360 --> 00:04:51.720
of developing things like myopia. So often with technology, and
96
00:04:51.759 --> 00:04:55.040
this applies to all of the impacts, it's not necessarily
97
00:04:55.040 --> 00:04:58.079
the technology per se. Sometimes it can be, but it's
98
00:04:58.079 --> 00:05:02.279
often it's what it's superseding, what it's displacing. So, for example,
99
00:05:02.319 --> 00:05:05.639
when it comes to mental health, we know the particularly
100
00:05:05.639 --> 00:05:09.759
with young people, we have lots of media headlines decrying
101
00:05:09.800 --> 00:05:12.079
all the negative impacts that technology is having, and we
102
00:05:12.120 --> 00:05:15.519
blame social media and smartphones for the decline in young
103
00:05:15.519 --> 00:05:16.519
people's mental health.
104
00:05:17.000 --> 00:05:19.560
In particular. There's studies that show that there's.
105
00:05:19.399 --> 00:05:23.079
Definitely a correlation between smartphone and social media use and
106
00:05:23.079 --> 00:05:26.839
poorer mental health outcomes, but that research is only correlational.
107
00:05:27.000 --> 00:05:29.879
We don't know which way the directional arrow points. Is
108
00:05:29.920 --> 00:05:33.439
it that young people with existing mental health issues gravitate
109
00:05:33.480 --> 00:05:36.079
to the online world because it fulfills some of their
110
00:05:36.079 --> 00:05:39.279
psychological needs, or is it the other way around, where
111
00:05:39.319 --> 00:05:42.480
technology is actually causing some of these poor mental health outcomes.
112
00:05:42.959 --> 00:05:45.920
The researcher is still in its infancy in this regard.
113
00:05:46.279 --> 00:05:48.040
But when it comes to mental health, I think it's
114
00:05:48.079 --> 00:05:52.199
what our digital devices are displacing that's having the negative impact.
115
00:05:52.319 --> 00:05:56.279
We know, for example, young people primary school, secondary school,
116
00:05:56.319 --> 00:06:00.480
and even adults, our sleep is diminishing both quantity and
117
00:06:00.480 --> 00:06:02.839
and a lot of that can be attributed to our
118
00:06:03.160 --> 00:06:08.240
tech habits. We're often, myself included, guilty of scrolling before
119
00:06:08.279 --> 00:06:10.000
we go to bed, and we can talk in a
120
00:06:10.000 --> 00:06:11.639
moment about why it's so hard.
121
00:06:11.360 --> 00:06:12.360
To stop that scroll.
122
00:06:12.439 --> 00:06:15.240
There are some very clever design techniques that makes it
123
00:06:15.319 --> 00:06:17.560
near impossible. I don't know about ucas, but I'm often
124
00:06:17.600 --> 00:06:21.519
guilty of finding it impossible to stop that scroll late,
125
00:06:21.680 --> 00:06:23.199
not even though I'm tired, and even though I know
126
00:06:23.319 --> 00:06:25.639
I shouldn't be doing it, but there I am on Instagram,
127
00:06:25.959 --> 00:06:29.680
tapping and swiping away. So it's having a negative impact,
128
00:06:29.720 --> 00:06:32.680
particular in our sleep. It's having a very big impact
129
00:06:32.720 --> 00:06:36.519
on our interpersonal relationships, how we interact with other people.
130
00:06:36.959 --> 00:06:38.759
Numerous studies have shown that just the.
131
00:06:38.639 --> 00:06:41.560
Presence of your phone while you are having a conversation
132
00:06:41.759 --> 00:06:45.560
with somebody diminishes the quality and quantity of the conversation.
133
00:06:45.680 --> 00:06:47.680
Even if you don't pick it up. Just the mere
134
00:06:47.720 --> 00:06:51.639
digital presence impacts that we also know we are in
135
00:06:51.839 --> 00:06:56.680
leading incredibly sedentary lives, and now, especially if we're working
136
00:06:56.680 --> 00:06:59.000
from home or some hybrid model of working from home,
137
00:06:59.000 --> 00:07:01.959
we're often sitting down than we ever have and that
138
00:07:02.439 --> 00:07:05.240
decrease in physical activity means we're not making you know,
139
00:07:05.279 --> 00:07:08.720
the dopamine and serotonin that we often got from physical activity.
140
00:07:09.560 --> 00:07:11.839
And so there are a whole lot of cascading consequences
141
00:07:11.920 --> 00:07:15.639
because of our screen news. But again, demonizing technology of
142
00:07:15.680 --> 00:07:18.480
saying well, let's give it up is just completely unrealistic.
143
00:07:18.560 --> 00:07:19.759
So it's about it.
144
00:07:19.639 --> 00:07:22.040
Is it really is, and so it's about how do
145
00:07:22.120 --> 00:07:25.319
we find healthy boundaries, how do we use the technology
146
00:07:25.519 --> 00:07:27.480
but use it in ways that isn't going to have
147
00:07:27.639 --> 00:07:29.639
a negative impact on us.
148
00:07:30.360 --> 00:07:32.199
I wanted to pick up on something that you mentioned
149
00:07:32.240 --> 00:07:35.680
there about our distractability. I'm not sure that's the word
150
00:07:35.759 --> 00:07:39.240
that you used, but I went to grab my phone
151
00:07:39.720 --> 00:07:42.040
yesterday when I was doing some work and I needed
152
00:07:42.079 --> 00:07:44.040
to do some maths, and I went to pick up
153
00:07:44.079 --> 00:07:48.199
my phone to do a simple addition add up some numbers,
154
00:07:48.439 --> 00:07:52.040
and fifteen minutes later, somehow I was on Instagram reading
155
00:07:52.079 --> 00:07:56.600
messages then randomly googling something that I'd seen.
156
00:07:57.040 --> 00:07:59.759
What is happening here? And we have a bit of a.
157
00:07:59.800 --> 00:08:02.560
Joke in our household that we've all become goldfish. We
158
00:08:03.120 --> 00:08:07.319
have about a four second attention span. Is that something
159
00:08:07.839 --> 00:08:11.319
that we're all experiencing because of our technology?
160
00:08:11.399 --> 00:08:14.120
Use absolutely what you described. Then I call it the
161
00:08:14.120 --> 00:08:17.560
digital rabbit hole. And it is so easy for us
162
00:08:17.680 --> 00:08:20.759
to go down that digital rabbit hole. And so one
163
00:08:20.759 --> 00:08:22.800
of the things, I mean, there's so many strategies we
164
00:08:22.800 --> 00:08:24.639
can put in place, and I'm sure we'll get to that,
165
00:08:24.839 --> 00:08:28.680
but just think the choice. The strategic color of the
166
00:08:29.040 --> 00:08:33.840
icons have been designed by psychologists to be psychologically appealing.
167
00:08:33.919 --> 00:08:37.200
When Steve Jobs released the first iPod Touch, in his
168
00:08:37.360 --> 00:08:40.000
press release, he said he wanted the icons to be
169
00:08:40.039 --> 00:08:44.080
so appealing that users wanted to lick this their devices.
170
00:08:44.639 --> 00:08:45.519
So that tells you.
171
00:08:45.480 --> 00:08:49.679
About something about the intentional design. The fact that our
172
00:08:50.159 --> 00:08:53.320
notification bubble is usually red and has a number in it.
173
00:08:53.399 --> 00:08:55.240
You know that metric and color.
174
00:08:55.000 --> 00:08:58.600
Choice is some of the subtle ways, but very powerful
175
00:08:58.639 --> 00:09:02.480
ways that gets us hooked in to constantly checking the
176
00:09:02.519 --> 00:09:05.519
fact that we have alerts and notifications pinging and coming
177
00:09:05.559 --> 00:09:09.720
to us constantly tricks our brain into thinking that everything
178
00:09:09.799 --> 00:09:12.159
is urgent and important. We have a brain that is
179
00:09:12.200 --> 00:09:15.919
hardwired to seek novelty and always find new and interesting things,
180
00:09:16.240 --> 00:09:18.159
and when we are constantly being.
181
00:09:17.960 --> 00:09:19.840
Bombarded, our brain doesn't know.
182
00:09:19.799 --> 00:09:23.720
How to cope because we basically have ancient paleolithic brains
183
00:09:23.799 --> 00:09:26.279
that were designed to go out and forage and hunt
184
00:09:26.320 --> 00:09:29.799
for information. But now the exact opposite is happening. The
185
00:09:29.840 --> 00:09:33.240
information is coming to us constantly. And I often refer
186
00:09:33.279 --> 00:09:36.200
to this as infobesity. You know, we are literally drowning
187
00:09:36.279 --> 00:09:37.600
in information constantly.
188
00:09:38.080 --> 00:09:41.960
That is such an interesting term, infobesity. You're so right,
189
00:09:42.039 --> 00:09:44.039
We're drowning in information.
190
00:09:44.399 --> 00:09:44.600
Yeah.
191
00:09:44.639 --> 00:09:47.279
The average adult now consumes the equivalent of three point
192
00:09:47.279 --> 00:09:50.759
four gigabytes of data every day, which is just mind blowing.
193
00:09:50.799 --> 00:09:53.000
And our brains have a finite as you know, a
194
00:09:53.039 --> 00:09:56.080
finite capacity. We've got a mental lobe that we can carry.
195
00:09:56.360 --> 00:09:59.159
And so when information is constantly bombarding us, and we
196
00:09:59.200 --> 00:10:02.120
have alerts and noteifications that ping and ding and trick
197
00:10:02.159 --> 00:10:05.200
our brain into thinking they're urgent and important, often we
198
00:10:05.279 --> 00:10:08.440
activate our sympathetic nervous system, and so we don't think
199
00:10:08.480 --> 00:10:12.279
logically about this. So we find ourselves picking up and scrolling.
200
00:10:12.879 --> 00:10:15.399
The fact that we never know, and one of the
201
00:10:15.559 --> 00:10:18.639
biggest reasons all of us, particularly kids, can get hooked
202
00:10:18.639 --> 00:10:23.279
on social media and virtual multiplayer video games, is because
203
00:10:23.600 --> 00:10:25.960
your phone and your digital devices that you love have
204
00:10:26.039 --> 00:10:28.440
been designed to be a little bit like a poker machine.
205
00:10:28.639 --> 00:10:31.720
They offer what we call intermittent variable rewards. So when
206
00:10:31.759 --> 00:10:35.320
you dived into your Instagram inbox yesterday or DMS.
207
00:10:35.279 --> 00:10:36.480
And you work quite.
208
00:10:36.320 --> 00:10:38.879
Sure if there was going to be something interesting or
209
00:10:38.919 --> 00:10:42.039
a kind message, or something a little controversial, and so
210
00:10:42.080 --> 00:10:45.600
it's that unpredictable reward ratio that gets us hooked into
211
00:10:45.639 --> 00:10:50.320
constantly checking. But the biggest design technique, and this is
212
00:10:50.360 --> 00:10:52.720
where I find it hard to go down the digital
213
00:10:52.799 --> 00:10:57.200
rabbit hole myself, is that these devices are a bottomless bowl.
214
00:10:57.480 --> 00:11:01.480
There's no stopping cube, there's no endpoint. We enter what
215
00:11:01.559 --> 00:11:05.240
I call the state of insufficiency. We never ever feel done.
216
00:11:05.320 --> 00:11:08.600
And I mean think about when you refresh Instagram or
217
00:11:08.879 --> 00:11:10.240
your social media, you ever.
218
00:11:10.120 --> 00:11:11.919
Get to the bottom of the if you can scroll
219
00:11:11.919 --> 00:11:13.320
and scroll and you never find the bottom.
220
00:11:13.600 --> 00:11:16.000
Yeah, it's like that, you know, those beautiful infinity pools
221
00:11:16.000 --> 00:11:18.639
that just seem to keep going and going and.
222
00:11:18.559 --> 00:11:19.799
Even just thig gesture.
223
00:11:19.919 --> 00:11:22.200
You know, when you refresh your social media feed, you
224
00:11:22.240 --> 00:11:26.039
pull down. It's the exact same action of pulling the
225
00:11:26.159 --> 00:11:29.120
lever on a poker machine. And so they are all
226
00:11:29.159 --> 00:11:33.840
these very subtle but very powerful techniques, and unfortunately a
227
00:11:33.840 --> 00:11:37.080
lot of our tech companies exploit these. You know, on
228
00:11:37.159 --> 00:11:42.159
YouTube now and most streaming services, the AutoPlay feature is
229
00:11:42.200 --> 00:11:44.919
the default setting, you know, how it rolls from one
230
00:11:44.919 --> 00:11:48.240
episode into another, and you plan to just watch one
231
00:11:48.240 --> 00:11:51.039
episode and before you know it, you're binge watching some
232
00:11:51.360 --> 00:11:53.000
trashy TV series.
233
00:11:53.559 --> 00:11:58.200
These devices have been deployed that way. So it's a
234
00:11:58.200 --> 00:12:01.399
hard battle. And that dot rabbit hole is something so
235
00:12:01.480 --> 00:12:02.679
many of us experience.
236
00:12:03.399 --> 00:12:06.639
When we talk about how much information we're consuming or
237
00:12:06.960 --> 00:12:10.879
partially consuming, perhaps I'm really curious to know how much
238
00:12:10.919 --> 00:12:15.120
of that we're actually retaining and for us as well
239
00:12:15.159 --> 00:12:15.600
as our.