WEBVTT FILE
X-TIMESTAMP-MAP=MPEGTS:0,LOCAL:00:00:00.000

1
00:00:05.088 --> 00:00:07.007
The 41 million Gen-Z

2
00:00:07.007 --> 00:00:10.410
voters could make a difference
in the expected razor thin election.

3
00:00:10.910 --> 00:00:13.730
As media reporter Aidan
Ryan found out, meeting them

4
00:00:13.730 --> 00:00:16.733
where they are isn't politics as usual.

5
00:00:16.883 --> 00:00:18.568
Well, thank you so much for coming in.

6
00:00:18.568 --> 00:00:19.552
Thanks for having me.

7
00:00:19.552 --> 00:00:25.375
The globe surveyed 150 local Gen-Z
residents and interviewed a couple dozen

8
00:00:25.375 --> 00:00:29.012
about how they say they are
getting informed about the 2024 election.

9
00:00:29.479 --> 00:00:30.530
What did you find out?

10
00:00:30.530 --> 00:00:32.665
Well, I found out that the youngest voters

11
00:00:32.665 --> 00:00:36.052
are going to a lot of different sources
to get information about the election.

12
00:00:36.603 --> 00:00:38.338
many are still going
to traditional sources,

13
00:00:38.338 --> 00:00:41.958
you know, seeing getting New York
Times, breaking news headlines.

14
00:00:42.659 --> 00:00:44.027
same thing with the globe.

15
00:00:44.027 --> 00:00:46.012
but they're also getting
a lot of information

16
00:00:46.012 --> 00:00:49.883
on social media,
particularly TikTok, Instagram, Twitter,

17
00:00:50.300 --> 00:00:53.303
and even platforms like Twitch,
which is the streaming site.

18
00:00:53.369 --> 00:00:56.372
so a lot of these younger voters are

19
00:00:56.456 --> 00:00:59.459
a few of them,
even said that they do seek out,

20
00:00:59.692 --> 00:01:02.996
news from traditional sources,
but they're seeing the most political

21
00:01:02.996 --> 00:01:04.764
content on places like TikTok.

22
00:01:04.764 --> 00:01:06.382
So this is all algorithm based.

23
00:01:06.382 --> 00:01:10.970
So are they literally clicking through,
trying to search and look for news,

24
00:01:10.970 --> 00:01:14.874
or is it just what pops up while they're
searching around for other stuff?

25
00:01:14.958 --> 00:01:16.109
It's a lot of the latter.

26
00:01:16.109 --> 00:01:20.663
I think a lot of younger voters,
are going to places like

27
00:01:20.663 --> 00:01:24.117
TikTok and Instagram for entertainment,
to catch up on their friends,

28
00:01:24.267 --> 00:01:27.737
but they're also being served a lot
of political content on those platforms.

29
00:01:28.171 --> 00:01:31.641
so you can see a kind of funny
meme or video on TikTok.

30
00:01:31.641 --> 00:01:35.812
And then the next, post is,
the Harris campaign, which is,

31
00:01:36.379 --> 00:01:38.848
putting out a lot of time on

32
00:01:38.848 --> 00:01:42.035
TikTok and other social media platforms
to put up clips of Harris.

33
00:01:42.051 --> 00:01:46.473
It could be news influencers
who are kind of this new age of,

34
00:01:47.507 --> 00:01:50.210
journalist
slash personality slash influencer

35
00:01:50.210 --> 00:01:55.598
who are, you know, doing their own news
reporting, or coverage on these platforms.

36
00:01:55.799 --> 00:01:57.200
it runs the gamut.

37
00:01:57.200 --> 00:01:59.969
I mean, in some ways, it's similar

38
00:01:59.969 --> 00:02:02.972
to TV, where you're going to come on
and turn it on to be entertained.

39
00:02:02.972 --> 00:02:05.959
Maybe you flip through and see
a news channel, maybe you catch some news.

40
00:02:06.059 --> 00:02:07.127
Obviously a different experience.

41
00:02:07.127 --> 00:02:11.564
If you went to Boston globe.com, went to
politics or went to whatever else section,

42
00:02:11.581 --> 00:02:15.568
and you're actively seeking it out,
but algorithms are driving it.

43
00:02:15.602 --> 00:02:20.507
How is does this new era of media
consumption change politics?

44
00:02:20.690 --> 00:02:21.774
Yeah, well, it changes.

45
00:02:21.774 --> 00:02:25.278
I mean, I think when you go to a place
like the globe or The New York Times

46
00:02:25.278 --> 00:02:28.398
or a traditional journalism outlet
to get that information, you know, there

47
00:02:28.398 --> 00:02:31.434
are, rules and ethics and kind of a,

48
00:02:32.669 --> 00:02:35.672
you know,
a guide to kind of how to do that work.

49
00:02:35.672 --> 00:02:36.573
It's journalistic.

50
00:02:36.573 --> 00:02:39.242
That's not always the case
in this new area of media.

51
00:02:39.242 --> 00:02:42.896
you could have,
anyone can set up a TikTok account

52
00:02:42.896 --> 00:02:47.750
and make videos to stream to, people
across the country and across the world.

53
00:02:48.384 --> 00:02:52.071
and so I think what it means for politics
is that the

54
00:02:53.122 --> 00:02:56.309
sources of information
today are just so much more varied,

55
00:02:56.309 --> 00:02:58.344
so much more fragmented
than they used to be.

56
00:02:58.344 --> 00:03:00.864
You can get information
that is completely biased

57
00:03:00.864 --> 00:03:03.867
and doesn't go through
kind of normal journalistic standards.

58
00:03:04.434 --> 00:03:08.838
but still think that that is and that's
what can be your primary, source of news.

59
00:03:09.389 --> 00:03:11.507
Oh, put you on the spot.
But you're a military reporter.

60
00:03:11.507 --> 00:03:14.961
I know you personally enough
that you have a soft spot for politics.

61
00:03:15.378 --> 00:03:18.381
You probably thought about what
this election might what might happen.

62
00:03:19.132 --> 00:03:22.919
are you surprised by anything
that you thought was going to happen?

63
00:03:22.919 --> 00:03:24.837
For example, more disinformation

64
00:03:24.837 --> 00:03:28.908
that we, you know, we were prepared for it
more I, that we thought we'd go awry.

65
00:03:29.092 --> 00:03:30.560
You think surprise you?

66
00:03:30.560 --> 00:03:31.678
I think, you know.

67
00:03:31.678 --> 00:03:33.363
It happened or didn't happen,
I guess I should say.

68
00:03:33.363 --> 00:03:37.617
I think when it comes to disinformation,
I think, you know, and talking

69
00:03:37.617 --> 00:03:39.352
with researchers,
I'm talking with academics

70
00:03:39.352 --> 00:03:41.838
who and experts
who study this kind of work.

71
00:03:41.838 --> 00:03:43.673
I think one of the biggest concerns

72
00:03:43.673 --> 00:03:46.960
still is disinformation
coming from political elites.

73
00:03:46.976 --> 00:03:48.928
It's coming from
candidates. It's coming from,

74
00:03:49.879 --> 00:03:51.331
you know, what is

75
00:03:51.331 --> 00:03:55.068
being beamed into the onto their phones
and living rooms across the country.

76
00:03:55.068 --> 00:03:58.171
I think that is still, I think AI is still
a little bit premature.

77
00:03:58.187 --> 00:04:01.190
I think there's been so much hype
about that in the news.

78
00:04:01.224 --> 00:04:05.278
but I still think that, you know, from
just talking and doing that kind of work

79
00:04:05.278 --> 00:04:09.098
and talking to experts,
the misinformation, disinformation.

80
00:04:09.115 --> 00:04:10.316
It's it's it's J.D.

81
00:04:10.316 --> 00:04:13.019
Vance making up a story about Springfield.
Ohio.

82
00:04:13.019 --> 00:04:15.922
It's not about AI creation of something
that didn't happen.

83
00:04:15.922 --> 00:04:18.308
I think that's still a big concern.
I mean, there certainly are concerns.

84
00:04:18.308 --> 00:04:21.110
And there's been cases
even here in New England, the robo

85
00:04:21.110 --> 00:04:24.113
there in the primary, with,
you know, the pretending to be Joe Biden.

86
00:04:25.098 --> 00:04:27.300
I think, you know,
we've seen cases of that,

87
00:04:27.300 --> 00:04:30.303
but I think those are much smaller
versus the,

88
00:04:30.320 --> 00:04:33.573
you know, false narratives
that are being peddled,

89
00:04:34.607 --> 00:04:36.392
by candidates and by.

90
00:04:36.392 --> 00:04:38.344
Yeah, like the example of JD
Vance. Correct.

91
00:04:38.344 --> 00:04:39.362
This is so interesting.

92
00:04:39.362 --> 00:04:40.546
And it's going to be really interesting

93
00:04:40.546 --> 00:04:43.549
to see what happens on the actual election
as we get closer.

94
00:04:43.549 --> 00:04:45.618
Media reporter Aiden Ryan,
thanks for joining us.

95
00:04:45.618 --> 00:04:47.220
Thanks for having me.

96
00:04:47.220 --> 00:04:50.640
For daily access
to Boston Globe today, all segments

97
00:04:50.640 --> 00:04:54.477
and episodes are available on demand
in the Boston Globe app.

98
00:04:54.477 --> 00:04:55.695
By clicking Watch.

