MADDOW DEBUNKS: Weird fake news, A.I. slop stories about her and MSNBC infect social media

Full Transcript — Download SRT & Markdown

00:00
Speaker A
I do not live in Texas.
00:12
Speaker A
I like Texas very much, but I have never spent all that much time there. I've never put down roots there.
00:22
Speaker A
I definitely do not own a house in Texas.
00:30
Speaker A
Relatedly, because I do not live in Texas and I don't have a house in Texas and I haven't visited there at all this past year.
00:40
Speaker A
I did not need to be rescued in the Texas floods this month.
00:46
Speaker A
Nor was I in Texas personally rescuing people from the Texas floods when they happened.
00:55
Speaker A
Also, just so you know, I do not have a long-lost daughter with whom I have recently been reunited.
01:00
Speaker A
I do not have a daughter. I do not have a baby. I do not have a baby on the way. No kids, no kids at all. I've never had a baby myself. My partner Susan never had a baby.
01:15
Speaker A
We've never adopted a baby or had a baby by surrogacy or by any other means by which one might reasonably end up with a baby.
01:29
Speaker A
So no house in Texas, nothing to do with the terrible Texas floods.
01:40
Speaker A
I don't have a long-lost daughter. I don't have any kind of baby anyway, anyhow.
01:47
Speaker A
I also have not been fired by MSNBC.
01:54
Speaker A
Look.
01:57
Speaker A
Here I am.
02:00
Speaker A
I still work here.
02:03
Speaker A
You see me now on MSNBC.
02:06
Speaker A
I have been here since 2005, I think.
02:11
Speaker A
I am happy as a clam here at MSNBC.
02:16
Speaker A
Relatedly, I have not founded my own news network.
02:20
Speaker A
Nor am I planning to.
02:22
Speaker A
Why would I do that when I work at MSNBC?
02:27
Speaker A
Without a baby.
02:30
Speaker A
And without a house in Texas.
02:34
Speaker A
And I say all this because if you have been looking at the internet machine,
02:40
Speaker A
particularly if you have been looking at Facebook,
02:46
Speaker A
you may have heard or seen stuff about me that contradicts those things that I just told you.
02:58
Speaker A
And, you know, I've been around for a while. I've been in this business for a while.
03:05
Speaker A
I'm used to people saying fake stuff about me to get a rise out of people or to get under my skin or for some other self-serving purpose.
03:12
Speaker A
I'm used to it.
03:14
Speaker A
I don't particularly care.
03:16
Speaker A
But I got to say, it's different now.
03:21
Speaker A
Now, because of AI, the fake stories are more compelling to people and more believable.
03:29
Speaker A
They're really finally targeted and calibrated to tell people things that they want to hear or specifically to tell them things that they want to click on to learn more about.
03:40
Speaker A
The biggest change, I think, is that there are often now AI-generated pictures or even videos that are designed to make you think that you're not only reading a story that has some appeal to you,
03:50
Speaker A
but there's visual proof of what the story says.
03:56
Speaker A
And AI will generate any visual proof that you want.
04:00
Speaker A
So there I am, rescuing people supposedly in the Texas floods.
04:07
Speaker A
And and there I am with a baby of some derivation.
04:15
Speaker A
And here I am having a security guard take the White House press secretary off of my set in a TV studio or something.
04:23
Speaker A
None of this is real.
04:25
Speaker A
None of this is real.
04:27
Speaker A
This is all AI generated.
04:31
Speaker A
And to me it all looks really fake.
04:34
Speaker A
Like if if you yourself are the subject of some of this AI slop,
04:40
Speaker A
I can tell you.
04:43
Speaker A
These images like this look clumsy and ham-handed and weird.
04:48
Speaker A
And if you look really closely,
04:51
Speaker A
some of the tells are there.
04:54
Speaker A
The the baby has too many fingers.
04:57
Speaker A
There's a there's a random extra arm in the picture with the old lady in the floodwaters.
05:03
Speaker A
And there's there's these other weird things that are weird and wrong if you look really closely.
05:07
Speaker A
But we don't always all look really closely.
05:10
Speaker A
We don't always have a reason to.
05:12
Speaker A
You know, if you're just scrolling through stuff on the internet, you might think this stuff is real.
05:20
Speaker A
It is way more believable to a lot of people than it used to be before AI took over all social media.
05:30
Speaker A
So, I mean, honestly, don't feel bad.
05:33
Speaker A
It's no fault of your own if you saw any of this and believed any of this.
05:42
Speaker A
For your convenience,
05:46
Speaker A
I will say if you ever do see something about me online and you want to check to see if it's true,
05:53
Speaker A
one place you can go to see if it's true is our website where we catalog this stuff,
05:59
Speaker A
which is
06:03
Speaker A
isthatreallyrachel.com.
06:06
Speaker A
That's the website address.
06:08
Speaker A
isthatreallyrachel.com.
06:11
Speaker A
We try to keep up with this stuff there so you can check to see if it's real.
06:17
Speaker A
Again, is that really Rachel.com?
06:21
Speaker A
In addition, I will mention the website Snopes.
06:24
Speaker A
Snopes.com.
06:26
Speaker A
They always do a really good job debunking this kind of stuff.
06:31
Speaker A
Not just stuff about me, but everything.
06:35
Speaker A
Snopes is great.
06:37
Speaker A
And there's almost nobody else who does it as sort of consistently consistently well as they do it.
06:46
Speaker A
Makes them pretty invaluable.
06:48
Speaker A
You should bookmark Snopes, snopes.com, and you can always check that regularly against stuff that's circulating online or any rumors that you hear in the news if you want to see if it's true, see if it's been debunked.
07:00
Speaker A
But you know, even even though the internet has always been a little bit of a cesspool,
07:07
Speaker A
I I do think I do think that something's happening right now, like this summer,
07:14
Speaker A
with the advent of AI.
07:17
Speaker A
There's there's just been a a tipping point.
07:20
Speaker A
It's just been swamped all of a sudden.
07:22
Speaker A
And very thoroughly.
07:23
Speaker A
I mean, even Google is about 90% less useful than it used to be on a day-to-day basis.
07:31
Speaker A
With the way that they've allowed AI and references to social media that are all fueled by AI to take over their search engine results.
07:39
Speaker A
So even Google's less useful than it used to be.
07:42
Speaker A
But social media in particular, and Facebook is the worst of it,
07:47
Speaker A
is pretty much totally overtaken now by AI slop.
07:53
Speaker A
It is it is really just trash.
07:58
Speaker A
And because the companies don't appear to want to fix it or do anything about it,
08:03
Speaker A
it's not going to get any better.
08:06
Speaker A
This stuff is essentially free and it's become the dominant content in American social media.
08:11
Speaker A
And it's you can see why.
08:13
Speaker A
It's infinite supply.
08:14
Speaker A
These things they don't take human effort, they are created robotically,
08:20
Speaker A
in essentially infinite quantities and for free.
08:25
Speaker A
And what they create and post online and circulate in social media is just stuff that is designed to manipulate you,
08:33
Speaker A
to get you to click something,
08:36
Speaker A
which they can directly monetize.
08:40
Speaker A
Or it's designed to manipulate your feelings or manipulate your perception of what's true and what's going on in the world.
08:50
Speaker A
And, you know, the antidote is the same as it's always been for all of us, right?
08:56
Speaker A
Only use trusted sources of information.
09:00
Speaker A
If somebody is citing something to MSNBC,
09:05
Speaker A
go to msnbc.com and see if you can find it there.
09:08
Speaker A
If somebody is citing something to BBC News or The New York Times or some other name brand news organization,
09:13
Speaker A
you should be able to go to the website of that name brand news organization and find it there.
09:20
Speaker A
Right?
09:24
Speaker A
Always try to figure out where something is coming from exactly.
09:31
Speaker A
Before you believe it and certainly before you share it.

Transcribe Another YouTube Video

Paste any YouTube link and get the full transcript with timestamps for free.

Transcribe a YouTube Video