WEBVTT

1
00:00:00.000 --> 00:00:03.439
[ SOUND ] One of the wildest things I read this 

2
00:00:03.439 --> 00:00:06.229
weekend, the New York Times reported on a facial 

3
00:00:06.229 --> 00:00:08.579
recognition company called Clearview, which is 

4
00:00:08.609 --> 00:00:10.629
partnered with hundreds of police departments across 

5
00:00:10.629 --> 00:00:13.459
the US. Most facial recognition works this way. You 

6
00:00:13.459 --> 00:00:16.399
have a photo in a database, and when it scans faces in 

7
00:00:16.399 --> 00:00:18.999
public, it matches that person with that photo. So 

8
00:00:18.999 --> 00:00:21.329
unless you're already logged in there, it doesn't really 

9
00:00:21.669 --> 00:00:25.099
work on you. Clear View Works in a much different way 

10
00:00:25.559 --> 00:00:28.439
the entire internet is basically the database so the 

11
00:00:28.439 --> 00:00:32.569
Clear View a I had basically scraped Facebook 

12
00:00:32.569 --> 00:00:35.369
venmo YouTube anywhere where there's a photo of 

13
00:00:35.409 --> 00:00:38.699
available online attached to a name publicly. It has it, 

14
00:00:38.699 --> 00:00:41.979
so the way that Clearview would work is if I took a 

15
00:00:41.979 --> 00:00:46.009
photo of you, it would scour this massive database that 

16
00:00:46.009 --> 00:00:48.569
already scraped the entire Internet, to match it with 

17
00:00:48.619 --> 00:00:51.489
somebody instead. So that is the future that we have to 

18
00:00:51.489 --> 00:00:53.739
look forward to with it Ai and facial recognition. So if 

19
00:00:53.829 --> 00:00:56.619
I'm reading into this correctly to clear view what they 

20
00:00:56.619 --> 00:00:59.029
did goes against the terms and conditions of a lot of 

21
00:00:59.029 --> 00:01:01.619
those websites. But again, it's completely legal for 

22
00:01:01.619 --> 00:01:03.859
them to do that, like they were able to create this 

23
00:01:03.859 --> 00:01:07.419
database. I believe of billions of different photos. And 

24
00:01:07.419 --> 00:01:09.309
it doesn't go against anything from the government. 

25
00:01:09.309 --> 00:01:11.749
The government can't say, you weren't allowed to do 

26
00:01:11.749 --> 00:01:15.444
that. No, yeah, there are no laws on facial recognition, 

27
00:01:15.444 --> 00:01:18.159
or at least federally. There are laws in Illionis and 

28
00:01:18.159 --> 00:01:22.329
Texas, but this is different Where it's scraping 

29
00:01:22.329 --> 00:01:24.829
the websites. It's perfectly legal. But there are also 

30
00:01:24.909 --> 00:01:26.429
some additional laws in what, Somerville, 

31
00:01:26.429 --> 00:01:29.059
Massachusetts. That's different. San Francisco. That's 

32
00:01:29.059 --> 00:01:32.169
specifically for police are banned from using facial 

33
00:01:32.169 --> 00:01:35.189
recognition. So, those are Some additional layers but 

34
00:01:35.219 --> 00:01:37.339
yeah, there are some pretty gaping holes here as well. 

35
00:01:37.339 --> 00:01:41.479
Yeah. So the way that this works is is more like if 

36
00:01:41.479 --> 00:01:44.419
I had taken a photo of you It's like she's zamfir faces 

37
00:01:44.769 --> 00:01:48.029
where it looks through this whole database. Yeah, I've 

38
00:01:48.029 --> 00:01:49.859
entered fine like maybe your photo on LinkedIn and 

39
00:01:49.909 --> 00:01:53.089
find your photo on Facebook. More likely than not, it 

40
00:01:53.089 --> 00:01:55.339
would probably find your photo from this video, and 

41
00:01:55.339 --> 00:01:59.139
be able to say, that is Ben of CNET. He covers 

42
00:01:59.139 --> 00:02:01.639
Amazon, and I'm not gonna say on air where you live, 

43
00:02:01.639 --> 00:02:04.479
or anything like that. But it would be able to note that 

44
00:02:04.479 --> 00:02:07.949
detail, and connect it with you there. Yeah, we have an 

45
00:02:08.039 --> 00:02:11.089
editor that works here that is extremely suspicious of 

46
00:02:11.089 --> 00:02:14.089
getting his photo taken and putting it anywhere online. 

47
00:02:14.269 --> 00:02:16.819
And I used to think he was super paranoid, but 

48
00:02:16.859 --> 00:02:20.464
apparently, he was very much onto something. Yeah, I 

49
00:02:20.464 --> 00:02:22.849
was looking for my ski mask before I came onto this 

50
00:02:22.849 --> 00:02:26.239
show, but I think I threw it out by accident. But- That's 

51
00:02:26.239 --> 00:02:29.529
too bad. Yeah, I mean, that's just how it is, we are all 

52
00:02:29.529 --> 00:02:32.769
logged in. Some database in the future, where anyone 

53
00:02:32.769 --> 00:02:36.849
can find us. Which, that is the future. 

54
00:02:36.919 --> 00:02:39.059
I mean, obviously, there are privacy advocates working 

55
00:02:39.059 --> 00:02:42.389
against it, and lawmakers who are hoping to establish 

56
00:02:42.389 --> 00:02:46.409
rules against this, but Honestly like the way it is 

57
00:02:46.409 --> 00:02:49.319
right now they can just do all this and that's just how it 

58
00:02:49.319 --> 00:02:53.159
is. Yeah. So if you ever vote online, you know most 

59
00:02:53.159 --> 00:02:54.864
more than likely are being tracked. [ MUSIC ] 