Toronto Escorts

Controversial Clearview AI app could 'end privacy.' So, what now?

Charlemagne

Well-known member
Jul 19, 2017
15,451
2,483
113
Controversial Clearview AI app could 'end privacy.' So, what now?

Company says it doesn't have plans to offer app to consumers

Ramona Pringle·CBC News·Posted: Feb 01, 2020 4:00 AM ET | Last Updated: February 1

A powerful and controversial new facial recognition app can identify a person's name, phone number and even their address by comparing their photo to a database of billions of images scraped from the internet. Now, a class-action lawsuit is taking on the startup, arguing that its app is a threat to civil liberties.

In a New York Times investigation, journalist Kashmir Hill revealed how a groundbreaking yet little-known facial recognition tool could "end privacy as we know it."

The app in question, Clearview AI, has the capacity to turn up search results, including a person's name and other information such as their phone number, address or occupation, based on nothing more than a photo.

Who's using it?

While it's not available for public use — you won't find it in the App Store — according to the company, it's already being used by more than 600 law enforcement agencies.

Even though Clearview AI says it doesn't have plans to make a consumer-facing version of the app, it's easy to imagine a copycat jumping on what they deem to be a lucrative market opportunity. We already outsource parts of our memories, turning to tech to help us remember things like phone numbers; an app that could help you recall people's names at conferences or reunions feels like a natural evolution of our current use of our smartphones.

"More digital memories are going to be appearing," says Ann Cavoukian, the executive director of the Global Privacy and Security by Design Centre. "And if we don't address these issues in terms of preventing non-consenting access to this data, we're going to lose the game."

So now what?

Now that a tool like this is on the market, is there any hope for putting the proverbial data genie back in its bottle, or is this in fact the end of anonymity?

"At this point with facial recognition, the cat is out of the bag. We've seen multiple implementations of it in the public and private sector. Even if this isn't used now, someone will use it," says Tiffany C. Li, an attorney and visiting scholar at the Boston University School of Law.

According to Li, the best option is to regulate both the creation and the use of the technology.

"It's easy to say you should regulate companies like Clearview AI, which create these services," she says. But, she says, the big picture is more complex.

"Who are they selling them to? And if they're working with third parties, how can we make sure that those companies don't misuse the technology?"

Li notes that in addition to laws by which companies would need to abide, there needs to be built-in recourse for individuals to protect their privacy and their rights.

Regulation seems the best bet for preserving privacy

Indeed, that could already be proving to be our best hope. In Illinois, a lawsuit seeking class-action status was just filed against Clearview AI claiming the company broke privacy laws, namely the state's Biometric Information Privacy Act (BIPA). The law safeguards Illinois residents from having their biometric data used without consent, and the lawsuit argues that the app's use of artificial intelligence algorithms to scan the facial geometry of each individual depicted in the images violates multiple privacy laws.

Facebook loses facial recognition technology appeal, must face class-action lawsuitBlack and Asian faces misidentified more often by facial recognition software

The lawsuit, which is seeking, among other things, an injunction to stop Clearview from continuing its business, argues that the company "used the internet to covertly gather information on millions of American citizens, collecting approximately three billion pictures of them, without any reason to suspect any of them of having done anything wrong, ever."

Laws protecting people's biometric data could prove to be our best bet when it comes to preserving any semblance of privacy, says privacy law scholar Frank Pasquale, but regulatory safeguards like Illinois's BIPA are still few and far between.

Flipping the way we think about the use of big data

Because technology advances at light speed compared to the laws meant to keep those who use it safe, many privacy advocates are pushing for a moratorium on the use of facial recognition.

A temporary ban would give regulators a chance to catch up, lest the technology advance past a point of no return, Pasquale says.

Our current way of dealing with privacy is broken, says Pasquale, who argues that "we can't expect individual users to keep track of all of the data that is being gathered about them, and what is being done with that data."

Instead, he says, we must flip the way we think about how big data, and technologies like facial recognition, are used.

"The current presumption is that any use of this data is fine, absent an explicit governmental regulation," says Pasquale, who argues that the opposite should be the case.

Pasquale says, given the dangers of facial recognition —– such as its tendency to misidentify individuals and foster biases — we ought to require organizations to have approvals in place before operating with data. Private entities, he says, should have to obtain a licence from a governmental authority "specifying the nature of the approved use, vetting the validity of the underlying data and specifying modes of recourse for those adversely impacted."

As for whether the availability of a tool like Clearview AI means that privacy as we know it is over, "this is going to be a tough one, because the technology is out there," says Cavoukian.

But, she says, laws that protect users and their data will go a long way toward preventing harm, adding that, in her mind, "there is no turning point that doesn't allow you to return to greater privacy."

https://www.cbc.ca/news/technology/clearview-app-privacy-1.5447420
 

Smallcock

Active member
Jun 5, 2009
13,703
21
38
What's next is big government abuse. They already have your DNA. You fools gave away all of your rights to privacy, not only when you signed up for social media, but when you sent those little DNA kits back to ancestry.com Instead of "owning" private companies like ancestry, the government simply compels them to give up their database. The separation is in name only.

They now know everything about you. They recognize your face on camera everywhere you go. They track your exact location using your phone. They know all of your habits and likes based on social media and search engine usage. They track your spending because you use plastic instead of paper.

Soon they will begin persecuting political dissidents under the rubric of "science". If soldiers of the State were to handcuff and haul off your neighbor for being a terrorist and told you that it was proven with DNA, facial recognition, phone/CC tracking evidence, etc would you confront them? They know how little you know. They know that you're ill-equipped to question them. You never had much power to begin with but now even that smidgen is gone.

You'll end up like Epstein. Murdered in plain sight and nobody held responsible. His murder was the test. Take a bad guy, murder him with the entire world watching, and see that nothing happens. It worked!
 
Toronto Escorts