Cory Doctorow sunglasses seem normal. But they're a long way from seeing them in security videos where his face turns into a bright white ball.
In his local credit union, amused cashiers notice the curious sight on nearby monitors and sometimes ask, "What's wrong with your head?" Doctorow said, giggling.
The frames of his sunglasses from the Chicago-based Reflectacles eyewear line are made of a material that reflects the infrared light from surveillance cameras and represents a marginal movement of privacy advocates experimenting with clothing, elaborate makeup, and accessories to protect themselves against surveillance technologies ,
Some carriers are driven by the desire to say goodbye to so-called "surveillance capitalism" – an economy that converts human experiences into data for profit-making – while others fear the government's privacy violation.
"People have long been interested in technologies that can make people invisible," said Dave Maass, senior investigative researcher at the San Francisco-based nonprofit Electronic Frontier Foundation. In response to the revival of the Ku Klux clan in the 1920s and 1950s, numerous states passed anti-mask laws to prohibit groups of people from hiding their identities.
And Maass noted an increase in digital surveillance countermeasures after former National Security Agency contractor Edward Snowden announced his findings on American surveillance programs around the world in 2013.
Today, artificial intelligence (AI) technology, such as facial recognition, is widely used in public and private spaces – including schools, retail stores, airports, concert halls, and even to unlock the latest iPhones. Civil rights groups concerned about the potential for abuse have asked politicians to regulate the systems. For example, a recent Washington Post investigation found that FBI and immigration and customs officials used face recognition to scan millions of American driver's licenses without their knowledge to identify suspects and undocumented immigrants.
Researchers have long criticized the lack of control over AI because it may be biased. A recent study by the National Institute of Standards and Technology, which looked at facial recognition algorithms from Microsoft and Intel, showed that Asians and blacks are up to 100 times more likely to be misidentified than whites. In situations where two different photos of a person are compared to confirm identity, e.g. For example, when checking passports, the study found that Native Americans are least likely to be identified by all U.S. populations. Pictures of black women are more likely to be incorrectly compared to photos of other women in an FBI database.
This study is based on previous research that found the Amazon facial analysis system to have a higher error rate in identifying women with darker skin tones than men with lighter skin tones.
Daniel Castro, vice president of the Information Technology and Innovation Foundation, a nonprofit think tank, believes that comparing images to a wider range of databases, which are more diverse, could reduce the error rate.
Facial recognition systems have proven effective in prosecuting criminal investigations and are more accurate than people in verifying the identity of people at border crossings. Developing policies and practices on data retention and use could prevent government abuse.
"The general use of this technology in the United States is very reasonable," said Castro. "They are carried out by police authorities who try to reconcile the public security interests of the communities with the privacy of the individual."
Nevertheless, the glasses in Doctorow's eyes serve as a start to the conversation about the dangers of giving governments and companies unrestricted access to our personal data.
The motivation to seek antidotes to an overpowering force has political and symbolic meaning for Doctorow, an LA-based science fiction writer and data protection lawyer. His father's family fled the Soviet Union, which used surveillance to control the masses.
"We are far too confident that surveillance technologies will be developed by people we agree with to achieve goals that we are happy to support," he said. "Developing this technology and not taking countermeasures is a road map to tyranny."
Recent iterations of Reflectacles prevent certain forms of 3D face recognition from detecting matches in a database through special lenses that block infrared lights that are used to map people's faces, glasses designer Scott Urban said.
The lenses of normal sunglasses become clear under any form of infrared light, but the special wavelength absorbers that are burned into the Urban lenses absorb the light and color it black.
Thanks to the absorbent quality of Reflectacles, you can effectively block Face ID on the latest iPhones. Although Urban said that the glasses are not meant to escape facial recognition that does not use infrared light, in such systems they decrease the likelihood of a positive match.
As a long-standing advocate of data protection, Urban has avoided the introduction of intelligent technologies with which his personal data can be saved. "My grandmother and I are the only ones who still have a smartphone," Urban said over the phone.
He believes there is an appetite for discrete equipment that preserves people's anonymity, as his Kickstarter campaign shows, supported by 311 people who pledged $ 41,315 after the July launch.
Some of his customers have turned to Reflectacles for security reasons. According to Urban, the conflicts between protesters for democracy and the police in Hong Kong increased over the summer. The activists in the region wanted to protect their identity.
Other forms of anti-surveillance camouflage are elaborate face paintings that thwart computer vision, such as the patterns designed by the artist Adam Harvey. The characteristic black and white complexion of Juggalos, the die-hard fans of the hip-hop duo Insane Clown Posse, can also block some facial recognition systems.
Designer Leo Selvaggio sacrificed his own identity to hide others by creating a mask from a 3D scan of his face.
Hyper-realistic masks have been used for criminal purposes, for example by bank robbers who wanted to hide their identity and thwart police investigations. A recent study on cognitive research: principles and implications found that 20 percent of the time, participants believed that hyper-realistic masks were real faces.
Some devices demonstrate the faultiness of AI systems that are used in investigations. LA-based cybersecurity analyst Kate Rose created her own fashion line called Adversarial Fashion to disguise automatic license plate readers.
She also made clothes and printed pictures of unused and fake license plates on fabric to make shirts and dresses. When the wearers pass traffic stops at the AI systems, the machines read the pictures on the clothing as plates and in turn enter junk data into the technology.
For Rose, her line is a playful message that shows that "this technology is based on something that is easy to mess with," she said.
Seattle had its own problems with surveillance technologies that led to the creation of the Seattle Surveillance Ordinance in 2017. The use of automatic license plate readers by the Seattle Police Department (SPD) has given cause for concern. In an impact assessment sent to the Seattle City Council last spring, an external working group criticized the SPD's ability to scan more than 13.5 million license plates per year from drivers who were not suspected of any crime.
Despite the increasing fashion trend, more people are buying surveillance technologies like Amazonas Ring to protect their property than buying anti-surveillance equipment, Castro said: “The reason is that people have a sense of security when using this technology. They want to know if a crime has been committed, that they have evidence, recourse, and certainty about their own property. "
San Francisco, Berkeley and Oakland, California, and Somerville, Massachusetts, have banned the use of facial recognition technology by government agencies last year. Last fall, California banned pairing of face recognition and biometric scanning with police-worn body cameras for the next three years. The Portland City Council is considering going one step further by banning private companies and government agencies from using the technology. At the congress level, legislative proposals such as the Accountability Act and a resolution establishing guidelines for the ethical development of AI signaled growing support for federal regulation.
In a broader sense, technology causes people to react. Last September, the Digital Rights Group Fight for the Future launched a campaign to prevent festivals and venues from using facial recognition technologies to scan concert-goers. Burning Man, Coachella, Bumbershoot and Lollapalooza have sworn, according to the nonprofit organizations, not to use the technology.
While Evan Greer, deputy director of Fight for the Future, welcomes creative ways to avoid surveillance, she believes the choice of officers to regulate AI systems is a must.
"People shouldn't have to wear special glasses, jewelry, or face masks when leaving their homes to be safe or protect their basic civil liberties," said Greer. “The public has to fight to keep this technology away from our schools, airports and public places. We can't give up now and literally throw a bag over our heads. "
What impact could artificial intelligence have on your work?
If you have any questions about AI, ask us. Use the form below to share questions about the impact of artificial intelligence on your work.