By Jessica Klein
With the proliferation of tools like facial recognition and video surveillance, the physical security industry has evolved beyond, well, physical precautions. Security companies now monitor threats from behind computer screens–a much more effective method than relying on locks, fences, and on-the-ground surveillance alone.
But technological advancements always come with caveats. Capturing faces on video means owning people’s personal data, and with the many data breaches brought by the past few years, regulators are becoming increasingly careful with how companies are storing and using people’s personal information.
“That ushered in a much more stringent regulatory landscape in Europe,” says Jonathan Tam, an associate at law firm Baker McKenzie, where he helps clients ranging from startups to multinational corporations navigate privacy regulations and cybersecurity.
Privacy concerns have also led to new legislation in the U.S., like the California Consumer Privacy Act. Tam calls it “one of the first comprehensive privacy laws” in the country.
These privacy regulations are forcing industries from marketing to healthcare and beyond to reform their own data practices. And as the security industry begins to incorporate artificial intelligence and other technologies that capture personal data, it may soon face the same reckoning.
So what do the GDPR and CCPA mean, exactly?
Though an EU regulation, the GDPR applies to any organization that processes data from EU citizens or residents, regardless of that organization’s location. The regulation specifies how data processors can remain both compliant (by notifying consumers about data collection and use, collecting data only when absolutely necessary, and storing data securely) and accountable (by keeping detailed documentation and training staff to handle personal data). The regulation also requires many organizations to appoint a Data Protection Officer to ensure accountability.
As for the CCPA, PWC calls it “the beginning of ‘America’s GDPR.’” Like the GDPR, it requires organizations to be transparent, let people opt out of selling their data, and be able to erase collected information. It applies to companies operating in California that make at least $25 million a year in revenue, collect data from more than 50,000 people, or reap the majority of their profits from user data.
Companies that meet these requirements in various industries have started adapting. Some have updated their privacy policies and pushed those updates out to users. You may have noticed services you’ve been using for years, from crowdfunding platforms to payroll companies, sending out emails and in-app notices in December asking you to read and agree to their updated data policies. They were anticipating January 1, 2020, the date the CCPA went into effect.
The CCPA reaches further than previous U.S. regulations because it applies across all types of data cases. Former regulations have been industry specific. For example, the Health Insurance Portability and Accountability Act, better known as HIPAA, only protects people’s medical records. The federal Gramm-Leach-Bliley Act solely applies to the financial industry.
The CCPA, Tam says, has encouraged other states to start pushing forward privacy laws of their own, like New York, Pennsylvania, and Minnesota.
How do these regulations affect the physical security industry?
Physical and information security consultant Michael Glasser says he sees two types of data used most in the physical security realm. There’s time and access related data, like when people enter a building or swipe in, and then there’s video surveillance.
Biometric data is another major area of privacy concern for those in the physical security industry. Video provides a venue for facial detection and recognition, while some high security operations require the likes of fingerprints or even iris scans for special building or room access.
Time, access, and video surveillance
Tracking when employees and customers come and go could be seen as a privacy violation if that information is used for what Glasser calls a “non-security purpose.” Most of his past clients’ legal teams and privacy review boards “are okay with recording video in case there’s a crime…or someone’s life is in danger.” But if a company is using it to monitor employee performance—for instance tracking how many times they go to the bathroom, that could be overstepping.
“Video surveillance hasn’t attracted too much scrutiny in the U.S. in the past, but that is probably going to change under the CCPA,” says Tam. “Just the fact that you have to notify folks is one requirement that is relatively new in the U.S. landscape.”
So far, Glasser hasn’t noticed many physical security companies having to make changes to their video surveillance notifications. Most already have the appropriate signage posted.
“The one piece that I see changing a little bit is visitor management,” he says. He’s noticed security companies have started having visitors fill out consent forms, some of which specify that visitors waive their right to request their video data later.
Opting out of video surveillance (short of avoiding areas in range of a camera) isn’t easy. One idea so far is to blur out the faces of those who don’t want to be on film after they’ve already made it into video footage, which isn’t the quickest or neatest solution.
Storing video, meanwhile, requires encryption. Video footage stored on encrypted servers in secure locations can help ensure that neither cyber-criminals nor in-person thieves reach this personal data.
Companies that process biometric data, like fingerprints, faces, and irises, will have to be extra careful. “Biometric data is considered more sensitive,” says Tam, compared to, say, email addresses and credit card numbers, “because it’s generally immutable. You can’t change your iris scan.”
Glasser offers a counterpoint. You bring your face out in public with you all the time. But you don’t write your social security or credit card number on the front of your shirt. That being said, a school in Sweden received the country’s first GDPR fine a few months ago for using video surveillance and facial recognition to track students’ attendance. For now, physical security companies might want to look to the few legal examples out there to determine where to draw the line in how they use facial scanning.
States like Illinois and Texas already have their own biometric privacy laws. Illinois’s law, for example, the Biometric Information Privacy Act, compels biometric processors to obtain consent from subjects and forbids using biometric information for commercial purposes.
Outside of such guidelines, physical security companies are using trial and error to develop best practices for protecting biometric data. “A whole lot more trial and a whole lot less error,” says Glasser. That’s because it’s too soon for the industry to see how long term effects of new regulations will play out.
What data is safe for now?
Anonymized data is generally regarded as exempt from data privacy regulations. In the physical security realm, such data could comprise traffic flow in a smart city, or the number of people gathering in a public space for a big event.
However, that data has to be genuinely anonymized. If the traffic flow information captures license plate numbers, that could be a violation. And if that crowd count is taken via video, that video better not be capturing individuals’ faces unless those people know they’re on camera.
How can the physical security industry stay compliant?
Security departments should start by creating a diverse, multidisciplinary compliance team. “It should not be a physical security director trying to figure out if one of their technology suppliers is compliant,” says Glasser. Instead, heads of security should convene experts from HR, finance, law, data privacy, and encryption technology.
As for finding regulation compliant technology suppliers, it’s usually best to go with a larger cloud service provider that already encrypts stored information. Creating encrypted technology within your own company is complicated and costly–unless you’re at major organization that employs lots of highly skilled software engineers.
“Most people tell me that they don’t trust the cloud companies,” says Glasser, but that could be a mistake. “They have a whole lot of smart people who are in charge of figuring this out, versus your small or medium business, who has the IT dude who thinks he’s smart because he read a Wired article on encryption.”
Finally, regular security risk assessments will help companies stay on top of possible vulnerabilities. Tam further suggests scrupulous documentation. “If you’re ever sued, those documents help demonstrate that you’ve implemented reasonable and appropriate safeguards,” he says.
Ultimately, the best anyone can do to stay compliant is to stay informed and play it safe. New privacy laws seem to crop up nearly every quarter. “Take nothing for granted,” Tam says.
To play it really safe, Glasser has one last piece of advice. “Don’t be nosy,” he says. “Don’t ask for [information] that you don’t need.”
Jessica Klein is a freelance writer who writes about new technologies in security, marketing, finance and more. The opinions and positions expressed in this article do not necessarily reflect the opinions and positions held by Patriot One Technologies and inclusion of persons, companies, or methods herein should not be interpreted as an endorsement.