Can policy keep pace with surveillance tech? Experts say it must.

By Benjamin O. Powers

The average citizen’s expectation of privacy in a public space is by and large a modest one. When you are out and about, you expect to be caught on camera inside your local convenience store, at a traffic signal or in the background of someone else’s selfie. In fact, in the 10 years after the 9/11 terrorist attacks, the United States added 30 million surveillance cameras to its streets.

Now though, AI-backed systems are able to use facial recognition and geo-triangulated social media surveillance to recognize up to 15 people in a crowd simultaneously, then track them in real time as they leave a public space after, for example, a protest. Given this, it’s worth considering how well governments and private security operations are crafting laws and policies that regulate the use of these new tools.

“This is a new era of the evolving and changing nature of government surveillance and technology and there is a real need to put in place policies of civilian review for approval of surveillance technologies, that balance government interest with constitutional rights,” said Berkeley, Calif. Mayor Jesse Arreguín. In 2018, the city passed an ordinance that requires elected representatives on the City Council to determine which new surveillance tech police should acquire and how.

The ordinance named a litany of surveillance tools, illustrating the challenge of regulating the use of varied and rapidly advancing technology. A partial list included cell site simulators, commonly known as “Stingrays,” automatic license plate readers, body-worn cameras, gunshot detectors, facial recognition software, thermal imaging systems, social media analytics software, and gait analysis software, as well as video cameras whose audio and video recordings can be remotely accessed.

“We spent a lot of time crafting an ordinance that was not overly cumbersome and has reasonable exceptions that would allow the city to use these technologies in exigent circumstances and when it wasn’t overly intrusive,” said Mayor Arreguín.

Many in the security community see such exceptions as a necessity, even though the ACLU and other like-minded organizations object to them. Other California cities such as Davis and Oakland have adopted similar measures, and Mayor Arreguín believes Berkeley’s ordinance could serve as a model for cities who don’t have one, with civilian oversight at the core of any such policy.  

Tim Williams, the Vice Chairman of Pinkerton, a security and risk management firm, believes it is up to corporations to step into the void as tech speeds ahead of regulation. Organizations should take measured steps now to develop protocols that consider technology’s impact on privacy and other areas of concern.  

“You really have to think very defensively, about what you are doing, why are you doing it, and how it is being implemented,” said Williams.

Williams said organizations must define each and every technology and establish the rules of engagement for each, taking into account regulations and laws that may vary from municipality to municipality. Then, they must communicate those policies effectively—and often— to their security teams.

“It’s one thing to produce a protocol or a policy,” said Williams. “But it’s another thing to have it understood.” Corporations don’t always establish clear and effective continuing education around security technology. But they should, he said.

It may not be worth adopting cutting edge tech until it’s been vetted a number of times, and truly understood, Williams said. Facial recognition tech, for example, still often has inherent biases and misidentifies women and people with darker skin tones. Those vulnerabilities could put an organization at risk for claims of privacy invasion and racial profiling.

One way around those claims is to employ a different kind of computer vision, said Martin Cronin, CEO of PatriotOne Technologies, a weapons detection company and the underwriter of this publication. PatriotOne’s system is trained to recognize weapons and other suspicious items first. “If you detect a threat first, and then use facial recognition to identify a person, very few will complain,” he said.

With the Internet of Things proliferating at a rapid rate, firms also need to be cautious of new tech in regards to their own security, said Williams, particularly if they’re holding valuable information. The interconnectedness of devices offers more points of vulnerability that can be exploited by malicious actors. Additionally, said Shahid Buttar, Director of Grassroots Advocacy at the Electronic Frontier Foundation, governments and companies should set limits on how long collected surveillance data can be held.

“There are other ways through policy to constraints surveillance. For instance, a limit on how long a department can retain data collected to a particular method,” says Buttar. “Let’s say, you’ve got a hunch, you get the judicial warrant, and you use the license plate reader. In this instance, a  jurisdiction could say you can’t retain data from the surveillance method for more than 180 days. The retention limit is another way to dialup or back the privacy protections.”


Many of the articles within the media pages of the website are 3rd party in origin and have been included for informative purposes only. Decisions to include articles are solely based on the timely nature of the storyline as it applies to the security industry in general and to the proliferation of threats to public safety in particular. The inclusion of these articles does not imply that PATRIOT ONE its management, agents or employees endorses any statements expressed. The public is advised to fully investigate any contentious claims or assertions prior to arriving at any conclusions. Any hyperlinks included in these articles does not imply that PATRIOT ONE monitors or endorses these websites. Accordingly, PATRIOT ONE accepts no responsibility for such websites. Additional information regarding exclusions and liability limitations are outlined here.