Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about opportunities here.
The physical security industry is at a crossroads. Video surveillance and analytics have rapidly moved to the cloud over the past decade, bringing improved connectivity and intelligence. But these same innovations also enable new possibilities for mass data collection, profiling and abuse.
As one of the industry’s leading cloud-based providers, Berkadawhich offers a range of physical security measures including AI-equipped remote surveillance cameras, controllers, wireless locks and more, is trying to chart a first path to privacy amid these emerging tensions.
The San Mateo-based company, which has brought more than 20,000 organizations into the cloud security era, plans to develop features focused on protecting identities and validating video authenticity.
Launched today, the updates come at a pivotal time for society and how we exist in public and private spaces. Verkada has drawn significant backlash for previous security breaches and controversial incidents. However, his ability to balance innovation with ethics will reveal how he navigates the turbulent physical security industry.
Obfuscation of identities, validation of authenticity
In an interview with Verkada founder and CEO Filip Kaliszan, he described the motivations and mechanisms behind the new privacy and verification features.
“Our mission is to protect people and property in the most privacy-sensitive way,” Kaliszan said. “[The feature release] is about this privacy-sensitive way to achieve our goal.”
The first update focuses on hiding identities in video streams. Verkada cameras will gain the ability to automatically “blur” faces and video streams using principles similar to augmented reality filters in social media apps. Kaliszan noted that security guards monitoring the streams “don’t really need to see all these details” about people until an incident occurs.
Blurring the “default path” where possible is a priority, with the goal being “most videos are washed out with unclear identities”.
In addition to face-recognition-based blurring, Verkada plans to implement “hashing of the video that we shoot across all of our devices… So we’re creating, you can think of it as a signature of the video content as it’s captured,” Kaliszan explained.
This creates an untampered digital fingerprint for each video that can be used to validate authenticity.
Such a feature helps address growing concerns about the creation of artificial intelligence, which makes it easier to falsify or alter footage.
“We can say that this video is real. It came from one of our sensors and we have evidence of when it was captured and how, or there is no match,” Kaliszan said.
For Kaliszan, adding privacy and verification capabilities aligns with both ethical imperatives and Verkada’s competitive strategy.
“It’s a win-win strategy for Verkada because on the one hand, you know, we’re doing what we think is right for society,” he argued. “But it’s also very wise for us,” in terms of building customer trust and preference, he said.
Questions raised about privacy
While Kaliszan positioned Verkada’s new features as a step toward protecting privacy, civil society critics argue that the changes don’t go far enough.
“If you do it where it can be undone — you can undo it later — you’re still collecting this very intrusive information,” said Merve Hickok, president of the independent nonprofit Center on Artificial Intelligence and Digital Policy.
Instead of just blurring the images temporarily, Hickok believes companies like Verkada should take a “privacy-enhancing approach where you don’t collect the data in the first place.” Once collected, even the dark footage allows for tracking through “location data, license plate readers, heat mapping.”
Hickok argued that Verkada’s incremental changes reflect an imbalance of priorities. “The security features are so good, so it’s like yeah, go ahead and collect them all, we’ll blur them out for now,” he said. “But then the individual rights of people walking are not protected.”
Without stricter regulations, Hickok believes we’re on a “slippery slope” toward ubiquitous public surveillance. He advocated legal bans on “real-time biometric identification systems in public places”, similar to those being discussed in the European Union.
A clash of views on ethics and technology
Verkada is at the center of these conflicting perspectives on ethics and technology. On the one hand, Kaliszan aims to show that security can be “privacy sensitive” through features like obfuscation.
On the other hand, civil society critics like Hickok question whether Verkada’s business model can ever be fully aligned with individual rights.
The answer has significant implications not just for Verkada, but for the broader security industry. As physical security moves to the cloud, companies like Verkada are leading thousands of organizations into new technology territory. The choices they make today about data practices and preferences will vary greatly in the future.
This power comes with an obligation, Hickok argues. “We are much closer to enabling the fully researched society than we are from a fully private and protected society,” he said. “So I think we should have that safety measure, but maybe the key here is that companies just have to be very persuasive.”
For Verkada, consciousness means promoting security by avoiding mass surveillance. “When all of that comes together, that consideration of privacy increases further, right?” Kalishan said. “And so we’re thinking about how we maintain privacy, how we connect identity locally, doing processing at the edge rather than building a mass surveillance system.”
VentureBeat’s mission is set to be a digital town square for technical decision makers to learn about transformative business technology and transact. Discover our Updates.