Pimloc raises $7.5M for visual AI that protects privacy instead of imperils it
Surveillance video installations are increasing, as is the sophistication of visual AI and the ways in which this data is used.
Recent laws such as the California Consumer Privacy Act, and increased privacy concerns on the part of the general public, leave governments and businesses open to new legal and reputational liability.
UK-based Pimloc’s Secure Redact SaaS product uses visual AI to quickly and affordably anonymise video in the service of citizen privacy, while preserving 100% of its value for the purposes of security and data analytics.
Pimloc has raised a seed round of $7.5M led by Zetta Venture Partners. Existing investors Amadeus Capital Partners and Speedinvest also participated.
The commitment reflects investors’ belief that the exponential growth in video surveillance is about to collide with regulations and increased sensitivity around privacy issues.
Pimloc’s Secure Redact is already in use by entities that must provide video evidence that complies with new data privacy regulations and protects the Personal Identifiable Information (PII) of other individuals. Secure Redact uses AI to automatically blur faces, heads and license plates in video - a process that previously required time and skilled human operators. It can be accessed directly as an online SaaS product or via APIs and Containers to allow integration into local video workflows and systems.
Low-cost high def video + AI = The Surveillance Century
Low-cost camera systems, cheap data storage and retrieval, and increasingly useful vision-based AI has triggered exponential growth in video surveillance. There’s now a CCTV camera for every 13 residents in London. In the United States, the number of installed security cameras has nearly doubled, to about 85 million, since 2015.
We accept cameras covering workplaces, hospitals, schools, transport networks and other public places. But fixed cameras in public spaces are just the tip of the iceberg. Home security IoT solutions, body-worn cameras, and vehicles equipped with Advanced DriverAssistance Systems also capture video. (There are eight cameras on a Tesla Model S!)
In the 20th century, CCTV footage was only watched if there was a security breach; otherwise it was soon overwritten. Now it may be stored indefinitely in the cloud and routinely “seen” by a growing suite of AI solutions such as facial recognition, emotion or anomaly detection, and crowd monitoring. It’s not just for security anymore; it’s spreading into operations, marketing, and finance; we are tracked around cashier-less stores; we interact with delivery drivers through cameras on our front doors.
The video genie has escaped the bottle. Yet as the volume of personal and sensitive video data expands, so does the risk of cyber breaches. As Pimloc CEO Simon Randall succinctly puts it, “It will be used—but it will also be abused.”
Regulators, workers and their unions, and ordinary citizens are pushing back
Most countries now have at least some data privacy regulations. In the EU, the first cases brought under the General Data Protection Regulation (GDPR) have reached the courts; large fines have been issued and new legal precedents have been set. The California Consumer Privacy Act (CCPA) already offers broadly similar protection; senators from both parties have proposed federal regulations.
Visual AI is increasingly useful in retail, warehouse, and factory settings where data from video analysis tracks safety and efficiency. Amazon has used video to track and enforce social distancing in its warehouses to reduce the spread of COVID. “Smart City” tech also makes extensive use of video. Pimloc is already in talks with companies about anonymising visual data used for production efficiency, to help them prioritise worker privacy.
Private citizens know that Cambridge Analytica used their social media data to politically manipulate them, and that Clearview AI scraped their personal photos to create a global facial-recognition database. People want more transparency around when and how they’re being surveilled and for what purpose. For businesses, the potential reputational damage from getting it wrong can be more punitive than regulations.