Surveillance Cameras Use Big Tech Designed Software That Considers Everyone A “Potential Threat”

by 

A recent Fox 5 New York article revealed how Big Tech and law enforcement are working together to identify people who could be a “potential threat” using artificial intelligence (AI).
The article describes how private businesses can use Actuate AI to transform every surveillance camera into a predictive AI monster that threatens everyone’s freedoms.
In a matter of seconds, after the employee is visible in the office space, a surveillance camera in the room goes into action. 
 
A green flash pulsates across the monitor set up in another corner of the room after the video management system recognizes a potential threat in the room. 
 
“As you can see it’s pulsing green within about a second or two. That means an alert has been registered,” says Actuate CEO and co-Founder Sonny Tai. “Our AI model has made a detection and sent an alert to video management systems.”
What they failed to mention is how Actuate equipped surveillance cameras are constantly surveilling and IDing people and objects.
The article goes on to explain how “a used car lot or a construction worksite using Actuate’s software could be programmed to notify the police when a person walking by a fence can be flagged as being suspicious.

“It could be at a used car lot or construction site at night with cameras programmed to notify when a suspicious person walks near a fence. The Actuate technology is currently being utilized by 20,000 cameras across the country including at schools, college campuses, construction sites, and used car lots.”

Actuate’s Public Spaces link reveals how building mangers can secretly flag tenants and visitors as ‘suspicious’, effectively turning apartment/condo buildings into police department substations.

“Given the current media coverage of workplace violence and terrorist threats, many building managers are increasingly concerned about their tenants’ and patrons’ safety in high-profile new developments. Actuate’s AI software integrates with existing security systems, enabling real-time responses to violence and providing patrons and tenants with peace of mind.”

And there’s the catch. Landlords and private corporations can use Actuate to alert them to a secret list of suspicious activities that would trigger a police or security response. Law enforcement could then use that secret list as probable cause to question, frisk and detain someone simply because Actuate determined them to be suspicious. (Much like ShotSpotter is used today.)
As the above video so proudly boasts, Amazon and Microsoft engineers designed Actuate’s AI to be used by any surveillance camera in the country.
Amazon says Actuate can be used in Amazon’s Elastic Compute Cloud and a number of other Amazon Web Services (AWS) to flag suspicious people who don’t social distance properly, wear face masks or to count (ID) pedestrians.

“Actuate AI is able to provide its clients actionable information with fewer false positives and without the racial bias inherent in many facial recognition–based AI models. Focusing on objects also enables Actuate AI to apply its technology to other relevant security and compliance tasks, including mask compliance, social distancing detection, intruder detection, people counting, and pedestrian traffic analysis.”

Amazon has helped Actuate amass the world’s largest database of labeled security camera footage otherwise known as a suspicious objects database.

“Actuate AI’s inference engine relies on what may be the world’s largest database of labeled security camera footage—a library of more than 500,000 images that helps the company’s AI scour live video to detect very small objects in highly complex scenes with greater than 99 percent accuracy and an industry-leading false positive rate.”

Amazon allows Actuate’s customers to flag suspicious people in less than half a second.

“The AI uses the processing power of an Amazon EC2 C5 Instance to monitor cameras for movement at all times. In doing so, the AI identifies relevant objects in less than half a second with the help of Amazon EC2 G4 Instances. Once the AI has decided that the event is a threat, the metadata is stored in Amazon DynamoDB, a key-value and document database that delivers single-digit millisecond performance at any scale.” 

Actuate AI designed by Big Tech

Why is Big Tech amassing such a huge AI database of suspicious objects and collecting millions if not billions of images of people and cars and storing them to the cloud?

“Actuate’s technology is built completely in the USA by engineers and data scientists from top US universities and companies, such as Georgia Tech, University of Chicago, Amazon, Microsoft, and Northrop Grumman.”

Microsoft’s Dice Corporation helps Actuate customers use multiple CCTV cameras to monitor suspicious people and cars for ‘practically anything’.

As Dice Corporation explains in the video, law enforcement and private corporations can use matrix video monitoring to ID people using facial recognition and also monitor a person’s phone calls.

We are primarily funded by readers. Please subscribe and donate to support us!

“Central Stations deploy Actuate’s technology to enable existing dispatchers to monitor more cameras and to provide a wider range of services monitoring anomalies beyond simple people/vehicle detection.”

The video also mentions how DHS’s Fusion Centers use matrix video monitoring to track suspicious people.

A February 2022 news release reveals that Dice Corporation has increased its global footprint with six International Data Centers.

“Avi Lupo, co-president of DICE Corporation, has announced that the company has increased its global footprint by adding six data centers outside the United States and Canada. The six data centers are supporting markets in Latin America, Europe, Asia, and expanded footprints in North America.”

With plans to create 15 data centers in the U.S. and across the globe, one has to ask who are they collecting this ‘suspicious’ data for?

Actuate’s “Mission” is a window into an AI-run surveillance camera network designed by Big Tech and law enforcement.

“As the gun violence epidemic has swelled here in the U.S., Co-Founder Sonny Tai’s background as a Marine Corps Captain led him to create Actuate. By sourcing key feedback from major stakeholders—law enforcement, educators, other veterans—Sonny and our team understood that early, real-time threat information would help achieve our vision of a safer, more intelligent future.”

The privacy implications of turning every CCTV camera into suspicious person tracking network is Orwellian and too horrifying to imagine.

It seems like Actuate AI and Evolv Technology are in a race to see who can create the largest suspicious person tracking network before one gets bought out by the other.

“Consumers can expect to see Evolv Technology scanners a lot more. Sports franchises like the Tennessee Titans and Carolina Panthers now use it; so do the New York Mets and Columbus Crew. The Super Bowl at SoFi Stadium in February deployed it on an outside perimeter. In New York City, public arts institutions such as Lincoln Center are trying it. So is a municipal hospital. (NYC Mayor Eric Adams has touted it as a potential subway security measure.)”

“North Carolina’s Charlotte-Mecklenburg school system, with 150,000 students, has also licensed Evolv. Theme parks are excited, too — all 27 Six Flags parks across the country now use it. Evolv has now conducted 250 million scans, it says, up from 100 million in September.”

The prospect of being considered a potential threat by retail stores, businesses, public places, sports stadiums, hospitals etc., using software designed by Big Tech flies in the face of a so-called democracy. AI designed by Big Tech to predict that someone is suspicious based on secret algorithms are a threat to everyone.

Views:

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.