In The Package Approved After PEC at ALECE, Ceará Contracted 513 Monitoring Points for R$ 2.6 Million Monthly, While Police Report Cameras Mistaking People for Garbage Bags and Triggering Full Name Identifications, Time, and Location Even When The Image Shows Backs or Sidewalks Without Any Faces
In Ceará, the cameras that confuse people with garbage bags have become the most troubling aspect of a contract estimated at R$ 2.6 million per month for 513 monitoring points. Police operating the public system recorded frozen images where the panel indicates “accuracy” above 86% and still triggers alerts with full names that do not match what appears.
The political authorization came after ALECE approved a PEC at the end of 2025, allowing the state government to exceed the spending cap on public security. The debate now is not only about technology but also about cost, accountability, and the type of error that an identification system can produce when the street becomes a permanent target for cameras.
The R$ 2.6 Million Monthly Contract and The 513 Points on The Map
The described contract establishes 513 monitoring points with an estimated monthly cost of R$ 2.6 million.
-
Motorola launched the Signature with a gold seal from DxOMark, tying with the iPhone 17 Pro in camera performance, Snapdragon 8 Gen 5 that surpassed 3 million in benchmarks, and a zoom that impresses even at night.
-
Satellites reveal beneath the Sahara a giant river buried for thousands of kilometers: study shows that the largest hot desert on the planet was once traversed by a river system comparable to the largest on Earth.
-
Scientists have captured something never seen in space: newly born stars are creating gigantic rings of light a thousand times larger than the distance between the Earth and the Sun, and this changes everything we knew about stellar birth.
-
Geologists find traces of a continent that disappeared 155 million years ago after separating from Australia and reveal that it did not sink, but broke into fragments scattered across Southeast Asia.
In practice, this creates a network of cameras distributed to inform policing operational decisions, with alerts and deployments based on what the system “sees”.
The problem is that the scale amplifies the effect of error.
If the cameras that confuse people with garbage bags appear at an isolated point, it is a technical case; if they appear within a system of 513 points, they become a repeated statistical risk, with public cost and direct impact on the citizen and the agent responding to the call.
When 86% Seems High, But The Image Is Of A Sidewalk, Back, or Object
The records cited by operators indicate “accuracy” above 86% in frozen images.
The shock lies in the contrast between the rate and the scene: instead of a clear face, backs, a marked sidewalk, objects, and even garbage bags are treated as people.
In this type of scenario, the discussion shifts from “does the system get it right or wrong” to “what the system is calling correct”.
A high percentage does not explain the context of the capture, nor does it guarantee that the identification with full name is compatible with framing, distance, lighting, and angle.
Alerts with Full Name and The Boundary Between Prevention and Embarrassment
The complaint describes that the system returns person, location, time, and identity with full name, even when the image does not show a face.
For those on the operation side, this creates a command for action that appears certain, although the cameras that confuse people with garbage bags reveal that certainty may just be an interface.
For the citizen, the risk is more delicate.
Being associated with an alert with full name without the image confirming the identification opens up room for improper approaches and embarrassment, especially when the error arises from reading objects, shadows, or movements that should not trigger identification.
Risk to Agents, Cost to The State, and The Effect on The Street
The material points out that the cameras that confuse people with garbage bags put public security agents at risk by deploying teams and organizing operations driven by a faulty system.
An incorrect alert is not just an incorrect piece of data; it mobilizes a vehicle, alters strategy, and can expose teams to unnecessary situations.
At the same time, the cost does not end at R$ 2.6 million per month. There are indirect costs of time, fuel, planning, and institutional wear and tear when the system loses credibility.
When the operator starts to doubt the alert, technology ceases to be a tool and becomes noise.
The PEC at ALECE and The Budget That Grows Along With The Surveillance
The hiring was authorized by ALECE, which approved a PEC at the end of 2025 to allow crossing the spending cap on public security.
This detail is central because it helps explain why a contract of R$ 2.6 million per month advances even in a context of fiscal pressure.
The content also indicates that, in the last four years, the sector has already consumed about R$ 1.5 billion.
In Ceará, the inevitable question is about priority: how much of this expenditure translates into real results and how much becomes dependence on systems that have not yet proven reliability in practice.
Company, Politics, and The Path That Brought The Cameras That Confuse People With Garbage Bags
The model is attributed to the company IPQ and underwent the political process that includes the PEC approved at ALECE.
The case gains weight because the criticism does not come from external observers, but from police officers who operate the system and report recognition errors.
There is also a record that, in 2023, state deputy Sergeant Reginauro defended a facial recognition system when he was a councilor in Fortaleza, but the project did not advance.
Now, under the proposal of Governor Elmano de Freitas, the system exists, and the cameras that confuse people with garbage bags reignite exactly the fear cited: mistaken identification and embarrassment.
What Changes When 513 Points Become The Standard and Not The Exception
With 513 points, the system ceases to be pilot and becomes a permanent public policy.
This means that the error can also become the standard if there are no quick correction mechanisms, auditing, and quality control of the images that feed the recognition.
This is where the debate needs to gain human precision, not just technical.
If the system identifies someone without a face, who is responsible for the damage, the operator who acted, the agency that hired, the company that supplied it, or the entire chain that accepted that “86%” was enough to act.
In Ceará, the contract of R$ 2.6 million per month for 513 points places surveillance at the center of the budget and the street.
The cameras that confuse people with garbage bags expose a paradox: the system announces “accuracy” above 86%, but generates alerts with full names in images of sidewalks, backs, and objects.
Have you ever seen an approach motivated by a camera alert, or does that still seem distant from your routine in Ceará? If it depended on you, what should be mandatory before again expanding the 513 points, independent auditing of the cameras that confuse people with garbage bags, transparency of the “accuracy” criteria, or review of the R$ 2.6 million contract at ALECE?

Seja o primeiro a reagir!