Menlo Park Logo
Sep 25, 2023
Email
Todos los Emails

Re: Comments on Item G2 of Menlo Park City Council Meeting of September 26, 2023 Automated License Plate Readers (ALPR) – Staff Report #23-215-CC

Dear Council

As a follow up to my email below, I also want to share a recent article on the surveillance business. While unrelated to this specific topic, it sheds light on why this is so problematic.

Also, I forgot to mention that no where in the staff report, I say any ‘justification’ (whether as it relates to policy concerns or economics) as to the need for these surveillance tools other than what could paraphrase as “it’s available so let’s get it with a blue light special.”


This article here - for those interested to learn about the surveillance industry - explains the frequent "98% of your fellow citizens think protecting children is more important than privacy" ads many of us may be seeing. Framing these two as alternatives is a logical fallacy aimed at tricking people into accepting mass surveillance.

The best "follow the money" reporting on whos behind the global attack on digital privacy yet.

TLDR: Its law enforcement x AI companies posing as NGOs with a commercial interest in selling scammy mass scanning tech. Deeply cynical, deeply shady.

The proposed regulation is excessively “influenced by companies pretending to be NGOs but acting more like tech companies”, said Arda Gerkens, former director of Europe’s oldest hotline for reporting online CSAM. Among the few traces of Thorn’s activities in the EU’s lobby transparency register is a contribution of 219,000 euros in 2021 to the WeProtect Global Alliance, the organization that had a video conference with Kutcher and Von der Leyen in late 2020.

And to those whove piously corrected those whove expressed the legitimate fear that mass scanning will quickly be ab/re-used for other purposes ... Europol officials floated the idea of using the proposed EU Centre to scan for more than just CSAM, telling the Commission, “There are other crime areas that would benefit from detection”. According to the minutes, a Commission official “signaled understanding for the additional wishes” but “flagged the need to be realistic in terms of what could be expected, given the many sensitivities around the proposal.”

Regards
Soody Tronson
Menlo Park Resident



On Sep 23, 2023, at 4:56 PM, Soody Tronson wrote:

Comments on Item G2 of Menlo Park City Council Meeting of September 26, 2023
Automated License Plate Readers (ALPR) – Staff Report #23-215-CC)


Dear Council
Let me count the ways that this so-called technology is harmful as is the staff report which uses a lot of buzz words without details.

Staff Report Buzz Words. Throughout the report we see words or statements without any additional details.

Below are examples of empty or disturbing content as well as comments against the use of these surveillance tools backed by civil rights organizations, such as the ACLU.

Soody Tronson
Menlo Park Resident

Attachments in pdf

COMMENT ON THE STAFF REPORT

Flock System:
1. Statement-Page G-2.2: “For the first 30 days after data is collected by the Flock system, the searchable data is more robust – it includes the vehicle color and general details, so that for this period of time, personnel with a documented investigative reason may search in more detail for vehicles involved in crime, safety, and/or missing persons cases to more precisely and accurately sort through data.”
a. Comment: What does “documented investigative reason” mean? What are the guidelines? What is the standard? Is there a warrant?
2. Statement-Page G-2.2 “The software, along with the City’s policies are meant to be utilized in a way that only minimally intrudes on any privacy concerns.
a. Comment. What does “minimally intrudes on any privacy concerns” mean? With our privacy being eroded each day, every “minimal” encroachment matters.
3. Statement-Page G-2.3: “[i]t made more sense to request equitable deployment of Flock cameras across the entire city jurisdiction.”
a. Comment: What is it meant by “equitable deployment?” Just because you use the word “equitable,” unbiased,” or “fair,” does not make it so.

Integration of firearm discharge detection technology (“Raven”):
4. Statement-Page G-2.3: “Firearm-discharge detection technology is already available in adjoining areas of Redwood City, North Fair Oaks, and East Palo Alto through other vendors, and incorporating it with Flock makes more fiscal sense than engaging with this capability ala carte from another vendor.”
a. Comment: It is red flag that all of these firearm-discharge detection technology equipment have been deployed in the relatively less-affluent areas.
5. Statement-Page G-2.3: “Flock establishes a four square mile area of our City for gunshot recognition using their Raven technology., … covering the majority of the urbanized land area.”
a. Comment: What is the frequency of gun shots in the four square miles in Menlo Park? Where is the data?
b. Comment: “Urban area” is another code for the less affluent areas with more BIPOC.
6. Statement-Page G-2.3: Cost. The installation of 36 fixed ALPRs and integrated gunshot technology will require an initial expense of $284,900 and ongoing cost of $251,500 annually. The initial agreement composed by Flock includes the expenses for the first two years, totaling $536,400.
a. Comment: Don’t we have better things to do with half a million dollars, like providing housing, shelter to the unhoused, healthcare for the uninsured or under-insured?
7. Statement-Page G-2.3: “Since we are an incorporated city, utilizing staff already budgeted and authorized, this type of unanticipated staff expense would not occur in Menlo Park.”
a. Comment: If the existing staff can taken on this additional task, that means they are not performing to their current full capacity. So perhaps, instead stop over-hiring and downsize the police department form its currently inflated numbers.
8. Statement-Page G-2.3: “Flock currently has cameras either deployed, or deployment authorized/pending, in nearly every city.”
a. Comment: The words authorized and pending are combined. How many are actually authorized versus pending?
b. Comment: If everyone jumps off the cliff, does that mean we should also?
9. Statement-Page G-2.3: “Because this data that is being collected is so objective, and because it is being collected constantly by machines that do not discriminate and collect every plate they can read, this system is incredibly neutral and unbiased. Our deployment of these cameras equitably across our jurisdiction, so that we are collecting data from all parts of our City, only adds to that fairness and objectivity.:
a. Comment: As already stated, there is nothing unbiased or equitable about putting these surveillance equipment in “urban areas” as opposed in affluent neighborhoods.
10. Attachments/ Hyperlinks -Page G-2.7: “Examples of Flock transparency portals”
a. Comments: The pages to which the hyperlinks point, in fact, do not provide any data other than two numbers: Data retention (in days) and the number of owned cameras. They provide no other information. How is this transparency?

CIVIL RIGHTS CONCERNS:

FLOCK:
The surveillance company Flock Safety is blanketing American cities with dangerously powerful and unregulated automatic license plate recognition (ALPR) cameras. While license plate readers have been around for some time, Flock is the first to create a nationwide mass-surveillance system out of its customers’ cameras.

Unlike a targeted ALPR camera system that is designed to take pictures of license plates, check the plates against local hot lists, and then flush the data if there’s no hit, Flock is building a giant camera network that records people’s comings and goings across the nation, and then makes that data available for search by any of its law enforcement customers. Such a system provides even small-town sheriffs access to a sweeping and powerful mass-surveillance tool, and allows big actors like federal agencies and large urban police departments to access the comings and goings of vehicles in even the smallest of towns. And every new customer that buys and installs the company’s cameras extends Flock’s network, contributing to the creation of a centralized mass surveillance system of Orwellian scope. Motorola Solutions, a competitor to Flock, is pursuing a similar business model.

Not every use of ALPRs is objectionable. For example, we do not generally object to using them to check license plates against lists of stolen cars, for AMBER Alerts, or for toll collection, provided they are deployed and used fairly and subject to proper checks and balances, such as ensuring devices are not disproportionately deployed in low-income communities and communities of color, and that the “hot lists” they are run against are legitimate and up to date. But there’s no reason the technology should be used to create comprehensive records of everybody’s comings and goings — and that is precisely what ALPR databases like Flock’s are doing. In our country, the government should not be tracking us unless it has individualized suspicion that we’re engaged in wrongdoing.

Whether ALPRs are being used for Amber Alerts, toll collection, or to identify stolen vehicles, a license plate can be run against a watchlist in seconds. The police do not need records of every person’s coming and goings, including trips to doctor’s offices, religious institutions, and political gatherings.

The data retention policy of Flock, as is, is very problematic. Please refer to the attached ACLU document for suggested changes.

ACLU’s fully laid out its concerns with this technology in a March 2022 white paper on Flock, and in a 2013 report on law enforcement use.

Raven:

There are at least four problems with the ShotSpotter Gunshot Detection System (Raven).

A critical report on the ShotSpotter gunshot detection system issued today by the City of Chicago’s Inspector General (IG) is the latest indication of deep problems with the gunshot detection company and its technology, including its methodology, effectiveness, impact on communities of color, and relationship with law enforcement. The report questioned the “operational value” of the technology and found that it increases the incidence of stop and frisk tactics by police officers in some neighborhoods.

The IG’s report follows a similarly critical report and legal filing by the Northwestern School of Law’s MacArthur Justice Center and devastating investigative reporting by Vice News and the Associated Press. Last week, the AP profiled Michael Williams, a man who spent a year in jail on murder charges based on evidence from ShotSpotter before having his charges dismissed when prosecutors admitted they had insufficient evidence against him.

1) First
a. ShotSpotter false alarms send police on numerous trips (in Chicago, more than 60 times a day) into communities for no reason and on high alert expecting to potentially confront a dangerous situation. Given the already tragic number of shootings of Black people by police, that is a recipe for trouble.
b. Indeed, the Chicago IG’s analysis of Chicago police data found that the “perceived aggregate frequency of ShotSpotter alerts” in some neighborhoods leads officers to engage in more stops and pat downs.
c. The placement of sensors in some neighborhoods but not others means that the police will detect more incidents (real or false) in places where the sensors are located. That can distort gunfire statistics and create a circular statistical justification for over-policing in communities of color.
2) Second
a. Second, ShotSpotter’s methodology is used to provide evidence against defendants in criminal cases, but isn’t transparent and hasn’t been peer-reviewed or otherwise independently evaluated. That simply isn’t acceptable for data that is used in court.
3) Third
a. Vice News and the AP note examples of the company’s analysts changing their judgments on all of the above types of results (which ShotSpotter disputes). In addition, the company uses AI algorithms to assist in the analysis — and as with all AI algorithms, that raises questions about reliability, transparency, and the reproducibility of results. The company turned down a request by the independent security technology research publication IPVM to carry out independent tests of its methodologies.
4) Fourth
a. Finally, still up for debate is whether ShotSpotter’s technology is even effective. We can argue over a technology’s civil liberties implications until the end of time, but if it’s not effective there’s no reason to bother. A number of cities have stopped using the technology after deciding that ShotSpotter creates too many false positives (reporting gunshots where there were none) and false negatives (missing gunshots that did take place). The MacArthur Justice Center’s report found that in Chicago, initial police responses to 88.7 percent of ShotSpotter alerts found no incidents involving a gun.