#privacy + #politics

Public notes from activescott tagged with both #privacy and #politics

Wednesday, February 25, 2026

The danger here isn’t just about one contract; it’s about the precedent. If the Pentagon successfully bullies Anthropic into submission or replaces it with a more “flexible” competitor, we are effectively witnessing the birth of an intentionally unethical AI.

The Death of Human Agency When AI is integrated into weaponry for “all lawful purposes” without restrictions on autonomy, we invite the Responsibility Gap. If an AI-driven drone swarm misidentifies a target, who is at fault? By removing the “human-in-the-loop” requirement, the military is seeking a weapon that offers the ultimate prize of war: lethality without accountability. Surveillance as a Service Existing U.S. laws were written for wiretaps, not for generative AI that can ingest millions of data points to build predictive profiles. Under an “all lawful purposes” mandate, an LLM could be turned into a digital Panopticon. Anthropic has warned that current laws have not caught up to what AI can do in terms of analyzing open-source intelligence on citizens. The Moral Race to the Bottom If the Pentagon blacklists Anthropic, it sends a clear message to competitors: Safety is a liability. To win government billions, firms will be incentivized to strip away safety layers. Reports already suggest OpenAI, Google, and xAI have shown more “flexibility” regarding the Pentagon’s demands.

The Pentagon’s “supply chain threat” maneuver is a scorched-earth tactic designed to force Silicon Valley to choose between its values and its bottom line.

If Anthropic stands firm, it may lose $200 million in revenue and a seat at the defense table. But if they cave, they may well be providing the operating system for the very “Terminator” future they were founded to prevent. In the world of 2026, the most dangerous threat to the supply chain might just be an AI that has been ordered to stop caring about ethics.

Tuesday, November 18, 2025

Privacy in general matters because you never know how your data might be used even if you’re a good guy.

On Thursday, a Skagit County Superior Court judge ruled that pictures taken by Flock cameras in the cities of Sedro-Woolley and Stanwood qualify as public records, and therefore must be released as required by the state's Public Records Act, court records show.

Flock's cameras, also called automated license plate readers, continuously and indiscriminately capture time- and location-stamped photos of any passing vehicles. Those images are then stored, and information about the vehicles, including their condition, make, model and license plate number, is added to a searchable database controlled by the customer.

Last week's Skagit County ruling could oblige the dozens of Washington police agencies which use Flock cameras, ostensibly to help them find stolen vehicles, crime suspects and missing people, to release the photos and data they collect — an outcome privacy advocates warned was possible.

The ruling also exacerbated concerns about potential misuse of Flock data, which swelled after University of Washington researchers released a report Oct. 21 showing federal immigration agencies like ICE and Border Patrol had accessed the data of at least 18 Washington cities, often without their police departments' knowing. The report raised concerns that the agencies might be using the data to target and arrest immigrants as part of Trump's immigration crackdown.