AI, War and the Right‑to‑Repair Debate

California, USAFri Mar 06 2026
The U. S. Army has decided to pull $200 million worth of software from a major AI company because the firm will not let it be used for mass spying on citizens or for fully autonomous weapons. The move sparks a debate about who gets to decide how powerful technology is used. The company’s leaders say their tools can’t safely make life‑and‑death decisions on their own. They want the government to keep a human in control of any weapon that could shoot. Critics argue this gives tech firms too much influence over national security choices, just as some lawmakers see a similar problem when car makers try to lock down software inside vehicles. The state that made the first “right‑to‑repair” law for cars voted overwhelmingly to let owners access their car’s software. That rule prevents manufacturers from forcing people into expensive dealer shops for any repair. The same principle—people should control what they buy—is now being applied to AI and defense.
The disagreement touches deeper values. If a private company can say whether an army drone can be fired automatically, it could override democratic safeguards. The question is whether technology should have built‑in limits or if the government can set them through law. One suggestion is to create clear rules that keep AI from being used for mass surveillance. Some states already allow people to delete data held by companies, but a national law would make this broader. Another idea is an international treaty that bans or limits autonomous weapons, similar to agreements on land mines and flamethrowers. Right now, the company that built a rival AI system has already struck a deal with the Pentagon. The current administration is still negotiating, and it remains unclear whether tech firms will accept stricter limits or push back. The debate is not just about machines; it’s a question of trust. If we expect AI to act in line with American values—privacy, human rights and the value of life—we need laws that enforce those standards. Without them, powerful tools could be used in ways that threaten safety and freedom.
https://localnews.ai/article/ai-war-and-the-righttorepair-debate-21d1f3ba

actions