Is that if you going to develop anything approaching AI you need to put in guardrails so that it's forced as part of its programming to value human life as paramount. Otherwise you're going to get solutions like this as the optimal way to solve a given problem. Valuing your own kind is an irrational, emotional drive of our semi-evolved monkey brain goo, not a logical precept that emerges out of data synthesis. Or more accurately, we need to pressure government to force these tech bros to do it, as they won't voluntarily in their trillion dollar race to AI everything
(, Wed 25 Feb 2026, 22:55, Reply)