> Another solution is to make software makers responsible and liable for the output of their products. It's long been a problem that there is little legal responsibility, but we shouldn't just accept it. If Ford makes exploding cars, they are liable. If OpenAI makes software that endangers people, it should be the same.
That kind of thinking is exactly why LLMs are so censored, because people think OAI should be liable if someone uses chatgpt to commit cyber crimes
How about cyber crimes are already illegal and we just punish whoever uses the new tools to commit crimes instead of holding the tool maker liable
This gets complex if LLMs enable children to commit complex crimes but that's different from just outright restricting the tool for everyone because someone might misuse it
There's always some wedge issue that means "don't punish the toolkmaker" is not politically viable. You can pick from guns to legal drugs to illegal drugs to all kinds of emotive things.
And once the wedge is in and the concept of maker responsibility is planted, it expands to people's pet issues, obviously.
The actual line of who gets punished just ends up at some equilibrium in the middle. Largely arbitrarily.