Brian Small<p></p><blockquote><span>.> ... as the philosopher Karl Popper noted, “we do not seek highly probable theories but explanations; that is to say, powerful and highly improbable theories.”</span></blockquote><blockquote><span>.> The theory that apples fall to earth because that is their natural place (Aristotle’s view) is possible, but it only invites further questions. (Why is earth their natural place?) The theory that apples fall to earth because mass bends space-time (Einstein’s view) is highly improbable, but it actually tells you why they fall. True intelligence is demonstrated in the ability to think and express improbable but insightful things.</span></blockquote><blockquote><span>.> True intelligence is also capable of moral thinking. This means constraining the otherwise limitless creativity of our minds with a set of ethical principles that determines what ought and ought not to be (and of course subjecting those principles themselves to creative criticism). To be useful, ChatGPT must be empowered to generate novel-looking output; to be acceptable to most of its users, it must steer clear of morally objectionable content. But the programmers of ChatGPT and other machine learning marvels have struggled — and will continue to struggle — to achieve this kind of balance.</span></blockquote><blockquote><span>.> ... In the absence of a capacity to reason from moral principles, ChatGPT was crudely restricted by its programmers from contributing anything novel to controversial — that is, important — discussions. It sacrificed creativity for a kind of amorality.</span></blockquote><blockquote><span>...</span></blockquote><blockquote><span>.> Note, for all the seemingly sophisticated thought and language, the moral indifference born of unintelligence. Here, ChatGPT exhibits something like the banality of evil: plagiarism and apathy and obviation. It summarizes the standard arguments in the literature by a kind of super-autocomplete, refuses to take a stand on anything, pleads not merely ignorance but lack of intelligence and ultimately offers a “just following orders” defense, shifting responsibility to its creators.</span></blockquote><blockquote><span>.> In short, ChatGPT and its brethren are constitutionally unable to balance creativity with constraint. They either overgenerate (producing both truths and falsehoods, endorsing ethical and unethical decisions alike) or undergenerate (exhibiting noncommitment to any decisions and indifference to consequences). Given the amorality, faux science and linguistic incompetence of these systems, we can only laugh or cry at their popularity.</span></blockquote><span> </span><a href="https://misskey.cloud/tags/Chomsky" rel="nofollow noopener" target="_blank">#Chomsky</a><span>, </span><a href="https://misskey.cloud/tags/NoamChomksy" rel="nofollow noopener" target="_blank">#NoamChomksy</a><span> </span><a href="https://misskey.cloud/tags/IanRoberts" rel="nofollow noopener" target="_blank">#IanRoberts</a><span> and </span><a href="https://misskey.cloud/tags/JeffreyWatumull" rel="nofollow noopener" target="_blank">#JeffreyWatumull</a><span> on </span><a href="https://misskey.cloud/tags/ChatGPT" rel="nofollow noopener" target="_blank">#ChatGPT</a><span> </span><a href="https://misskey.cloud/tags/AISalami" rel="nofollow noopener" target="_blank">#AISalami</a><span> as </span><a href="https://misskey.cloud/tags/Eichmann" rel="nofollow noopener" target="_blank">#Eichmann</a><span>, </span><a href="https://misskey.cloud/tags/AdolfEichmann" rel="nofollow noopener" target="_blank">#AdolfEichmann</a><span> </span><a href="https://misskey.cloud/tags/Bureaucrat" rel="nofollow noopener" target="_blank">#Bureaucrat</a><span> </span><a href="https://misskey.cloud/tags/BanalityOfEvil" rel="nofollow noopener" target="_blank">#BanalityOfEvil</a><span> in<br> </span><a href="https://misskey.cloud/tags/NYT" rel="nofollow noopener" target="_blank">#NYT</a><span>, </span><a href="https://misskey.cloud/tags/NewYorkTimes" rel="nofollow noopener" target="_blank">#NewYorkTimes</a><span> March 8, 2023</span><p></p>