I don’t quite know whether it is especially computer science or its subdiscipline Artificial Intelligence that has such an enormous affection for euphemism. We speak so spectacularly and without ourselves recognizing our own superficiality and immeasurable naivete with respect to these concepts. And, in the process of so speaking, we anesthetise our ability to evaluate the quality of our work and, what is more important, to identify and become conscious of its end use.
The student I mentioned above imagines his work to be about computer games for children, involving perhaps toy kittens, bears and balls. Its actual end use will likely mean that some day a young man, quite like the student himself and who has parents and possibly a girl friend, will be set afire by an exploding missile which was sent his way by a pilot’s associate system shaped by the student’s research. The psychological distance between the student’s conception of his work and its actual implications is astronomic. It is precisely this enormous distance which makes it possible not to know and not to ask if one is doing sensible work or contributing to the greater efficiency of murderous devices.
One can’t escape this state without asking, again and again: “What do I actually do? What is the final application and use of the products of my work?” and ultimately, “am I content or ashamed to have contributed to this use?”
This greatly recalls Günther Anders’ “Promethean discrepancy” — our inability, both practical and willful, “to imagine that which we have created or produced.”
We can indeed make the hydrogen bomb; but to envision for ourselves the consequences of that which we have made we are not adequate in the same manner, our capacity to feel hobbles along behind our capacity to do: we can indeed rain bombs on hundreds of thousands; to regret or weep for them we cannot.