This past semester I presented students in my engineering ethics course with an especially messy problem situation involving the development of a cyclotron for use in proton therapy, an unreliable fellow engineer, a boss playing favorites, the spectacular failure of a control system during a preliminary test, the relative merits of hardware versus software, and a lot of time pressure.
Proton therapy is a relatively recent development in the treatment of cancer; a new facility for proton therapy is under construction only a few blocks from campus.
Students worked on this situation in groups over a period of several weeks. I asked them to analyze the situation, do whatever background research they needed to do, develop at least three options, and offer up a careful, even-handed consideration of the ethical implications of each option in terms of basic values.
The results were mixed.
One common pattern I noticed was a restriction of ethical vision or moral imagination to the scale of the individual or the firm.
Sometimes, this restriction was combined with a tendency toward punishment-avoidance: How can I avoid getting fired? How can we avoid getting a bad reputation? How can we avoid getting sued?
Even when they took into account more genuinely ethical considerations, though, many students seem to think their responsibilities began and ended with the firm within which they worked.
It’s generally understood that engineers, by the nature of their profession, have responsibilities beyond the organizations of which they are part. The difficulty I find myself having is getting students to see and to feel those responsibilities as rooted in lived experience.
One way to focus my students’ attention on the ethical implications of a design process, for example, is to get them more vividly to imagine the products of that design process in use under various circumstances . . . including, unfortunately, catastrophic failures.
In the case of the cyclotron, I reminded students that what a cyclotron does is to shoot a beam of protons at a human being. I tell them: put yourself in the patient’s place, and imagine all the possibilities.
As they prepared for their final assignment, a student wrote to me for clarification of a point I had made about the ethical irrelevance, in most cases, of maters of reputation and legal liability. Part of my reply is salient, here:
Let me take an example from the cyclotron situation. In that case, the person who should be most on your mind is not yourself, or your boss, or Jack, or anyone within the firm, but the patient – already ill, quite possibly terrified – on the receiving end of the beam. If you mess this up, or allow it to remain messed up to save yourself a lot of fuss, that patient could suffer injury or death. The patient no doubt agreed to accept a certain level of risk, but not that much risk, and not risk caused by the incompetence, cravenness, or negligence of an engineer.
The risk of harm to that patient (a matter of “the good”) and the violation of that patient’s reasonable expectations and informed consent (a matter of “the right”) is ethically relevant on its own, regardless of what happens to your reputation or that of your firm.
Part of the point of redesigning my courses for next term is to focus with even more determination on this kind of visceral connection with the responsibilities of engineers.