STEM education is all the rage: Science, Technology, Engineering and Mathematics, the collection of disciplines regarded as the most desirable, the most likely to lead to financial success for individuals and economic growth for nations, the obsession of universities and policy makers alike.
I don’t know who was the first to consider the possibility of making a concession to the older and richer tradition of liberal-arts education in the United States, but I’ve also come across the acronym, STEAM, in which the ‘A’ stands for Arts.
That’s nice. Continue reading
When the conversation opened up on the second day of our November workshop, after my presentation on acceptable risk, the project team and the invited participants spent much of the remainder of the morning developing and jotting down ideas for fostering better, more informed and more constructive public deliberation about hydraulic fracturing.
Our initial ways of phrasing the questions were rough, and many of them were likely to be perceived as biased against one group or another, playing on stereotypes, say, of engineers or of some of the more strident individuals who might show up for a public hearing.
In the weeks that followed, the project team at Georgia Tech revised the list, and reconsidered it, and revised it again.
The end result is a set of questions that will frame the work of our second workshop, now scheduled for early April: Continue reading
I have said that the first day of our workshop on hydraulic fracturing, in November, brought out a long list of risks related to hydraulic fracturing and, indeed, the engineers and scientists who participated were quite adept at identifying such risks and possibilities for mitigation.
Something else came out during those first sessions, though, which I found troubling.
What I heard was simply a repeated assertion or implication that those who oppose hydraulic fracturing are moved to do so only by emotion, especially by fear. The assertion was reinforced with reference to certain bad actors in the public arena who engage in campaigns based on misinformation, distortion and possibly even fraud to manipulate the emotions of an uninformed public.
The underlying assumption of such claims, I think, is that there is a clean distinction between reason and emotion, and that only those who base their decisions on the methods and findings of the sciences have reason on their side.
Beneath this is a still deeper assumption that quantitative analysis is the essence of rationality. Continue reading
My tongue-in-cheek comment on the language of hydraulic fracturing was intended to get at the ways in which metaphors and images can affect – and sometimes skew – our understanding of risks and responsibilities.
This effect can work in any direction, for or against any particular position, and it can be especially pronounced when the problem situation within which people are making decisions – and disagreeing with one another over what decisions to make – are not well understood.
One theme that emerged early in our first workshop on hydraulic fracturing was that nearly every available image of hydraulic fracturing is inaccurate in ways that may exaggerate or, at least, misrepresent the risks involved in the process – and this is true even of images on websites of those who should know better, and on websites of organizations generally favorable to the use of hydraulic fracturing in oil and gas extraction.
One of the students on our project team came across an especially egregious example of the type, an image used in the film, Gasland: Continue reading
As I have been hinting, I’m currently caught up in a collaborative project on engineering, ethics and policy related to hydraulic fracturing.
The idea for the project began to take shape in conversations I was having with my colleague, Chloé Arson, who is over in the School of Civil and Environmental Engineering at Georgia Tech. We were exploring opportunities for new directions in engineering ethics education.
I cannot now say which of them emerged first, but there are twin intuitions at the heart of our discussion:
- Most interesting problems in engineering – and for ethics and policy related to engineering – involve not only risk but also uncertainty, because the underlying dynamics of the problem situation are poorly or only partially understood; and
- We should aim to prepare engineers-in-training to engage in ethical inquiry and policy inquiry at the same time they are engaging in empirical inquiry and in design.
I joked at the time that the second intuition goes both ways: we need engineers who can think like ethicists and ethicists who can think like engineers. Continue reading
In my post of Tuesday afternoon, I made brief mention of an exercise in my environmental ethics class involving a pencil:
In my environmental ethics class, I gave each of six groups a single no. 2 pencil – a classic yellow Ticonderoga, as it happens – and asked them first to write down everything they already knew about no. 2 pencils or could find out from physically examining and using the object itself. Then I told them to go to the ‘net to find out what else they could learn.
I should put this exercise in context.
The fearful truth of the matter is that I am inventing the idea for my environmental ethics course as I teach it. Continue reading
Following up on my post from last week, I discussed the use of electronics in the classroom with each of my two classes, today.
I told students that I used to ban electronic devices, and why I thought I needed to do so, but noted that I lifted that ban when I switched to problem-based learning: the internet can be a valuable resource for groups grappling with practical problems . . . though the trick is to know when and where to look.
I told them they should make sure to bring some kind of internet-capable device with them to class, so it’s available if they need it. I added that we would try out and reflect on various ways of using electronic devices for collaborative work, since some are likely to be more fruitful than others.
I went on to say that there might be particular activities and assignments for which I would ask them to put away their electronics, but that there would be a good pedagogical reason for it. For example, I might want them to see what they can make of a particular problem using only the resources of their own minds and bodies, and perhaps of a paper book. Continue reading
This past semester I presented students in my engineering ethics course with an especially messy problem situation involving the development of a cyclotron for use in proton therapy, an unreliable fellow engineer, a boss playing favorites, the spectacular failure of a control system during a preliminary test, the relative merits of hardware versus software, and a lot of time pressure.
Proton therapy is a relatively recent development in the treatment of cancer; a new facility for proton therapy is under construction only a few blocks from campus.
Students worked on this situation in groups over a period of several weeks. I asked them to analyze the situation, do whatever background research they needed to do, develop at least three options, and offer up a careful, even-handed consideration of the ethical implications of each option in terms of basic values.
The results were mixed. Continue reading