Utilizing an Engineering Ethical Reasoning Instrument in the Curriculum

Paper ID #7607

Utilizing an Engineering Ethical Reasoning Instrument in the Curriculum

Dr. Carla B. Zoltowski, Purdue University, West Lafayette

Carla B. Zoltowski, Ph.D., is Education Administrator of the EPICS Program at Purdue University. She received her B.S. and M.S. in electrical engineering and Ph.D. in engineering education, all from Purdue University. She has served as a lecturer in Purdue’s School of Electrical and Computer Engineering. Dr. Zoltowski’s academic and research interests include human-centered design learning and assessment, service-learning, ethical reasoning assessment, leadership, and assistive technology.

Prof. Patrice Marie Buzzanell, Purdue University, West Lafayette Prof. William C. Oakes, Purdue University, West Lafayette

William (Bill) Oakes is the director of the EPICS Program and one of the founding faculty members of the School of Engineering Education at Purdue University. He has held courtesy appointments in Mechanical, Environmental and Ecological Engineering as well as Curriculum and Instruction in the College of Education. He is a registered professional engineer and on the NSPE board for Professional Engineers in Higher Education. He has been active in ASEE serving in the FYP, CIP and ERM. He is the past chair of the IN/IL section. He is a fellow of the Teaching Academy and listed in the Book of Great Teachers at Purdue University./ He was the first engineering faculty member to receive the national Campus Compact Thomas Ehrlich Faculty Award for Service-Learning. He was a co-recipient of the National Academy of Engineering’s Bernard Gordon Prize for Innovation in Engineering and Technology Education and the recipient of the National Society of Professional Engineers’ Educational Excellence Award and the ASEE Chester Carlson Award. He is a fellow of the American Society for Engineering Education and the National Society of Professional Engineers.

c©American Society for Engineering Education, 2013

 

 

 

 

Utilizing an Engineering Ethical Reasoning Instrument in the

Curriculum

Abstract

 

The need for understanding and enhancing engineering students’ ethical development has been

the subject of numerous publications and has been embedded in ABET criteria. Although there

are reliable and valid measures of individual ethical development (e.g., Defining Issues Test,

Version 2 (DIT2) 1 ), engineering ethics offers a unique site in which the confluence of

disciplinary concerns, professional codes, industry regulations, accreditation and other Board

considerations, and insight into human issues enter design considerations. As a result, we

developed the Engineering Ethical Reasoning Instrument (EERI). For this paper, we describe

how educational benefits can be achieved by using the EERI within the curriculum. First, we

present some background information on an instrument that is in its final validation phases and

that offers an engineering scenario-based assessment of individual students’ ethical reasoning.

Second, we present how we can utilize this instrument for instructional exercises in three

different class formats. We found that it was particularly important in the service-learning design

class for students to learn what issues to consider and frameworks to engage, but also when and

how to better recognize ethical issues in their own projects. The service-learning context offered

an ideal site in which engineering educators could assist students in reason through alternative

choices in their design and their team processes. Through these curricular strategies, students can

develop deep insights into personal, team, and professional ethics.

 

Introduction

 

There have been abiding concerns with ethics in the engineering discipline. These concerns

center on professional codes of ethics as well as the development of ethical decision making

abilities in everyday engineering work. The need for understanding and enhancing engineering

students’ ethical development has been the subject of numerous publications and has been

embedded in ABET criteria. Ethical development scholarship has had two main branches

focusing on moral development phases aligned with justice or more universalized standards

whereby ethical quandaries are adjudicated 2,3,4

and on contextualized ethical processes where

care or more individualized standards are incorporated 5 .

 

Although there are reliable and valid measures of individual ethical development such as the

DIT2 1,6

, engineering ethics offers a unique site in which the confluence of disciplinary concerns,

professional codes, industry regulations, accreditation and other Board considerations, and

insight into human issues enter design considerations. Although most if not all professions have

unique contexts for ethics, engineering has its own particular values that center on technical

aspects, design, and problem solving 7 increasingly in human-centered design. In human-centered

design, the immediate and long-term effects on the potential users in their lived conditions

warrants greater concern than design typified as more “thing” or technically oriented. Moreover,

engineering has become an increasingly global profession such that prior ethical decisions that

might have had to do with technical design feasibility and other criteria have been reconsidered.

Globalization along with the proliferation of new technologies create new contexts and issues

that are not covered by traditional codes of ethics. Professional engineering codes are

 

 

 

 

insufficient for dealing with complex cultural and social issues as well as with decisions about

emerging technologies. For engineering educators, ethical development is of great importance

because lack of consideration of such issues can result in life-threatening results and enormous

costs to industries.

 

Needing a way to assess ethical development in engineering students, researchers have

constructed measures that consider different types of ethical situations encountered by engineers

in their daily work. Foundational to these measures is Kohlberg’s moral development phases

(originally called “stages”) in which individuals increase in ethical decision making from purely

self-centered responses to dilemmas to (ideally) highly empathic and contextually oriented

choices 2,3

. The Defining Issues Test, Version 2 (DIT2) is a refined measure of moral judgments

that reworks Kohlberg’s phases into schema. These schema are: preconventional (focusing on

personal interest and encompassing Kohlberg’s stages 2 and 3), conventional (emphasizing

maintaining norms, stage 4), and post-conventional thinking (displaying agile perspective-taking,

ability to appeal to ideals that are shareable and non-exclusive, and expectations for full

reciprocity between laws and the individual, stages 5 and 6) 1 . For instance, the Engineering and

Science Issues Text (ESIT) examines moral judgments in ways similar to the DIT2 but is based

on engineering and science dilemmas 8 . Specifically the ESIT is a scenario-based assessment with

two parts whereby students first read a scenario in which an ethical dilemma related to project-

based engineering and/or science work is depicted and then indicate a decision about issue

importance through rating considerations they thought about when they made a decision. The

items used for ratings are intended to utilize moral schemas.

 

Moral schema also are the bases of other instruments. Among these measures, Rudnicka

requested that students complete an engineering task involving professional dilemmas 9 . Upon

videotaping the case discussions and conducted an interaction analysis to construct a model.

Although not an assessment tool, Rucknika’s model provides insight into the processual nature

of and key considerations in engineering ethical decision making and reasoning. Finally, like the

ESIT, the Engineering Ethical Reasoning Instrument (EERI) is set up like the DIT2. The EERI

provides a scenario-based instrument by which students’ ethical schema or phases of ethical

development regarding the particular issues being described can be ascertained. Although still in

its validation process, the EERI shows promising results for classifying students’ current ethical

schema or mindsets and ascertaining growth in more complex and nuanced reasoning processes

through course work developed to enhance critical thinking, empathy, person-centeredness, and

related learning.

 

Beyond the goal of assessing students’ ethical reasoning in an engineering context, the

instruments can be incorporated into the curricula to enhance student learning. This paper

describes curricular approaches for developing students’ decision making utilizing the

instrument. Traditionally, engineering curricular approaches to ethics have been case-based or

have centered around lecture and discussions about ethical frameworks. While necessary, such

approaches can be supplemented by individual assessments of students’ ethical reasoning

abilities and reflective activities about the tasks. Specifically, we address curricular interventions

in multidisciplinary project teams focused on real world applications. These interventions

leverage the utility of engineering ethical reasoning models and instruments into curricula. We

focus on the EERI but recognize that similar models and instruments can be used similarly.

 

 

 

 

 

Our paper addresses the ways in which instructors can utilize an engineering ethical reasoning

instrument in engineering class or lab sessions. First, we present some background information

on an instrument that is in its final validation phases and that offers an engineering scenario-

based assessment of individual students’ ethical reasoning. Second, we present how we can

utilize this instrument for instructional exercises in three different class formats: large-lecture,

team-based labs, and small-group interactive sessions. In large-lecture classes, we find that

students often engage in discussions by averaging their individual numerical responses to

instrument items without much discussion of content but with assumptions that the numbers

represent similar ethical reasoning logics. The team-based labs offer opportunities to segue from

discussions of hypothetical scenarios into issues regarding team members’ and project partners’

needs, interests, and priorities. Finally, small-group interactive sessions can focus on how

students move from individual processing of ethical decisions to greater understanding of the

multiple and often conflicting perspectives that characterize resolution of ethical challenges.

 

Overview of EERI Development

 

Like the ESIT, the Engineering Ethical Reasoning Instrument (EERI) is modeled after the DIT2

in its basic structure of scenarios, decision, and items for rating and ranking.

 

The EERI has been developed over the course of several years in collaboration with three other

universities. During this time, the EERI has undergone a number of changes in scenario

construction and in item and scale development 10

. In its current form, there are eight scenarios:

Housing Quality, Soap Box Derby, International Aid, Flood Control, Nurse Schedule Software,

Water Quality, Grant Proposal, and Pedestrian Bridge. These scenarios involve dilemmas that

students might reasonably expect to encounter on a student project team and are intended to be

engineering discipline neutral. Similar to the DIT2 and ESIT, students are asked explicitly what

action they would take in the given situation. Following their recording of their individual

decision, students are next asked to rate a series of items including some fillers (i.e., statements

that would be considered nonsensical in the context of the specific scenarios). In its current form,

students rate 14 items in terms of importance with five response alternatives: great, much, some,

little, or no. The instructions for the ranking of students’ top 4 items were: “Consider the 14

issues you rated above and rank which issues are the most important.” This rating and ranking

process follows the format of the DIT2. Like the DIT2, it is the ranked items that determine

individual students’ ethical reasoning schema phase. The rating process makes students reflect

upon each item. After they have rated each item, they can come to a more informed choice in

the ranking section that follows the 14 rated items.

 

Utilization of EERI for Instructional Exercises

 

In addition to using the instrument to be able to assess students’ individual ethical reasoning

within the student engineering context, the EERI can be leveraged in other curricular aspects in

order to further the student learning. Although students are engaged in real projects within the

service-learning context, they often did not relate discussions of ethical frameworks or

discussions of “disaster” ethics to their own student project experiences. In order to help them

connect discussions of ethical decision making in the project context, we have incorporated the

 

 

 

 

instrument in a variety of class formats. In this section, we describe how the EERI can be used

in three different class formats: large-lecture, team-based labs, and small-group interactive

sessions. We summarize our findings and recommendations for the utilization of the EERI for

instructional exercises at the conclusion of this section.

 

Large-lecture

 

One way in which we incorporated the instrument in the curricula was to have students complete

the online instrument prior to coming to a large lecture. This provided a common experience and

set of scenarios that we could discuss in the large lecture. In the lecture, we could review the

scenario quickly and discuss what issues the students thought were important in making the

decision (not bounded by the instrument), as well as start to explore why certain issues were

more important. In the lecture we discussed the Soap Box Derby scenario, which is provided

below:

Your student design team has designed a new Soap Box Derby car that allows children

with physical and cognitive disabilities to race by allowing an adult to ride in a backseat

and maintain full control of the car. Based on suggestions from the adults, you have

added spring tension to the child’s steering wheel in front in order to simulate the feeling

of driving and make the child’s experience more realistic and fun. The child will not have

the ability to control the car, only the illusion of control. Before the first test run with an

adult and a 14-year-old child onboard you hear the child’s parent tell the child to “be

careful” and to “drive safely.” The parent turns to you, explains that because of a

cognitive disability the child likely won’t understand the difference anyway, and asks you

to tell the child that the front steering wheel is actually functional. The request that you

lie to the child would take advantage of the child’s disability and it creates the possibility

that the child would feel responsible if they were to lose the race or have an accident.

Would you lie to the child? ___ Yes ___ Can’t decide ___ No

 

The students then were asked to identify the ethical issues from the scenario, first individually

and then as part of a larger class discussion. The discussion of why issues were important often

suggested overarching frameworks or ideals, such as justice. We discussed these frameworks

and how applying different frameworks might lead to different decisions. We then introduced a

process to help guide students through making the ethical decision, which was similar to a

human-centered design (HCD) process to which they were previously exposed 10

. We referred to

this ethical design process as the Moral Decision Making Process. Like HCD, this process

involved several steps beginning with problem discussions and resolution in a statement through

the selection among different solution alternatives would result in a singular problem statement.

The five phases in this process are laid out in a linear fashion but they likely would require

students’ reconsiderations of previous “steps” as they learn facts, talk through alternatives, and

test different options against their problem specifications and other issues. The five phases of the

Moral Decision Making Process are:

 

Moral Decision Making Process

1. Stating the problem a. Stakeholders b. Issues involved

 

 

 

 

2. Check facts 3. Develop list of options 4. Test options with different frameworks/perspectives 5. Make a choice and then follow through!

 

Then we asked the students to apply the Moral Decision Making process to the Soap Box Derby

scenario by completing the following table:

State the problem

Check facts

Develop list of options

Test options

Make a choice & Follow through!

 

After doing their evaluations and tables separately, students were asked to form small groups,

primarily triads but up to five members, to discuss their results and learn about alternative views.

There was no encouragement for students to attempt to reach consensus about evaluations. As a

final step, the students were asked to consider how this exercise and Moral Decision Making

process applied to their specific projects. Questions used for reflection upon its application were:

What ethical issues have or need to be considered regarding your project?

 How did you decide how to address the issues?

 What factors were important in making your decision?

 What was the outcome of your decision?

Team-Based Labs

 

In the second example, the instrument was used in a team-based lab consisted of 120 students

who were part of a learning community that included a first-year engineering class, service-

learning engineering design team courses, and either a communication or English course. In the

first-year engineering class, the 120 students were divided into small groups of four in which

they worked on projects (that were different from their service-learning course experiences).

These groups of four remained the same throughout the semester.

 

Students took the EERI in an online format very early into the semester before discussion about

ethics and design considerations took place. In addition, students were not provided with their

individual EERI test results.

 

One week after taking the online EERI version, students arrived at class and were handed two

different engineering ethical dilemma scenarios out of eight possible scenarios incorporated in

the EERI. Students were asked to read one of the scenarios and complete the items. The items

asked students to make a decision about what would do as individual in the situation presented

(e.g., would they rate houses for code violations that could result in low socio-economic status

residents being forced to move from their homes and without alternative housing being

provided). Upon making the decision about what they would do in this hypothetical case, items

then asked students to rate the extent to which certain criteria were used by them individually in

their decision. For example, in the house quality scenario, students might be asked about criteria

such as “What if you were blamed for negative outcomes?”, “Is your action consistent with your

 

 

 

 

discipline’s professional code?”, and “Have the potential risks and benefits for all parties been

considered?”, and relating to schemas 2/3 (preconventional), 4 (conventional), and 5/6 (post-

conventional), respectively.

 

Upon completing the first scenario discussion and critera ratings individually, students then met

in their small groups of four to reach consensus about a team decision for the dilemma as well as

the most significant criteria that influenced their decision (that is, the rating and the ranking of

criteria). After reaching consensus, students were asked to repeat the process for the second

scenario.

 

As students were engaging in discussion, the primary instructor, teaching assistants, and

researchers involved in developing the EERI walked around and listened in on student

deliberations about their collective decisions and criteria for these decisions. Although not

recorded, we noted that there was a tendency for students to simply average scores to reach a

decision and to prioritize certain criteria as instrumental in their reasoning processes through the

same type of averaging process. Whereas recognizing tendencies to use heuristics or simple

decision-making rules rather than engage in cycles of discussion is common particularly when

individuals do not think that the problem or required solution is highly complex or equivocal 11,12

,

we did not expect to find students consistently averaging their individual results. We encourage

instructors to consider using instructions other than to ask students to achieve consensus. If

students persist in reverting to numbers, instructors could discuss tendencies to quantify life

experiences and complex ideas as well as what is lost when the ideas themselves are not

scrutinized. In particular, students lose the opportunities to hear others’ assumptional bases and

reasoning processes for their decisions. They also may not realize that others thought quite

differently about the scenarios themselves and what constitutes ethical dilemmas in engineering

as well as adequate responses.

 

When both scenarios were completed individually and collectively, students were instructed to

reflect upon four questions and respond to these questions individually and online:

1. What most influenced your individual decision? In other words, why were certain statements higher in importance to you?

2. What factors were most important to the group? 3. Compare and contrast the factors you considered for the two different scenarios. 4. How did you make changes to the importance you placed on certain statements during

group discussions? What did you change and why did you make these changes?

5. How did the team make the decisions? Was the process diplomatic or did certain people exert more influence than others?

 

Analysis of the individual responses to Question 5 confirmed that most teams simply averaged

the ratings to reach consensus. This use of the averaging process was viewed by students as

democratic, allowing each person to have an equal vote. However, as noted previously, this

“rule” or decision-making heuristic did not encourage the students to discuss the underlying

issues or reasons as to why there were differences in the ratings.

 

In their written reflections, students also identified a variety of factors that most influenced their

decisions:

 

 

 

 

 

The social ramifications involved with each question were my primary concern because if

people could get hurt or more harm than good could be done then I decided not to

partake. Personal gain was not as important to me as protecting others from hardship.

 

The things that influenced my decision was whether or not I directly affected the

consequence. For example in the first example I would not be in fault for doing it because

my job was to rate them and it was someone else’s job to do the moral thing.

 

I focused more on the legal and professional code standards than on the views others

would have of me.

 

Students commented about their ability to compare and contrast their own individual choices and

those of others on their teams during the class activity discussions. These discussions proved

invaluable according to students’ reflection comments. For example, students noted:

 

In the first scenario, the actions were legal but it was a question of doing harm to others

and acting unethically.

 

In the first scenario, people’s lives were at stake. Therefore, the major factor looked at in

this scenario was the well-being of the people. In the second scenario, the idea of using

money illegally was brought to our attention. Therefore, legal issues were at the front of

my mind.

 

The students also identified factors they perceived were most important to the group:

 

The factors that were the most important to the group, were based off what would help

the customer or beneficiary of the project the most.

 

Abiding by the rules or regulations

 

Small Group Interactive Sessions

 

The third way in which the instrument was incorporated into the curriculum involved small

groups who participated in an interactive workshop-like class session. Like the previous

examples, the students had completed the online instrument prior to the session. When they

arrived at the workshop, students were asked to individually reflect on what they had learned

from completing the instrument. Specifically, they were asked: (1) to identify issues or ways of

thinking that the instrument presented that they had not considered before taking the instrument,

and (2) to reflect upon previously unconsidered issues (i.e., students were asked to respond to the

following question, “Did any of the issues present ways of thinking that you had not considered

before?”). Then the students were placed in small groups of approximately four students and

asked to reach a consensus as a group regarding the action to take and the ranking of the

importance of the issues. However, in this case as opposed to the team-based lab, the individual

issues were printed on pieces of paper, and they were asked to physically arrange them in order

of importance. If the group identified an issue that they thought was important but wasn’t

 

 

 

 

represented by the statements from the scenario, they were instructed that they could write it on a

yellow piece of paper and include it in the prioritization of the statements. When they completed

the activity, the groups were asked to respond aloud to the following questions to the rest of the

workshop participants:

 For each of the two scenarios: o What your group decided to do? Why? o Your top 4 issues and why o Any “yellow” issues?

 The themes, or descriptions of categories of issues that you identified

 Did you think of the scenarios differently?

 Were the decisions unanimous?

At the completion of the discussion, students were asked to reflect individually on the following

questions:

 What did you learn from working with others to complete? o What issues did it make you consider that you had not thought about before? o Did any of the issues present ways of thinking that you had not considered

before?

o Did you change your way of thinking?

 How did your team make decisions? Did everyone contribute equally or did one person dominate?

 

In the discussion, students reported that participating in this activity allowed them to consider

ways of thinking that they had not considered previously. Through observation of the small

groups working through the decisions, the workshop facilitator noted that the students discussed

significant issues and concerns related to the statements.

 

In order to help students connect this activity with their actual projects, they were asked to

complete a final reflection question:

 Think about your project this semester o What are specific decisions and actions that you have made/taken on your

project?

o What issues did you consider? o Did you see these as ethical issues?

 Think about your team o How did you team decide what to do? o What kind of culture did your team have?

 

Summary of EERI Utilization for Instructional Exercises

 

In large-lecture classes, we find that students often engage in discussions by averaging their

individual numerical responses to instrument items without much discussion of content but with

assumptions that the numbers represent similar ethical reasoning logics. The team-based labs

offer opportunities to segue from discussions of hypothetical scenarios into issues regarding team

members’ and project partners’ needs, interests, and priorities. Finally, small-group interactive

sessions can focus on how students move from individual processing of ethical decisions to

 

 

 

 

greater understanding of the multiple and often conflicting perspectives that characterize

resolution of ethical challenges.

 

Conclusion

 

In closing, we provide an overview of the Engineering Ethical Reasoning Instrument (EERI)

theoretical basis and operationalization. Although still in its scale and construct validation

phases, the EERI provides challenging ethical dilemmas for student consideration as these

students learn professional codes of conduct as well as other influences on their moral and

ethical reasoning abilities. In three different class situations, namely, large-lecture, team-based

labs, and small-group interactive sessions, we provide instructions for how the EERI might be

incorporated into online preparatory work for class discussions as well as for in-class individual

and team activities. Through such curricular strategies, students can develop deep insights into

personal, team, and professional ethics.

Bibliography

 

1. Rest, J., Narvaez, D., Bebeau, M., & Thoma, S. (1999). A neo-Kohlbergian approach: The DIT and schema

theory. Educational Psychology Review, 11, 291-324.

2. Kohlberg, L. (1981). Essays on moral development. San Francisco: Jossey-Bass. 3. Kohlberg, L. (1985). Resolving moral conflicts within the just community. In C. B. Harding (Ed.), Moral

dilemmas: Philosophical and psychological issues in the development of moral reasoning (pp. 71-97). Chicago:

Precedent Publishing.

4. Rawls, J. (1999). A theory of justice (rev. ed.). Cambridge, MA: Harvard University Press. 5. Gilligan, C. (1982). In a different voice: Psychological theory and women’s development. Cambridge, MA:

Harvard University Press.

6. Rest, J., Narvaez, D., Thoma, S., & Bebeau, M. (2000). A neo-Kohlbergian approach to morality research. Journal of Moral Education, 29, 381-395.

7. Davis, M. (1991). Thinking like an engineer: The place of a code of ethics in the practice of engineering. Philosophy and Public Affairs, 20, 150-167.

8. Borenstein, J., Drake, M. J., Kirkman, R., & Swann, J. L. (2010). The engineering and science issues test (ESIT): A discipline-specific approach to assessing moral judgment. Science and Engineering Ethics, 16, 387-

407.

9. Rudnicka, E. A. (2009). Development and evaluation of a model to assess engineering ethical reasoning and decision making. Ph.D. dissertation, University of Pittsburgh.

10. Titus, C., Zoltowski, C. B., & Oakes, W. C. (2011). Designing in a social context: Situating design in a human- centered, social world. Proceedings of the American Society for Engineering Education Annual Conference &

Exposition, Vancouver, BC, Canada.

11. O’Keefe, D. (2002). Persuasion: Theory & research (2 nd

ed.). Thousand Oaks, CA: Sage.

12. Weick, K. E. (1979). The social psychology of organizing (2 nd

ed.). Boston, MA: Addison-Wesley.

Looking for a similar assignment? Get help from our qualified experts!

"Our Prices Start at $9.99. As Our First Client, Use Coupon Code GET15 to claim 15% Discount This Month!!":

Order a Similar Paper Order a Different Paper