APU Cyber & AI Original

Can Teacherbots Be Programmed with Ethical Behavior?

By Dr. Wanda Curlee, Faculty Member, School of Business, American Public University, and Dr. William Oliver Hedgepeth, Faculty Member, Transportation and Logistics Management, American Public University

This is the second article in a research study on whether an AI system could ever replace online college professors. Several APU School of Business faculty members are conducting this study.

There is ongoing debate about whether teacherbots should be allowed in the classroom despite the fact that they already there in classrooms around the world.

Start a degree program at American Public University.

For example, Finland’s robot teacher is providing valuable lessons in how robots and human teachers can coexist in the classroom. Japan’s first robot teacher was hailed as the first such human-machine interface system in the classroom.

Robots are already being used in college and university classrooms as learning tools, not as teacher replacements. Robot librarians are in use at the University of Technology in Sydney, Australia. It seems that it’s just a matter of time before teacherbots end up in university classrooms as teaching assistants or as replacements for instructors.

One of the dilemmas of using teacherbots is ethics: Whose ethics will be coded into the teacherbot? Will the ethical makeup of the teacherbot be kept within the acceptable standards while it is learning from its human-machine interface experiences? Do teacherbots have to be ethically coded for each region or school where it is employed?

What Are Ethics in the Teaching Discipline?

Ethics is “the discipline dealing with what is good and bad and with moral duty and obligation,” according to Merriam-Webster. But what about ethics in teaching? The terms ethics is not normally part of students’ descriptions of their best or worst teachers. Can a teacher described as fair or mean or honest or biased be ethical at the same time?

Ethics are also something that teachers should embrace in the classroom, whether it be online, in a distance classroom or in their personal lives. Teaching can include ethics as part of students’ moral development.

Ethics is about decision-making. And decisions are based on rules and regulations of human behavior.

In educational publications for teachers, you might read terms such as “ethical standard of professional practice,” or “moral code of educator ethics.” Many U.S. states have a code of ethical standards to which their teachers should or must adhere.

Teacherbots would have to adhere to these standards, too. But some of the codes are ambiguous. For example, who decides what providing “professional education services in a nondiscriminatory manner” is for teacherbots?

Teacherbots and Requirements for Ethical Teaching

A teacherbot is part of the human-computer interface in education. A teacherbot can be developed to be the sole contact between teacher and student or it can be an assistant to a human teacher. Both roles were examined as part of our research. Does a teacherbot perform 100% of the teaching in certain classes, all classes, or less than 100% along with a human teacher?

As English instructor Manuel Campos wrote in How to Become an Ethical Teacher, ethics “…involves systematizing, defending, and recommending concepts of right and wrong conduct.” Ethics has many aspects that a teacherbot should be able to embrace. The list, composed by Campos, must be programmed into teacherbots:

  • They must have the necessary content knowledge to provide students with up-to-date information.
  • They must teach courses within their area of expertise.
  • Teachers must remember that content knowledge is not everything so they must have a set of great pedagogical techniques.
  • Teachers should make the adjustments necessary to improve their teaching effectiveness.
  • Teachers must encourage students to think critically
  • Educators shouldn’t indoctrinate students into a particular point of view.
  • Teachers should encourage open discussion.
  • Teachers should treat students as individuals.
  • Teachers should avoid policies that are unfair.
  • Teachers should encourage academic integrity.
  • Teachers should communicate and apply consequences for academic dishonesty.
  • Teachers have the ethical responsibility to address potential violations of academic integrity.
  • Teacher assessments should be objective, valid and fair.
  • Teacher assessments should match course objectives.
  • Teachers should maintain a professional and objective relationship with students.
  • Teachers should avoid behaviors that take advantage of their power relationships with students.
  • Teachers should avoid situations that might be construed as discrimination or sexual harassment.

Can a teacherbot accomplish all this? That is part of the ongoing research and development (R&D) of teacherbots in the synchronous and asynchronous classroom, online and distance learning classrooms.

Western Versus Eastern Ethics

According to Difference Between, “Eastern ethics is much more about doing what is right in terms of what is expected of you by your family, society and culture.” This Eastern way of thinking is different from Western.

Whereas Eastern has part of its focus on family, Western has a focus on the “self and what is rationally or logically true.” As a professor or teacher from a Western or Eastern culture, it would be prudent to understand such differences so that your students or business associates would bring the proper respect to your decisions and behavior.

Of course, teacherbots would have to be carefully programmed to adhere to this list while taking into account the differences in ethics’ perceptions.

Ben-Gurion University of the Negev in Israel conducted a study of how various intelligent vehicles operators (IVO) would react to various non-win situations in a self-driving car.

The Massachusetts Institute of Technology (MIT) is testing several versions of IVOs. While this is not a direct correlation to teacherbots, the data gleaned from the testing show how difficult it will be to program teacherbots.

Besides ethical differences, there are also noticeable differences between cultures, not only Western versus Eastern, but also wealthy nations versus developing nations, and individualistic versus collectivist cultures. Some of the differences found were as follows:

  • Save elderly pedestrian (Eastern) over young pedestrians (Western)
  • Tolerance for jaywalkers (Countries with less governmental control)
  • Sparing more lives over a few (No clear answer)
  • Protection of the driver (Depends on who the driver would hit and possibly kill)
  • Save those with a higher social status over a lower status (Those countries with high social inequality)

Ethics evolve over time. Will the teacherbot evolve as well? Will teacherbots drive the evolution of ethics? We are moving into an era of artificiality, as Dr. Laura Pana noted in her article “Artificial Intelligence and Moral Intelligence.” We human beings are becoming more technical beings.

Think of today’s medical devices and human technological implants that are so common. As we are more comfortable with being bionic will we become different? Dr. Pana argues that human ethics cannot simply be coded into machines because machine ethics – if they exist at all – are different from human ethics.

We Are Evolving Ethically

Understanding the various driving factors of ethics and how we are evolving ethically to include the idea of bots into our everyday lives, programming ethics in teacherbots will be complex but understandable. Will students see the differences in teaching and grading due to the ethics of a teacherbot? As we step forward into the realm of teacherbots, we need interaction between social and technology researchers to further our understanding of teacherbots’ ethics.

About the Authors

Dr. Wanda Curlee is a full-time professor at American Public University. She has over 30 years of consulting and project management experience and has worked at several Fortune 500 companies. She has a Doctor of Management and Organizational Leadership, an MBA, an M.A. and a B.A. in Spanish Studies. Dr. Curlee has published numerous articles and several books on project management.

Dr. Oliver Hedgepeth is a full-time professor at American Public University (APU). He was program director of three academic programs: Reverse Logistics Management, Transportation and Logistics Management and Government Contracting. He was Chair of the Logistics Department at the University of Alaska Anchorage. Dr. Hedgepeth was the founding Director of the Army’s Artificial Intelligence Center for Logistics from 1985 to 1990, Fort Lee, Virginia.

Oliver Hedgepeth

Dr. Oliver Hedgepeth is a full-time professor in the Dr. Wallace E. Boston School of Business. He teaches and publishes on reverse logistics as well as transportation and logistics. Dr. Hedgepeth holds a bachelor’s degree in chemistry from Barton College, a master’s degree in engineering management from Old Dominion University and a Ph.D. in engineering management from Old Dominion University. Before his teaching career, Dr. Hedgepeth was an operations research systems analyst for the Department of Defense and the Defense Intelligence Agency. From 1990 to 1995, he was the founding director of the Army’s Artificial Intelligence Center for Logistics at Fort Lee, Virginia.

Comments are closed.