APU Online Learning Original

AI Tools Like ChatGPT Are Challenging Scholarly Ethics

To save writing time and effort, today’s writers have the option of using artificial intelligence (AI) tools such as ChatGPT (Chat Generative Pre-Trained Transformer). Some of these AI tools are free to use, according to Medium, and they could potentially be used by students for various writing projects, such as:

  • Writing a college paper
  • Writing a news story for a high school or college newspaper

These AI tools, although convenient, seem to be automating creativity. But what about the ethics of students using these tools for their schoolwork?

The Practicalities and Ethics of Using AI Tools for School

To gain more insight into the use of AI tools, I asked current college students. Some of them saw ChatGPT as just another tool they could use for schoolwork. However, the use of AI tools inspired several discussions about the practicalities and ethics of using them.

For instance, one student was concerned if any person would have enough thinking ability to create a comprehensive question for an AI tool to answer. Other students thought that using AI tools could help with quickly extracting key information from a variety of sources.

AI tools could be useful for quickly extracting key information from a variety of sources.

AI tools like ChatGPT could make writing a college paper less challenging and less frightening for some students. For instance, AI tools could suggest good sentence construction and the proper use of punctuation, capitalization, and grammar.

Maybe asking an AI tool to write a college paper could be just a way to start a personal reflection of that topic in a writing assignment. The use of AI tools does not have to make any college research paper unoriginal. Instead, such a tool can improve a student’s creativity; the student could use an AI-created paper as a springboard for an in-depth investigation into an assigned topic.

Actually, using AI tools might become ethical in the future. AI tools may provide college students and professional researchers with innovative thoughts that could be used to solve complex social and ethical problems. It’s also possible that an AI-created paper could help students overcome writer’s block, giving them a view of what topics would be possible to research and write.

Other students in my class were concerned about how our society will be evolving and learning. They were worried about how people would be able to develop their own opinions if they use those AI tools to generate opinions.

Already, our University uses AI software such as Turnitin to help students see how their writing might be close to plagiarism. But AI tools may be creating a new generation of people who are really uninformed about the world around them, how they fit into that world, how to solve problems and how to think.

Using AI tools to write papers could create lazy students and detract from a student’s integrity. AI tools could potentially destroy student creativity. In addition, AI tools could affect the process of critical thinking, problem solving and eloquent description for students.

Right now, college professors are discussing AI tools to see if students should be told whether or not to use them. But it’s still important for students to show both inline citations and reference lists, which make a critical difference to students’ grades.

RELATED: How You Can Avoid the Various Pitfalls of Academic Writing

AI Writing Tools Are Likely Here to Stay, So Schools Will Need to Establish Writing Policies

It seems that AI tools for writing are here to stay for college students. So, if these tools are staying around, each college or university should have its own writing policies for students. Here is a potential policy:

1. AI tools may only be used for educational purposes, such as research and class projects, and not for any commercial or unethical purposes.

2. Students and faculty members must properly cite AI tools as a source in any work that uses their output.

3. Any data generated by AI must be handled in accordance with the institution’s policies on data privacy and security.

4. AI tools should not be used as a substitute for human effort or critical thinking.

5. Any misuse of AI tools will result in disciplinary action in accordance with the institution’s policies.

6.  Students should verify their understanding of the rules to ensure that a school has recourse for any problem involving the use of AI tools.

7. There should be a higher education governing body to create, mitigate and manage the oversight of AI tools in educational environments.

For faculty members reading this article, how do these types of proposed writing policies seem as useful guides for college students using these new AI tools? Should we expect to see AI-written articles in peer-reviewed journals? Would an AI-created article in professional journals be considered truly scholarly? What is your opinion of this writing policy and these other questions?

Oliver Hedgepeth

Dr. Oliver Hedgepeth is a full-time professor in the Dr. Wallace E. Boston School of Business. He was program director of three academic programs: Reverse Logistics Management, Transportation and Logistics Management, and Government Contracting. Dr. Hedgepeth was also Chair of the Logistics Department at the University of Alaska, Anchorage, and the founding Director of the Army’s Artificial Intelligence Center for Logistics from 1985 to 1990, Fort Lee, Virginia.

Comments are closed.