Generative AI poses questions to long-standing higher ed ethics

A compass rose embedded in the sidewalk.

With generative AI tools becoming more commonly used in the office, at home and at school, it raises ethical dilemmas and questions for higher education institutions like LCC utilizing AI tools. Photo by Carson Lemon. 

Carson Lemon

This article is part of The Lookout's LCC x AI series, a multi-part series on conversations surrounding AI and its impacts at Lansing Community College. 

By Carson Lemon
Staff Reporter

With generative AI tools becoming more commonly used in the office, at home and at school, it raises ethical dilemmas and questions for higher education institutions like Lansing Community College. Provost Sally Welch spoke to some of the ethical concerns surrounding the rapidly increasing use of generative AI throughout LCC campus.

LCC and Generative AI in the Classroom

With the generative AI boom, it seems every big tech company wanted their own generative AI bot. Google created Gemini, Microsoft has Copilot, and the company OpenAI has perhaps the most popular generative AI chatbot with Chat GPT.

Welch pointed out that, even for an educator wanting to avoid AI use, it’s pretty much impossible. “As a faculty member, [avoiding AI] is a challenge because everything has AI kind of built into it. So, if you do a Google search, the very first thing that comes up is the AI interpretation; Copilot is built into our Microsoft Office suite.”

Generative AI use has become something of a situational problem. It’s been hard for the college to determine what kind of AI uses constitute cheating and what uses can actually help learning. Welch said that, currently, the main goal is to “teach students not to rely on [AI generated writing] as the truth, but to take that information and use it their own way or to learn the material in their own way, but write it in their own words.”

LCC, Generative AI and Accreditation

Accreditation is essential to any college or university in the United States, as it ensures that the degrees that students earn are valid and better enables credits to transfer between institutions. Lansing Community College is an institution boasting college-wide accreditation, but also many program-specific accreditors ranging from LCC’s Emergency Medical Services program to the Massage Therapy program. LCC’s institutional accreditation comes from the Higher Learning Commission, which, according to the LCC website is “an agency approved by the U.S. Department of Education to accredit degree-granting institutions of higher education throughout the United States.” In the case of program-specific accreditors, by visiting the Program Accreditation webpage on the LCC website, viewers can verify that a program is certified or accredited. For example, LCC’s Aviation Technology is certified by the Federal Aviation Administration, and LCC’s Dental Hygiene program is accredited by the Commission on Dental Accreditation.

In total, LCC has 24 accrediting bodies to answer to, all with different nuances and requirements. However, Welch confirmed that, in terms of an AI use policy, the accreditors are united: “Nobody has one.”

Elaborating on the statement above, Welch explained that, because generative AI technologies are evolving so quickly, it has been a difficult issue to write policy on for colleges and their accreditors. She cited their own policy-making process at LCC as an example: “We had a basic [AI] policy here that we were going to post, and we had to wait because things kept changing.”

Though the college is not currently navigating new AI-related requirements from accrediting bodies, the possibility that in the near future they will is very real. Additionally, a concern for accrediting bodies lacking a generative AI use policy is that institutions receiving their accreditation may not be keeping a close eye on AI use, leading to students missing out on learning experiences needed to become competent in their field. If these accreditors are seeing colleges and universities pumping out graduates with their accreditation, but with no real knowledge about their field, their reputation could come under fire.

Though one could argue that cheating has always existed, the surge in use of generative AI software truly does allow a student to completely bypass important learning processes when they ask it to write an essay for them, generate study materials, and more uses explored in Nicole Wadkins’ article. Simply the act of writing their own notecards, even putting together a poorly written essay, significantly deepens the learning experience of students.

LCC and the Generative AI Lawsuits

In December 2023, The New York Times filed a lawsuit against OpenAI and Microsoft, alleging that the companies illegally used writing from the Times to train its AI chatbots, infringing on copyright law. OpenAI’s defense against the suit is that, because the writing was transformative—the chatbots learned how to write new sentences, not copy the Times’ previously written ones—the company’s use of that content falls legally under fair use doctrine.

Though the case was filed in 2023, the suit has not yet gone to court, but still holds the potential to dramatically alter the field of AI chatbots in the U.S. depending on the ruling. If OpenAI’s ChatGPT is found to be replicating works from the Times, it could open the door for potential plagiarism scandals if the AI model has been creating works comprised of mainly stolen work.

This lawsuit becomes a concern for LCC when students use AI chatbot to write or create for them. These chatbots may be creating work replicating another author. They may be incorrectly citing a source, citing sources that don’t exist, or—and perhaps the most dangerous for LCC— directly ripping off the writing of a billion-dollar news outlet like The New York Times.

Currently, the solution to ongoing lawsuits and avoiding accusations of plagiarism is to simply cite the AI chatbot when its use is permitted in a course. According to Welch, “Citing the material is really important. So, you're giving credit, not necessarily to the right person,” she admits, “but you're at least giving credit to the source that you got the information.”

LCC Embraces Generative AI at What Cost?

To run these generative AI chatbots, companies like OpenAI and Microsoft need huge buildings filled with server towers that are dedicated to constant computing to keep the chatbots online. One of the biggest drawbacks of generative AI use becoming so widespread is that these data centers are extremely resource intensive. Tapping into vast stores of information and synthesizing new content requires a lot of energy, as reported by the MIT Technology Review.

Additionally, because of the way municipal power grids are run, data centers that are built in residential areas can end up increasing power bills for residents living near one of these centers. In New Jersey, residents reported rate hikes of about 20% on their monthly power bills after having a data center open near them. This can add up to hundreds more on residents' power bills in a year.

It isn’t just energy these data centers need to run; water is an essential part of cooling the server towers, which generate large amounts of heat because they run around the clock. The most common method of cooling for these data centers is to use water in the form of mist to cool the air, which is then blown through the server towers. In 2025, the New York Times released a story highlighting the stories of people who had literally lost their water after a data center was constructed nearby. This is a story that is echoed throughout the U.S., where about 1,240 data centers run. When people are losing their water so generative AI models can continue to run, there is clearly a problem.

If LCC is encouraging students to use generative AI models, to explore the many tools offered by such software, are they cosigning onto this considerable environmental destruction? Welch hopes not, but cannot be sure. “That's the downfall of AI. I think researchers are already working on trying to mitigate that,” she said.

Presenting her dilemma, Welch continued, “It's really, it's a tough situation. Do I like it? Hurting the environment? Not at all. Do I need students to be prepared to use AI? Yes. I don't know how to weigh those in terms of what's better, what's worse for each of those different parties.”

Higher Learning’s Concessions & Generative AI

As with all higher education institutions, LCC included, we all like to think of them as bastions for knowledge and learning. However, a community college has a particular obligation to prepare students for the workforce. Community colleges are meant to be drivers of the economy—even if that economy seems to focus on wealth generation, success of questionable industries, and uninhibited growth. LCC and community colleges all around the country face the same pressure to educate the community and enable continued learning for all while also directly feeding the local workforce. According to Welch, employers have really homed in on finding and recruiting employees that know how to work with AI. “I went to two conferences this year. One was the Limitless Learning Summit and the other was Achieving the Dream. They both very specifically talked about how the workforce is expecting students to know how to use AI.” Welch continued, saying, “We have to find a way to get the students the skills they need to be ready.”

Some might argue that just because the technology is available, that does not mean it needs to be used. But it seems that LCC leadership knows that they are a small fish in a humongous pond. If they protest the use of generative AI in the classroom, they run the risk of producing a graduate pool left behind, lacking the tools needed for survival in today’s dog-eat-dog job market. If they encourage the use and exploration of generative AI, leadership seems to think they are doing their jobs: preparing students for the workforce that awaits them.

Sponsors

Looking for housing? Visit the official lcc off-campus housing website! Visit offcampushousing.lcc.edu
Back to top