The University of Mississippi is working to establish artificial intelligence policies by tasking individual academic departments, instead of professors, with creating their own standards for AI usage in coursework. Currently, each professor sets the standard regarding AI for his or her own courses, with little uniformity throughout departments. Work creating these policies will continue into the fall semester.
The request for department-wide standards came after the University of Mississippi’s AI Task Force teaching and learning subcommittee sent a recommendation for AI use to the university administration.
The teaching and learning subcommittee is one of the four subcommittees of the AI task force. The group is composed of faculty members who discuss how AI is developing, how those developments impact the university and what practices and procedures the university should implement as a result.
Joshua Eyler, senior director of the Center for Excellence in Teaching and Learning and assistant professor of teacher education, serves as chair of the teaching and learning subcommittee. Eyler said the university dictated that individual departments create AI policies instead of a standardized, university-mandated policy because of each department’s unique needs.
“(The teaching and learning subcommittee’s AI procedural draft) is a recognition that there are a lot of different practices right now and that departments are looking for and would value support structures in creating policies,” Eyler said. “Not standardized in the sense that everyone’s doing the same thing because that would be very difficult given how different departments are, both in terms of their disciplinary practices and their cultural norms. Giving them not necessarily standardized (policies), but giving them resources and tools to be able to create the policies.”

Different schools and colleges on campus will face varied challenges as they coordinate with their departments to create these AI policies.
Some schools only have a few departments, which can make collaboration easier. The School of Journalism and New Media, for instance, only has three departments — media and communication, journalism and integrated marketing communications. Other schools and colleges have more vast and diverse departments with varying needs.
The College of Liberal Arts houses 21 departments; the departments have not started working on creating AI policies.
“At this time, the college has not yet begun work on this initiative, as it will require coordination across 21 programs,” Stacey Smith, assistant to the dean for the College of Liberal Arts, said in an email to The Daily Mississippian. “We anticipate starting the process after graduation, given the number of priorities and activities taking place between now and then.”
Dean of the School of Journalism and New Media Andrea Hickerson elaborated on what AI policies could look like.
“I suspect because our departments have such a similar skill set, similar background, they will look very similar in some cases,” Hickerson said. “Now, we might see differences, for example, about how journalism wants to approach it where they are focused on truth and accuracy, where (integrated marketing communications) would be more comfortable with it in some of the creative aspects of things.”
Hickerson also expressed that while the policies are being made on a departmental level, there is a possibility of it changing to an overarching School of Journalism and New Media policy, as to hinder any confusion students may have regarding AI use expectations.
“(Departments are) really going to focus on their disciplinary expertise and what works for that area of knowledge. Though, after people make those decisions, it’s going to start at the departmental level and then maybe look across,” Hickerson said. “If journalism adopts this policy and IMC adopts this policy, and students are taking all these classes, they might not realize that they’re in a journalism class versus an IMC class. We’re going to have to be really cautious about that. It very well could be that we’ll end up with a single policy after going through departmental ones.”
The AI Task Force meets quarterly throughout the school year.
“The task force is sponsoring conversations around university-wide policies and guidelines on artificial intelligence as it relates to, specifically, teaching and learning, research, service and business practices,” Robert Cummings, executive director of academic innovation, said.
Beyond dealing with the work of the individual subcommittees, the task force also discusses innovations in AI technology and the effect that AI agents could have on the university. AI agents are software systems that use AI to perform multiple tasks at once, without the initial prompting from the user.
Generative, simple AI models like ChatGPT require more user control and review than the advanced AI agents that have emerged in the past four to five months. Marc Watkins, assistant director of academic innovations group and lecturer in composition and rhetoric, gave a presentation on the growing access students have to AI agents at the task force meeting on March 27.

“Sometimes you can prompt (simple AI) agents; other times that trigger can send it up, so they work autonomously. That was about 18 to 20 months ago,” Watkins said. “You can have dozens of agents. Faculty might decide that we need it to do feedback. Another agent becomes an email agent … Another agent might … grade.”
Advanced AI agents have the ability to work more independently and are able to perform a wide variety of jobs simultaneously. Watkins addressed some of the implications that tools like this could have on the university.
“That’s really the big shift here. We’ve moved from a space where we’re working iteratively with these tools to one where it might be automated to some degree,” Watkins said. “The one big risk with all this is that you’re still responsible for the output, even when your agent may be working on your behalf.”
These AI agents could also affect teacher evaluations.
“We need to go beyond just assessment and start thinking about risks that generative AI systems pose to us, and one of those areas is our teaching evaluations,” Watkins said. “If a student does use an AI agent to produce an evaluation, the resulting data is not a reflection of student experience.”
AI Task Force members also discussed the possibility of giving students access to the Google chatbot Gemini through the university. This topic was introduced in October; talks about providing Gemini to students will continue through the task force’s next meeting in May.



































