Here鈥檚 how 一品探花论坛 faculty use AI

They are making sure students are literate in artificial intelligence and embracing it in their own teaching and research.

Daniel Anderson sitting at computer in front of whiteboard.
Daniel Anderson, professor of English and comparative literature, developed training modules as part of the 一品探花论坛 AI Literacy initiative platforms responsibly and effectively. His goal is to ensure that all UNC students become AI literate. (Jon Gardiner/UNC-Chapel Hill)

The concept of artificial intelligence has been around since antiquity. Sacred statues in Egypt were believed to be imbued with real minds and could answer any questions put to them.

These days, we have Google.

We also have streaming services, businesses and social media platforms that use AI to track user behavior to provide personalized content and advertisements. Health apps analyze biometric data to give insights for a fit lifestyle. Navigation systems use real-time鈥痑nd historical traffic patterns and predictive analytics to provide route recommendations.

What鈥痠s鈥痙ifferent today is the surge in the amount of data, computational power and the public release of generative AI platforms like ChatGPT or Microsoft Copilot. In the world of artificial intelligence, the more data, the more robust these models and algorithms become.

This has led to an exponential increase in how this technology could be used. For some, ethical issues loom just as large. Problems with data privacy, structural bias, copyright, plagiarism and disinformation are all hot topics in the AI ethical debate.

While some view the rise of AI as a threat, others see its potential in learning. One of the latter is鈥, professor in the College of Arts and Sciences鈥 English and comparative literature department and director of鈥痑nd the鈥. In his research, Anderson studies the intersection of computers and writing.

Last summer, Anderson allowed English composition students to use generative AI platforms throughout the course and found mixed results. When brainstorming topic ideas, identifying key words or summarizing verified documents, AI was helpful. But when looking for a literature review with the best 10 sources, the chatbot generated false references, what is called an AI hallucination.

These hallucinations taught students a valuable lesson in verification.

鈥淎 lot of what I hoped to see, as a teacher, happened,鈥 said Anderson. 鈥淏y spending time with these activities, students were able to have the lightbulb go off for themselves.鈥

Based on this experience, and with funding from the鈥, Anderson built training modules as part of the鈥 initiative鈥痶o help students learn how to use these platforms responsibly and effectively.

Anderson鈥檚 goal is to ensure that all 一品探花论坛 students become AI literate.

鈥淣ot all students need a deep understanding of AI, but they should know it鈥檚 trained on data and that data can come with limitations and affordances,鈥 he said. 鈥淵our bank is going to be using AI; your doctor is going to be using it. It鈥檚 useful to have a sense of how this technology is mediating your life.鈥

  • , an assistant professor with appointments in the computer science department and the School of Data Science and Society, focuses on improving machine learning models, a type of AI that learns from data in order to make predictions.
  • , a professor in the biology and genetics departments and in the UNC School of Medicine鈥檚 integrative program for biological and genome sciences, investigates histones and their role in gene expression.
  • , assistant professor in the computer science department, works on video understanding and computer vision technologies.
  • , assistant professor in the exercise and sport science department, core faculty member in the鈥疢atthew Gfeller Center鈥痑nd co-director of the STAR Heel Performance Lab, uses AI to predict athlete performance, injury prevention and recovery.