Artificial Intelligence, or AI, has arrived, and we are just beginning to see its impacts on how we work, play and, well, study. As it eliminates busywork and opens the door to new learning frontiers, AI is expected to create a paradigm shift in higher education instruction. UCR’s XCITE Center for Teaching and Learning can help faculty redesign course content and curriculum to explore or embrace AI.
So, what can we expect as AI moves into our classrooms and onto our computer screens? We asked Richard Edwards, the executive director of the XCITE Center; Rich Yueh, an assistant professor of teaching in information systems at the School of Business; Matthew Lang, a teaching professor in the Department of Economics; and Yue Dong, an assistant professor of computer science and engineering; to sound off (without help from ChatGPT) on the questions many of us have on our minds.
Question: What are the opportunities and uses of incorporating AI tools, such as ChatGPT, into higher education courses?
Richard Edwards: At this time, instructors in higher education are just beginning to explore potential uses of AI tools such as ChatGPT and Google Bard in their courses. AI has the potential to help tutor students, improve their writing through assistive prompts, and personalize the learner’s experience. Using AI in this way should not violate academic integrity nor result in a rise in plagiarism. While there will always be risks associated with AI and their potential to be misused in pursuing a college education, their beneficial uses will have to be carefully monitored and explained to students, as we never want any technology solution to diminish the critical rigor and value of a UCR education. We will learn much more about the potential uses of AI in education as it continues to evolve and develop, and if AI shapes up to be a big driver of change in how professors and instructors design their assignments and assessments for today’s learner.
Q: What are some of the potential uses of AI in assessment and grading?
Yue Dong: AI tools such as ChatGPT are extensively trained with human feedback, and as a result, learn how to provide feedback as well. They have the potential to automatically correct questions involving reasoning and natural language, and are especially adept at ranking different answers. On the other hand, these models might inherit biases from the large datasets used during pretraining. The implications of these biases on the fairness of utilizing such tools as evaluation instruments remain unclear.
Edwards: Many innovators in educational technology have been exploring personalized and adaptive learning techniques over the last decade. The goal of this approach has always been to create a more responsive and customized learning experience for students. These types of adaptive technologies, including AI, have the ability to help students better comprehend course materials in some disciplines by having the assessments and grades shift based on the student’s inputs and answers. I would expect such explorations in this area to increase in the coming years.
Q: Could AI potentially offer a more personalized or one-on-one learning experience to students and, if so, how?
Rich Yueh: Absolutely. A student can prompt the AI for practice problems and solutions, plus feedback on performance, tailored to the student’s current needs or level of understanding. Furthermore, each time the student sends a message, the AI reads the entire conversation. This allows the AI to recall mistakes the student made early on, track the student’s progression, and generate instruction to fill in learning gaps, admittedly better than a professor or TA can. Students can also prompt the AI and receive output in their native language, though accuracy may vary compared to prompting in English. The AI is always accessible, thus the student does not need to wait for office hours or discussion sections to get help.
Matthew Lang: I think this is one of the most intriguing elements of the current proliferation of AI for education. I have always said that one of the best ways to test your understanding of a topic is to teach it to your roommate. Now, using Custom Instructions, we can guide our students to make the AI act like a “student” while they explain the material. I don’t think it will be long before we will be able to train a model specifically for our course and then educators will be able to craft brand new resources for their students.
Edwards: An additional point here is about how AI will improve as a method for personalized learning, and with it, potentially a rise in learning outcomes. Why might this be the case? Harold Bloom, best known for the learning taxonomy named after him, also studied and researched what he called “the 2 Sigma Problem.” In this problem, Bloom observed that the average student tutored one-on-one performed two standard deviations better than students in a classroom. AI has the potential to help with this problem at scale, by making “one-on-one" mastery learning more available to a wider group of students in higher education. It will be interesting to see if AI does have a way of addressing 2 sigma problem, and with it, help students gain mastery in their courses.
Q: Can AI help close educational equity gaps and/or help students with disabilities or learning needs?
Yueh: Yes. Students can use text-to-speech and speech-to-text tools. A student who has difficulties asking questions in person can use AI as a first option to prepare a script or notes to bring to a meeting or send in an email. Current AI tools can make key words bold for students with dyslexia, and there are likely AI tools in development that can tailor input and output for other learning needs and disabilities.
Edwards: This is another area where we are only at the beginning of what will evolve over time. One initial use at this time can involve better and more accurate transcription services for video lectures, for example. Moreover, here are going to be ways that students with certain learning disabilities might have courses materials adapted and delivered by different modalities and media to help them learn.
Q: What are some of the limitations of current AI tools that students should be aware of?
Yueh: Current AI tools are trained on sets of data that have distinct cutoff dates. This is known as a static corpus of data. If you prompt AI about something that happened recently, and that event is not in the AI tool’s training data, you will not get a useful response. On the other hand, search engines work with a dynamic corpus of data, meaning that a news article that gets posted today will end up on major search engines within hours.
Edwards: AI is not a substitute for critical thinking. It is an aid to critical thinking, and can stimulate new ideas or suggest meaningful lines of inquiry that will need students to still do the old-fashioned work of developing their own hypotheses and theses, struggle to find answers that seem elusive, and seek out additional information from multiple sources, including from libraries and primary materials.
Q: What are “hallucinations” in the context of AI?
Dong: Hallucinations in AI typically refer to errors that models commit, which are inconsistent with established world knowledge or the given input data. For instance, if asked "What is 1+1?" and the model responds with "3," this represents a hallucination. While such glaring mistakes are becoming increasingly rare, models often make subtler and more nuanced errors. Hallucinations pose a significant challenge for implementing AI tools within educational settings, especially in field that high precision is important. Unlike traditional textbooks, which adhere to consistent definitions and established facts, AI tools might not always align with accepted knowledge, leading to potential misinformation or misunderstandings.
Q: Will AI tools cause a deterioration in students’ skills in long-form writing, critical thinking, research, etc.?
Yueh: If used inappropriately or without training, yes. Cheating and an overreliance on AI tools are major risks. Since AI tools are trained on data available on the internet, there is an inherent bias in the output (in addition to the hallucinations question above). Students who don’t do their own further iterations on AI output may lose some of these skills.
Lang: I brought this up in my first class this term. Everyone needs to experiment and figure out how to use AI to augment learning and understanding without “turning our brains into mush.” But instead of focusing on the possible deterioration of skills this term, my current aim is to highlight how much more effective AI tools are when you have a clear goal and understanding about why you are using it. If my students go to ChatGPT and ask, “What is the GDP?” they will get an answer, but it will probably produce a wordy version of a textbook answer (hopefully!), no learning is augmented, and poor habits can be formed. If instead, they first work to understand the concept and ask a more thoughtful question, such as, “I know that GDP captures production in an economy, but I am concerned that it is not a good measure of well-being. Can you provide reasons for and against this line of reasoning?” they will get a more meaningful response that will force them to think about the topic and potentially invoke excitement about learning in a way that was not as widely possible before.
Q: How will incorporating AI into the curriculum help prepare students for future careers?
Yueh: In one of many examples, Google is incorporating AI into their Workspace products like Docs and Gmail. ChatGPT is built by OpenAI, which is owned by Microsoft. Within a few years, AI will be incorporated into the Microsoft Office Suite. Other companies like Adobe are also building AI into their products. Even in the unlikely situation that AI will not affect a particular career, students will still need to learn how to use the new tools needed in that career.
Q: Is AI a “video killed the radio star” moment for textbooks, which have been an expensive cornerstone of traditional college courses?
Yueh: Not in its current stages, but I see the trend approaching. There are nascent AI tools that allow a user or corporation to input their own training data, essentially acting as an AI for that user or corporation (see Google Duet and IBM watsonx). I envision a situation in a few years where a department pools their knowledge—which is peer-reviewed and meets the standards of their field—into an AI tool, creating a knowledge repository that can be turned into a textbook. This can pair with AI tools that generate images, graphics, data visualizations, and so on.
Lang: I am surprised that traditional textbooks haven’t been had their “video killed the radio star” moment already. I can only speak for economics, but some of the most expensive and popular textbooks are just repackaged information that is freely available on the internet. My vision for the future of textbooks is that AI tools will be created that allow instructors to feed relevant data into a platform which creates basic resources for a course. From there, students will interact with the “textbook” and the “textbook” will respond to them in a personalized way, catering to any style of learning.
Q: For faculty who are interested, what are the first steps? Who can help them get started? And what are the resources available at UCR?
Yueh: I founded the UCR AI Forum, a UCR enterprise Slack where anyone on campus can learn about, discuss, and collaborate on all aspects of AI usage and research. I have students working with me and we’re happy to come to your department and give an AI demo that is tailored to your field. We and community members answer questions on Slack.
Q: What would you say to faculty who are completely resistant?
Lang: My experience has been that many people are ignoring it as opposed to resisting it. They play around with ChatGPT, it gets a question wrong, and they move on from it. It is not my place to tell anyone how they should teach, and I would never criticize anyone for not integrating something they are not comfortable with into their courses or work activities. At the same time, with a little more experience using these tools, many are likely to find that there are fun, unique, and impactful ways to improve a lecture, class, or something completely unrelated to work. My first uses of AI had less to do with direct work tasks and more with exploring topics I was curious about, like where I should take a vacation where there are lots of hikes. This provided a wider view of what it could potentially offer and made using it to augment my work more natural. So, before dismissing it outright, I'd urge skeptics to explore how these technologies can genuinely add value to their work, interests, and day-to-day activities more generally.
Q: UCR faculty teach diverse courses—from theater to engineering to neuroscience and philosophy. Should all instructors be looking for a way to incorporate AI?
Yueh: I feel every faculty member should at least be conversant about AI. They should understand what their students are doing with AI and keep track of how AI is shaping their field and industry.
Cover illustration by Getty Images