How ChatGPT is Transforming the Way We Teach Software Development

The revelation came in the summer of 2023, when I took on a high school student as a summer intern. Their task was to develop a machine learning model to predict air quality in our city, using Jupyter notebooks, basic Python and scikit-learn.
One day, I was discussing the performance of the algorithm with my intern and asked them to change a graph: instead of plotting the predicted versus true values, I asked them to show the difference between the predicted and true values.
The student switched to another browser tab, prompted ChatGPT to "Calculate the difference between two arrays y1 and y2" and continued to copy and paste the answer "y1 – y2" into the notebook.
At first I was amused that they would ask the AI assistant for a line of code that is so simple, and certainly faster to write yourself than to prompt, wait, and copy and paste. But then I started thinking about the implications of AI assistants for the way we teach Software Development and the learning outcomes for students.
In what follows, I outline the implications of the rise of AI assistants for the teaching of coding skills, based on my personal experience as an undergraduate and graduate instructor. I argue for accepting AI assistants in the classroom, rather than trying to restrict their use. Assignments and exams should take into account the use of AI assistants and assess skills that are not – yet – covered by AI. However, students should be given the opportunity to develop their own coding skills, rather than relying on AI technology for every part of their learning journey.
How does learning actually work?
There is a famous quote attributed to the Chinese philosopher Confucius:
"I hear and I forget. I see and I remember. I do and I understand."
Both in my own training and in Teaching others, I have found this to be true. In education and psychology, the last part of the quote is known as transfer of learning [1]. Students progress through tasks of increasing complexity. First, they do exercises on the same problem until they master a concept. Later, they can independently identify the concept that is needed to solve a new problem.
For example, when you learned to do addition in elementary school, you practiced with homework problems like 17 + 8 = ?. You mastered the concept of addition, and can now transfer the skill to adding any two integers.
A student who has only ever used AI assistants cannot reach the stage of understanding a concept. In copy-pasting, the student is not actively engaging with the material. They may solve the problem to the teacher's satisfaction, but they have not acquired the skill associated with completing the task.
What is the point of teaching software development?
I teach software development mostly to graduate students and professionals, so my experience is based on that level. My goal is to enable the students to solve their everyday problems with programming. They should learn
- How to identify a problem that can be solved with an algorithm and how to frame it correctly
- How to write the software to solve the problem
- How to evaluate the quality of the algorithm's output
In my opinion, the current state of AI assistants is best suited for solving the middle part of the problem, i.e. writing the software. For my students, the main benefit comes from learning how to perform the first and last tasks in the list, as these are highly dependent on their niche application.
We follow the Carpentries' concept [2] of teaching coding skills and are currently using simple but instructive datasets, where we are guiding the students through data analysis and model development.
ChatGPT can provide the necessary code in seconds, whereas it takes the students several days to learn the basics. How do we motivate students to engage deeply with the material and master a skill?
My students are grown adults, and if they choose to forgo a learning opportunity in favor of copying and pasting everything into ChatGPT, that is fine with me. However, I would expect a professional to have enough understanding of their field to solve a basic problem without resorting to a tool – like a book, stackexchange.com, or now ChatGPT.
To illustrate my point, consider an accountant who uses a calculator for their daily tasks. Even if they are faster and make fewer errors when using the device, they still know how to add numbers. The same should apply to software developers and their use of AI assistants.
As future knowledge workers, computer science students should develop skills beyond prompting ChatGPT. After all, someone has to develop the next iteration of generative AI models.

Can we always rely on AI assistants?
As promising as the prospects of AI models like GPT-4 and Gemini are, they are not yet at a stage where we can rely on them to 100%.
First of all, AI models are prone to hallucinations. When they cannot give an answer, they may just make up facts. This has been widely demonstrated by researchers and software professionals [3].
Second, AI models require a lot of infrastructure. There is no guarantee that you will always have access to a model that needs to run in a large data center. Network issues, short-term corporate decisions, and rising subscription costs can limit our access to AI assistants in unexpected ways.
Third, to evaluate the performance of an AI model, we need to develop critical thinking skills. Humans are born with a tendency toward these skills, but they still need to be trained to use them properly.
In a scenario where education consists only of prompting ChatGPT without ever thinking critically about the generated output, it seems impossible to develop this vital skill. Thus, I argue for integrating AI assistants into our courses and curricula, while still leaving room for human growth.
How are teachers reacting to AI assistants?
When AI assistants first became widely available with ChatGPT, the reactions went in three different directions.
- Some teachers rejected the use of ChatGPT and tried to ban it from assignments and exams,
- Other instructors placed no restrictions on the use of ChatGPT, often due to a lack of awareness,
- A third group welcomed the use of ChatGPT, understood that it was impossible to ban it, and set about developing new teaching concepts.
I will now take a closer look at each of the three types, and discuss their pros and cons.
Banning AI assistants
Restricting the use of AI assistants at the college level is probably pointless, since all students have access to the Internet on their smartphones at all times. Since AI assistants can be used through a simple browser interface, it seems impossible to stop students from using them. Even during exams, students have found ways – like hiding a smartphone in the bathroom, much like the cheat sheets of former generations' high school and college years.

Take-home assignments and open-book exams, long favored by instructors because they tend to engage students more deeply with the material, can now be solved with AI alone. Since even OpenAI has difficulty distinguishing between AI-generated and human-generated content, it seems impossible for instructors to detect the illicit use of AI.
In certain settings, it would be beneficial to discourage students from mindlessly prompting ChatGPT. As outlined above, the use of AI assistants can be detrimental to the transfer of learning.
To return to the example of learning to add two integers, in elementary school you were not allowed to use a calculator, even though everyone has quick access to a calculator on their phone. Once the transfer of learning is established, it is no problem to use the calculator for addition, allowing students to solve more complex problems.
Ignoring AI assistants
As a machine learning professional, I was aware of the rise of generative AI, but the fundamental shift came as a surprise. Teachers of other subjects, who may be less inclined to follow the latest AI news, have been overtaken by events.
By ignoring the availability of AI assistants, curricula may be undermined. Students may not master the skills associated with a course despite excellent performance, and teachers should be aware of this. Therefore, it is critical to educate teachers so that they understand the power of AI assistants and can incorporate them into their courses.
Integrating AI assistants
By accepting AI assistants into our classrooms, we not only acknowledge that it is impossible to ban them, but we also enable our students to learn a new skill that is critical for the digital age: Learning how to prompt an AI assistant and evaluate its answers.

This requires new approaches to teaching and assessment. Some ideas for instructors to assess transfer of learning include
- Having students prepare presentations and discuss them interactively in class
- Focus more on group work and projects, like collecting data with a sensor the students build themselves before proceeding with data analysis
- Focus on niche applications and regional aspects that may not be covered by today's AI tools
- Explicitly ask students to use AI for a task, followed by a critical discussion of the generated output.
In class, it is worthwile to consider restricting the use of AI assistants for as long as a concept is not yet secured.
Concluding thoughts
AI assistants like ChatGPT are here to stay and will have a major impact on the daily work of knowledge workers. Software development is no exception, and in the future, many parts of programming will be done by AI rather than humans.
When teaching software development, educators should think about integrating AI assistants into their curriculum, rather than limiting their use. Students should be given the opportunity to learn basic software development skills on their own, so that they can grasp concepts and apply transfer learning. They need to be knowledgeable enough to judge whether AI assistants have performed well on a given task.
As a new skill, students should learn how to efficiently prompt AI assistants for their programming questions. Taken together, this will ensure that in the future we still have software professionals who can write basic code without relying on AI assistants, but who can use them to their advantage in a professional context.
What are your experiences and thoughts? Please share them in the comments!
References
- Transfer of learning: https://en.wikipedia.org/wiki/Transfer_of_learning
- Carpentries software education: https://carpentries.org/
- Bang et al, A Multitask, Multilingual, Multimodal Evaluation of ChatGPT on Reasoning, Hallucination, and Interactivity (2023). https://arxiv.org/abs/2302.04023
- ChatGPT in schools (German): https://www.spiegel.de/panorama/bildung/chatgpt-und-co-an-schulen-lehrkraefte-in-ki-verantwortung-a-a2a2bd4f-7b13-481b-8c8a-55ed6c1a9ce9