map Don't forget to learn

Using AI as a tool to do a thing still relies on knowing how to do the thing yourself.

Generative AI has the potential to streamline our work and make us more efficient. For example, programmers can use AI to translate old code into more modern programming languages, helping with maintenance. AI is also usually good at summarizing things—such as summarizing meeting notes or providing an overview of discussion points, which you can review and edit.

These advances are made possible because we already know how to do the work, but the AI takes care of the repetitive component of these “everyday” tasks. The human remains “in the middle” of the process, reviewing the work produced by the AI. That might mean a developer integrating AI-generated source code into a larger project, or a manager reviewing a summary before attaching it to a report. Using AI as a tool to do a thing still relies on knowing how to do the thing yourself.

You need to learn it first

My concern is when I see people using AI to skip the learning process. My analogy runs like this: When you first learned about math in elementary school, you learned how to add and subtract. Later, you learned simple multiplication and division, then progressed to multiplying two-digit numbers, and long division. Throughout the process, you learned about math using pencil and paper.

And now if someone asks you to multiply 7 × 4, you know the answer is 28, because you learned how to multiply. You can take that a step further: you know that 14 × 4 is just 2 × 7 × 4, or 2 × 28. That’s 56. In the same way, you know that 14 × 8 is 112, because it’s multiplying by 2 again. You can work out the answer because you learned how math works. But as a working professional, it’s more efficient to use a calculator to get the more precise result: 27.632.

It’s okay to use a tool if you already know how it works. If you understand the process, the tool makes you more efficient. On the other hand, if you skipped the learning process and only used a calculator to do even the most basic arithmetic, you would find yourself at a serious disadvantage later on.

AI can’t replace learning

The Communications of the ACM published an article last year about the impact of AI on computer science education that demonstrated the danger of using AI to skip the learning process. In the article, a computer science professor challenged his students to use a programming language they didn’t know (Fortran) to solve a problem. One group was allowed to use AI to write code for them, another group was allowed to use Llama to suggest code. A third group could only google topics about Fortran; they had to figure out the rest on their own.

As the article highlights, the group using AI finished first, and the group that had to work things out took the longest:

One group was allowed to use ChatGPT to solve the problem, the second group was told to use Meta’s Code Llama large language model (LLM), and the third group could only use Google. The group that used ChatGPT, predictably, solved the problem quickest, while it took the second group longer to solve it. It took the group using Google even longer, because they had to break the task down into components.

A week later, the students were tested on it: the students who let AI do the job for them couldn’t do it, but the students who had to figure it out by googling things passed.

Then, the students were tested on how they solved the problem from memory, and the tables turned. The ChatGPT group “remembered nothing, and they all failed.” … Meanwhile, half of the Code Llama group passed the test. The group that used Google? Every student passed.

Learn it then do it

I teach some courses on technical and professional writing, and that’s how I approach AI with my students: I want my students to learn how to do it first, then it’s okay to use a tool.

For example, when my students learn how to use rhetoric to write a compelling proposal, I don’t let them use AI for that. If my students skip the learning process, they will be under-prepared for a career. One day, many of these students will need to write a client proposal. If they skipped the learning process by only using AI to do their work in the classroom, they won’t be able to do it on their own in the office. For example, these students might use an AI to write the proposal draft, and they won’t recognize if the proposal is good or bad or how to improve it, because they never learned what makes for a great proposal.

But the students who did the work and learned how to write a compelling proposal will be able to evaluate an AI-generated proposal and make it better by applying their own understanding of rhetoric and how a client might respond to a proposal.


This article is adapted from Don’t skip the learning process by Jim Hall, and is republished with the author's permission.