5 Comments

And ChatGPT responded the following to my essay (sounds like she agrees with me :-) ):

You make several interesting points regarding the use of technology in education, specifically in relation to ChatGPT. It's true that the context in which technology is being considered for use is important to understand, and that different subjects or assignments may call for different approaches.

Regarding the use of calculators in math, it's true that students need to learn arithmetic and practice using it. However, as they progress in their studies, they may encounter more complex problems that require the use of a calculator. Similarly, in writing, students need to learn the elements of composition, but they may also benefit from tools that help them refine their writing skills, such as grammar checkers or thesauruses.

Regarding ChatGPT, you raise an important question about what the program "knows." While it's true that ChatGPT has an impressive facility for language understanding and production, it's also true that its knowledge is largely limited to the English language. Additionally, while it may possess some degree of "world knowledge," this is likely based on patterns it has learned from large amounts of text data rather than any real understanding of the world.

As for the question of whether the program is doing some of the "thinking" in college essays, it's a valid concern. While the student may be coming up with the ideas and arguments, ChatGPT may be assisting with the actual phrasing and sentence structure. However, this is not necessarily a bad thing if the goal is to produce well-written essays that effectively communicate the student's ideas.

In conclusion, the use of technology in education is a complex issue that requires careful consideration of the context in which it is being used. While tools like ChatGPT can be incredibly useful in certain situations, it's important to be aware of their limitations and potential drawbacks. Ultimately, the goal should be to use technology in a way that enhances the learning experience for students and helps them achieve their educational goals

Expand full comment

Wow -- this is great.

Expand full comment

I think it's important to understand the context in which technology (whether slide rule, calculator, or ChatGPT) is being considered for use. Let's take the calculator as an example. If you believe that it's important for kids to learn arithmetic, you teach the rules of arithmetic to them and then have them practice their use through homework and tests. In such a setting, I feel it would be inappropriate for a kid to use a calculator on their homework and tests, since it obviates the need to learn and practice arithmetic. On the other hand, if you're teaching kids physics, where you present facts and laws and they practice with homework and exams, I think it's OK to use calculators, since their capability (arithmetic, etc.) doesn't clearly overlap with, say, F = ma.

Now, by this reasoning, if you want kids to learn expository writing (as opposed to literature) then you teach them the elements of composition and they practice with essays. In this case (say, High School English) using ChatGPT would, again, obviate the purpose of the teaching, so to my mind, it shouldn't be allowed. On the other hand, if you're talking about using writing skills to convey information in some non-composition subject (say, an essay on why Putin's invasion of Ukraine is a violation of international norms) then the student's product is ideas about political science, economics, ethics, etc., and I think using a tool that helps map ideas into verbal form, I'm less inclined to object to.

But... Let's talk about ChatGPT. Is it in fact a tool that maps ideas into verbal form? Well, yes, it is that, but I think it's more, too. I believe that ChatGPT and other Generative Pretrained Transformers (hence, GPT) have incorporated (through learning) an amazing facility for language understanding and production, in terms of spelling, grammar, semantic constraints, etc. But is that all? This point has really been itching me since I learned about ChatGPT: What does the program actually "know"? As I said, it really knows language (English only?), but does it really know anything else, what I'd call "world knowledge". It sure seems to, as suggested about ChatGPT's understanding of clouds (and more generally, non-solid objects), and I'd suggest that it would be very difficult to teach *just* the language part without imparting some world knowledge in the process. Then, there's a third *possible* layer of understanding, which is synthesizing solutions to problems, what I'd call "general reasoning". Again, it's hard to separate this from the other two layers of knowledge that I would grant ChatGPT possesses, but ChatGPT's ability to write simple script programs seems to embody such a synthesis process (full disclosure, this is a programmer trying to justify his skills). However, I think we're an awfully long way from Artificial General Intelligence, which is the point where I think we have to start worrying a little bit.

I think the balance among these various layers (presumably changing even as we write) makes use of ChatGPT for, say, writing college essays, makes me *somewhat* uncomfortably. The question is: is the student coming up with the ideas and arguments and the program is "rendering", or is the program doing some of the "thinking", too? On the other hand, for topics where the "world knowledge" and "general reasoning" are pretty tough, like technical papers and instruction manuals, I'd be pretty strongly in favor on using ChatGPT, given the poor quality of language (but excellent world knowledge) of such writing.

Expand full comment

Mark, you've framed this superbly. The "what does it know?" question. It clearly has deduced a tremendous amount about language -- just like our grandchildren have done -- without being taught explicit rules of grammar or sentence structure. But it still might be able to articulate rules of grammar because it has "read" books on grammar. A fascinating situation.

Back to the slide rule/calculator debate for a moment, I think that the reason calculators were initially prohibited in exams was because of time -- in a timed exam situation, having a faster way to calculate gave more time to work on what you called the F=ma part, which provided an unfair advantage given that most students couldn't have a calculator. Then, as the cost of calculators dropped, that reason faded away and it became obvious that calculators in physics classes made perfect sense.

Regarding synthesis, it struggled with synthesizing a correct equation for the simple algebra word problem I gave it. So, it is missing some very basic knowledge of formulating a mathematical model from a language description. On the other hand, once I gave it the correct equation, I was impressed with how it used that to construct sentences to give me examples, albeit with some confusion about which tense to use.

I'm also interested in understanding what it would be like to use ChatGPT as a writing assistant. Would it, for example, make sense for me to prompt it with the key ideas I want to convey in a post and then have it write a first draft? Even if I felt I had to make extensive revisions (after all, I often extensively revise first drafts that I write the old-fashioned way), it still might help me be more productive. I haven't experimented with that at all but would like to.

It seems likely that it could do an excellent job of certain kinds of writing that are "formulaic". Last year we paid a lawyer to prepare a new estate plan. He asked questions about what we wanted to achieve and then he wrote a long, detailed document. But, surely, he didn't start from a blank sheet but pasted together some template text that covered similar needs that other of his clients had. Is this something that ChatGPT could do well if given the questions to ask?

Expand full comment

excellent analogy, Lee. I appreciate being challenged to consider thing from a new perspective.

Expand full comment