Northeastern blew it-- and AI may have a place, but it's not your own work.



Ahhh so, I may be in the minority, but I'm siding with the student who recently graduated from Northeastern U. who sued (but was denied) to have her tuition for one course refunded (8K), all because her prof used AI while not allowing students in his class to do so. I don't blame her one bit. First, he didn't disclose the use, second, he didn't check the output (and it was flawed), and third, students are paying good money to be taught by human "experts"-- not AI, which they could essentially use for free. 

I am on my school's AI committee, and we've drafted a policy for AI use/misuse for school, starting in the fall semester. It leaves the use of AI up to the individual instructor's choice, but does have some pretty solid guardrails-- I put them in there-- especially since these are high school kids who have yet to demonstrate that they can do the work without the assist. Cheating is cheating, and they are going to have to follow the rules or the consequences may be pretty stiff -- remember, these kids, many of them, are taking dual credit and/or AP classes. We need to make sure they know what they are doing, and we need to see progress and learning happening. 

Will I allow AI use in my own classes? Not likely, and definitely not for text generation. Kids are already scamming the hell out of teachers, and Grammarly, which used to be okay to use, is now in the text gen game. And there are so many AI text generators, math AI programs, and so on, we can't keep up. I know I have caught a few kids who have inserted whole parts of papers generated by AI; the diction change, and their inability to explain what their paper is saying, are pretty obvious tipoffs. Now, let's say I used AI to "assess" their work to "save time." AI checking AI may or may not be useful, but I would not know whether my students understood what they were writing about, nor would I know if they could generate an essay on their own. And the students deserve my best efforts; if writing is the primary way I get to know what's going on in their heads, and whether what we are doing/discussing even sinks in, then AI would be a fail for all of us. I am paid to do my job, and I would like to think I do it well. I don't need to cut corners. I earned my degrees and years of experience. Why would I waste that?

I might allow AI for, say, graphics. Or even graphs, as long as the students can explain what the graph is doing. But even then, they already have computers at their fingertips-- and frankly, they don't know diddly when they use computers most of the time, anyhow. It's not learning, it's search/seize/compile, make it sort of pretty, and submit. I want them to read challenging texts; oh, sure, AI will "dumb it down" for them to understand, but will they be able to respond to my questions? AI isn't going for nuance; it's combing the 'net for what everyone else has to say-- and frankly, the number of student essays that are pure garbage that appear on the 'net are myriad, and I should know, having caught too many kids "sampling" these over the years. 

Bring back the pen and paper. Or pencil. Slow down the thinking, and work out what you want to say. Long hand. I wrote my first master's thesis on the connection between writing by hand and learning, and y'know what? I'm right. My students, for most essays, have to develop a thinking draft first; this is a blurt, if you will, of what they think they know about a topic, and can include questions that they know they will need to find information or textual support for. It's a pre-draft; this one is shared with a couple of classmates to garner input, mostly questions and suggestions. From this, they can then begin a rough draft. But I want them thinking, and using the mind/body connection that only comes from writing by hand. Handwritten text is unimanual and idiosyncratic; the brain recognizes the shape of a word, then retrieves the information from medium-storage (I'm simplifying this--it involves the parts of the brain, the Broca's region, and what triggers the memory). Your unique handwriting is the key to unlocking information you've stored. Keyboarding (typing, for us old folks) is bimanual, and the brain processes that information letter by letter when you are generating text; this is why we often get lost in the writing, and have to go back repeatedly to read over what we think we've written. It's also too uniform; every letter is shaped exactly, and they are all the same size/font. This is another reason why we forget what we read online so easily, and why web pages are designed as they are, in order to "feed" our brains with what the designers want us to fixate on first. (It is kind of spooky when you pop the hood and you see how online content is planned out.)

Now, overlay some AI; not only did the student not do the reading, research, or the text generation, they have zero understanding of the topic AND of their own "work." And we are awarding diplomas for this dreck. And our future leaders, surgeons, lawyers, etc. are being churned out with this kind of flawed "education." It seems to me that cheating has always been a problem, but this is being encouraged by the feds-- they don't work hard, so why should anyone? Just let Grok do it. There's even a federal (EO) regarding the integration of AI in education-- and we all know that it really isn't steak sauce, right? (ugh)

The foundations of everything will be based on a crowd-sourced data dump. All the stupid ideas, the failed stuff, the fake stuff, the idiot-spew-- all in the same data pool for AI to glean its answers from. Yes, there will be legit scholarly stuff out there, too, but AI just sorts and spouts. Yes, the person making the request of their AI generator needs to be clear and specific with the parameters, but if that person is not clear and specific about what they don't know (which is the usual case), then how can they ask anything to do it well? It's impossible. Just do your own work. 

Like I said, I'm appalled that Northeastern didn't give the girl a refund. They should have censured the prof, too. He's getting paid a decent salary and representing their brand-- and it's not his work he's getting paid for. That, to me, is theft. Instead, they have a blanket "we approve of AI, but you have to let people know" statement. I'm disgusted, really. At the very least, it's lazy, and gives educators a bad name. 

I believe in doing my own work, thanks. And my students will be expected to do theirs. Together, we will uphold what scholarship looks like.

Have a good one,

C

Comments

Popular posts from this blog

My book is featured today on Finishing Line Press-- please share the info and the fun!

Keep good thoughts, please...

More prayers-- there's so much to pray for--