Not really, Jason. I heard an anecdote about a student taking a business communication course last Spring, learning to write various genres of business emails. Her professor insisted that AI not be used for work in the course. Simultaneously, this student had taken a temp job at a business. She was responsible for personalizing emails generated by AI, which very quickly made the finite number of genres immediately apparent to her. Time to write a complex email dropped from an hour to ten minutes. Having to customize the text was instructive and improved her ability and speed at completing class assignments which otherwise would have been busy work. This resistance to AI is not easy to classify as “good” or “bad” resistance. But it’s very real and it’s not in anyone’s interest. I think part of the issue with writing about an abstraction like “resistance” and then issuing an opinion—everyone has one, right?—is fanning the flames of resistance during a time when the fire is already intensely hot. You are certainly in the right to hold this position and to argue for it. I’m trying to point out some consequences.
This is why we need stronger collaboration between the professional world and academia. Set AI aside ... what we teach about writing in the classroom, doesn't match what's happening in the professional world.
Now put AI back, and you are on a different planet altogether.
While I agree with your premise wholeheartedly, I worry a little about the word “resistance.” Right now I have students who want to ignore it, who “are not fans” and who are feeling like my talking about it in class is a “bait and switch” from the class they thought they signed up for (note on that last point: it’s not: the class is intro to cognitive psychology, where technology and AI is integrally intertwined and totally appropriate content). So I worry that the word “resist” invites folks to push back on learning about it. What you describe tho: learning, demanding improvement to ethics and transparency, and push-back on corporate greed, are all things I do want my students to embrace.
That's a good point. Thanks for pointing that out!
I think there is a huge spectrum here. There are certainly students and faculty who will use "resistance" as a way to not learn fully about the concept or to dismiss it quickly.
I think so much of it comes down to what resistance (or whatever we want to call it) looks like, and how informed it is.
In the end, it all comes down to context and student/faculty voice.
It is very complex. But resistance in itself—as in refusing to discuss a specific issue at a department meeting—is easy to spot. It’s hard to make any change when isolation is such a pervasive norm. Professors told me routinely the course is dialed in and they don’t want to change anything, not even to contribute data to a department assessment project. And they could choose not to participate in any form. The ethic of collaborative curriculum is common in K12 while academic freedom masquerading as a defense for keeping the status quo is a different matter in the university—night and day.
That is really difficult to figure out, and it's a huge factor when it comes to how colleges and universities are structured.
I'm all for embracing AI and adapting to it. But other voices and approaches help rein us in on an institutional level, so that we're preparing for multiple futures.
It may be unique to California, but it is astounding. Fellow professors who were on board in the past with developing First Year Experience portfolios, for example, are sort of in a deep funk. What is going to happen to literature? Will anyone ever read Homer again? I’m finally got a GE course approved for Area D focused on utopian/dysutopian literature, and now I’m going to redo the syllabus? No thanks. In the SPA disciplines (special accreditation required liked medicine, engineering, architecture, etc.) the standards and criteria have been coming on the horizon and the response is encouraging—nursing programs I’ve read about are running with AI. It’s a shame universities can’t rein in their faculties to at least cooperate and collaborate on issues that affect everyone. Students are on pins and needles expecting to get falsely accused of plagiarism any day. Institutional governance in higher education must draw on faculty senators to lead the way in innovation initiatives through allocating funds for faculty proposed AI projects.
“But other voices and approaches help rein us in on an institutional level, so that we're preparing for multiple futures.” Wow. This statement is tough to unpack. AI is here. Preparing for it as a possible multiple future is over. Other voices as you’ve depicted them want to say, no, it’s not here, I don’t want to face it. In my travels with university colleagues over the past year I’m stunned at the number of folks who refuse to visit an AI website and take a test drive. It’s as if they are afraid they will be poisoned or something. I, personally, in all my years involved in faculty development, departmental efforts to secure accreditation, working with Centers for Teaching and Learning, have never seen such staunch, dug in resistance. Some statements I’ve seen on system policy in the UC are draconian. It’s clear that faculty need special permission to use AI is say large lecture classes. Why???
That's interesting! I have run across very different attitudes, as a whole. Most faculty are very open to playing with it, and often have on their own already.
And I mean that the future of Gen-AI is far from clear, especially as it gets worked into other emerging technologies like biocomputing and other forms of AI.
It's here, but it's final form and effect is far from certain.
Jason, your argument is circular if I’m understanding you. Faculty resistance is a ubiquitous response to virtually anything that threatens their tried and true techniques and exams. I learned this fact through years of experience as a university assessment coordinator. They have the right to resist anything requiring change; therefore, anything should be resisted if it requires change. Inertia in higher education is eroding the reach of universities in this era. Universities should be leading, not following complacently. I realize I’ve used a modality that shifts from a factual to an ethical discourse, but I shudder to think of the consequences for university and for learning if your opinion is wide spread. I understand stubborn, but I wonder whether the university can withstand the intensity of stubborn I hear in your post.
Yeah, that's bad/unproductive resistance. That is certainly a thing, though I encounter it less and less everyday. It's when we resist for the sake of resisting, or just because it's contrary to what we've done in the past.
(I write about this a bit in the clarification section on here.)
Definitely not what I'm writing about here. That's a whole different conversation!
Maybe I'll write about that in a future post! It's a whole different kind of resistance.
I'm not really sure why this has to be framed as resistance, though I would agree with the "good reasons to resist" list. I would much rather frame this as collaboration.
We should be collaborating with industry to solve these problems in context, not resist.
Now maybe we should resist all the bad decisions college administrations are making about AI ... but collaborating would most likely be more effective.
Not really, Jason. I heard an anecdote about a student taking a business communication course last Spring, learning to write various genres of business emails. Her professor insisted that AI not be used for work in the course. Simultaneously, this student had taken a temp job at a business. She was responsible for personalizing emails generated by AI, which very quickly made the finite number of genres immediately apparent to her. Time to write a complex email dropped from an hour to ten minutes. Having to customize the text was instructive and improved her ability and speed at completing class assignments which otherwise would have been busy work. This resistance to AI is not easy to classify as “good” or “bad” resistance. But it’s very real and it’s not in anyone’s interest. I think part of the issue with writing about an abstraction like “resistance” and then issuing an opinion—everyone has one, right?—is fanning the flames of resistance during a time when the fire is already intensely hot. You are certainly in the right to hold this position and to argue for it. I’m trying to point out some consequences.
Good point!
I definitely think you're right about the difficulty of thinking about (and writing about) resistance in the abstract.
I think it all comes down to context, and how faculty and students are given voice in the process.
This is why we need stronger collaboration between the professional world and academia. Set AI aside ... what we teach about writing in the classroom, doesn't match what's happening in the professional world.
Now put AI back, and you are on a different planet altogether.
While I agree with your premise wholeheartedly, I worry a little about the word “resistance.” Right now I have students who want to ignore it, who “are not fans” and who are feeling like my talking about it in class is a “bait and switch” from the class they thought they signed up for (note on that last point: it’s not: the class is intro to cognitive psychology, where technology and AI is integrally intertwined and totally appropriate content). So I worry that the word “resist” invites folks to push back on learning about it. What you describe tho: learning, demanding improvement to ethics and transparency, and push-back on corporate greed, are all things I do want my students to embrace.
That's a good point. Thanks for pointing that out!
I think there is a huge spectrum here. There are certainly students and faculty who will use "resistance" as a way to not learn fully about the concept or to dismiss it quickly.
I think so much of it comes down to what resistance (or whatever we want to call it) looks like, and how informed it is.
In the end, it all comes down to context and student/faculty voice.
It is very complex. But resistance in itself—as in refusing to discuss a specific issue at a department meeting—is easy to spot. It’s hard to make any change when isolation is such a pervasive norm. Professors told me routinely the course is dialed in and they don’t want to change anything, not even to contribute data to a department assessment project. And they could choose not to participate in any form. The ethic of collaborative curriculum is common in K12 while academic freedom masquerading as a defense for keeping the status quo is a different matter in the university—night and day.
That is really difficult to figure out, and it's a huge factor when it comes to how colleges and universities are structured.
I'm all for embracing AI and adapting to it. But other voices and approaches help rein us in on an institutional level, so that we're preparing for multiple futures.
It may be unique to California, but it is astounding. Fellow professors who were on board in the past with developing First Year Experience portfolios, for example, are sort of in a deep funk. What is going to happen to literature? Will anyone ever read Homer again? I’m finally got a GE course approved for Area D focused on utopian/dysutopian literature, and now I’m going to redo the syllabus? No thanks. In the SPA disciplines (special accreditation required liked medicine, engineering, architecture, etc.) the standards and criteria have been coming on the horizon and the response is encouraging—nursing programs I’ve read about are running with AI. It’s a shame universities can’t rein in their faculties to at least cooperate and collaborate on issues that affect everyone. Students are on pins and needles expecting to get falsely accused of plagiarism any day. Institutional governance in higher education must draw on faculty senators to lead the way in innovation initiatives through allocating funds for faculty proposed AI projects.
“But other voices and approaches help rein us in on an institutional level, so that we're preparing for multiple futures.” Wow. This statement is tough to unpack. AI is here. Preparing for it as a possible multiple future is over. Other voices as you’ve depicted them want to say, no, it’s not here, I don’t want to face it. In my travels with university colleagues over the past year I’m stunned at the number of folks who refuse to visit an AI website and take a test drive. It’s as if they are afraid they will be poisoned or something. I, personally, in all my years involved in faculty development, departmental efforts to secure accreditation, working with Centers for Teaching and Learning, have never seen such staunch, dug in resistance. Some statements I’ve seen on system policy in the UC are draconian. It’s clear that faculty need special permission to use AI is say large lecture classes. Why???
That's interesting! I have run across very different attitudes, as a whole. Most faculty are very open to playing with it, and often have on their own already.
And I mean that the future of Gen-AI is far from clear, especially as it gets worked into other emerging technologies like biocomputing and other forms of AI.
It's here, but it's final form and effect is far from certain.
Jason, your argument is circular if I’m understanding you. Faculty resistance is a ubiquitous response to virtually anything that threatens their tried and true techniques and exams. I learned this fact through years of experience as a university assessment coordinator. They have the right to resist anything requiring change; therefore, anything should be resisted if it requires change. Inertia in higher education is eroding the reach of universities in this era. Universities should be leading, not following complacently. I realize I’ve used a modality that shifts from a factual to an ethical discourse, but I shudder to think of the consequences for university and for learning if your opinion is wide spread. I understand stubborn, but I wonder whether the university can withstand the intensity of stubborn I hear in your post.
Thanks for that, Terry!
Yeah, that's bad/unproductive resistance. That is certainly a thing, though I encounter it less and less everyday. It's when we resist for the sake of resisting, or just because it's contrary to what we've done in the past.
(I write about this a bit in the clarification section on here.)
Definitely not what I'm writing about here. That's a whole different conversation!
Maybe I'll write about that in a future post! It's a whole different kind of resistance.
I'm not really sure why this has to be framed as resistance, though I would agree with the "good reasons to resist" list. I would much rather frame this as collaboration.
We should be collaborating with industry to solve these problems in context, not resist.
Now maybe we should resist all the bad decisions college administrations are making about AI ... but collaborating would most likely be more effective.