I agree, especially that it's a hard sell for students to brush away the messiness. One of my concerns is the way in which this could reinscribe the faultlines of educational (dis)advantage. I worry that more advantaged students -- for all the reasons we know -- will be more persuadable, will have stronger support to see the intrinsic value of messiness for learning, and stronger support to persevere. I worry both that advantaged students will have better access to technologically enabled learning when that is helpful, and greater capacity to resist it when it counts.
One problem I see is that current genAI tools don't really enhance what it is given so much as completely rerenders or reformats it according to some internal black box of inference. Your image here isna good example: the AI "enhanced" version patently ignores many of the features of your original drawing. It doesn't enhance it, it just replaces it. IMO the tools aren't yet smart enough to be considered enhancers. We're just swapping our own imagination for a machine hallucination.
I actually mean enhancement in a neutral (or at least, potentially positive or negative) way.
I think my version is better and more valuable, especially from a learner’s perspective.
For me, enhancement is all about looking cleaner, whether it’s actually better or not. I think that’s what many of our students will content with, even if enhancement doesn’t actually improve the thoughts or the writing.
Nice piece, Jason. I think messy is a keyword for helping distinguish human creativity from the cultural outputs of AI models. This Henry Farrell line has stuck with me: "On average, they create representations that tug in the direction of the dense masses at the center of culture, rather than towards the sparse fringe of weirdness and surprise scattered around the periphery."
AI generated illustration is a nice example of this kind of density at the center of culture. I'm increasingly loosing my anxiety about fully automating cultural labor that aims for creativity (art, application development, literary fiction) while understanding that cultural production that aims for the middle (some forms of illustration and graphic design, coding, marketing copy) will be done by machines.
As we should have done with earlier waves of industrial automation, we should think about the humans impacted by these changes and help them in material ways, but as long as the messy and creative work can only be done by humans, I'm okay with it, especially because it rethinking our curricula and pedagogy entirely.
I think you are on the right lines, Jason. But aren't we all trying to find our way through this AI "forest" and then help to guide, encourage and support our students and anyone else who wants to listen to us.
I think I am right to tell my students that AI is just another tool for you, but as with all tools, you need to learn how to use it and be careful when using it.
Articles like yours, are part of the learning and comments like Kelly's are part of the warning to be careful when using AI.
Finally, my wife seems to be able to spot AI articles, as she says they "lack humanity". I reckon that "messiness" is a sign oh humanity. I wonder if you agree?
I agree, especially that it's a hard sell for students to brush away the messiness. One of my concerns is the way in which this could reinscribe the faultlines of educational (dis)advantage. I worry that more advantaged students -- for all the reasons we know -- will be more persuadable, will have stronger support to see the intrinsic value of messiness for learning, and stronger support to persevere. I worry both that advantaged students will have better access to technologically enabled learning when that is helpful, and greater capacity to resist it when it counts.
That is a fantastic point, Kelly!
I think there is definitely a way that this conversation links up with social inequality, and how this technology could widen the gap considerably!
I hope we'll be able to use these tools to help teachers become more persuasive and supportive towards all kinds of kids.
One problem I see is that current genAI tools don't really enhance what it is given so much as completely rerenders or reformats it according to some internal black box of inference. Your image here isna good example: the AI "enhanced" version patently ignores many of the features of your original drawing. It doesn't enhance it, it just replaces it. IMO the tools aren't yet smart enough to be considered enhancers. We're just swapping our own imagination for a machine hallucination.
That’s a good point!
I actually mean enhancement in a neutral (or at least, potentially positive or negative) way.
I think my version is better and more valuable, especially from a learner’s perspective.
For me, enhancement is all about looking cleaner, whether it’s actually better or not. I think that’s what many of our students will content with, even if enhancement doesn’t actually improve the thoughts or the writing.
Maybe I need a better word!
Nice piece, Jason. I think messy is a keyword for helping distinguish human creativity from the cultural outputs of AI models. This Henry Farrell line has stuck with me: "On average, they create representations that tug in the direction of the dense masses at the center of culture, rather than towards the sparse fringe of weirdness and surprise scattered around the periphery."
AI generated illustration is a nice example of this kind of density at the center of culture. I'm increasingly loosing my anxiety about fully automating cultural labor that aims for creativity (art, application development, literary fiction) while understanding that cultural production that aims for the middle (some forms of illustration and graphic design, coding, marketing copy) will be done by machines.
As we should have done with earlier waves of industrial automation, we should think about the humans impacted by these changes and help them in material ways, but as long as the messy and creative work can only be done by humans, I'm okay with it, especially because it rethinking our curricula and pedagogy entirely.
Thanks, Rob!
I think we need to think about the large-scale, long-term effects of this technology. And I feel like we’re not there yet.
I think you are on the right lines, Jason. But aren't we all trying to find our way through this AI "forest" and then help to guide, encourage and support our students and anyone else who wants to listen to us.
I think I am right to tell my students that AI is just another tool for you, but as with all tools, you need to learn how to use it and be careful when using it.
Articles like yours, are part of the learning and comments like Kelly's are part of the warning to be careful when using AI.
Finally, my wife seems to be able to spot AI articles, as she says they "lack humanity". I reckon that "messiness" is a sign oh humanity. I wonder if you agree?
I think your wife is only something. There is something so neat, so sterile about AI prose (especially out of the box)!
And Amen to teaching students to use the tools carefully and responsibly!