We are way past professors using AI to write their papers. Now professors feed a set of papers into AI and ask where the gap is in the logic so they can write a paper to fill it. When submitting, AI tells them who will likely be reviewing the paper for publication so they can strategically cite that person's work. Spoiler: it's working.
I remember in the late 1960s overhearing a high school teacher say to another teacher, "I can tell which encyclopedia a student used to write their paper."
I remember using the Encyclopedia for papers about animals in the 1970s! I distinctly remember thinking it doesn’t make sense to tell us to write the same thing that’s in the Encyclopedia but don’t copy it.
We students were all told the same thing: write a paper on a subject you don't know and show how knowledgeable you are--but don't paraphrase the encyclopedia.
With ai writing you are elevated to Editor-in-chief and ai is one of your writers.
I have written e-books, e-magazines and have one major bone to pick with large news companies. When using ai, stop sounding like it's a 5th grade book report. "In conclusion" is as much of an ai tell as an image with 8 fingers on one hand.
For the same reason as misspelling words and posting pictures with 8 fingers on one hand. It's unprofessional, immature and not to be taken seriously. Signs of laziness. Oh, you want to get paid and taken seriously as a periodical? As a journalist? Stop acting like you just wrote a 5th grade book report because obviously that's where ai got trained. In conclusion: Be more professional and name your sources or don't name them if inferring they are ai. But don't take credit that aren't yours. See sistermedia dot com or themiccoligroup dot com /magazine for examples of professionally written ai generated articles that don't insult the reader, don't sound like a 5th grade book report. Penelope if you want me to create an e-magazine for you, reach out. Would love to collab and work something out.
"When I had paid links on my site, FTC law said I had to disclose them. I did not. My feeling was: if my writing is good, who cares if there's a paid link?"
This is very ENTP of you. Only follow the rules that make sense. (Or have a penalty you want to avoid.)
I am not sure what people want in general, but I know what they want from me. Shocking posts aren't reliable for good reader results any more than unshocking posts are. I mean, seriously, if I knew I could make a lot of money from shock shock shock I would have done it. My readers care way more about engaging writing and interesting thinking than they do about shock. The thing AI can never do is tell the story of me. AI can replicate stories about things I've already told stories about. But that wouldn't be interesting. So I have to remember to tell my personal experience instead of a general experience - and give people space to generalize for themselves.
If you put this comment through AI, it would have given you this edit: Everyone is absolutely NOT using AI to write. Many of us completely refuse.
It's not a huge edit, but it retains your voice and makes it easier for someone to receive your message. I see this as similar to how poets labor over precise language—they're showing respect for their readers through clarity.
It takes away from my voice and erases my identity (which includes warts and all) - one small edit at a time. Poets labor over getting THEIR words exact, not any exact words. Another entity picking words for them is their nightmare.
Is it possible that after learning your writing style someone could use AI and write and article and falsely attribute the article to you? I could be wrong but has this not already happened with authors? Perhaps one is the Editor in Chief but could someone who is not you be the Editor in Chief of work attributed to you?
They could totally do that. But it’s been happening for 15 years; people take some or part of my post for their website and say they wrote it. It doesn’t matter to me because they can’t build a readership that way. I also think that using my voice and saying something stupid is stupid. And if someone says interesting things and chooses to write in my voice, maybe the contribution I make to the world is a voice that feels genuine to a lot of people.
I neglected to say that I think you make excellent points in the essay. I would suggest you use the technology in a responsible, innovative manner and one in which was an aspirational intention of developers. In a world where everything is energy and you are the determinate of your energetic footprint this is truly an excellent disposition to have (your reply to my comment). Do you think AI is better at plagiarizing than a human being?
AI is better than 98% of humans at everything. Which means unless you’re at the very top of what you’re doing, AI will be better. This reality has actually made me think twice before I do anything — I have to be really honest with myself about if I’m intending to work hard enough to be in the top 2% of all humans.
I generally agree, but most people don't have 4,500 articles they can upload to guide the AI. Writing is thinking, and people who don't put in the reps will not be good thinkers.
That's a good strategy for now, and I imagine Google will do something like that with Gmail and Gemini.
But I wonder how much young people even use email any more? Do they mostly interact over text and Slack?
Edit: also I have written hundreds of thousands of words as comments on blogs and reddit. I wish there were a way for an AI to slurp up all that without blowing up privacy.
It’s an interesting question whether Gen z will even care if their voice comes through. Maybe it’s not worth having a unique voice anymore. Maybe creative writing will be about structure rather than voice.
Don’t worry, soon all of the words that *can* be attributed to you *will* be. Isn’t that the goal of all the data-oligarchs? Match everything to its source so they can monetize more of us. Our health care data, financial data, purchasing habits, perhaps even our ai-usage habits. Think I’m paranoid? Just ask CHATboy.
I'm in my 70s.
I remember in the late 1960s overhearing a high school teacher say to another teacher, "I can tell which encyclopedia a student used to write their paper."
I remember using the Encyclopedia for papers about animals in the 1970s! I distinctly remember thinking it doesn’t make sense to tell us to write the same thing that’s in the Encyclopedia but don’t copy it.
We students were all told the same thing: write a paper on a subject you don't know and show how knowledgeable you are--but don't paraphrase the encyclopedia.
With ai writing you are elevated to Editor-in-chief and ai is one of your writers.
I have written e-books, e-magazines and have one major bone to pick with large news companies. When using ai, stop sounding like it's a 5th grade book report. "In conclusion" is as much of an ai tell as an image with 8 fingers on one hand.
I love the analogy of editor-in-chief with a writing staff. So true.
For the same reason as misspelling words and posting pictures with 8 fingers on one hand. It's unprofessional, immature and not to be taken seriously. Signs of laziness. Oh, you want to get paid and taken seriously as a periodical? As a journalist? Stop acting like you just wrote a 5th grade book report because obviously that's where ai got trained. In conclusion: Be more professional and name your sources or don't name them if inferring they are ai. But don't take credit that aren't yours. See sistermedia dot com or themiccoligroup dot com /magazine for examples of professionally written ai generated articles that don't insult the reader, don't sound like a 5th grade book report. Penelope if you want me to create an e-magazine for you, reach out. Would love to collab and work something out.
"When I had paid links on my site, FTC law said I had to disclose them. I did not. My feeling was: if my writing is good, who cares if there's a paid link?"
This is very ENTP of you. Only follow the rules that make sense. (Or have a penalty you want to avoid.)
Penny,
Can originality of thought be programmed?
Is original thought different from simply saying shocking things for notoriety?
Do people want original thought or do they simply want to read what they already think, but stated in outrage?
Peace,
D
I am not sure what people want in general, but I know what they want from me. Shocking posts aren't reliable for good reader results any more than unshocking posts are. I mean, seriously, if I knew I could make a lot of money from shock shock shock I would have done it. My readers care way more about engaging writing and interesting thinking than they do about shock. The thing AI can never do is tell the story of me. AI can replicate stories about things I've already told stories about. But that wouldn't be interesting. So I have to remember to tell my personal experience instead of a general experience - and give people space to generalize for themselves.
Everyone is absolutely NOT using AI to write. I, and many others,100% refuse.
If you put this comment through AI, it would have given you this edit: Everyone is absolutely NOT using AI to write. Many of us completely refuse.
It's not a huge edit, but it retains your voice and makes it easier for someone to receive your message. I see this as similar to how poets labor over precise language—they're showing respect for their readers through clarity.
It takes away from my voice and erases my identity (which includes warts and all) - one small edit at a time. Poets labor over getting THEIR words exact, not any exact words. Another entity picking words for them is their nightmare.
I can understand you feeling that way. I’ve had editors for most of my life, so it feels like an enhancement to me - and a way to learn.
Is it possible that after learning your writing style someone could use AI and write and article and falsely attribute the article to you? I could be wrong but has this not already happened with authors? Perhaps one is the Editor in Chief but could someone who is not you be the Editor in Chief of work attributed to you?
They could totally do that. But it’s been happening for 15 years; people take some or part of my post for their website and say they wrote it. It doesn’t matter to me because they can’t build a readership that way. I also think that using my voice and saying something stupid is stupid. And if someone says interesting things and chooses to write in my voice, maybe the contribution I make to the world is a voice that feels genuine to a lot of people.
I neglected to say that I think you make excellent points in the essay. I would suggest you use the technology in a responsible, innovative manner and one in which was an aspirational intention of developers. In a world where everything is energy and you are the determinate of your energetic footprint this is truly an excellent disposition to have (your reply to my comment). Do you think AI is better at plagiarizing than a human being?
AI is better than 98% of humans at everything. Which means unless you’re at the very top of what you’re doing, AI will be better. This reality has actually made me think twice before I do anything — I have to be really honest with myself about if I’m intending to work hard enough to be in the top 2% of all humans.
We'll see such AI shit commodified to death... One doesn't even have to "falsely attribute" anything, it will lose value all of its own
I generally agree, but most people don't have 4,500 articles they can upload to guide the AI. Writing is thinking, and people who don't put in the reps will not be good thinkers.
Good point, but if most of your writing is not online, it’s probably in email. So what do you think about uploading 4500 emails you’ve written?
That's a good strategy for now, and I imagine Google will do something like that with Gmail and Gemini.
But I wonder how much young people even use email any more? Do they mostly interact over text and Slack?
Edit: also I have written hundreds of thousands of words as comments on blogs and reddit. I wish there were a way for an AI to slurp up all that without blowing up privacy.
It’s an interesting question whether Gen z will even care if their voice comes through. Maybe it’s not worth having a unique voice anymore. Maybe creative writing will be about structure rather than voice.
Don’t worry, soon all of the words that *can* be attributed to you *will* be. Isn’t that the goal of all the data-oligarchs? Match everything to its source so they can monetize more of us. Our health care data, financial data, purchasing habits, perhaps even our ai-usage habits. Think I’m paranoid? Just ask CHATboy.