Scared of AI? Good. Here’s What Writers Should Do Next
Fear isn’t the problem. It’s just information. If you can listen to what it’s revealing, you can meet AI from a place of creative authority, not panic.
My last post made the case that creators are asking the wrong question about AI.
The question isn't whether machines will replace us. It's who gets to define what creative work means going forward.
By now you’re either nodding along or you’re quietly freaking out. Or both. Probably both.
So let's talk about the fear.
What AI Fear Sounds Like for Writers
Just to be clear, I’m not talking about abstract anxiety about the future of AI. I mean the specific 4 a.m. thoughts that creative people are waking up with right now.
What if everything I've spent years building just doesn't matter anymore?
I've had that one.
I've been writing a sci-fi trilogy for five years. I've built a newsletter, a blog, a body of work I care about deeply. And there are moments—in a writer’s chat room, in the middle of the night, after reading some manic headline—where I think: what if none of this counts in two years?
Here's another one I've sat with: What if by using AI in my creative process, I'm betraying the craft I love? That one is sneaky, because it comes dressed as integrity. It feels noble. And it can keep you frozen for a very long time.
Then there are the fears that feel less personal and more political. AI companies are training on our work without consent—they're stealing from us. And: The environmental cost of running these systems is enormous and nobody's talking about it.
These concerns carry real weight. Data use and environmental impact are legitimate, unresolved issues that deserve serious attention and better policy. I'm not going to wave them away.
But I want to gently separate two things that are getting tangled together in a lot of creator conversations right now: legitimate policy concerns that require collective action, and personal fear about your own relevance and future that is masquerading as conviction.
They can coexist. They often do. And the place where they get tangled is exactly where people get stuck.
Why Creative Fear About AI Is Useful
Here's what I've learned from sitting with my own fear about AI, and from hundreds of conversations with writers, editors, and creators over the past three years:
Fear points at the thing you value most.
If you're afraid your creative work won't matter anymore, that's telling you that creative work is central to your identity.
If you're afraid of being replaced, that's telling you your work carries meaning you may not have fully articulated—even to yourself. The fear is pointing at something worth understanding better.
If you're afraid that engaging with AI means selling out, that's telling you that artistic integrity is a core value. Okay. So what does integrity actually look like in a landscape that's shifting this fast? That's a question worth staying with instead of letting the fear answer it for you.
The problem with fear is that it only knows one kind of move: protect. Withdraw. Draw the line. Throw up the defenses.
Sometimes that's the right call. But when our creative futures are up for grabs, holding position isn't actually standing still. It's falling behind while the ground moves under your feet.
“Fear about AI points at the thing you value most as a creator. And that can be a powerful compass.”
How to Move Through AI Fear as a Creator
I ended The Wrong Question by talking about positive freedom—the idea that true freedom requires building something, not just defending against threats. Preparing the ground for new growth.
That's a beautiful concept. It's also genuinely hard to practice when you're scared.
So here is the practical, daily-life version of moving from defense to agency. It's three things, and none of them is “just stop being afraid.”
First: Get in the room with other creators and have honest conversations.
Look for opportunities to have real conversations with people you trust, where you can say “I'm terrified and I don't know what to do” and nobody tries to fix it or dismiss it. Where you can also say “I tried something with AI and it was actually amazing” without getting side-eyed.
We are processing a collective disruption individually, and it's making us lonely and rigid. The writers' conference chat I described in my last post was a room full of scared people performing certainty at each other. That's what happens when there's no space for honest conversation—you get loyalty tests instead of learning.
Find your people. Start a group chat. Have coffee. Sit around someone’s fire pit and talk about what you're afraid of and also what you're excited about and also what you have no idea about yet. That's where the real thinking happens—in relationship, in real time, with people who are grappling with the same questions you are.
Second: Commit to facts over rumors, and get comfortable saying “I don't know.”
Things are moving so fast in AI that what was true six months ago may be irrelevant today. And the loudest voices in creator communities are often repeating things they heard from someone who read a headline about a study they didn't actually read.
Information moves fast and loose when people are scared. Fear makes us grab onto whatever confirms what we already believe.
So:
Read primary sources when you can. Follow people who are tracking the actual developments, not just reacting to them.
Be willing to update your position when new information arrives.
When someone asks what you think about AI and copyright, or AI and the environment, or AI and the future of publishing—practice saying “here's what I understand so far, and I'm staying open as things develop.”
I get it. Certainty feels safe. But in this insanely paced landscape, certainty is mostly performance. So hold your values steady while staying genuinely curious about what's unfolding.
Third: Ask yourself the questions that fear won't let you ask.
Fear keeps the conversation small: will I survive this? Will my work still count? Am I safe?
Those questions matter. But staying exclusively in survival mode means you never get to the questions that might change something—for you and for the creative community you're part of.
Here are some I've been sitting with:
If I weren't afraid, what would I want to create next?
What would it look like to bring my full creative authority to this moment, including the parts that are uncertain?
What do I want the next generation of writers and artists to inherit from how we handle this moment?
Am I making decisions about AI based on what I've actually experienced, or based on what I've been told I should feel?
You don't have to answer these today. But I'd encourage you to take one of them on a walk, or into a journal, or into a conversation with someone you trust. See what surfaces when you hold the question instead of reaching for an answer.
“We are processing a collective disruption individually, and it’s making us lonely and rigid.”
What I Know For Sure About AI and Creativity
I don’t know how this ends. But I know what kind of creator I want to be while it’s unfolding.
I know that:
Fear is real and it deserves respect.
Disruption is real and it's going to be painful for a lot of people.
Legitimate concerns about data, labor, and environmental impact need real policy solutions, not just individual resistance.
And I know that the creators who shape what comes next won't be the ones who had the most certainty. They'll be the ones who were willing to stay in the conversation—scared, uncertain, and still showing up—long after it would have been easier to walk away.
The definitions are still being written.
Stay in the room.
I’m Jennifer Lewy, author of the Game of Paradise YA sci-fi series—stories about what happens when humanity rebuilds after collapse under a powerful AI called the NEWRRTH. I’ve been mentioned in the Detroit Free Press for pioneering AI use in the creative process.
If you want thoughtful, grounded writing about AI, creativity, and the future of storytelling, join my newsletter.
Explore the Game of Paradise series:
The One Game | The One Exiled | The One Reborn (coming 2026)
Last updated: April 20, 2026