Published Date : 12/10/2025
Recently, my publisher informed me that a significant technology company involved in the development of artificial intelligence (AI) wants to use my book, “Stories from Montana’s Enduring Frontier,” for AI training purposes. The representative explained that I would earn $340 for “this one-time use.” But is this one-time use like a wet wipe — disposable, expendable, and easily sacrificed?
“Stories from Montana’s Enduring Frontier” is a collection of essays that I wrote over 20 years, arguing that 20th-century Montanans developed unique views of how nature worked, as captured in the wilderness-adventure and resource-extraction connotations of “the frontier.” The book felt particularly foreign to anything in the world of AI.
All the writers I know feel particularly vulnerable to AI. Most of today’s commercial AI programs are “large language models,” with skills not in logic, reasoning, or math but merely in generating text. This directly threatens writers’ jobs. Worse, replacing a human writer with today’s generative AI is like replacing a wild raspberry with an artificially flavored Crystal Light. The error-filled, uncreative products of AI threaten not only writers but also the joy and usefulness of reading.
While most people fulminate abstractly about AI, this query about buying my book presented a clear choice, sharpened by the specificity of “$340.” If I accepted the offer, would the knowledge I’ve poured into these essays become available to an AI, potentially decimating my book sales? If my royalties thus fell to zero because I had signed a death warrant for a book that is like a child to me, was the $340 worth it?
Perhaps $340 was better than nothing. Many technology companies train AI models by stealing from authors. “Stories from Montana’s Enduring Frontier” was among four of my books pirated for the “LibGen” database, which was used to educate AI programs from Anthropic and Meta. Although Anthropic recently settled a resulting lawsuit, Meta and others may yet escape punishment.
What is the proper value of my book? Although $340 is not much compensation for all the work I put in, neither is a royalty of $1.19 per book sold. If my main goal was adequate market compensation for my writing, I probably shouldn’t have published a book in the first place. The book is now 12 years old. At current sales rates, it would take a few years to make $340 in royalties.
When I talked with friends about this dilemma, it felt like none of us knew how to think about the situation. Maybe, as with previous technologies, making the book more widely available will stimulate sales — or maybe not. Maybe AI will thwart young people’s ability to engage in intellectual careers — or maybe its perils are overhyped.
Maybe AI will swallow my entire output without fair compensation — we know that Anthropic and Meta have already tried. My publisher wouldn’t say which AI company made the offer, how it arrived at that take-it-or-leave-it price, or how it would use my book. Would I feel differently about the deal if AI contributed to the world’s knowledge rather than merely helping students cheat?
As I thought about this, I realized I was reflecting a distinctly human desire, rather than an AI desire. A large language model consumes a book as data. Its model requires ever more data to predict what the next word in a sentence should be. It’s certainly ego-deflating to think of the product of my research, extensive reading, interviewing, thinking, and finally writing as “data.” I’d prefer it to be “knowledge” or even “wisdom” that the AI wants to suck from me. I’d prefer to think that it needs my well-told stories, my keen insights, my brilliant larger points.
But AI doesn’t think in such big-picture terms. It just predicts a word, and then another word, and then another. I realized that this is also a model for how nature works. There’s no grand plan. No knowledge. No story with a satisfying ending. There’s just a single cell reproducing. One leaf reaching for sunlight. A predator seeking its next dinner.
Q: What is the main concern for writers regarding AI?
A: The main concern for writers is that AI can generate text that mimics human writing, potentially replacing human writers and devaluing their work.
Q: How does AI training use books like 'Stories from Montana’s Enduring Frontier'?
A: AI training uses books to improve the AI's ability to generate text, often by consuming the content as data to predict the next word in a sentence.
Q: What ethical issues are raised by AI companies using books without fair compensation?
A: The ethical issues include the potential exploitation of authors and the devaluation of their intellectual property without fair compensation.
Q: What is the impact of AI on the literary market?
A: AI can potentially decimate book sales by making content available for free or at a low cost, thus reducing royalties for authors.
Q: How can writers protect their work from being used by AI without their consent?
A: Writers can protect their work by being aware of how their content is being used, advocating for stronger copyright laws, and considering legal action if their work is used without permission.