By Peter Pavarini

When I first began reading books a teacher hadn’t assigned to me (when I started to read for pleasure), I remember how disappointed I was to learn that some of my favorite novels had been ghost-written. It was equally troubling to find out that many best-selling authors delegated the writing of their sequels to underlings and still took full credit for books published in their names.
Fast forward to today – the Age of Artificial Intelligence.
What is Original?
Thousands of years after humans began painting on cave walls or writing on various kinds of tablets, it became hard to determine whether something was original or not. As recently as 1976, lawyers for songwriter Ronnie Mack had no difficulty convincing a U.S. court that a musical icon like George Harrison had plagiarized “He’s So Fine” when he penned his hit song “My Sweet Lord”.[i] Today, by comparison, any one of a million hip hop artists can routinely steal from the entire canon of popular music with few if any legal consequences.
Perhaps, in just those few decades, originality stopped being the relevant standard in intellectual property law or how the public measured the worth of something new and different.
Mark Twain – always far ahead of his times – once said:
“It takes a thousand men to invent a telegraph, or a steam engine, or a phonograph, or a photograph, or a telephone or other important thing – and the last man gets the credit and we forget the others. He added his little mite – that is all he did. These object lessons should teach us that ninety-nine parts of all things that proceed from the intellect are plagiarism, pure and simple; and the lesson ought to make us modest. But nothing can do that.”[ii]
As a writer and a lawyer, I’ve always been aware of the fine line between originality and theft. Almost everything I write requires a fair amount of research before I put pen to paper.
A quote often mistakenly attributed to Albert Einstein states:
“The secret to creativity is knowing how to hide your sources.”[iii]
Similarly, one of my favorite nonfiction authors – John McPhee – once taught his students at Princeton that:
“Taking things from one source is plagiarism; taking things from several sources is research.”[iv]
What Can Artificial Intelligence Do?
We live at a time when artificial intelligence can easily generate essays, poems, lyrics, artwork, legal documents, even entire business plans in a matter of seconds. If, by borrowing from a vast array of available sources, a machine can produce something that looks or sounds reasonably professional, does originality even matter anymore?
According to experts, there are already (or soon will be) seven categories of AI.[v] The first five depend on pre-existing data needed to “train” their chat boxes or other systems.[vi] Only the last two – “theory of mind” AI and “self-aware” AI – at least in theory are capable of independent, original thought. Most AI the public now uses, such as internet searches, facial recognition or self-driving cars, is considered “weak AI”.
The meaning of “originality” never required the creation of something “out of thin air”. Although we laud those whose genius results in an invention or other innovative concept which revolutionizes how we live, the creative process typically involves reusing older ideas in a new or unexpected way. In other words, geniuses still stand on the broad shoulders of those who came before them.
The Secret Ingredient
And yet, originality always requires more than just avoiding plagiarism. The work of an author or artist also needs to reflect some personal insight, independent reasoning or a fresh perspective on pre-existing knowledge – the secret ingredient behind every breakthrough.
Even if this definition of originality remains the primary way we determine if something is creative, how then should we value the work product of advanced forms of AI once they become more common?[vii]
Defining what’s “original” in the post-AI era is not simply a matter of philosophical debate. How we define originality will also have practical implications for anyone who wishes to create without relying upon these highly evolved forms of AI. In other words, what “human premium” should we give to an original work that continues to draw upon personal experience, spontaneous insight, emotional nuance and, most importantly, the ability to tell a good story?
Once a creative work is finished, should its final form be judged more by its content or by the particular process used by its creator to bring the work to fruition?
Without a doubt, AI is here to stay. It may always prove faster and more accurate than human intelligence for tasks like sorting volumes of data or recognizing overlooked patterns. However, it will never have the benefit of the creator’s “lived experience”, to use that well-worn phrase. It will never know what it’s like to struggle as an artist, to hope against all odds, to dream and to learn from one’s failures. It will never experience being touched by the Muse.
Appreciating Authenticity Rather Than Originality
Therefore, in the future, the correct way to assess the value of a creative work should not be to measure its originality but rather its authenticity. Something authentic doesn’t need to be original. It can be worthwhile because we find it to be real or genuine – not a copy or imitation of something else. Why does that matter? When all is said and done, I believe we’re all naturally drawn to people and things we find sincere, trustworthy and faithful to universally recognized values.
As I’ve often asked in earlier blogs, is something true? Is it beautiful? Is it good? If those descriptors don’t mean much to you, then you’ll probably be satisfied with AI-generated music, artwork, and everything else produced by AI. But, for the rest of us, I pray we continue to believe in the mystique and grandeur of what only humans can create and deliver.
[i] George Harrison’s 1970 hit “My Sweet Lord” was embroiled in a landmark copyright lawsuit for plagiarizing The Chiffons’ 1963 song “He’s So Fine.” A U.S. court ruled that Harrison committed “subcontract plagiarism”—unintentionally copying the tune—ordering him to pay nearly $1.6 million in damages.
[ii] Letter to Anne Macy, reprinted in The Story Behind Helen Keller, Doubleday, Doran & Co. (1933).
[iii] There is no substantive evidence that Einstein ever said this. More likely, it’s a phrase associated with British philosopher C.E.M. Joad but used to criticize Einstein’s failure to cite his sources in his early papers on special relativity.
[iv] Jane Mount, Bibliophile: An Illustrated Miscellany, Chronicle Books (2018).
[v] Eaton Business School, “Understanding 7 types of AI with examples”, Blog January 20, 2026.
[vi] Artificial narrow intelligence (weak AI), artificial general intelligence (strong AI), artificial superintelligence, reactive machine AI, limited memory AI, theory of mind AI, and self-aware AI.
[vii] See Alexander Pokorny, “Uncovering Generative AI Originality: Can It Deliver?”, Artellico Blog, September 3, 2025.

Be First to Comment