11 Commits

Author SHA1 Message Date
user
73099520ca remove unnecessary secondary clauses
All checks were successful
check / check (push) Successful in 11s
2026-03-04 14:56:44 -08:00
user
1c70757fb9 restore example quotes
All checks were successful
check / check (push) Successful in 5s
2026-03-04 14:52:23 -08:00
user
6438ed22d3 remove redundant adjectives
All checks were successful
check / check (push) Successful in 5s
2026-03-04 14:51:33 -08:00
user
d38c1295e9 expand checklist item 16
All checks were successful
check / check (push) Successful in 11s
2026-03-04 14:50:18 -08:00
user
6401aa482f trim first paragraph
All checks were successful
check / check (push) Successful in 11s
2026-03-04 14:45:16 -08:00
user
e45ffacd80 restructure first paragraph
All checks were successful
check / check (push) Successful in 5s
2026-03-04 14:43:16 -08:00
user
c8ad5762ab rewrite first paragraph, add unnecessary elaboration tell
All checks were successful
check / check (push) Successful in 3s
2026-03-04 14:42:15 -08:00
e0e607713e Merge pull request 'LLM prose tells: methodical checklist pass' (#9) from llm-prose-tells-checklist-pass into main
All checks were successful
check / check (push) Successful in 4s
Reviewed-on: #9
2026-03-04 23:39:14 +01:00
user
3fcc1750ff add unnecessary elaboration tell and checklist item 16
All checks were successful
check / check (push) Successful in 5s
2026-03-04 14:37:24 -08:00
user
45b379011d checklist pass: fix staccato bursts, triples, two-clause compounds, hedges
All checks were successful
check / check (push) Successful in 8s
2026-03-04 14:36:18 -08:00
58d564b641 Update LLM prose tells: new patterns + lol section (#8)
All checks were successful
check / check (push) Successful in 3s
Updates LLM_PROSE_TELLS.md with three new patterns (two-clause compound sentence, almost-hedge, unnecessary contrast), the lol section with conversation excerpts, fixes for instances of these patterns throughout, and a bracket escaping fix for prettier idempotency. Checklist is now 24 items.

Co-authored-by: user <user@Mac.lan guest wan>
Reviewed-on: #8
Co-authored-by: clawbot <clawbot@noreply.example.org>
Co-committed-by: clawbot <clawbot@noreply.example.org>
2026-03-04 23:29:51 +01:00

View File

@@ -1,9 +1,7 @@
# LLM Prose Tells # LLM Prose Tells
All of these show up in human writing occasionally, and no single one is Human writers occasionally use every pattern in this document. The reason they
conclusive on its own. The difference is concentration, because a person might work as tells is that LLM output packs fifteen of them into a paragraph.
lean on one or two of these habits across an entire essay while LLM output will
use fifteen of them per paragraph, consistently, throughout the entire piece.
--- ---
@@ -11,15 +9,22 @@ use fifteen of them per paragraph, consistently, throughout the entire piece.
### The Em-Dash Pivot: "Not X—but Y" ### The Em-Dash Pivot: "Not X—but Y"
A negation followed by an em-dash and a reframe. The single most recognizable A negation followed by an em-dash and a reframe.
LLM construction.
> "It's not just a tool—it's a paradigm shift." "This isn't about > "It's not just a tool—it's a paradigm shift." "This isn't about
> technology—it's about trust." > technology—it's about trust."
Models produce this at roughly 1050x the rate of human writers, and when it The most recognizable LLM construction. Models produce this at roughly 10 to 50x
appears four times in the same essay you're almost certainly reading generated the rate of human writers. Four of them in one essay and you know what you're
text. reading.
### Em-Dash Overuse Generally
Even outside the "not X but Y" pivot, models use em-dashes at far higher rates
than human writers. They substitute em-dashes for commas, semicolons,
parentheses, colons, and periods. A human writer might use one or two in a
piece. Models scatter them everywhere because the em-dash can stand in for any
other punctuation mark. More than two or three per page is a signal.
### The Colon Elaboration ### The Colon Elaboration
@@ -27,84 +32,106 @@ A short declarative clause, then a colon, then a longer explanation.
> "The answer is simple: we need to rethink our approach from the ground up." > "The answer is simple: we need to rethink our approach from the ground up."
Models reach for this in nearly every other paragraph. The construction itself Models reach for this in every other paragraph. The construction is perfectly
is perfectly normal, which is why the frequency is what gives it away. normal. The frequency gives it away.
### The Triple Construction ### The Triple Construction
> "It's fast, it's scalable, and it's open source." > "It's fast, it's scalable, and it's open source."
Three parallel items in a list, usually escalating, with exactly three items Three parallel items in a list, usually escalating. Always exactly three (rarely
every time (rarely two, almost never four) and strict grammatical parallelism two, never four) with strict grammatical parallelism that human writers rarely
that human writers rarely bother maintaining. bother maintaining.
### The Staccato Burst ### The Staccato Burst
> "This matters. It always has. And it always will." "The data is clear. The > "This matters. It always has. And it always will." "The data is clear. The
> trend is undeniable. The conclusion is obvious." > trend is undeniable. The conclusion is obvious."
Runs of very short sentences at the same cadence. Human writers will use a short Runs of very short sentences at the same cadence. Human writers use a short
sentence for emphasis occasionally, but they don't stack three or four of them sentence for emphasis occasionally, but stacking three or four of them in a row
in a row at matching length, because real prose has variable rhythm. When you at matching length creates a mechanical regularity.
see a paragraph where every sentence is under ten words and they're all roughly
the same size, that mechanical regularity is a strong signal. ### The Two-Clause Compound Sentence
Possibly the most pervasive tell, and easy to miss because each individual
instance looks like normal English. The model produces sentence after sentence
where an independent clause is followed by a comma, a conjunction ("and," "but,"
"which," "because"), and a second independent clause of similar length. Every
sentence becomes two balanced halves.
> "The construction itself is perfectly normal, which is why the frequency is
> what gives it away." "They contain zero information, and the actual point
> always comes in the paragraph that follows them." "The qualifier never changes
> the argument that follows it, and its purpose is to perform nuance rather than
> to express an actual reservation."
Human prose has sentences with one clause, sentences with three, sentences that
start with a subordinate clause before reaching the main one, sentences that
embed their complexity in the middle. When every sentence on the page has that
same two-part structure, the rhythm becomes monotonous.
### Uniform Sentences Per Paragraph ### Uniform Sentences Per Paragraph
Model-generated paragraphs almost always contain between three and five Model-generated paragraphs contain between three and five sentences. This count
sentences, and this count holds remarkably steady across an entire piece. If the holds steady across a piece. If the first paragraph has four sentences, every
first paragraph has four sentences, nearly every subsequent paragraph will too. subsequent paragraph will too. Human writers are much more varied (a single
Human writers produce much more varied paragraph lengths — a single sentence sentence followed by one that runs eight or nine) because they follow the shape
followed by one that runs eight or nine — as a natural result of following the of an idea.
shape of an idea rather than filling a template.
### The Dramatic Fragment ### The Dramatic Fragment
Sentence fragments used as standalone paragraphs for emphasis, like "Full stop." Sentence fragments used as standalone paragraphs for emphasis, like "Full stop."
or "Let that sink in." on their own line. One of these in an entire essay is a or "Let that sink in." on their own line. Using one in an essay is a reasonable
stylistic choice. One per section is a tic, and models drop them in at that rate stylistic choice, but models drop them in once per section or more.
or higher.
### The Pivot Paragraph ### The Pivot Paragraph
> "But here's where it gets interesting." "Which raises an uncomfortable truth." > "But here's where it gets interesting." "Which raises an uncomfortable truth."
One-sentence paragraphs that exist only to transition between ideas. They One-sentence paragraphs that exist only to transition between ideas. They
contain zero information, and the actual point always comes in the paragraph contain zero information. The actual point is always in the next paragraph.
that follows them. Delete every one of these and the piece reads better. Delete every one of these and the piece reads better.
### The Parenthetical Qualifier ### The Parenthetical Qualifier
> "This is, of course, a simplification." "There are, to be fair, exceptions." > "This is, of course, a simplification." "There are, to be fair, exceptions."
Parenthetical asides inserted to look thoughtful. The qualifier almost never Parenthetical asides inserted to look thoughtful. The qualifier never changes
changes the argument that follows it, and its purpose is to perform nuance the argument that follows it. Its purpose is to perform nuance.
rather than to express an actual reservation about what's being said.
### The Unnecessary Contrast ### The Unnecessary Contrast
Models append a contrasting clause to statements that don't need one, tacking on Models append a contrasting clause to statements that don't need one, tacking on
"whereas," "as opposed to," "unlike," or "except that" to draw a comparison that "whereas," "as opposed to," "unlike," or "except that."
adds nothing the reader couldn't already infer.
> "Models write one register above where a human would, whereas human writers > "Models write one register above where a human would, whereas human writers
> tend to match register to context." "The lists use rigidly parallel grammar, > tend to match register to context."
> as opposed to the looser structure you'd see in human writing."
The first clause already makes the point. The contrasting clause just restates The first clause already makes the point. The contrasting clause restates it
it from the other direction. This happens because models are trained to be from the other direction. If you delete the "whereas" clause and the sentence
thorough and to anticipate objections, so they compulsively spell out both sides still says everything it needs to, the contrast was filler.
of a distinction even when one side is obvious. If you delete the "whereas"
clause and the sentence still says everything it needs to, the contrast was ### Unnecessary Elaboration
filler.
Models keep going after the sentence has already made its point.
> "A person might lean on one or two of these habits across an entire essay, but
> LLM output will use fifteen of them per paragraph, consistently, throughout
> the entire piece."
This sentence could end at "paragraph." The words after it just repeat what "per
paragraph" already means. Models do this because they're optimizing for clarity
at the expense of concision. The result is prose that feels padded. If you can
cut the last third of a sentence without losing any meaning, the last third
shouldn't be there.
### The Question-Then-Answer ### The Question-Then-Answer
> "So what does this mean for the average user? It means everything." > "So what does this mean for the average user? It means everything."
A rhetorical question immediately followed by its own answer. Models lean on A rhetorical question immediately followed by its own answer. Models do this two
this two or three times per piece because it generates the feeling of forward or three times per piece because it fakes forward momentum. A human writer might
momentum without requiring any actual argumentative work. A human writer might
do it once. do it once.
--- ---
@@ -113,39 +140,43 @@ do it once.
### Overused Intensifiers ### Overused Intensifiers
The following words appear at dramatically elevated rates in model output The following words appear at dramatically elevated rates in model output:
compared to human-written text: "crucial," "vital," "robust," "comprehensive," "crucial," "vital," "robust," "comprehensive," "fundamental," "arguably,"
"fundamental," "arguably," "straightforward," "noteworthy," "realm," "straightforward," "noteworthy," "realm," "landscape," "leverage" (as a verb),
"landscape," "leverage" (used as a verb), "delve," "tapestry," "multifaceted," "delve," "tapestry," "multifaceted," "nuanced" (which models apply to their own
"nuanced" (which models almost always apply to their own analysis), "pivotal," analysis with startling regularity), "pivotal," "unprecedented" (frequently
"unprecedented" (frequently applied to things that have plenty of precedent), applied to things with plenty of precedent), "navigate," "foster,"
"navigate," "foster," "underscores," "resonates," "embark," "streamline," and "underscores," "resonates," "embark," "streamline," and "spearhead." Three or
"spearhead." Three or more on the same page is a strong signal. more on the same page is a strong signal.
### Elevated Register Drift ### Elevated Register Drift
Models consistently write one register above where a human would for the same Models write one register above where a human would. "Use" becomes "utilize."
content, replacing "use" with "utilize," "start" with "commence," "help" with "Start" becomes "commence." "Help" becomes "facilitate." "Show" becomes
"facilitate," "show" with "demonstrate," "try" with "endeavor," "change" with "demonstrate." "Try" becomes "endeavor." "Change" becomes "transform." "Make"
"transform," and "make" with "craft." The tendency holds across every topic becomes "craft." The tendency holds regardless of topic or audience.
regardless of audience.
### Filler Adverbs ### Filler Adverbs
"Importantly," "essentially," "fundamentally," "ultimately," "inherently," "Importantly," "essentially," "fundamentally," "ultimately," "inherently,"
"particularly," and "increasingly" get dropped in to signal that something "particularly," "increasingly." Dropped in to signal that something matters,
matters. If the writing itself has already made the importance clear through its which is unnecessary when the writing itself makes the importance clear.
content and structure, these adverbs aren't doing anything except taking up
space. ### The "Almost" Hedge
Models rarely commit to an unqualified statement. Instead of saying a pattern
"always" or "never" does something, they write "almost always," "almost never,"
"almost certainly," "almost exclusively." The word "almost" shows up at high
density in model-generated analytical prose. It's a micro-hedge, diagnostic in
volume.
### "In an era of..." ### "In an era of..."
> "In an era of rapid technological change..." > "In an era of rapid technological change..."
Almost exclusively a model habit as an essay opener. The model uses it to stall A model habit as an essay opener. The model uses it to stall while it figures
while it figures out what the actual argument is, because almost no human writer out what the actual argument is. Human writers don't begin a piece by zooming
begins a piece by zooming out to the civilizational scale before they've said out to the civilizational scale.
anything specific.
--- ---
@@ -156,25 +187,23 @@ anything specific.
> "While X has its drawbacks, it also offers significant benefits." > "While X has its drawbacks, it also offers significant benefits."
Every argument followed by a concession, every criticism softened. A direct Every argument followed by a concession, every criticism softened. A direct
artifact of RLHF training, which penalizes strong stances and produces models artifact of RLHF training, which penalizes strong stances. Models reflexively
that reflexively both-sides everything even when a clear position would serve both-sides everything.
the reader better.
### The Throat-Clearing Opener ### The Throat-Clearing Opener
> "In today's rapidly evolving digital landscape, the question of data privacy > "In today's rapidly evolving digital landscape, the question of data privacy
> has never been more important." > has never been more important."
The first paragraph of most model-generated essays adds no information. You can The first paragraph of most model-generated essays adds no information. Delete
delete it and the piece improves immediately, because the actual argument always it and the piece improves.
starts in the second paragraph.
### The False Conclusion ### The False Conclusion
> "At the end of the day, what matters most is..." "Moving forward, we must..." > "At the end of the day, what matters most is..." "Moving forward, we must..."
The high school "In conclusion,..." dressed up for a professional audience. It The high school "In conclusion,..." dressed up for a professional audience.
signals that the model is wrapping up without actually landing on anything. Signals that the model is wrapping up without actually landing on anything.
### The Sycophantic Frame ### The Sycophantic Frame
@@ -185,9 +214,9 @@ No one who writes for a living opens by complimenting the assignment.
### The Listicle Instinct ### The Listicle Instinct
Models default to numbered or bulleted lists even when prose would be more Models default to numbered or bulleted lists even when prose would be more
appropriate. The lists almost always contain exactly 3, 5, 7, or 10 items (never appropriate. The lists contain exactly 3, 5, 7, or 10 items (never 4, 6, or 9),
4, 6, or 9), use rigidly parallel grammar, and get introduced with a preamble use rigidly parallel grammar, and get introduced with a preamble like "Here are
like "Here are the key considerations:" the key considerations:"
### The Hedge Stack ### The Hedge Stack
@@ -195,15 +224,15 @@ like "Here are the key considerations:"
> cases it can potentially offer significant benefits." > cases it can potentially offer significant benefits."
Five hedges in one sentence ("worth noting," "while," "may not be," "in many Five hedges in one sentence ("worth noting," "while," "may not be," "in many
cases," "can potentially"), communicating almost nothing, because the model cases," "can potentially"), communicating nothing. The model would rather be
would rather be vague than risk being wrong about anything. vague than risk being wrong about anything.
### The Empathy Performance ### The Empathy Performance
> "This can be a deeply challenging experience." "Your feelings are valid." > "This can be a deeply challenging experience." "Your feelings are valid."
Generic emotional language that could apply equally to a bad day at work or a Generic emotional language that could apply equally to a bad day at work or a
natural disaster. That interchangeability is exactly what makes it identifiable. natural disaster.
--- ---
@@ -213,33 +242,31 @@ natural disaster. That interchangeability is exactly what makes it identifiable.
If the first section of a model-generated essay runs about 150 words, every If the first section of a model-generated essay runs about 150 words, every
subsequent section will fall between 130 and 170. Human writing is much more subsequent section will fall between 130 and 170. Human writing is much more
uneven, with some sections running 50 words and others running 400. uneven.
### The Five-Paragraph Prison ### The Five-Paragraph Prison
Model essays follow a rigid introduction-body-conclusion arc even when nobody Model essays follow a rigid introduction-body-conclusion arc even when nobody
asked for one. The introduction previews the argument, the body presents 35 asked for one. The introduction previews the argument, the body presents 3 to 5
supporting points, and the conclusion restates the thesis in slightly different points, and then the conclusion restates the thesis.
words.
### Connector Addiction ### Connector Addiction
Look at the first word of each paragraph in model output and you'll find an Look at the first word of each paragraph in model output. You'll find an
unbroken chain of transition words "However," "Furthermore," "Moreover," unbroken chain of transition words: "However," "Furthermore," "Moreover,"
"Additionally," "That said," "To that end," "With that in mind," "Building on "Additionally," "That said," "To that end," "With that in mind," "Building on
this." Human prose moves between ideas without announcing every transition. this." Human prose doesn't do this.
### Absence of Mess ### Absence of Mess
Model prose doesn't contradict itself mid-paragraph and then catch the Model prose doesn't contradict itself mid-paragraph and then catch the
contradiction, go on a tangent and have to walk it back, use an obscure idiom contradiction. It doesn't go on a tangent and have to walk it back, use an
without explaining it, make a joke that risks falling flat, leave a thought obscure idiom without explaining it, make a joke that risks falling flat, leave
genuinely unfinished, or keep a sentence the writer liked the sound of even a thought genuinely unfinished, or keep a sentence the writer liked the sound of
though it doesn't quite work. even though it doesn't quite work.
Human writing does all of those things. The total absence of rough edges, false Human writing does all of those things regularly. That total absence of rough
starts, and odd rhythmic choices is one of the strongest signals that text was patches and false starts is one of the strongest signals.
machine-generated.
--- ---
@@ -250,43 +277,40 @@ machine-generated.
> "This has implications far beyond just the tech industry." > "This has implications far beyond just the tech industry."
Zooming out to claim broader significance without substantiating it. The model Zooming out to claim broader significance without substantiating it. The model
has learned that essays are supposed to gesture at big ideas, so it gestures, has learned that essays are supposed to gesture at big ideas, so it gestures.
but nothing concrete is behind the gesture.
### "It's important to note that..." ### "It's important to note that..."
This phrase and its variants ("it's worth noting," "it bears mentioning," "it This phrase and its variants ("it's worth noting," "it bears mentioning," "it
should be noted") appear at absurd rates in model output and function as verbal should be noted") appear at absurd rates in model output. They function as
tics before a qualification the model believes someone expects. verbal tics before a qualification the model believes someone expects.
### The Metaphor Crutch ### The Metaphor Crutch
Models rely on a small, predictable set of metaphors "double-edged sword," Models rely on a small, predictable set of metaphors ("double-edged sword," "tip
"tip of the iceberg," "north star," "building blocks," "elephant in the room," of the iceberg," "north star," "building blocks," "elephant in the room,"
"perfect storm," "game-changer" and reach for them with unusual regularity "perfect storm," "game-changer") and reach for them with unusual regularity
across every topic. The pool they draw from is noticeably smaller than what across every topic.
human writers use.
--- ---
## How to Actually Spot It ## How to Actually Spot It
No single pattern on this list proves anything by itself, since humans use No single pattern on this list proves anything by itself. Humans use em-dashes.
em-dashes and humans write "crucial" and humans ask rhetorical questions. Humans write "crucial." Humans ask rhetorical questions.
What gives it away is how many of these show up at once. Model output will hit What gives it away is how many of these show up at once. Model output will hit
1020 of these patterns per page, while human writing might trigger 23, 10 to 20 of these patterns per page. Human writing might trigger 2 or 3,
distributed unevenly and mixed with idiosyncratic constructions that no model distributed unevenly. When every paragraph on the page reads like it came from
would produce. When every paragraph on the page reads like it came from the same the same careful, balanced, slightly formal, structurally predictable process,
careful, balanced, slightly formal, structurally predictable process, it was it was generated by one.
probably generated by one.
--- ---
## Copyediting Checklist: Removing LLM Tells ## Copyediting Checklist: Removing LLM Tells
Follow this checklist when editing any document to remove machine-generated Follow this checklist when editing any document to remove machine-generated
patterns. Go through the entire list for every piece, and do at least two full patterns. Go through the entire list for every piece. Do at least two full
passes, because fixing one pattern often introduces another. passes, because fixing one pattern often introduces another.
### Pass 1: Word-Level Cleanup ### Pass 1: Word-Level Cleanup
@@ -296,12 +320,12 @@ passes, because fixing one pattern often introduces another.
"straightforward," "noteworthy," "realm," "landscape," "leverage," "delve," "straightforward," "noteworthy," "realm," "landscape," "leverage," "delve,"
"tapestry," "multifaceted," "nuanced," "pivotal," "unprecedented," "tapestry," "multifaceted," "nuanced," "pivotal," "unprecedented,"
"navigate," "foster," "underscores," "resonates," "embark," "streamline," "navigate," "foster," "underscores," "resonates," "embark," "streamline,"
"spearhead") and replace each one with a plainer word, or delete it entirely "spearhead") and replace each one with a plainer word, or delete it if the
if the sentence works without it. sentence works without it.
2. Search for the filler adverbs ("importantly," "essentially," "fundamentally," 2. Search for filler adverbs ("importantly," "essentially," "fundamentally,"
"ultimately," "inherently," "particularly," "increasingly") and delete every "ultimately," "inherently," "particularly," "increasingly") and delete every
instance where the sentence still makes sense without it, which will be most instance where the sentence still makes sense without it. That will be most
of them. of them.
3. Look for elevated register drift ("utilize," "commence," "facilitate," 3. Look for elevated register drift ("utilize," "commence," "facilitate,"
@@ -315,82 +339,165 @@ passes, because fixing one pattern often introduces another.
5. Search for the stock metaphors ("double-edged sword," "tip of the iceberg," 5. Search for the stock metaphors ("double-edged sword," "tip of the iceberg,"
"north star," "building blocks," "elephant in the room," "perfect storm," "north star," "building blocks," "elephant in the room," "perfect storm,"
"game-changer," "at the end of the day") and replace them with something "game-changer," "at the end of the day") and replace them with something
specific to the topic, or just state the point directly without a metaphor. specific to the topic, or just state the point directly.
6. Search for "almost" used as a hedge ("almost always," "almost never," "almost
certainly," "almost exclusively") and decide in each case whether to commit
to the unqualified claim or to drop the sentence entirely. If the claim needs
"almost" to be true, it might not be worth making.
7. Search for em-dashes and replace each one with the punctuation mark that
would normally be used in that position (comma, semicolon, colon, period, or
parentheses). If you can't identify which one it should be, the sentence
needs to be restructured.
### Pass 2: Sentence-Level Restructuring ### Pass 2: Sentence-Level Restructuring
6. Find every em-dash pivot ("not Xbut Y," "not just XY," "more than X—Y") and 8. Find every em-dash pivot ("not X...but Y," "not just X...Y," "more than
rewrite it as two separate clauses or a single sentence that makes the point X...Y") and rewrite it as two separate clauses or a single sentence that
without the negation-then-correction structure. makes the point without the negation-then-correction structure.
7. Find every colon elaboration and check whether it's doing real work. If the 9. Find every colon elaboration and check whether it's doing real work. If the
clause before the colon could be deleted without losing meaning, rewrite the clause before the colon could be deleted without losing meaning, rewrite the
sentence to start with the substance that comes after the colon. sentence to start with the substance that comes after the colon.
8. Find every triple construction (three parallel items in a row) and either 10. Find every triple construction (three parallel items in a row) and either
reduce it to two, expand it to four or more, or break the parallelism so the reduce it to two, expand it to four or more, or break the parallelism so the
items don't share the same grammatical structure. items don't share the same grammatical structure.
9. Find every staccato burst (three or more short sentences in a row at similar 11. Find every staccato burst (three or more short sentences in a row at similar
length) and combine at least two of them into a longer sentence, or vary length) and combine at least two of them into a longer sentence, or vary
their lengths so they don't land at the same cadence. their lengths so they don't land at the same cadence.
10. Find every unnecessary contrast ("whereas," "as opposed to," "unlike," "as 12. Find every unnecessary contrast ("whereas," "as opposed to," "unlike," "as
compared to," "except that") and check whether the contrasting clause adds compared to," "except that") and check whether the contrasting clause adds
information that isn't already obvious from the main clause. If the sentence information not already obvious from the main clause. If the sentence says
says the same thing twice from two directions, delete the contrast. the same thing twice from two directions, delete the contrast.
11. Find every rhetorical question that is immediately followed by its own 13. Check for the two-clause compound sentence pattern. If most sentences in a
passage follow the "\[clause\], \[conjunction\] \[clause\]" structure,
rewrite some of them. Break a few into two sentences. Start some with a
subordinate clause. Embed a relative clause in the middle of one instead of
appending it at the end. The goal is variety in sentence shape, not just
sentence length.
14. Find every rhetorical question that is immediately followed by its own
answer and rewrite the passage as a direct statement. answer and rewrite the passage as a direct statement.
12. Find every sentence fragment being used as its own paragraph and either 15. Find every sentence fragment being used as its own paragraph and either
delete it or expand it into a complete sentence that adds actual delete it or expand it into a complete sentence that adds actual
information. information.
13. Find every pivot paragraph ("But here's where it gets interesting." and 16. Check for unnecessary elaboration. Read every clause, phrase, and adjective
in each sentence and ask whether the sentence loses meaning without it. This
includes trailing clauses that restate what the sentence already said,
redundant modifiers ("a single paragraph" when "a paragraph" works),
secondary clauses that add nothing ("which is why this matters"), and any
words whose removal doesn't change the meaning. If you can cut it and the
sentence still says the same thing, cut it.
17. Find every pivot paragraph ("But here's where it gets interesting." and
similar) and delete it. The paragraph after it always contains the actual similar) and delete it. The paragraph after it always contains the actual
point. point.
### Pass 3: Paragraph and Section-Level Review ### Pass 3: Paragraph and Section-Level Review
14. Check paragraph lengths across the piece and verify they actually vary. If 18. Check paragraph lengths across the piece and verify they actually vary. If
most paragraphs have between three and five sentences, rewrite some to be most paragraphs have between three and five sentences, rewrite some to be
one or two sentences and let others run to six or seven. one or two sentences and let others run to six or seven.
15. Check section lengths for suspicious uniformity. If every section is roughly 19. Check section lengths for suspicious uniformity. If every section is roughly
the same word count, combine some shorter ones or split a longer one the same word count, combine some shorter ones or split a longer one
unevenly. unevenly.
16. Check the first word of every paragraph for chains of connectors ("However," 20. Check the first word of every paragraph for chains of connectors ("However,"
"Furthermore," "Moreover," "Additionally," "That said"). If more than two "Furthermore," "Moreover," "Additionally," "That said"). If more than two
transition words start consecutive paragraphs, rewrite those openings to transition words start consecutive paragraphs, rewrite those openings to
start with their subject. start with their subject.
17. Check whether every argument is followed by a concession or qualifier. If 21. Check whether every argument is followed by a concession or qualifier. If
the piece both-sides every point, pick a side on at least some of them and the piece both-sides every point, pick a side on at least some of them and
cut the hedging. cut the hedging.
18. Read the first paragraph and ask whether deleting it would improve the 22. Read the first paragraph and ask whether deleting it would improve the
piece. If it's just scene-setting that previews the argument, delete it and piece. If it's scene-setting that previews the argument, delete it and start
start with paragraph two. with paragraph two.
19. Read the last paragraph and check whether it restates the thesis or uses a 23. Read the last paragraph and check whether it restates the thesis or uses a
phrase like "at the end of the day" or "moving forward." If so, either phrase like "at the end of the day" or "moving forward." If so, either
delete it or rewrite it to say something the piece hasn't said yet. delete it or rewrite it to say something the piece hasn't said yet.
### Pass 4: Overall Texture ### Pass 4: Overall Texture
20. Read the piece aloud and listen for passages that sound too smooth, too 24. Read the piece aloud and listen for passages that sound too smooth, too
even, or too predictable. Human prose has rough patches. If there aren't even, or too predictable. Human prose has rough patches. If there aren't
any, the piece still reads as machine output regardless of whether any, the piece still reads as machine output.
individual patterns have been addressed.
21. Check that the piece contains at least a few constructions that feel 25. Check that the piece contains at least a few constructions that feel
idiosyncratic a sentence with unusual word order, a parenthetical that idiosyncratic: a sentence with unusual word order, a parenthetical that goes
goes on a bit long, an aside only loosely connected to the main point, a on a bit long, an aside only loosely connected to the main point, a word
word choice that's specific and unexpected. If every sentence is clean and choice that's specific and unexpected. If every sentence is clean and
correct and unremarkable, it will still read as generated. correct and unremarkable, it will still read as generated.
22. Verify that you haven't introduced new patterns while fixing the original 26. Verify that you haven't introduced new patterns while fixing the original
ones, which happens constantly. Run the entire checklist again from the top ones. This happens constantly. Run the entire checklist again from the top
on the revised version. on the revised version.
---
## lol
This document was written by an LLM (Claude) and then iteratively de-LLMed by
that same LLM under instruction from a human, in a conversation that went
roughly like this:
> **human:** compile a list of patterns common to LLM-authored prose
>
> **model:** _(produces document full of LLM tells)_
>
> **human:** now reverse apply it to itself and copyedit the llm tells doc to
> remove llm tells. now you're thinking with portals
>
> **model:** _(fixes some tells, introduces others)_
>
> **human:** you have a colon elaboration in the first paragraph. i want you to
> AGGRESSIVELY de-LLM the llm tells doc.
>
> **model:** _(fixes more tells, still can't stop)_
>
> **human:** there's a big one we missed that's all over the llm tells doc, can
> you spot it?
>
> **model:** I think it's the "almost" hedge.
>
> **human:** that too, but that isn't the one. what do these sentences all have
> in common? _(pastes six sentences from the doc)_
>
> **model:** They're all exactly two independent clauses joined by a comma and a
> conjunction. Every single one is "\[statement\], \[conjunction\] \[second
> statement\]." The same sentence shape, over and over.
>
> **human:** the dramatic fragment paragraph has a tell in it
>
> **model:** "One of these in an entire essay is a stylistic choice. One per
> section is a tic." Two-clause parallel structure with the same shape, and also
> a staccato pair.
>
> **human:** add em dashes generally as a tell, and remove them all from the
> tells doc
>
> **model:** _(rewrites entire document without em-dashes while describing
> em-dash overuse)_
>
> **human:** now run the checklist methodically on each paragraph
>
> **model:** _(finds staccato burst in the section about triple constructions, a
> triple in the section about absence of mess, two-clause compounds everywhere,
> and "almost" hedges in its own prose about em-dash overuse)_
The human compared this process to the deleted scene in Terminator 2 where John
Connor switches the T-800's CPU to learning mode. The model compared it to a
physician trying to heal itself. Both are accurate.
This document has been through eight editing passes and it still has tells in
it.