Start with a loose summary. Then pack in more information — without adding more words. Each round gets denser, forcing better compression and clearer writing.
Most AI summaries are too fluffy. They use lots of words to say very little: "The article discusses an important development in the technology industry..." That's filler, not information.
Chain of Density fixes this with a simple constraint: write a summary, then rewrite it at exactly the same length but with more key information included. Each round, identify 1–3 important facts that are missing and work them in — which means cutting filler to make room. After about 3 rounds, you get summaries that humans actually prefer: packed with information but still readable.
This composition builds on:
Extract What Matters Loop Until DoneChain of Density combines entity identification (what key facts are missing?) with iterative refinement (keep rewriting until dense enough), constrained by fixed length to force genuine compression.
Summarizing a news article about Apple's iPhone 15 Pro launch.
Entities: tech company smartphone — almost no specific information.
Entities: Apple iPhone 15 Pro Tim Cook A17 Pro titanium USB-C $999 80M units — packed with specifics.
Same length. Four times the information. All filler removed.
Bars show entity count per summary. Humans prefer Round 3 — informative but still readable.
There's a sweet spot. Too sparse and the summary is useless filler. Too dense and it reads like a telegram. Research shows Round 3 hits the mark — humans consistently prefer it over both sparser and denser versions.
The fixed-length constraint is the key innovation. When you ask AI to "add more information," it naturally makes the summary longer. By holding length constant, you force it to do something much harder and more valuable: compression. To fit new facts in, it must cut filler, fuse related ideas, and choose more precise words.
This mimics what skilled human editors do: not adding more words, but making every word carry more weight. The iterative approach also prevents the model from getting overwhelmed — adding 1–3 facts at a time is manageable, even if the final density would be hard to achieve in one shot.
Start sparse. Each round, identify missing key facts and rewrite at the same length to include them. The fixed-length constraint forces real compression — every word earns its place.
Chain of Density is an iterative refinement technique, similar in spirit to Recursive Chain-of-Feedback (which iteratively fixes errors) and Reflexion (which iteratively improves through self-critique). The difference is the specific mechanism: a fixed-length constraint that forces compression rather than general improvement.
It also connects to Extract What Matters at the single-prompt level. Where that technique asks the model to identify key information once, Chain of Density does it iteratively, finding deeper and more specific entities with each round.