- Home
-
Calculate Text Entropy
Calculate Text Entropy
Measure Shannon entropy, information density, and redundancy of your text.
Drop file to upload
Enter text to calculate entropy
Paste your content above or use an example to measure information density.
Try These Examples
What is Text Entropy?
Shannon entropy is a concept from information theory that measures the average amount of information produced by a source of data. In the context of text, it quantifies the randomness or unpredictability of the characters in a string.
A string with high entropy is highly unpredictable (like a random password), while a string with low entropy has many repeating patterns (like "aaaaa").
The Formula
H(X) = -Σ P(xᵢ) log₂ P(xᵢ)
- H(X): Entropy in bits per character
- P(xᵢ): Probability of character xᵢ
- Σ: Sum over all unique characters