Skip to main content
Use this checklist to evaluate UX text quality. Rate each criterion 0-10.

How to use this checklist

1

Review individual pieces

Evaluate specific UI strings, messages, or flows
2

Audit entire experiences

Score major screens or user journeys
3

Compare options

Rate different versions to choose the best
4

Track improvements

Measure before and after edits
5

Build rationale

Use scores to explain writing decisions
6

Focus efforts

Identify lowest-scoring areas to improve first

Concise

  • No filler words like “basically”, “actually”, “just”
  • Each word adds meaning or clarity
  • Can’t remove words without losing meaning
Score 0-10: How well does each word earn its place?
  • Maximum meaning per word
  • Efficient phrasing
  • No redundancy
Score 0-10: How efficiently is information conveyed?
  • Optimal line length for readability
  • Breaks appropriately for scanning
  • Not too long (causes focus loss) or too short (causes back-and-forth eye movement)
Score 0-10: Is line length optimized for reading?
  • Less than 3-4 lines per paragraph
  • Sentences vary in length but average ~15-20 words
  • Broken into scannable chunks
Score 0-10: Are sentences and paragraphs scannable?
  • Most important words come first
  • Action words at start of sentences
  • Users see key information immediately
Score 0-10: Is critical information front-loaded?
  • Most critical information first
  • Secondary details follow
  • Nice-to-know information last
Score 0-10: Is information prioritized effectively?

Purposeful

  • Text helps users complete their task
  • Addresses user’s “why am I here?” question
  • Removes barriers to action
Score 0-10: How well does this support user goals?
  • Supports conversion, engagement, or retention
  • Aligns with product objectives
  • Advances organizational goals
Score 0-10: Does this advance business objectives?
  • Consistent with brand personality
  • Recognizable as coming from this product
  • Uses appropriate tone for context
Score 0-10: Is brand voice consistent?
  • User benefit is clear
  • Explains “what’s in it for me?”
  • Shows value before asking for action
Score 0-10: Is user value clear?
  • Written in second person (“you”)
  • Emphasizes outcomes, not features
  • User-centered, not company-centered
Score 0-10: Is the focus on user benefits?
  • Motivates action without being pushy
  • Matches user’s intention and journey stage
  • Appropriate level of urgency
Score 0-10: Does framing invite appropriate action?

Conversational

  • Sounds like something you’d say aloud
  • Flows naturally when read
  • Not stiff or overly formal
Score 0-10: Would you actually say this out loud?
  • Subject performs the action
  • Direct and energetic
  • Use passive only when it’s clearer (rare cases)
Score 0-10: Is active voice used appropriately?
  • Prepositions present (“to”, “from”, “with”)
  • Articles included (“a”, “an”, “the”)
  • Not telegraphic or robotic
Score 0-10: Does it sound human, not robotic?
  • Uses language your users use
  • Based on user research and testing
  • No unnecessary technical jargon
Score 0-10: Does this match user language?
  • Voice shines through when context allows
  • Not overly serious in light contexts
  • Not playful in serious moments
Score 0-10: Is personality appropriate for context?

Clear

  • Specific verbs that describe the action
  • “Delete” not “Remove” for permanent deletion
  • “Save” not “OK” for saving changes
Score 0-10: Are verbs specific and accurate?
  • Active imperative for buttons
  • Clear, direct instructions
  • Action-oriented language
Score 0-10: Are commands clear and direct?
  • 7th grade reading level for general audience
  • 10th grade for professional contexts
  • Avoids complex vocabulary and sentence structures
Score 0-10: Is reading level appropriate?
  • Titles tell you where you are
  • Not generic or vague
  • Provide context and orientation
Score 0-10: Do titles orient users effectively?
  • Same word means same thing throughout
  • UI patterns applied consistently
  • Terminology documented in style guide
Score 0-10: Is terminology consistent?

Scoring guide

9-10

ExcellentBest practice example

7-8

GoodMinor improvements possible

5-6

AdequateNotable issues to address

3-4

Needs workSignificant problems

0-2

PoorMajor revision required

Example evaluation

Text: “An error occurred while processing your request. Please try again.”

Scoring

Concise
score
default:"6/10"
Wordy, could be “We couldn’t process your request. Try again.”
Purposeful
score
default:"4/10"
Doesn’t help user fix the problem or explain what happened
Conversational
score
default:"5/10"
Somewhat robotic, “an error occurred” is system-speak
Clear
score
default:"5/10"
Vague, doesn’t specify what error or why
Overall: 5/10 — Adequate but needs significant improvement

Printable checklist

Use this condensed version for quick evaluations:

Concise (0-10 each)

  • Every word has a job
  • High information density
  • 40-60 characters per line
  • Short sentences/paragraphs
  • Front-loaded signal words
  • Ideas ordered by priority

Purposeful (0-10 each)

  • Supports user goals
  • Meets business goals
  • Reflects brand voice
  • Value proposition clear
  • User-benefit focused
  • Active, inviting framing

Conversational (0-10 each)

  • Natural, spoken language
  • Active voice predominates
  • Connecting words included
  • Familiar words/phrases
  • Appropriate personality

Clear (0-10 each)

  • Accurate action words
  • Command forms appropriate
  • Plain language
  • Meaningful titles
  • Consistent terminology
Overall score calculation:
Sum all individual scores and divide by number of criteria (22 total) for an overall 0-10 rating.