How compelling is your README?

Eight traits learned from the best READMEs on GitHub. Hook speed, credibility, scannability — each scored 0–100.

From your terminal

POST /p/u22a8.compelling-readme — pipe a URL or text. JSON by default, add Accept: text/plain for the formatted table. Full API docs →

In your editor

Streamable HTTP MCP server — works with Claude Code, Cursor, Windsurf, or anything that speaks MCP.

# Claude Code $ claude mcp add --transport http u22a8 https://u22a8.ai/mcp
score Score text or URL against a model. Optionally compare against a baseline.
list_profiles Available models with descriptions and trait names.
list_traits What does a model measure? Names, polarity labels, sample counts.

Claude Code plugin

/u22a8:evaluate scores content, /u22a8:improve iterates edits until traits converge. Plugin marketplace →

Scores are approximate — not a substitute for human judgment.