§
§ · free tool

JSON validator. Deep diagnostics, not just parse.

Paste JSON. We validate, surface parse errors with line + column + visual context, run common-mistake detection (trailing commas, single quotes, BOMs, comments), report depth + key + type stats, and validate NDJSON line-by-line.

Paste JSON. We validate strict syntax, surface parse errors with line + column + visual context, run a common-mistake detector even on inputs that parse, and report structure statistics. Toggle NDJSON mode for streaming / log-file content where each line is independent JSON.

·
0 chars
verdict
common-mistake check

Privacy: JSON is parsed in your browser. Nothing is sent or logged.

§ 02 · what gets checked

Three layers of validation.

Layer 1: Strict syntax. The browser's native JSON.parse against RFC 8259. Pass / fail with line + column on failure. The same parser every web service uses — if it parses here, it'll parse on the wire.

Layer 2: Common-mistake detector. Six patterns flagged even when the input passes strict parsing: BOM at start, trailing commas, single quotes, embedded comments, unquoted keys, raw control characters. These are the differences between strict JSON and the JS-flavored or JSON5 dialects. An input that parses in your editor (which may be lenient) might still trip a strict server-side parser. The detector flags both pre- and post-parse style issues.

Layer 3: Structure stats. Once the JSON is parsed, we walk the tree and report: max nesting depth, total key count, value-type breakdown, array length distribution, raw + estimated-gzip size. Useful for spotting unexpectedly deep nesting (anything past 8 levels makes API consumers grumpy), denormalized arrays, or null-heavy schemas with wasted overhead.

§ 03 · when to use this

Four jobs this tool covers.

Job 1: Webhook payload triage. A webhook from Stripe / Shopify / GitHub came back. Paste the body, validate, read the structure stats. The stats tell you the shape without scrolling through the JSON. If validation fails, the line + column points at exactly where to fix.

Job 2: Log file scanning. Toggle NDJSON mode. Paste a chunk of a Cloudflare / AWS CloudWatch / Vector / Fluent Bit log. The validator counts valid lines, invalid lines, and shows the first eight failing lines with their parse error. Faster than grepping for a malformed entry across a 10,000-line export.

Job 3: Pre-publish config check. Before merging a change to a Kubernetes manifest, a Terraform variables file, a GitHub Actions workflow, or any JSON-bearing config — paste the JSON sub-content here, validate, run the common-mistake detector. Catches the trailing comma you didn't see and the YAML-vs-JSON quote mismatch before CI does.

Job 4: AI output cleanup. When ChatGPT or Claude returns a JSON-shaped response embedded in prose, paste the JSON portion here. The common-mistake detector catches the JS-flavored quirks LLMs sometimes produce (unquoted keys, trailing commas, comments) — fix the input or instruct the model differently. Pair with our JSON Formatter for the pretty-print follow-up.

§ 04 · questions

Six questions users ask.

How is this different from JSON Formatter?

JSON Formatter prioritizes pretty-printing and indent options for readable output. JSON Validator prioritizes diagnostics: detailed error messages with surrounding context, common-mistake detection (trailing commas, single quotes, BOMs, comments — even on input that parses successfully but might break in stricter parsers), structure stats (depth, key count, type breakdown), and NDJSON line-by-line validation. Use Formatter when you want to read the JSON; Validator when you need to know if it's correct and what's in it.

What's NDJSON / JSON Lines?

NDJSON (Newline-Delimited JSON, sometimes called JSONL or JSON Lines) is a streaming format where each line is an independent JSON document. Common in log files (Cloudflare logs, AWS CloudWatch, structured application logs), data exports (BigQuery exports, MongoDB dumps, JSONL training data for LLMs), and any context that wants append-only JSON without a wrapping array. Toggle NDJSON mode and we validate each line independently — useful for catching the one bad row in a 10,000-row export.

What does the structure stats panel show?

Five fields: max nesting depth (how many levels deep the deepest leaf sits), total key count (count of all keys across all objects), value-type breakdown (counts of strings, numbers, booleans, nulls, objects, arrays), array length distribution (min, max, average for any arrays present), and approximate gzip-compressed size (a rough estimate of how big the JSON would be on the wire). Useful for spotting unexpectedly deep nesting, denormalized arrays, or wasted overhead from null-heavy schemas.

What 'common mistakes' does it catch?

Six classes: (1) trailing commas after the last element of an object or array — illegal in strict JSON but common in JavaScript-flavored JSON. (2) Single quotes instead of double — legal in JS but not JSON. (3) BOM characters at the start — invisible UTF-8 byte order marks that break some strict parsers. (4) Embedded // or /* */ comments — legal in JSONC and JSON5 but not strict JSON. (5) Unquoted keys ({ key: 'value' }) — JS object literal style, not JSON. (6) Control characters in strings (raw tabs, newlines) — must be escape-encoded. We surface each as a recommendation even if the input passes the strict parser.

Can it validate against a JSON Schema?

Not full JSON Schema (the IETF standard runs hundreds of pages of behaviors). For full schema validation, use a real validator like Ajv in your build pipeline. For a quick spot-check, paste a simple type-only schema in the side panel — we validate types, required fields, enum values, minLength, maxLength, minimum, maximum, and pattern (regex). That covers ~80% of real-world schemas. Anything more complex (allOf / oneOf / anyOf / $ref) needs Ajv.

Is the JSON I paste sent anywhere?

No. We use the browser's native JSON.parse — runs entirely client-side. The page is static HTML; the only network request is the initial page load. Safe for JSON containing API keys, customer records, internal config, or anything else you would not want to upload to a third party.