How to Format JSON in JavaScript — 5 Methods
JSON is everywhere: API responses, config files, database exports, log pipelines. But raw JSON output is often a compact blob — no line breaks, no indentation, keys in whatever order the serializer spits them out. That makes debugging painful and collaboration harder than it needs to be.
JavaScript gives you several ways to format JSON, from the built-in JSON.stringify to
streaming libraries for files too large to load into memory. This guide walks through five practical
methods, covering the most common cases and the edge cases that catch developers off guard.
Method 1: JSON.stringify with Indentation
The simplest approach is already built into the language. JSON.stringify accepts three
arguments: the value to serialize, a replacer (more on that in Method 2), and a space
parameter that controls indentation.
const data = {
name: "Alice",
role: "engineer",
skills: ["JavaScript", "TypeScript", "Node.js"],
active: true,
};
// 2-space indent (common default)
console.log(JSON.stringify(data, null, 2));
// 4-space indent
console.log(JSON.stringify(data, null, 4));
// Tab indent
console.log(JSON.stringify(data, null, "\t"));
With null, 2, the output looks like this:
{
"name": "Alice",
"role": "engineer",
"skills": [
"JavaScript",
"TypeScript",
"Node.js"
],
"active": true
}
Two spaces is the most common convention in JavaScript projects — it matches the default in ESLint,
Prettier, and most style guides. Four spaces is common in Python-adjacent tooling. When
space is omitted or set to 0, the output is a compact single-line string —
ideal for wire transmission.
You can also format JSON instantly in the browser using the JSON Formatter tool — paste raw JSON, get pretty-printed output, and copy the result without writing any code.
Method 2: Custom Replacer Function
The second argument to JSON.stringify — the replacer — is a function or array that
filters and transforms values before serialization. This is where you get real control over what
ends up in the output.
The array form is the simplest approach for key whitelisting:
const user = {
id: 42,
name: "Bob",
password: "s3cr3t",
email: "bob@example.com",
createdAt: new Date(),
};
// Array replacer: include only listed keys
const safeJson = JSON.stringify(user, ["id", "name", "email"], 2);
console.log(safeJson);
// {
// "id": 42,
// "name": "Bob",
// "email": "bob@example.com"
// }
The function form gives you per-value control. Return the value to include it, return
undefined to drop it, or return a transformed value:
function replacer(key, value) {
// Drop keys that start with underscore (private convention)
if (key.startsWith("_")) return undefined;
// Mask sensitive fields
if (key === "password" || key === "token") return "[REDACTED]";
// Convert Date objects to ISO strings explicitly
if (value instanceof Date) return value.toISOString();
return value;
}
const payload = {
id: 1,
name: "Carol",
password: "hunter2",
_internalFlag: true,
lastLogin: new Date("2026-05-01"),
};
console.log(JSON.stringify(payload, replacer, 2));
// {
// "id": 1,
// "name": "Carol",
// "password": "[REDACTED]",
// "lastLogin": "2026-05-01T00:00:00.000Z"
// } The replacer runs recursively on every nested object and array. This pattern is useful for logging pipelines where you want to strip credentials before writing to disk or sending to an observability service.
Method 3: Pretty-Print with Sorted Keys
JavaScript objects do not guarantee key order (though V8 and most engines preserve insertion order for string keys). When you need deterministic output — for diffing, caching, or canonical representations — sorting keys alphabetically is the right move.
function sortedStringify(value, indent = 2) {
return JSON.stringify(value, sortReplacer, indent);
}
function sortReplacer(key, value) {
if (value !== null && typeof value === "object" && !Array.isArray(value)) {
return Object.keys(value)
.sort()
.reduce((sorted, k) => {
sorted[k] = value[k];
return sorted;
}, {});
}
return value;
}
const config = {
version: "1.0",
author: "Dave",
dependencies: { typescript: "^5.4", eslint: "^9.0", astro: "^5.0" },
name: "my-project",
};
console.log(sortedStringify(config));
// {
// "author": "Dave",
// "dependencies": { "astro": "^5.0", "eslint": "^9.0", "typescript": "^5.4" },
// "name": "my-project",
// "version": "1.0"
// }
Sorted keys mean that git diffs show only the lines that actually changed, rather than arbitrary
reorderings from different serializers. This is particularly useful for package.json
and similar config files checked into version control.
For converting JSON to other formats, the JSON to YAML tool and JSON to CSV tool also handle key ordering in their output.
Method 4: Format from a String (Parse + Stringify)
Real-world JSON usually arrives as a string — from a fetch response, a file read, a clipboard paste, or a database TEXT column. You need to parse it first, then reformat it. The critical piece is proper error handling: invalid JSON will throw, and you want to catch that gracefully.
function formatJsonString(rawString, indent = 2) {
try {
const parsed = JSON.parse(rawString);
return { ok: true, result: JSON.stringify(parsed, null, indent) };
} catch (err) {
return { ok: false, error: err.message };
}
}
const raw = '{"name":"Eve","scores":[100,95,88],"active":true}';
const { ok, result, error } = formatJsonString(raw);
if (ok) {
console.log(result);
// {
// "name": "Eve",
// "scores": [100, 95, 88],
// "active": true
// }
} else {
console.error("Parsing failed:", error);
}
// Invalid input
const bad = '{"name": "Eve", "broken":}';
const r2 = formatJsonString(bad);
// { ok: false, error: "Unexpected token '}'" } Wrapping parse errors in a structured return object makes this function safe to use in UI components and build scripts without surrounding every call site in a try/catch. The MDN documentation for JSON.stringify covers the full parameter spec, and the RFC 8259 defines what valid JSON looks like at the protocol level.
Method 5: Streaming for Large Files
Methods 1–4 all load the entire JSON structure into memory before formatting. For files in the hundreds of megabytes or multi-gigabyte range, this blocks the Node.js event loop and may crash the process entirely.
The streaming approach reads the file in chunks and writes formatted output incrementally. For NDJSON
(one JSON object per line, common in log files and database exports), a readline-based
approach works without extra dependencies:
import { createReadStream, createWriteStream } from "node:fs";
import { createInterface } from "node:readline";
async function formatNdjsonFile(inputPath, outputPath) {
const rl = createInterface({
input: createReadStream(inputPath),
crlfDelay: Infinity,
});
const output = createWriteStream(outputPath);
for await (const line of rl) {
if (!line.trim()) continue;
try {
const obj = JSON.parse(line);
output.write(JSON.stringify(obj, null, 2) + "\n---\n");
} catch (e) {
output.write("[Invalid JSON line: " + e.message + "]\n---\n");
}
}
output.end();
} NDJSON is the simplest streaming format: each line is a valid, complete JSON object. Many export tools support it precisely because it is trivially streamable. If you control the format of large data exports, prefer NDJSON over a single giant JSON array.
Edge Cases to Watch Out For
These are the scenarios where standard JSON formatting either fails silently or throws unexpectedly.
Circular References
JSON.stringify throws a TypeError if an object references itself directly
or indirectly. Fix it with a replacer that tracks visited objects using a WeakSet:
function safeStringify(obj, indent = 2) {
const seen = new WeakSet();
return JSON.stringify(obj, (key, value) => {
if (typeof value === "object" && value !== null) {
if (seen.has(value)) return "[Circular]";
seen.add(value);
}
return value;
}, indent);
}
const a = { name: "circular" };
a.self = a;
console.log(safeStringify(a));
// { "name": "circular", "self": "[Circular]" }
The WeakSet holds references without preventing garbage collection, which avoids memory
leaks in long-running processes.
BigInt Values
JSON.stringify throws a TypeError for BigInt values because
the JSON spec has no 64-bit integer type. Convert to string in your replacer:
const data = { id: 9007199254740993n, value: 42 };
JSON.stringify(data, (key, value) =>
typeof value === "bigint" ? value.toString() : value
, 2);
// { "id": "9007199254740993", "value": 42 } Map and Set Values
Map serializes as an empty object and Set serializes as an empty array —
not their contents. Convert them explicitly in a replacer:
const data = {
tags: new Set(["json", "javascript"]),
meta: new Map([["source", "api"]]),
};
JSON.stringify(data, (key, value) => {
if (value instanceof Set) return [...value];
if (value instanceof Map) return Object.fromEntries(value);
return value;
}, 2);
// { "tags": ["json", "javascript"], "meta": { "source": "api" } } undefined Values
Object properties with undefined values are silently dropped. Array slots with
undefined become null. Use a replacer to convert undefined
to null when you need to preserve all keys:
const obj = { a: 1, b: undefined, c: null };
JSON.stringify(obj, null, 2);
// { "a": 1, "c": null } — "b" is silently dropped
// Fix:
JSON.stringify(obj, (key, value) =>
value === undefined ? null : value
, 2);
// { "a": 1, "b": null, "c": null } Format JSON Instantly in the Browser
If you need to format a JSON blob right now without writing any code, the Toova JSON Formatter handles it in one click — paste raw JSON, get pretty-printed output with 2 or 4-space indentation, and copy the result. No signup, no file upload, everything runs locally in your browser.
For conversions between formats, JSON to YAML and JSON to CSV follow the same privacy-first approach — your data never leaves your device.
Conclusion
For most use cases, JSON.stringify(obj, null, 2) is all you need. Add a replacer
function when you need filtering, masking, or sorted keys. Wrap JSON.parse in a
try/catch when handling external input. Reach for streaming only when file size makes synchronous
parsing impractical. And keep the edge cases — circular refs, BigInt,
Map/Set, undefined — in the back of your mind when working
with unusual data shapes.