PureDevTools

JSON to SQL INSERT Generator

Paste a JSON array, get ready-to-run SQL INSERT statements — batch mode, CREATE TABLE, NULL handling, MySQL/PostgreSQL/SQL Server quoting

All processing happens in your browser. No data is sent to any server.

Options

(0 = all)

594 characters

3 rows · 8 columns · 1 statement · 350 chars

Rows: 3Columns: 8Statements: 1Output size: 350 charsOptions: backtick · batch(100)

You exported 500 records from an API as a JSON array and need to load them into a PostgreSQL table. Writing INSERT INTO users (id, name, email, created_at) VALUES (1, 'Alice', 'alice@example.com', '2024-03-15T10:30:00Z') for 500 rows — with proper quoting, NULL handling, and dialect-specific escaping — is not a task you should be doing by hand.

Why This Generator (Not a JSON Viewer)

PureDevTools has tools for viewing JSON, formatting JSON, and converting JSON to CSV, TypeScript, Go, Python, and Rust. This tool generates SQL INSERT statements — with MySQL, PostgreSQL, SQL Server, and SQLite quoting conventions, multi-row batch inserts, CREATE TABLE inference, NULL handling, and ISO date detection. Everything runs in your browser; no data is sent anywhere.

What Is a JSON to SQL INSERT Generator?

When migrating data, seeding a database, or converting API responses into persistent records, you need SQL INSERT statements that match your table schema. Writing them by hand is error-prone, especially for large JSON datasets with many columns or nested values.

A JSON to SQL INSERT generator reads your JSON array and automatically produces correct INSERT INTO statements, handling:

This tool runs entirely in your browser — your JSON data never leaves your device.

How to Use This Tool

  1. Paste your JSON array in the JSON Input textarea (or click Load sample to try an example)
  2. Set the Table name to match your target table
  3. Choose the Quote style that matches your SQL dialect (backtick for MySQL, double-quote for PostgreSQL)
  4. Select INSERT mode: multi-row (batch) or single row per INSERT
  5. Configure NULL handling and optionally enable CREATE TABLE
  6. The SQL output updates instantly in the SQL Output panel
  7. Click Copy to copy to clipboard, or Download .sql to save as a file

Understanding the Generated SQL

Multi-Row INSERT (Batch Mode)

By default, the tool generates a single INSERT INTO statement with multiple VALUES rows:

INSERT INTO `users` (`id`, `name`, `email`) VALUES
  (1, 'Alice', 'alice@example.com'),
  (2, 'Bob', 'bob@example.com'),
  (3, 'Carol', 'carol@example.com');

This is significantly faster than individual inserts because:

Use the Rows per batch setting to split large datasets into chunks. Set to 0 to put all rows in a single statement. A value of 500–1000 rows per batch is generally safe for most MySQL and PostgreSQL configurations.

Single-Row INSERT Mode

When single-row mode is selected, each row becomes its own statement:

INSERT INTO `users` (`id`, `name`, `email`) VALUES (1, 'Alice', 'alice@example.com');
INSERT INTO `users` (`id`, `name`, `email`) VALUES (2, 'Bob', 'bob@example.com');
INSERT INTO `users` (`id`, `name`, `email`) VALUES (3, 'Carol', 'carol@example.com');

Use single-row mode when you need:

Identifier Quote Styles

Different SQL databases use different syntax to quote identifiers (table names, column names):

StyleExampleUse with
Backtick ``column_name`MySQL, MariaDB
Double quote ""column_name"PostgreSQL, SQLite, standard SQL
Square bracket [ ][column_name]SQL Server (T-SQL), MS Access
None (bare)column_nameSimple names with no reserved words

Quoting identifiers is important when column names conflict with SQL reserved words (e.g., order, select, group, name, key) or contain special characters.

NULL Handling Options

Write NULL (default)

Missing or null values become the SQL NULL keyword:

INSERT INTO `users` (`name`, `bio`) VALUES ('Alice', NULL);

Use this when your column allows NULL and you want to explicitly represent missing data.

Write Empty String

Null values become '' (empty string):

INSERT INTO `users` (`name`, `bio`) VALUES ('Alice', '');

Use this for NOT NULL TEXT columns where you want a placeholder rather than a constraint violation.

Skip Column if Null

Columns with null values are omitted entirely from the INSERT:

INSERT INTO `users` (`name`) VALUES ('Alice');
-- bio column omitted, database default applies

Use this when your table has DEFAULT values you want to apply for missing fields. Note: in this mode each row may have a different column list, so multi-row INSERT falls back to single-row statements.

CREATE TABLE Type Inference

When Include CREATE TABLE is checked, the tool prepends a CREATE TABLE IF NOT EXISTS statement with inferred column types:

CREATE TABLE IF NOT EXISTS `users` (
  `id` INTEGER,
  `name` TEXT,
  `email` TEXT,
  `age` INTEGER,
  `is_active` INTEGER,
  `registered_at` DATETIME,
  `score` REAL,
  `bio` TEXT
);

Type inference rules:

JSON value types in columnSQL type
All integersINTEGER
All booleansINTEGER (0/1)
Any floating-point numberREAL
ISO 8601 date strings onlyDATETIME
Any strings (non-date)TEXT
Objects or arraysTEXT (JSON string)
All nullNULL

These types follow the SQLite type affinity model and are broadly compatible with MySQL and PostgreSQL (which will implicitly cast them to their native types).

Value Encoding Rules

JSON valueSQL literal
nullNULL (or '' or omitted — see NULL handling)
true1
false0
Integer numberLiteral number, e.g., 42
Float numberLiteral number, e.g., 3.14
Infinity, NaNNULL (not representable in SQL)
StringSingle-quoted, e.g., 'hello'
String with single quoteEscaped by doubling: 'it''s fine'
String with backslashEscaped: '\\'
ISO 8601 date stringSingle-quoted: '2024-01-15'
Object or arrayJSON-serialized, single-quoted: '{"key":"val"}'

Performance Tips for Large Datasets

Common Use Cases

Database seeding: Convert fixture JSON files from a test framework into SQL seed scripts for a staging environment.

Data migration: Export data from a REST API (JSON) and import it into a relational database without writing custom ETL code.

Schema prototyping: Use the CREATE TABLE option to quickly scaffold a table structure from a sample API response.

Cross-database replication: Export records from one database as JSON (via SELECT ... FOR JSON in SQL Server or json_agg in PostgreSQL) and re-import them into another.

CSV → JSON → SQL pipeline: Combine the CSV to JSON converter tool to first parse your CSV file, then use this tool to generate INSERT statements.

Security Note

All string values are sanitized by doubling single quotes before insertion. However, you should always review generated SQL before running it against a production database:

This tool is intended for development, migration, and seeding workflows — not for generating SQL to be executed from untrusted user input.

Related Tools

More JSON Tools