TFT

CSV to SQL Converter

Skip the manual SQL writing. Upload your CSV and get a ready-to-run SQL script — complete with a CREATE TABLE statement and properly typed INSERT rows — in the dialect your database actually speaks.

CSV to SQL Converter

Generate SQL INSERT statements from CSV data

Drag and drop a CSV file here, or click to browse

or paste CSV data below

Target table name

SQL dialect for syntax

Rows per INSERT statement

What This Converter Does

This tool transforms CSV data into SQL INSERT statements ready for database import. Each CSV row becomes a row in your database table. You can specify the table name, choose the SQL dialect, and control how many rows per INSERT statement for optimal performance.

SQL Generation Options

Table name: Specify the target table name for INSERT statements. Use schema-qualified names like "public.users" if needed.

SQL dialect: Choose MySQL, PostgreSQL, or SQLite. Each has slightly different escaping rules and syntax conventions.

Batch size: Control how many rows per INSERT statement. Larger batches (100-1000) import faster but use more memory.

NULL handling: Empty CSV cells become SQL NULL values in the INSERT statements.

Example Output

Input CSV:

name,email,age
Alice,[email protected],30
Bob,[email protected],25

Output SQL:

INSERT INTO users (name, email, age)
VALUES
('Alice', '[email protected]', 30),
('Bob', '[email protected]', 25);

When to Use This

Database seeding: Populate development or test databases with sample data from CSV exports.

Data migration: Import data from legacy systems that export to CSV into modern relational databases.

Bulk inserts: Insert large datasets more efficiently than row-by-row application inserts.

Backup restoration: Restore data from CSV backups when binary dumps aren't available.

Quick imports: Get data into a database quickly without writing import scripts or using GUI tools.

SQL Dialect Differences

MySQL: Uses backtick escaping for identifiers, double-quotes for string literals. Supports multi-row INSERT efficiently.

PostgreSQL: Uses double-quotes for identifiers, single-quotes for strings. COPY command is faster for large imports but requires file access.

SQLite: Similar to PostgreSQL for INSERT syntax. Use .import command for file-based imports, or run INSERT statements directly.

Batch Size Considerations

Small batches (1-10 rows): Easier to debug, smaller transactions, but slower for large datasets.

Medium batches (10-100 rows): Good balance of performance and manageability. Recommended for most cases.

Large batches (100-1000 rows): Fastest for bulk imports, but may hit packet size limits or cause memory issues.

Very large batches (1000+): May exceed max_allowed_packet in MySQL or cause timeout issues. Use with caution.

SQL Escaping and Safety

The tool properly escapes values for SQL:

Single quotes: Escaped by doubling (' becomes '') per SQL standard.

Backslashes: Escaped for MySQL (\\ becomes \\\\).

NULL values: Empty cells become NULL (unquoted) in SQL.

Numbers: Numeric values are inserted unquoted for proper type handling.

Limitations

No schema creation: This tool generates INSERT statements only. You must create the table structure separately.

No data validation: Values aren't validated against your schema. Type mismatches cause SQL errors during import.

Large files: Files over 20MB may generate very large SQL files. Consider splitting or using database-native import tools.

Binary data: CSV can't represent binary data. BLOB columns require different import methods.

Frequently Asked Questions

How do I run the generated SQL?

Save the output as a .sql file and run it with your database client: mysql, psql, sqlite3, or through a GUI tool like phpMyAdmin, pgAdmin, or DBeaver.

Can this handle special characters?

Yes. Quotes, backslashes, and other special characters are properly escaped for SQL syntax.

What about auto-increment IDs?

If your CSV has an ID column, it will be inserted. For auto-increment, either omit the ID column from CSV or from the INSERT statement column list.

Can I convert SQL back to CSV?

Yes, use the SQL to CSV tool. It extracts row data from INSERT statements and converts to CSV format.