SQLite Forum

Loading a large file into a sqlite database
Login

Loading a large file into a sqlite database

(1) By Kevin Youren (KevinYouren) on 2021-08-28 01:49:48 [source]

If you have a need analyze a really large file,
the following .import technique may be useful to you.

Note, I don't use a .mode sub-command, as my data is not in CSV format.

A 2.5GB Linux text file with about 67 million lines, 
loaded into a 3GB SQLITE db table in about 80 seconds.

The sqlite3 terminal command works quite well with this database.

The GUI database viewer I use hangs when I try to load the database.



my_directory$ ls -lai lsof_with_trace.trace
2475135568 Aug 27 11:58 lsof_with_trace.trace
my_directory$ rm lsof_with_trace_short_blob.sqlite
my_directory$ date;
Sat 28 Aug 2021 10:46:57 AEST

my_directory$ sqlite3 lsof_with_trace_short_blob.sqlite
SQLite version 3.37.0 2021-08-03 23:23:23
Enter ".help" for usage hints.
sqlite> create table trace_blob(trace_line BLOB);
sqlite> .schema
CREATE TABLE trace_blob(trace_line BLOB);
sqlite> .import "lsof_with_trace.trace" trace_blob
sqlite> select count(*) from trace_blob;
66920485
sqlite> .quit
my_directory$ date;
Sat 28 Aug 2021 10:48:14 AEST

my_directory$ ls -lai lsof_with_trace.trace
2475135568 Aug 27 11:58 lsof_with_trace.trace
my_directory$ ls -lai lsof_with_trace_short_blob.sqlite
3044913152 Aug 28 10:48 lsof_with_trace_short_blob.sqlite