I think you'll need to show a bit more of what you are trying to do. I just wrote this simple-minded Perl program, mkbigger.pl, to create mondo tsv files: ``` my $count = shift; my $head = <>; my $row = <>; my @cols = split("\t", $row); print $head; for (my $i = 1; $i <= $count; ++$i){ @cols[0] = sprintf("%d", $i); print join("\t", @cols); } ``` then had this SQLite3 shell session: ``` SQLite version 3.31.1 2020-01-27 19:55:54 Enter ".help" for usage hints. Connected to a transient in-memory database. Use ".open FILENAME" to reopen on a persistent database. sqlite> create table Biggy7 (id integer primary key autoincrement, a text, b text, c real, d real, e blob, f blob); sqlite> insert into Biggy7 (a,b,c,d,e,f) values ('a','b',3e8,1.38e-23,x'01',x'02'); sqlite> .mode tabs sqlite> .headers on sqlite> .once small.tsv sqlite> .select * from Biggy7; sqlite> .q ``` Then, at the OS shell: ``` > perl mkbigger.pl 600000 < small.tsv > vbiggy.tsv > sqlite3 ``` Then, in SQLite shell again: ``` SQLite version 3.31.1 2020-01-27 19:55:54 Enter ".help" for usage hints. Connected to a transient in-memory database. Use ".open FILENAME" to reopen on a persistent database. sqlite> .mode tabs sqlite> .import vbiggy.txt Bigger Error: cannot open "vbiggy.txt" sqlite> .import vbiggy.tsv Bigger sqlite> .headers on sqlite> select count(*) as rowCount from Bigger; rowCount 600000 sqlite> ``` No problem arose with this. Of course, I was not taxing any machine memory limits with this, or working the filesystem much. You might count the rows you did get in, then go look at your data and see if something special is there, just past the rows that were .import'ed alright.