SQLite Forum

sqlite .archive can't handle big files?
Login

sqlite .archive can't handle big files?

(1) By Uwe Schmitz (schmitzu) on 2021-10-14 15:01:32 [source]

Just recently I saw that sqlite has an built-in archiver. I gave it a try and got an error when adding a big (>~700MB) file (platform was a recent Win10, executable from original sqlite download page):

> sqlite3 tst.sqlar
SQLite version 3.36.0 2021-06-18 18:36:39
Enter ".help" for usage hints.
sqlite> .ar -c --verbose C:/Temp/gnuSource.fossil
C:/Temp/gnuSource.fossil
ERROR: out of memory

Adding a much bigger file (~7GB), things got worse. The command returns immediately (no error message), but nothing is added to the archive.

Under Linux (self build executable, a bit older than the WinExe) there is no problem with ~700MB files. But a 2GB file gives:

> sqlite3 tst.sqlar
SQLite version 3.31.1 2020-01-27 19:55:54
Enter ".help" for usage hints.
sqlite> .ar -c --verbose fossilRepos.tar.gz
fossilRepos.tar.gz
ERROR: string or blob too big

Question: is the sqlite-archive not designated to work with bigger files?

(2) By Stephan Beal (stephan) on 2021-10-14 15:06:53 in reply to 1 [link] [source]

Question: is the sqlite-archive not designated to work with bigger files?

My recollection is that the 2GB limit is currently baked into sqlite blobs. It's not an artificial limit of sqlar (fossil, based on sqlite, has the same limit).

(3) By Uwe Schmitz (schmitzu) on 2021-10-14 15:27:49 in reply to 2 [link] [source]

Ok, so it's not useful as a full zip/7zip replacement. I conclude, that the sqlar program uses the same code an has the same limitations.

I mean (naive as I am ;-)), it couldn't be so hard to split files in chunks, if they are biggger than the max blob size...

But thanks for clarifying.