SQLite Forum


2 forum posts by user schmitzu

15:27 Reply: sqlite .archive can't handle big files? (artifact: 91ecd7cf73 user: schmitzu)

Ok, so it's not useful as a full zip/7zip replacement. I conclude, that the sqlar program uses the same code an has the same limitations.

I mean (naive as I am ;-)), it couldn't be so hard to split files in chunks, if they are biggger than the max blob size...

But thanks for clarifying.

15:01 Post: sqlite .archive can't handle big files? (artifact: 7d167e2a80 user: schmitzu)

Just recently I saw that sqlite has an built-in archiver. I gave it a try and got an error when adding a big (>~700MB) file (platform was a recent Win10, executable from original sqlite download page):

> sqlite3 tst.sqlar
SQLite version 3.36.0 2021-06-18 18:36:39
Enter ".help" for usage hints.
sqlite> .ar -c --verbose C:/Temp/gnuSource.fossil
ERROR: out of memory

Adding a much bigger file (~7GB), things got worse. The command returns immediately (no error message), but nothing is added to the archive.

Under Linux (self build executable, a bit older than the WinExe) there is no problem with ~700MB files. But a 2GB file gives:

> sqlite3 tst.sqlar
SQLite version 3.31.1 2020-01-27 19:55:54
Enter ".help" for usage hints.
sqlite> .ar -c --verbose fossilRepos.tar.gz
ERROR: string or blob too big

Question: is the sqlite-archive not designated to work with bigger files?