SQLite Forum

Dump and online backupjexclude tables
Login
The sqlite3_backup API operates on the page level, and I'd wager it treats the data within each page as opaque rather than parsing any of it. As such, I'd expect it doesn't _know_ which table a page belongs to, or anything about tables at all.

Theoretically it wouldn't be a huge effort for sqlite to infer which pages are relevant for a given set of tables/indices and setup a partial-backup based on that subset, however unless you start rewriting page data (and the embedded references to other pages) you'd still end up with a database of the same size just with a bunch of empty pages (which is still potentially advantageous if you use a compressed filesystem).

OTOH if you do start parsing and rewriting page data, this is sounding less like a backup and more like a VACUUM INTO.

Of course with any approach that subsets you end up with a backup which cannot be directly swapped back in to replace the live database, ie. you must have some recovery procedure in place to regenerate whatever was filtered out of the backup. As such, what is the downside to the .dump {relevant-tables} approach?