SQLite Forum

recommendation for reclaiming the db when disk is full
Login

recommendation for reclaiming the db when disk is full

(1) By vgimonka on 2022-01-07 23:27:29 [link] [source]

Hello, I have an embedded device on which I have a in memory sqlit3 database. I am using C apis to connect and perform operation on it.

I was doing stress test by adding entires into this db until I get disk full error.

In such a case, once the disk is full, The only operation I can do is open the database.

I have tried to just read the db using the handler I get from open, but that fails rc = sqlite3_exec(db, sql, read_callback_print, (void*)data, &err);

basically any operation using sqlite3_exec fails.

What is the recommended way to reclaim the db by removing items from the database?

I read that in such cases, i need to maintain an external process to keep track of the memory and not let additional entires inside db when memory is close to full. But I am not sure if that is what is recommeneded here?

thank you

(2) By Simon Slavin (slavin) on 2022-01-08 07:38:27 in reply to 1 [link] [source]

I answer the problem you said you had in two parts:

Copy your database and make sure your copy is not corrupt.

You can't do anything useful on a computer with full memory. If you are using an 'in memory sqlite3 database' then SQLite is only one contributor to it getting a 'disk full error'., You have a number of things taking up memory. Make one of them take up less memory until you have some free memory to rescue yourself from your situation. Once you have done that you can proceed.

Can your embedded device use anything but memory for storage ? For instance can it address a Flash drive plugged into it ? If so, make a copy of the database and any associated files SQLite has made (any file in the same folder which has a similar name) on the Flash drive. Take the Flash drive, plug it into a proper computer, then use that computer to check and correct the database. Once it is uncorrupted, do something to replace the 'in memory' database with this new uncorrupt copy.

If your device has no way to address external storage you're going to have to write C code to do something equivalent. Unless your device can run the SQLite command line shell tool, which seems unlikely on an embedded device.

Prevent this happening in future.

Just as you wrote. Figure out how much memory you'd need to get yourself out of a 'full memory' situation, double it, and write some code that monitors the amount of free memory and does something when free memory drops below that amount. This is not specifically to do with SQLite, it's something you do with any computer which continually uses up more memory. Since the monitoring process would not involve SQLite (though the part which frees up memory might), I have no advice for you on how to do it.

If your embedded device is logging data into the SQLite database, or doing anything else which will continually make the database bigger, you should think through how to have this happen in an orderly and convenient way. Alternatively write your database to storage, presumably Flash storage, rather than memory.

(3) By vgimonka on 2022-01-08 21:18:01 in reply to 2 [link] [source]

Thank you Simon for your response.

So essentially you are saying once I hit such a case, there is no way I can recover using sqlite3 alone, unless I have some external factors like

  1. doing periodic backup to flash storage

  2. monitoring services which monitors the space and informs my sqlite process to not futher populate db if its close to getting full

(4.1) By Keith Medcalf (kmedcalf) on 2022-01-08 21:34:20 edited from 4.0 in reply to 3 [link] [source]

Once you have relieved the problem, it might be wise to calculate the "maximum size" of the database file (which still permits you to clean up) and then set a page limit on the database file size on all connections that update the database using pragma max_page_count https://sqlite.org/pragma.html#pragma_max_page_count

The default is to permit the database file to grow until the underlying storage is full. You are, however, free to specify at what point the report of "disk full" occurs (before it is actually full), and it might be wise to take advantage of that built-in capability. Note that you will need to use the pragma after opening every connection to the database file on which you wish to limit the number of pages -- the limit is not "sticky".

(6) By Keith Medcalf (kmedcalf) on 2022-01-08 22:12:34 in reply to 4.1 [link] [source]

Note that once you detect "disk full" you can reset the page limit to 0 (unlimited) and perform your "cleanup", then re-impose the limit (if necessary).

Example:

>sqlite
SQLite version 3.38.0 2022-01-07 00:43:25
Enter ".help" for usage hints.
Connected to a transient in-memory database.
Use ".open FILENAME" to reopen on a persistent database.
sqlite> pragma max_page_count=2;
┌────────────────┐
│ max_page_count │
├────────────────┤
│ 2              │
└────────────────┘
sqlite> create table x(x);
sqlite> insert into x select value from wholenumber where value between 1 and 100;
sqlite> insert into x select value from wholenumber where value between 101 and 200;
sqlite> insert into x select value from wholenumber where value between 201 and 300;
sqlite> insert into x select value from wholenumber where value between 301 and 400;
sqlite> insert into x select value from wholenumber where value between 401 and 500;
Error: stepping, database or disk is full (13)
sqlite> delete from x order by rowid limit 25;
sqlite> insert into x select value from wholenumber where value between 401 and 500;
sqlite>

(7) By Keith Medcalf (kmedcalf) on 2022-01-08 22:22:28 in reply to 6 [link] [source]

Ooops. Settting the max_page_count to 0 (or negative) does not reset the count to unlimited. You have to set it to the value you want.

(11) By vgimonka on 2022-01-10 22:59:29 in reply to 7 [link] [source]

thank you Keith! this is very helpful!

(5) By Ryan Smith (cuz) on 2022-01-08 21:52:24 in reply to 3 [source]

While "1. - Doing periodic backup to flash" is always a good idea, it won't prevent or circumvent a filled-up memory, but it will prepare a good backup in the case that (or any other loss) happens.

As to the other point, about monitoring that informs SQLite to "not further populate db if it is close to getting full", I don't see it having any reliable success rate. SQLite uses memory for all kinds of things, DB storage is only one of those things (in an in-memory DB), but all manner of DB management, even select queries, will use some or other amount of memory, and sometimes lots of it, up to double the size of the DB itself for certain operations.

There is no amount of monitoring and forewarning of SQLite that seems helpful to me. What is helpful is making sure there is always an amount of memory free that is equal or more than the size (or expected size) of the DB itself, over and above the memory used by the DB. As soon as that is no longer the case, warn some user or the DB admin or systems administrator.

Some systems eat as much memory as you allow them, such as MSSQL or MySQL and the like, do not run these on the same system, or if you do, ensure their memory usage is limited to a point where you know there will be enough left for double the SQLite DB's expected final size. Then, once such a size is decided, limit SQLite itself by setting the max page-count so it too knows to fail once growing beyond.

If you keep adding water into a bucket it WILL overflow, but you can decide at which point that is by deciding the size of bucket to use, and deciding what to do and who to warn when the high-water mark is reached in the bucket, before it starts spilling. There is no way to automate this without committing to disk, unless part of maintenance is deleting (losing) old data from the DB, and even then size reduction may need to be forced with a VACUUM - which is exactly an operation that may require twice the DB size in memory to complete.

I'm of the belief that if you are not using at least 80% of your available memory, then you've wasted your money buying it. But allocating 2GB for a 1GB database counts precisely as "using" it, even though it may not be occupied fully at all times. Running out of memory is much worse, especially if you start swapping to disk, or worse still, if there is no disk to swap to.

I suppose what I tried to say in far too many words is this: Please "Plan" your memory usage with enough margin and set everything up to respect those limits. This is much wiser than trying to measure/monitor the usage and try put reactive measures in place for when it overfills.

(9) By vgimonka on 2022-01-10 16:45:24 in reply to 5 [link] [source]

Thank you Ryan, I will try out the page count method, it looks promising.

(8) By Simon Slavin (slavin) on 2022-01-08 23:30:41 in reply to 3 [link] [source]

Backups are only about a rescue. You shouldn't need to rescue. If your device requires a rescue it is a failing device and you are a bad designer. The device should not fail in the first place, which is the (2) you described.

This is not specific to SQLite, it's a general problem with any storage device on any computer. If you let the device get full you need human interaction to proceed. Because a computer-contolled process of releasing space requires temporarily using more space. So it will fail, because there is no more space to use. And then your only way to solve the problem is to have a human do it.

Because of this you don't let a storage device get completely full. You write your program, or write another special monitoring program, so that it spots that the device is nearly full and does something about it.

Otherwise your device will hang or crash, and rebooting it will not cure the problem, because as soon as the device is booted it writes to a log file, and one day that writing will fail because storage for the log file is full.

Programming is simple. It can be done by idiots, and often is. Designing something which works long-term requires intelligence and experience. Experience is what you're getting now. This is what makes the difference between earning $20,000 a year and earning $80,000 a year.

(10) By vgimonka on 2022-01-10 16:45:44 in reply to 8 [link] [source]

thank you Simon for your inputs! very helpful.