Best practice with gradle and sqlite?
(1) By daddyjones on 2025-01-27 16:28:46 [link] [source]
I have a smallish gradle project (Java) that I've been working on using a sqlite database.
Trying to follow the gradle conventions I put my db file for testing in src/test/resource and didn't think much more about it. I then accessed the via via the classpath. This, of course, was a mistake. During build the file was moved to the build directory and my code was therefore interacting with that file, though I thought it was interacting with the original db file I created.
So, I was wondering what advice the community might have as to how to handle this? Should I simply put the file in the same location that the program runs in? Should I stick with what I've got, but make sure to ues the correct db file for checks etc.? Somewhere else?
I'm very keen to avoid having to use a full path to locate the file and it kind of needs to be relative to the original location.
Anyway - thanks in advance for any/all of your thoughts!
(2) By Donal Fellows (dkfellows) on 2025-02-04 09:43:07 in reply to 1 [link] [source]
For testing purposes in these sorts of setups (Maven in my case, but gradle follows maven conventions and has the same issues) I've typically not had actual databases in my test data. Instead, I just keep SQL scripts that can create the test setup that I want, and then I apply those to the temporary DB that I make for the test (using an in-memory DB if I can get away with it, and one in a temporary directory otherwise; the path is either generated within the per-test setup or the test itself). This works fine for correctness testing, and has the bonus that the SQL scripts are much nicer VCS artefacts than a binary SQLite DB ever would be.
Another way of thinking about it is that there's no guarantee at all that the contents of the test-enabled build for a Java (or other JVM-based language) are expanded into individual files on disk; they might be contained in a JAR at the point that the test starts. SQLite definitely can't read its database without extraction in such a situation; it typically only works when the DB is hosted directly on a real, local file system. Your tests shouldn't require write access to the test resources though they may be copied to make temporary files.
(3) By ralf (ralfbertling) on 2025-02-06 14:41:25 in reply to 1 [source]
Hi,
we don‘t use Java, but that shouldn‘t make much of a difference:
We create the unit test database in our CI-pipeline, i.e. during builds that affect the schema. For repeated testing. the file is copied. We are considering speeding things up, by utilizing in-memory-dbs (i.e. not writing changes of successful tests to the file system) or even savepoints (creating a top-level transaction and issuing a ROLLBACK after tests and only savepoints für transactions inside of tests. That -of course- requires some abstraction to avoid some parts of the code still changing the file or tempering with the savepoints.
regards, ralf
(4) By Donal Fellows (dkfellows) on 2025-02-06 15:08:24 in reply to 3 [link] [source]
In-memory databases are very good for unit tests are good in many ways, notably being very fast and easy to put into a known state. I'm also pleased to note (from reading the docs while preparing this answer) that one can now have multiple connections accessing the same in-memory database, which is something I've needed in some cases (when testing an internally multi-threaded component).
Integration tests might use a copy of a database file. They ought to not use the database file in the build data (were that even possible) as those sorts of resources should be read-only so that they can be used multiple times in a row without rebuilding.