You are exactly correct One table, some extra rows for distinguishing dates and Stocks. With appropriate indexing, this one table will probably be faster than any other combination of multiple tables you could consider. Also, SQLite is size-limited based on your choice of Page-size, and the row count limit, well, let's just say at INT64 it can store those rows you described for longer than the age of the known Universe. The Size limit will be depending on your amount of data per row, which sounds like it isn't a lot. Rows are stored on pages, and in the default SQLite setup the page size is 4K which is typically corresponding to most OS's IO page-sizes, which is efficient. There is a limit on the amount of pages that can be in the DB, so the real technical limiting factor is how big a page is, and you can make the page size bigger easily to a maximum of 64K which allows DBs of up to 2 Terra bytes in size - [See here how to change it](https://sqlite.org/pragma.html#pragma_page_size). There are other considerations. If you want to add another Index with a billion rows, that will see a large space requirement added. If you want to Vacuum a DB (reducing the free space inside it after some use) you will need free space on your hard drive of more than twice the size of the DB. Best advice I can give is.... start with the default page size, and try import it all, see if it runs into size errors, and if so, see at what percentage of the job it failed (if at all), that should give you an indication of how much more you will need, which will help deciding an appropriate page size. Also - Ensure you import each section (folder) in a single transaction, to make it as fast as possible without postponing fail conditions by much, and that way it's also easy to just add on new folders as they come.