SQLite Forum

A newbie's question about database structure design
Login
I have a 3yeaars large number of stock historical data, they are stored in directories according to different dates, and the name of each directory is like 20180102, 20180103...There are about 750 directories so far, and there are more than 4000 CSV files stored in each directory, and file name corresponds to a stock code. Each CSV file has 50000-100000 rows and 7 columns.

My current method of processing these files is used the os.walk method in python to walking the tree. This method is very inefficient and slowly, so I want to convert these files to a sqlite database, but I am new to databases and I don't know how to design a more reasonable database structure. Here are the three solutions I think:

1. Build a database with 4000 tables. One column of each table corresponds to all the historical data of a stock. I need to use python to merge the three-year CSV datas of single stock and import it into a column in the table.

2. Build 4000 databases (the number of stocks), each database has 750 tables, and each table directly corresponds to the content of a CSV file.

3. Build 750 databases (the number of directories). There are 4000 tables in each database, and each table directly corresponds to the content of a CSV file. (Similar to solution 2)

My question is which solution is best? The method 1 or the latter two methods(2 and 3)? including factors such as execution efficiency, speed, stability, etc.?

Any information will be appeciated.