Wow, What a wake-up call, I already understand what you are talking about is how the database works, I am ashamed that I even said "build 4000 databases". I'm such a rookie. I’m repeating what you mean, that is, because all files have the same structure of fields(columns), I just need to create a table, throw all the datas into table, and then use the characteristics of the database to query and sort, right? ? But those CSV files do not contain 'Date' and 'Stock code', so I need to add 2 columns, namely 'Date' and 'StockCode', and then I can rely on these two columns to query data. Am I right? But in this case, the table will be very large, and it will have (assuming that each CSV file has an average of 70,000 rows) 70,000×750×4000=210,000,000,000 (210 billion rows). Can SQLite withstand such a large amount of data? So or should I use a database with several tables? But is it inconvenient to search in that case? Thanks for you advice.