Download .TAB file into SQLITE3 Table
Hi I want to easily download the data in the following NASA .TAB file into a SQLITE3 table using Python. Not sure the best way to connect and download this data easily. It would just be a one off download initially.
I guess I would need to web scrape the data based on a new entry every line break into a newly defined table in SQLITE3. I am using Python to do this. ANy help much appreciated.
Take a look at pandas.read_csv(). Once you have a pandas dataframe, it is easy to write it out to a DB table.
Thanks this appears to be the way to go.
However on trying to copy the 15,000 rows of data from my panda dataframe to an SQLITE table it cuts out before it copies it all with the following error. Is this a common thing and easy to fix? 15,000 rows isn't particularly big I will be dealing with much higher numbers with this and my PC has loads of memory and disk space so should be able to cope. I would prefer to store the data permanently in SQLITE instead of working with dataframes constantly.
IOPub data rate exceeded.
The notebook server will temporarily stop sending output
to the client in order to avoid crashing it.
To change this limit, set the config variable
Current values: NotebookApp.iopub_data_rate_limit=1000000.0 (bytes/sec) NotebookApp.rate_limit_window=3.0 (secs)
(4) By Gunter Hick (gunter_hick) on 2021-08-16 14:30:01 in reply to 3 [link] [source]
This appears to be a limit imposed by the sender of the data, not related to anything in SQLite.
This came up when I tried print out the contents of the table to screen so I guess it killed the memory or something. I think my save from dataframe to SQLITE table went as planned and all rows are in there. I am not sure the code to count the number of rows and I will be able to confirm if the data is all in there. I think it is and it is just not printing out to screen for some reason from SQLITE.
I opened the database using a free DB tool called DB Browser and all rows are in at as expected. The error comes up when I try and print all rows to the screen which I don't need to do now given that I know they are in the DB. Think I will just use the DB Browser now for all checks on correct DB contents etc and not Python and the browser.