SQLite Forum

Download .TAB file into SQLITE3 Table
Login
Thanks this appears to be the way to go. 

However on trying to copy the 15,000 rows of data from my panda dataframe to an SQLITE table it cuts out before it copies it all with the following error. Is this a common thing and easy to fix? 15,000 rows isn't particularly big I will be dealing with much higher numbers with this and my PC has loads of memory and disk space so should be able to cope. I would prefer to store the data permanently in SQLITE instead of working with dataframes constantly.

IOPub data rate exceeded.
The notebook server will temporarily stop sending output
to the client in order to avoid crashing it.
To change this limit, set the config variable
`--NotebookApp.iopub_data_rate_limit`.

Current values:
NotebookApp.iopub_data_rate_limit=1000000.0 (bytes/sec)
NotebookApp.rate_limit_window=3.0 (secs)