SQLite Forum

Csv or Vsv parsing blob column
Login

Csv or Vsv parsing blob column

(1) By anonymous on 2021-07-20 23:39:24 [link] [source]

I have some tables with columns that have Csv or Vsv blobs in them. Does anyone know how or has modified these extensions to read from blob columns instead of file or data attribute? Sending column blob to data attribute using eval only works on small chunks in my testing.

(2) By MBL (RoboManni) on 2021-07-21 06:09:13 in reply to 1 [link] [source]

it is not easy to understand your request but I tried some basics with 'eval', which herein is one of my extension functions:

sandbox>sqlite3.exe
SQLite version 3.36.0 2021-06-18 18:36:39
Enter ".help" for usage hints.
Connected to a transient in-memory database.
Use ".open FILENAME" to reopen on a persistent database.
sqlite> select typeof(X'010203');
blob
sqlite> select eval('select 3.1415');
Error: no such function: eval
sqlite> .load sqlite3.dll
sqlite> select eval('select 3.1415');
3.1415
sqlite> select eval('select '||X'010203');
unrecognized token: ""
sqlite> select eval('select typeof('||X'010203'||')');
unrecognized token: ""
sqlite> select eval('select typeof("'||X'010203'||'")');
text
sqlite>

(3) By anonymous on 2021-07-21 11:59:59 in reply to 2 [source]

I should have been more clear, its actually in a text column already, but the files are megabytes in size, which I assume is hitting buffer limits using eval. Was wondering if there was a way to parse a large csv stored in a column using the existing csv and tsv extensions.

drop table if exists csvfile;
CREATE TABLE csvfile
(
  name TEXT,
  csv TEXT
);
INSERT INTO csvfile VALUES
('myfile.csv', 'mycsv file with, commma, delimited, columns and linebreaks for rows
 mycsv file with, commma, delimited, columns and linebreaks for rows
 mycsv file with, commma, delimited, columns and linebreaks for rows
 mycsv file with, commma, delimited, columns and linebreaks for rows
 mycsv file with, commma, delimited, columns and linebreaks for rows
 mycsv file with, commma, delimited, columns and linebreaks for rows
 mycsv file with, commma, delimited, columns and linebreaks for rows
 mycsv file with, commma, delimited, columns and linebreaks for rows
 mycsv file with, commma, delimited, columns and linebreaks for rows');
select eval('CREATE VIRTUAL TABLE temp.t1 USING csv(data="' || csv || '")') from csvfile where name="myfile.csv";
SELECT * FROM t1;

Is there a way to parse larger csv's stored in columns without exporting and reimporting?

(4) By Simon Slavin (slavin) on 2021-07-21 14:34:06 in reply to 3 [link] [source]

Can I ask for more clarification ? Your text

csv stored in a column using the existing csv and tsv extensions

doesn't ring any bells.

Are you storing an entire CSV file, encoded as a BLOB, in one field of a SQLite table ?

Are you using https://www.sqlite.org/csv.html or something else ?

(5) By anonymous on 2021-07-21 14:50:46 in reply to 4 [link] [source]

Yes, I have rows with an entire CSV file stored as TEXT in a field in a Sqlite table.

(6) By Simon Slavin (slavin) on 2021-07-22 15:07:08 in reply to 5 [link] [source]

In that case you want a CSV-parsing library. And there isn't one inside SQLite. The CSV parsing donne by the SQLite shell tool is built into the tool itself, not the SQLite library it calls.

(7) By Ryan Smith (cuz) on 2021-07-22 16:04:43 in reply to 5 [link] [source]

Simon is correct about the CSV capability you've seen being in the CLI, not the library itself.

You may be able to write the field to file and then use the library's CSV import capabilities on it, but somewhere some coding is going to get involved, so you may as well just use a CSV importer or library right in your code from the get-go to pack the csv into the tables you wish. This will give you more control too.