SQLite Forum

Csv or Vsv parsing blob column
Login
I should have been more clear, its actually in a text column already, but the files are megabytes in size, which I assume is hitting buffer limits using eval. Was wondering if there was a way to parse a large csv stored in a column using the existing csv and tsv extensions.

drop table if exists csvfile;
CREATE TABLE csvfile
(
  name TEXT,
  csv TEXT
);
INSERT INTO csvfile VALUES
('myfile.csv', 'mycsv file with, commma, delimited, columns and linebreaks for rows
 mycsv file with, commma, delimited, columns and linebreaks for rows
 mycsv file with, commma, delimited, columns and linebreaks for rows
 mycsv file with, commma, delimited, columns and linebreaks for rows
 mycsv file with, commma, delimited, columns and linebreaks for rows
 mycsv file with, commma, delimited, columns and linebreaks for rows
 mycsv file with, commma, delimited, columns and linebreaks for rows
 mycsv file with, commma, delimited, columns and linebreaks for rows
 mycsv file with, commma, delimited, columns and linebreaks for rows');
select eval('CREATE VIRTUAL TABLE temp.t1 USING csv(data="' || csv || '")') from csvfile where name="myfile.csv";
SELECT * FROM t1;

Is there a way to parse larger csv's stored in columns without exporting and reimporting?