SQLite Forum

Setting non-binary columns to null in a table with a binary column populated with data creates free space and a larger file
Login
Note this is probably why it is recommended to store large fields (blob/text) (that consume overflow pages) in a separate table and access them by reference rather than mixing such large items in the same table as small associated data.

For example:

```
create table Data1A
(
  Data_id integer not null unique references Data1B(Data_ID), 
  Items_id INTEGER, 
  Key TEXT
);
create table Data1B
(
  Data_id integer primary key,
  data blob
);
create view Data1 
         as select Data, 
                   Items_id, 
                   Key 
              from Data1A, Data1B
             where Data1A.Data_id == Data1B.Data_id
;
create trigger Data1Insert instead of insert on Data1
begin
  insert into Data1B (Data) values (new.Data);
  insert into Data1A values (last_insert_rowid(), new.Items_id, new.Key);
end;
create trigger Data1Update instead of update on Data1
begin
  update Data1B set data=new.data
   where Data_id == (select data_id from Data1A where Items_id IS old.items_id and Key IS old.key)
     and new.data is not old.data;
  update Data1A set Items_id=new.items_id, key=new.key
   where Items_id IS old.items_id and Key IS old.key;
end;
create trigger Data1Delete instead of delete on Data1
begin
  delete from Data1B
   where Data_id == (select data_id from Data1A where Items_id IS old.items_id and Key IS old.key);
  delete from Data1A
   where Items_id IS old.items_id and key IS old.key;
end;
```

Which splits out the blob (on the assumption that it will be large) into its own table and provides a "view" that looks and works like the original table but does not re-write the blob unless you are actually updating it.