JSON_Object() Argument Limit
(1) By dCipher on 2022-03-14 19:57:32 [link] [source]
Hi
I had a table with 102 fields which I wanted to retrieve as JSON. However when I enter all 204 arguments for JSON_Object() it says there are too many arguments. Can this limit be removed from this function so you can retrieve more than 50 fields.
Thanksm Simon
(2) By Alex Garcia (alexgarciaxyz) on 2022-03-15 03:28:15 in reply to 1 [link] [source]
Hey Simon,
Unfortunately it seems like you're hitting a known implementation limit of SQLite, which says:
The number of arguments to a function is sometimes stored in a signed character. So there is a hard upper bound on SQLITE_MAX_FUNCTION_ARG of 127.
A few ideas if you need to work around it:
- group related columns together as separate objects, like:
select json_object(
'group1', json_object(
"item1", 'item1',
"item2", 'item2',
/* ... */
),
'group2', json_object(
"item3", 'item3',
"item4", 'item4',
/* ... */
)
);
- Use
json_group_object
to create objects with > 127 keys. But if you're getting these 100+ values from 100+columns in the same table, it will be a bit awkward.
I'm sure someone with better sqlite-fu can think of something better!
(3) By midijohnny on 2022-03-15 12:36:10 in reply to 2 [link] [source]
Maybe use repeated iterations to the JSON_INSERT function. Here's a static example:
WITH
a(j) AS (VALUES(JSON_OBJECT('pk',1))),
b(j) AS (SELECT JSON_INSERT(j,'$.fname','joe') FROM a),
c(j) AS (SELECT JSON_INSERT(j,'$.lname','bloggs') FROM b)
-- [...]
SELECT j FROM c;
Output:
{"pk":1,"fname":"joe","lname":"bloggs"}
Its not as nice the JSON_GROUP_OBJECT used above for sure - but perhaps more amenable (since we don't have to count-up the number of columns beforehand) for building a recursive solution (which would need to rely on dynamic SQL, since we need a list of "column_names").
See this forum post for suggested ways of building the dynamic SQL.
Actually thinking about it: the JSON_GROUP_OBJECT method would work just as well in a recursive setting (one object per group, no need to count them up in advance) - and would take just a single call at the end to combine the objects.
(4) By dCipher on 2022-03-18 16:25:42 in reply to 3 [link] [source]
Thank-you both for your suggestions. Too bad there was not a general way to get all fields from a table something like "Select JSON_Object('cltrn','*') from cltrn" so that it automatically returned all the columns and values as JSON.
But I found the following works:
select json_group_array(json_insert(json_insert(json_object('trnno',[trnno],'total',[total]),'$.date',[date],'$.qty',[qty]),'$.price',[price],'$.authby',[authby])) from clgrdtrn
(5) By midijohnny on 2022-03-18 18:15:31 in reply to 4 [link] [source]
I agree - it would be nice somehow to have a variation of the JSON_OBJECT function that took a '*'.
In the meantime - in your example - are you just selecting all the columns from a single table? (i.e. not doing any other joining or more complicated expressions?)
If so: then the dynamic SQL example and a combination of the JSON functions could do this automatically - so long as you don't mind having a temporary file and/or running up a second copy of the Sqlite process.
(6) By dCipher on 2022-03-19 14:21:27 in reply to 5 [source]
Hi
I am only selecting from a single table. My preferred solution would be:
Select * from cltrn as JSON.
The "as JSON" would work no matter what the sql select statement was doing because once the result was obtained all that would have to be done is return a JSON array.
If there was no data returned in the select statement the JSON result would be "[]". If there was just a single row the result would be [{...}] and multiple rows would be [{...},{...}...]
Simon