In the last two years I've been using JSON in almost every project I was involved in.
For example with APEX Office Print our plugin is sending JSON to the backend. This JSON you can actually see yourself in the Remote Debug screen in your AOP Dashboard.
Another example is the wearables project (IoT) I showed at KScope 16; the wearable is sending data to a smartphone or tablet, which in his turn is doing a call to our backend (in ORDS) and sending JSON across.
At the end of the day we want the data in the Oracle Database, so our APEX apps can work with that data.
Since Oracle DB 12c, JSON is supported straight from the database. I wrote a number of blog posts how to read JSON from SQL within the database. Here's a quick demo of JSON in the database:
SQL> create table tbl_with_json (
2 json_clob clob,
3 constraint json_clob_chk check (json_clob is json)
4 );
Table TBL_WITH_JSON created.
SQL>
SQL> insert into tbl_with_json (json_clob) values ('{
2 "items": [{
3 "client_id": -1,
4 "registration_date": "2016-07-29T07:46:09.941Z",
5 "question": "My Question",
6 "description": "My huge clob"
7 }]
8 }');
1 row inserted.
SQL>
SQL> select a.json_clob.items.question as question, a.json_clob.items.description as description
2 from tbl_with_json a;
QUESTION
----------------------------------------------------------------------------------------------------------------------------------------------------------------------
DESCRIPTION
----------------------------------------------------------------------------------------------------------------------------------------------------------------------
My Question
My huge clob
Now the reason of this blog posts: what if your JSON contains some very big text (>32K) in a single node e.g. in the description field?
If you want to follow along in your own test case, open the description record in
SQL Developer for example and past a large text (>32K) in the description node (so replace "My huge clob" with some other big text).
Tip: For my test cases I typically use a Lorem Ipsum generator where I can specify the number of characters for example 33000 characters.
How can we parse this JSON and store for example the content of that in a CLOB field?
As I'm on 12c, should be simple right? The database is supporting reading JSON from SQL, so I first tried with
JSON_TABLE, but there you can only define VARCHAR2 or NUMBER as data type, no CLOB, so went with VARCHAR2.
Here's the result:
SQL> select jt.question, jt.description
2 from tbl_with_json,
3 json_table(json_clob, '$.items[*]'
4 columns (
5 question varchar2 path '$.question',
6 description varchar2 path '$.description'
7 )
8* ) as jt;
QUESTION
----------------------------------------------------------------------------------------------------------------------------------------------------------------------
DESCRIPTION
----------------------------------------------------------------------------------------------------------------------------------------------------------------------
My Question
Oracle just returns null (nothing - blank) for the description!
But it's definitely not blank:
Next I tried the query like in my initial example, but the result was the same:
SQL> select a.json_clob.items.question as question, a.json_clob.items.description as description
2 from tbl_with_json a;
QUESTION
----------------------------------------------------------------------------------------------------------------------------------------------------------------------
DESCRIPTION
----------------------------------------------------------------------------------------------------------------------------------------------------------------------
My Question
So the database will return a value when there's less than 4K (or possibly 32K depending the setting of your varchar2 size in the database) and it returns null when it's over this limit.
Hopefully Oracle Database 12.2 fixes this issue, but at the moment there's no native way to get to that data by using the Oracle supplied JSON functions.
Edit 4-AUG: return null is default behaviour of Oracle, but you can specify you want an error instead. See the comments of Alex and Beda.
Ok, what can we try next?...
Since Oracle Application Express 5, APEX comes with a very nice package to work with JSON,
APEX_JSON. This package has been heaven for us, especially with
AOP.
So I thought to try to use the APEX_JSON.PARSE and store it in a temporary JSON so I can read it with the get_clob_output method:
SQL> declare
2 l_data clob;
3 l_json apex_json.t_values;
4 l_return clob;
5 begin
6 select json_clob
7 into l_data
8 from tbl_with_json;
9 apex_json.parse(l_json, l_data) ;
10 apex_json.initialize_clob_output(dbms_lob.call, true, 0) ;
11 apex_json.open_object;
12 apex_json.write(l_json, 'items[1].description') ;
13 apex_json.close_object;
14 l_return := apex_json.get_clob_output;
15 apex_json.free_output;
16 end;
17 /
Error starting at line : 1 in command -
declare
l_data clob;
l_json apex_json.t_values;
l_return clob;
begin
select json_clob
into l_data
from tbl_with_json;
apex_json.parse(l_json, l_data) ;
apex_json.initialize_clob_output(dbms_lob.call, true, 0) ;
apex_json.open_object;
apex_json.write(l_json, 'items[1].description') ;
apex_json.close_object;
l_return := apex_json.get_clob_output;
apex_json.free_output;
end;
Error report -
ORA-20987: Error at line 6, col 18: value exceeds 32767 bytes, starting at Lorem ipsum dolor sit amet, consectetuer adipiscin
ORA-06512: at "APEX_050000.WWV_FLOW_JSON", line 928
ORA-06512: at "APEX_050000.WWV_FLOW_JSON", line 993
ORA-06512: at line 9
But as you can see, there's a limit in there as well. So APEX 5 doesn't return null, but it returns an error. Hopefully a future version of APEX removes this limit ;)
When I work with data, I prefer to do it straight in the database, but now I was stuck. At those moments you have to go for a walk, get some sleep and talk to others to get more ideas... My preferred development languages (in this order) are APEX, SQL, PL/SQL, JavaScript, Node.js, ... (and then all others)
Then I remembered a
blog post of Kris Rice that
SQLcl has the ability to run JavaScript too because SQLcl includes
Nashorn (A Next-Generation JavaScript Engine for the JVM). So after looking at some SQLcl script
examples, I wrote my own little SQLcl script that reads out the clob and puts it in a variable "content":
SQL> script
2 var Types = Java.type("java.sql.Types")
3 var BufferedReader = Java.type("java.io.BufferedReader")
4 var InputStreamReader = Java.type("java.io.InputStreamReader")
5
6 var GET_CLOB = "declare " +
7 " l_clob CLOB; " +
8 " begin " +
9 " select json_clob " +
10 " into l_clob " +
11 " from tbl_with_json; " +
12 " ? := l_clob;" +
13 " end;";
14
15 var cs = conn.prepareCall(GET_CLOB);
16 cs.registerOutParameter(1, Types.CLOB);
17 cs.execute();
18 var clob = cs.getClob(1);
19 cs.close();
20
21 var r = new BufferedReader(new InputStreamReader(clob.getAsciiStream(), "UTF-8"))
22 var str = null;
23 var content = "";
24 while ((str = r.readLine()) != null) { content = content + str; }
25 ctx.write(content);
26 /
{ "items": [{ "client_id": -1, "registration_date": "2016-07-29T07:46:09.941Z", "question": "My Question", "description": "Lorem ipsum dolor sit amet, consectetuer adipiscing elit. Aenean commodo ligula eget dolor. Aenean massa. Cum sociis natoque penatibus et magnis dis parturient montes, nascetur ridiculus mus. Donec quam felis, ultricies nec, pellentesque eu, pretium quis, sem. Nulla consequat massa quis enim. Donec pede justo, fring
So the above reads the content of the clob which contains the JSON.
As we are in JavaScript I thought we can parse this JSON and navigate to the description field. Once we have it we store it in another table or do whatever we want with it.
Cool if it would work, no? And it did! :)
So lets finish this example. First we create a table to store the description field (the very big text).
SQL> create table tbl_with_description (description clob);
Table TBL_WITH_DESCRIPTION created.
Here's the final script that will store the description node to another table :
- the ctx.write calls are there to send debug output
- the obj.items[0].description is how we get to the description node and we store that in a bind variable and execute another insert statement to save the description value:
SQL> script
2
3 try {
4 var Types = Java.type("java.sql.Types")
5 var BufferedReader = Java.type("java.io.BufferedReader")
6 var InputStreamReader = Java.type("java.io.InputStreamReader")
7
8 var GET_CLOB = "declare " +
9 " l_clob CLOB; " +
10 " begin " +
11 " select json_clob " +
12 " into l_clob " +
13 " from tbl_with_json; " +
14 " ? := l_clob;" +
15 " end;";
16
17 var cs = conn.prepareCall(GET_CLOB);
18 cs.registerOutParameter(1, Types.CLOB);
19 cs.execute();
20 var clob = cs.getClob(1);
21 cs.close();
22
23 var r = new BufferedReader(new InputStreamReader(clob.getAsciiStream(), "UTF-8"))
24 var str = null;
25 var content = "";
26 while ((str = r.readLine()) != null) { content = content + str; }
27 //ctx.write(content);
28
29 var obj = JSON.parse(content);
30 ctx.write("Question: " + obj.items[0].question + "\n");
31 ctx.write("Description: " + obj.items[0].description + "\n");
32
33 var binds = {};
34 binds.description = obj.items[0].description;
35
36 var ret = util.execute("insert into tbl_with_description (description) values (:description)", binds);
37
38 if (ret) {
39 ctx.write("Insert done!\n");
40 } else {
41 ctx.write("Error :(\n");
42 var err = util.getLastException();
43 ctx.write("\nERROR:" + err + "\n");
44 }
45
46 } catch(e){
47 ctx.write(e +"\n")
48 e.printStackTrace();
49 }
50
51 /
Question: My Question
Description: Lorem ipsum dolor sit amet, consectetuer adipiscing elit. Aenean commodo ligula eget dolor. Aenean massa. Cum sociis natoque penatibus et magnis dis ... eu,
Insert done!
Testing:
SQL> select count(*) from tbl_with_description;
COUNT(*)
----------
1
SQL> select substr(description,1,50) from tbl_with_description;
SUBSTR(DESCRIPTION,1,50)
--------------------------------------------------------------------------------
Lorem ipsum dolor sit amet, consectetuer adipiscin
SQL>
I was blown away by this... and I see a lot of potential be able to run JavaScript against the database.
There's actually a way to
load Nashorn in your database too, so you can do JavaScript, Node.JS etc. straight from your database. Nashorn came with Java 8, but it should run in Java 7 too, now the default version of Java in the Oracle Database is 6, so there're some extra steps to do to get it to work. Running JavaScript from the database is something I've on my list to do R&D in and I actually submitted an abstract to
KScope17 where I will present my results on this topic (if it gets accepted!) :)
So to recap this (longer) blog posts:
1) JSON is being used a lot these days and having the possibility to work with JSON in the Oracle database is very nice, but as we have seen in the above example, it can't do everything yet. It has a real issue with large nodes.
2) Knowing other languages and thinking out-of-the-box might come in handy; I would even say that JavaScript becomes more and more important for an APEX developer.
3) SQLcl is a great tool, if you don't use it yet, I would definitely recommend looking into it.
4) Oracle Nashorn opens up an entire new set of possibilities.
In the last paragraph of this blog post I want to thank Kris Rice for his help understanding SQLcl script. Although there are many examples, it took me some time to get going and I did struggle to understand how to get to error messages for example. Although it's mostly JavaScript in the script, having some Java knowledge makes it easier. Time to refresh that a bit, it has been 15 years ago I did some real Java coding.
Hope this blog post will help you work with JSON and JavaScript within an Oracle context.
Edit 4-AUG: read the comments section for a way to get the CLOB out with PL/JSON.
7 comments:
Great post Dimitri.
I'm seeing more opportunities for JSON every day, both in the front end and also in the database.
Thanks for highlighting the problem, and some of the options available.
Don't forget you could also try PL/JSON as a fallback, possibly after some customization.
Good stuff, didn't know about getting Nashhorn inside the database.
The reason you got NULL when using the JSON_TABLE in your first query is because of the default settings. If you add ERROR ON ERROR you will get ORA-40478:output value too large
select jt.question, jt.description
from tbl_with_json,
json_table(json_clob, '$.items[*]'
columns (
question varchar2 path '$.question',
description varchar2 path '$.description'
error on error) --<------- Add this :)
) as jt;
Thanks for that comment Alex. I learn something new every day :)
Hi Mark,
Thanks for the suggestion! Since the database and APEX support JSON I've not used PL/JSON anymore, but because of your comment I gave that a shot again.
At first sight PL/JSON doesn't have a get_clob function, only get_string, get_number, get_boolean and get_date: https://github.com/pljson/pljson/blob/master/src/json_ext.sql
But in APEX you have to do it a bit differently when there're clobs, so thought to search for that in PL/JSON and actually found a way to get it working by using the get_json_value!
Here's a pure PL/SQL way with PL/JSON getting the large clob out of a single node:
--
declare
l_obj json;
l_json_value json_value;
l_clob clob;
l_result clob;
begin
select json_clob
into l_clob
from tbl_with_json;
l_obj := json(l_clob);
-- print JSON
-- l_obj.print;
l_json_value := json_ext.get_json_value(l_obj, 'items[1].description');
l_result := empty_clob();
dbms_lob.createtemporary(l_result, true);
l_json_value.to_clob(l_result, true);
dbms_output.put_line(substr(l_result,1,50));
-- as json_value returns " and a carriage return we have to get rid of those
l_result := trim(trailing '"' from trim(trailing CHR(13) from trim(trailing CHR(10) from trim(leading '"' from l_result))));
insert into tbl_with_description (description) values (l_result);
dbms_lob.freetemporary(l_result);
end;
/
--
So thanks again for the suggestion!
Dimitri
Dimitri,
thanks for blogging about the JSON support in Oracle.
A few comments:
1) Currently, JSON_VALUE, JSON_QUERY and JSON_TABLE return string values as VARCHAR2 only. CLOB is not yet supported because of performance reasons (it would require creation of temp lobs).
2) If you just specify VARCHAR2 uses the default of VARCHAR2(4000). 4000 is always the default irregardless of 'extended varchar' setting.
If you don't like the default of 4000 you can specify the desired length. Example VARCHAR(20) or VARCHAR2(32767)
3) If the selected JSON value cannot be returned as the selected SQL return type then we have an error. The cause could be like your where the length does not fit. But it could also be that a number conversion failed if you specified RETURNING NUMBER and the selected value is not a number.
4) NULL ON ERROR is the default error handler. If you want the query execution to be aborted with an error messages then you should specify ERROR ON ERROR. This behavior is by design an defined in the SQL standard - so there is no issue in the product. Maybe you can clarify this concept in your blog so that readers are not mislead.
Thanks
Or you can use as_json: https://technology.amis.nl/2015/12/22/as_json-relational-to-json-in-oracle-database/
With a SQL-query:
select as_json.qget_string( j.column_value, 'question' )
, as_json.qget_string_clob( j.column_value, 'description' )
from tbl_with_json a
, table( as_json.qget_array(as_json.qjson( a.json_clob ), 'items' ) ) j
Or in PLSQL:
declare
t_tmp clob;
t_json as_json.tp_json_value;
begin
select json_clob into t_tmp from tbl_with_json where rownum <= 1;
t_json := as_json.json( t_tmp );
t_tmp := as_json.get_string_clob( t_json, 'items[1].description' );
dbms_output.put_line( length( t_tmp ) || ' ' || substr( t_tmp, 1, 50 ) );
as_json.free;
end;
Great, thanks Anton!
Post a Comment