Sunday, October 27, 2019

Native Oracle DB JSON functionality as alternative for using cursor() in AOP (and APEX_JSON)

When using external (WEB/REST) services, you often communicate in JSON. So it's important to be able to generate JSON in the format that is expected by the external service.

In the case of APEX Office Print (AOP), we made it super simple to communicate with the AOP server from the database through our PL/SQL API. You just have to enter a SQL statement and the AOP PL/SQL API, which uses APEX_JSON behind the scenes, generates the necessary JSON that the AOP Server understands.

Here's an example of the Order data in JSON: a customer with multiple orders and multiple order lines:


As we are living in the Oracle database, we have to generate this JSON. The data is coming from different tables and is hierarchical. In SQL you can create hierarchical data by using the cursor() syntax.

Here's an example of the SQL statement that you would typically use in AOP (the cursor highlighted in red):

select
  'file1' as "filename", 
  cursor(
    select
      c.cust_first_name as "cust_first_name",
      c.cust_last_name  as "cust_last_name",
      c.cust_city       as "cust_city",
      cursor(select o.order_total      as "order_total", 
                    'Order ' || rownum as "order_name",
                cursor(select p.product_name as "product_name", 
                              i.quantity     as "quantity",
                              i.unit_price   as "unit_price"
                         from demo_order_items i, demo_product_info p
                        where o.order_id = i.order_id
                          and i.product_id = p.product_id
                      ) "order_lines"
               from demo_orders o
              where c.customer_id = o.customer_id
            ) "orders"
    from demo_customers c
    where customer_id = 1
  ) "data"
from dual

From AOP 19.3 onwards, the AOP PL/SQL API not only supports this cursor() syntax but also the native JSON functionality of the Oracle Database (version 12c and upwards).

The query above can also be written as the following using JSON support in the Oracle Database:

select 
  json_arrayagg( 
    json_object( 
      'filename' value 'file1', 
      'data'     value (
          select 
            json_arrayagg(
              json_object( 
                'cust_first_name' value c.cust_first_name, 
                'cust_last_name'  value c.cust_last_name,
                'cust_city'       value c.cust_city, 
                'orders'          value (
                    select 
                      json_arrayagg(
                        json_object(                               
                          'order_total' value o.order_total, 
                          'order_name'  value 'Order ' || rownum,
                          'order_lines' value (
                              select 
                                json_arrayagg(
                                  json_object(                               
                                    'product_name' value p.product_name, 
                                    'quantity'     value i.quantity,
                                    'unit_price'   value i.unit_price
                                  )
                                returning clob)      
                                from demo_order_items i, demo_product_info p
                               where o.order_id = i.order_id
                                 and i.product_id = p.product_id
                            )
                        )
                      returning clob)      
                      from demo_orders o
                    where o.customer_id = c.customer_id
                  )
              )
            returning clob)  
            from demo_customers c
            where c.customer_id = 1
          )
    )
  returning clob) as aop_json
  from dual 

You have to get used to this syntax and have to think a bit differently. Unlike the cursor syntax where you define the column first and give it an alias, using the JSON functions, you define the JSON object and attributes first and then map it to the correct column.

I find the cursor syntax really elegant, especially in combination with APEX_JSON, it's a really cool solution to generate the JSON you need. But I guess it's a personal choice what you prefer and I must admit, the more I use the native JSON way, the more I like it. If performance is important you most likely want to use native database functionality as much as possible, but I go in more detail further in this post. Lino also found an issue with the cursor syntax in the Oracle Database 19c, so if you are on that database release you want to look at the support document.

Before I move on with my test case, if you need more info on JSON in the database: Carsten did a nice blog post about parsing JSON in APEX, and although it's about parsing JSON and this blog post is more about generating JSON, the conclusions are similar. You can read more about APEX_JSON and the native JSON database functions in Tim's write-up on Oracle-Base.

As I was interested in the performance of both implementations, I run a few test cases. There are different ways to test performance, e.g. use dbms_profiler, Method R Workbench, trace, timing the results, ... Below I use Tom Kyte's script to compare two PL/SQL implementations. The interesting thing with the script it's not only comparing timings but also latches, which give you an idea of how hard the database has to work. You can download it from AskTom under the resources section:


Here's my test script:

declare
  l_sql             clob;
  l_return          blob;
  l_output_filename varchar2(100);  
  l_runs            number(5) := 1;
begin
  runStats_pkg.rs_start;
  
  -- sql example with cursor
  for i in 1..l_runs
  loop
      l_output_filename := 'cursor';
      l_sql := q'[
                select
                'file1' as "filename",
                cursor
                (select 
                    c.cust_first_name as "cust_first_name",
                    c.cust_last_name  as "cust_last_name",
                    c.cust_city       as "cust_city"
                   from demo_customers c
                  where c.customer_id = 1 
                ) as "data"
                from dual   
               ]';
      l_return := aop_api_pkg.plsql_call_to_aop (
                    p_data_type       => aop_api_pkg.c_source_type_sql,
                    p_data_source     => l_sql,
                    p_template_type   => aop_api_pkg.c_source_type_aop_template,
                    p_output_type     => 'docx',
                    p_output_filename => l_output_filename,
                    p_aop_remote_debug=> aop_api_pkg.c_debug_local); 
  end loop;  
  
  runStats_pkg.rs_middle;  
  
  -- sql example with native JSON database functionality
  for i in 1..l_runs
  loop
      l_output_filename := 'native_json';
      l_sql := q'[
                select 
                  json_arrayagg( 
                    json_object( 
                      'filename' value 'file1', 
                      'data'     value (select 
                                          json_arrayagg(
                                            json_object( 
                                              'cust_first_name' value c.cust_first_name, 
                                              'cust_last_name'  value c.cust_last_name,
                                              'cust_city'       value c.cust_city 
                                            )
                                          )  
                                          from demo_customers c
                                         where c.customer_id = 1
                                       )  
                    )
                  ) as aop_json
                  from dual 
               ]';
      l_return := aop_api_pkg.plsql_call_to_aop (
                    p_data_type       => aop_api_pkg.c_source_type_sql,
                    p_data_source     => l_sql,
                    p_template_type   => aop_api_pkg.c_source_type_aop_template,
                    p_output_type     => 'docx',
                    p_output_filename => l_output_filename,
                    p_aop_remote_debug=> aop_api_pkg.c_debug_local);                     
  end loop;    

  runStats_pkg.rs_stop;   
end;
/

I ran the script (with different l_runs settings) a few times on my 18c database and with the above use case on my system, the native JSON implementation was consistently outperforming the cursor (and APEX_JSON) implementation.

Run1 ran in 3 cpu hsecs
Run2 ran in 2 cpu hsecs
run 1 ran in 150% of the time

Name                                  Run1        Run2        Diff
STAT...HSC Heap Segment Block           40          41           1
STAT...Heap Segment Array Inse          40          41           1
STAT...Elapsed Time                      4           3          -1
STAT...CPU used by this sessio           4           3          -1
STAT...redo entries                     40          41           1
STAT...non-idle wait time                0           1           1
LATCH.simulator hash latch              27          26          -1
STAT...non-idle wait count              13          12          -1
STAT...consistent gets examina          41          43           2
LATCH.redo allocation                    1           3           2
STAT...active txn count during          21          23           2
STAT...cleanout - number of kt          21          23           2
LATCH.transaction allocation             1           3           2
LATCH.In memory undo latch               1           3           2
LATCH.JS Sh mem access                   1           3           2
STAT...consistent gets examina          41          43           2
LATCH.keiut hash table modific           3           0          -3
STAT...calls to kcmgcs                  64          69           5
STAT...dirty buffers inspected           6           0          -6
STAT...workarea executions - o           2          12          10
STAT...free buffer requested            71          52         -19
STAT...lob writes unaligned             80          60         -20
STAT...lob writes                       80          60         -20
STAT...sorts (rows)                      0          20          20
STAT...execute count                    91          71         -20
STAT...sorts (memory)                    0          20          20
LATCH.active service list                0          25          25
STAT...consistent gets                 183         156         -27
STAT...consistent gets from ca         183         156         -27
STAT...consistent gets pin (fa         142         113         -29
STAT...consistent gets pin             142         113         -29
STAT...lob reads                       160         130         -30
LATCH.JS queue state obj latch           0          42          42
LATCH.object queue header oper         151         103         -48
STAT...workarea memory allocat          66          -6         -72
STAT...db block changes                431         358         -73
STAT...consistent changes              390         315         -75
LATCH.parameter table manageme          80           0         -80
STAT...undo change vector size       8,748       8,832          84
LATCH.enqueue hash chains                1          88          87
STAT...parse count (total)             100          10         -90
STAT...session cursor cache hi         171          71        -100
STAT...opened cursors cumulati         171          71        -100
STAT...free buffer inspected           126           0        -126
STAT...calls to get snapshot s         470         330        -140
STAT...db block gets from cach         958         744        -214
STAT...hot buffers moved to he         220           0        -220
STAT...redo size                    12,016      12,248         232
STAT...db block gets                 1,039         806        -233
STAT...db block gets from cach       1,029         796        -233
STAT...session logical reads         1,222         962        -260
STAT...file io wait time             5,865       6,279         414
STAT...recursive calls                 561         131        -430
LATCH.cache buffers chains           3,224       2,521        -703
STAT...session uga memory          196,456           0    -196,456
STAT...session pga memory        1,572,864           0  -1,572,864
STAT...logical read bytes from   9,928,704   7,798,784  -2,129,920

Run1 latches total versus runs -- difference and pct
        Run1        Run2        Diff       Pct
       3,853       3,180        -673    121.16%

There are many different iterations of this test, using bind variables, etc. It seems "logical" that a native DB implementation is better performance-wise than a combination of PL/SQL (APEX_JSON) and SQL (cursor). But I always recommend you just run the test in your own environment. What is true today, might be different tomorrow and a lot comes into play, so if there's one thing I learned from Tom Kyte, it's don't take things for granted, but test in your unique situation.

So, in real life using AOP, will you see a big difference? It depends on the complexity of your SQL statement and data, how many times you call a report etc. but my guess is, in most cases, it's probably not much of a difference in user experience.

A simple test would be to do "set timing on" and compare the implementations:


Or if you are using AOP on an Oracle APEX page, you can run your APEX page in Debug mode and you will see exactly how long the generation of the JSON took for the data part:


Happy JSON'ing :)

1 comment:

Unknown said...

Mr Dimitri Thanks for your valuable input. In Relation to Charts ( Column, Line, Pie , and combine charts ) how do you produce the json script thanks