Thursday, April 29, 2021

Setup the APEX Office Print (AOP) Server with the Oracle Cloud APEX Service


This post is part of the series My spin on the new Oracle APEX Application Development Service.

To print and export data from the APEX Service in the Oracle Cloud, you can use APEX Office Print (AOP).

AOP allows you to generate documents based on templates created in Word, Excel, PowerPoint, HTML, and Text.
For example, people use AOP to create an invoice, send a packaging slip, export an Excel spreadsheet with multiple sheets, or have a monthly PowerPoint presentation with the latest sales numbers.

AOP the go-to print engine for Oracle APEX and the Oracle database and it offers both a cloud service and an on-premises solution. Having your APEX Service talk to the AOP Cloud is straightforward as there's really no installation to do. You can find more info about that in this blog post. But, what if you want your own dedicated AOP server for your APEX Service? Many people actually like to have their own instance so they can print as much as they want, have full control, and maximum speed - as it's closest to your database. In this post, I will show one way to set up your own APEX Office Print (AOP) Server which works with the Oracle APEX Application Development Service.

The special bit with the Oracle APEX Service is that it's a fully managed service, which means the database, ORDS, and APEX are pre-configured, auto-patched, auto-scalable, and optimized. I think it's awesome, but the downside is that you don't have direct access to the machine and the network. 

The installation is a 3 step process:
1. Import the AOP Server image in your own Oracle Cloud
2. Create a Compute instance based on this image
3. Create a Gateway to this instance

The following is a step-by-step guide:

(1) Login into apexofficeprint.com, go to the Downloads section, and click AOP Gateway with Compute 
This will copy the URL of the AOP image, which we will import in your own Oracle Cloud. (FYI we are focussing heavily on the ability to let you run your own AOP instance in the Oracle Cloud, so in the Oracle Cloud section on our website, you will find more ways to run AOP).


Go to your own Oracle Cloud dashboard and click Compute - Custom Images:


Click the Import Image button:


Give it a name and copy the link that is in the clipboard to the Object Storage URL:


Hit the Import Image button:


(2) It might take a few minutes in order for it to create the image, but once it's green, we can Create Instance from this image:


Give the instance a name, select where you want the Compute instance to be, and select an Image and shape:



The selection of the shape really depends on how much you print. I like, for example, the flexible shape
AMD Rome: VM.Standard.E3.Flex (2 OCPU, 20GB Mem) which will cost around $60 a month.

Select the network and assign a public IP address. Note that the public IP is not really necessary, but it just makes the activation of AOP easier as you can connect directly to the Compute instance:


Finally hit the Create button:


Now the Compute Instance will be created:


A few minutes later, the pre-configured AOP instance is up and running. 
Write down the Private IP Address and Click on Subnet:


Next click on the name of the Security Lists:


And Add Ingress Rules:


Add the port AOP is running on. By default, AOP is running on port 8010, but you could change that in the instance if you wanted, and hit Save Changes:


The Compute instance is now fully configured.

(3) The issue with the APEX Service is that it only allows HTTPS calls. Instead of dealing with SSL certificates, we will make use of the Gateways feature in the Oracle Cloud which you find under Developer Services - API Management - Gateways


Create Gateway:


Give it a name and hit Create:


Once the Gateway is created go to the Deployments section:


Create Deployment:


Give it the name AOP and path prefix /aop and hit the Next button


The path can be / and for the URL use the Private URL of the Compute instance.
Make sure to check the "Disable SSL Verification". Our Compute doesn't have SSL, but the Gateway does and the APEX Service will only talk to the Gateway.


In the overview screen hit the Create button


And you will see that AOP is being deployed and becomes Active:


Click the Copy Endpoint in the Deployments section - this is your AOP Server URL!

Those are basically the steps to get your own AOP Server running with the APEX Service.
In fact, it's not only linked to the APEX Service, you can use this technique for any Oracle Cloud service, like Autonomous Database or your database on another Compute instance.

To manage the AOP Server here are some commands you can use when connected to the AOP Server:

start aop : systemctl start aop stop aop : systemctl stop aop status aop : systemctl status aop autostart at boot : systemctl enable aop remove autostart at boot: systemctl disable aop

Happy printing!

Thursday, April 22, 2021

Which one to pick: Free APEX Service vs Free Oracle Autonomous DB

I'm a big fan of the Free Oracle Cloud as it allows Oracle APEX, Oracle Database, web developers, or more generally, everybody, to get started doing their thing. If you want to get going, I previously did a series you may want to read called Best and Cheapest Oracle APEX hosting: Free Oracle Cloud.

A few days ago Oracle announced the free offering for the Oracle APEX Service. I previously started on a series on this new APEX service My spin on the new Oracle APEX Application Development Service. I still have to continue that series, but I first thought to write now that it's available for free, about which one to pick.  The Free Oracle Autonomous DB which comes with APEX or the Free APEX Service which comes with the Oracle DB?

All-in-all both services are very similar, but the decision is very easy for me. Ask these two questions to yourself:

Do I want to have SQL*net access (e.g. connect with SQLcl, SQL Developer, or Visual Studio Code)?

If the answer is yes, you need to go with the Oracle Autonomous Database as of the time of writing SQL*net access isn't available for the APEX Service.

Do I want to upgrade my free service to a paid service at some point?

The APEX Service is unbelievable value for money, and 1/3 of the price of the Autonomous Database, so if you ever want to upgrade in one click to a paid version of the APEX Service, you want to start from the free APEX Service, You can't go from the Autonomous DB to the APEX Service, so the one you pick decides where you upgrade to.

Combining both ...

If you don't mind a bit of hassle, you could start with the Always Free Autonomous Database, export the database and APEX apps, and import them into the APEX Service. It will take a bit of effort to do this, but it's definitely a good choice if you don't mind doing that. But from that moment onwards you will work with the APEX Service... or you treat the APEX service as your production environment and your Always Free Oracle Autonomous DB as your development environment.

If you wonder why do you even care about SQL*net access... I work file-based, e.g. all my PL/SQL objects (packages, procedures, ...) are in files in Git and I edit them, I see very nicely what I changed, I can easily commit, etc. I won't mess up other people's code and I can always go back to a previous version. So, I really like to work in Visual Studio Code linked to my Git. Now I could just copy/paste when I'm done editing and compile it in SQL Workshop of SQL Developer Web, but it's one extra step.

Starting with the Always Free APEX Service and upgrading to Paid.

For most smaller projects I will probably educate people to go with the Always Free APEX Service. Most time is spent in APEX anyway and SQL Workshop or SQL Developer Web is good enough for those small projects to maintain the database objects as typically not many people are working on the same package at the same time. So I thought to see how well upgrading from Free to Paid would work... and I must say I was impressed! It took about 15 minutes to do. Here's what I did:

In the Oracle Cloud console, go to the APEX Application Development - APEX Instances and hit the Create APEX Service button:

Follow the wizard to Create the APEX Service; give it a name


Add a password and provide the email addresses of people you want to be notified of maintenance. This notification part is new and really useful!


The Always Free APEX Service is ready and you can play with it. Now, let's hope at some point you are so successful with your APEX app, that you will switch to a Paid instance. Here's what it takes to upgrade.
Click the More Actions : Upgrade Instance to Paid:


Confirm you want to upgrade:


It will take a few minutes to update:


And presto, there it becomes available:


Easy! Now one of the best features of the APEX Service is that it can Auto Scale, which means that on heavy traffic the service can take up to 3 times the OCPU you have. I find that super cool :) You pay for it, but you don't have to do anything, and it's only for the time it needs more resources.

When you upgrade from a free tier, the auto-scaling is not enabled by default. Here's how to enable it: Click More Actions > Scale:


Hit the Auto Scaling checkbox:


This takes a bit of time and for me, it didn't automatically tell me it was done....


... at some point I refreshed the browser and saw it was done:

I like the APEX Service a lot. It's missing a few features before I can move a few customers
e.g. the ability to have custom domain names to name one. 

But since January 13th I'm actually using the new APEX Service and hosting a public website on it.
It's a fun app to predict the curve of the COVID-19 cases in Belgium. There's a static page in front of it, so the custom domain name toogviroloog.be is there, but once you click on Predict (Voorspel zelf), you will see the APEX app hosted on the APEX Service.


Hope this post helps to pick the Free Oracle Cloud service you need.

Tuesday, February 02, 2021

The Best Rich Text / HTML Editor for Oracle APEX

One of the first bigger public applications I built in Oracle APEX was the website and backend for a flight school (in 2006). It was a little Content Management System (CMS), so the secretary of the flight school could update the website and the trainers could create their online course material and exams straight from an administration page. I used a Rich Text (HTML) Editor, so they could make some text bold, include pictures, add some color and add tables to the text.

In many other applications, I started to use the Rich Text Editor so people could write their own emails and send them straight from their APEX apps. 

In the last 5 years, I didn't only come across the use of a Rich Text Editor in our own applications, but we saw that many people use it when they use APEX Office Print (AOP) to generate documents. AOP has the unique feature that it understands HTML and converts this to native Word/PDF styling. For example, the bold tag in HTML or the use of a span with a certain style is translated as real bold and, for example, that color in the generated document. People love it as they can make very dynamic documents.

Although we and many others are using the Rich Text Editor in our Oracle APEX apps, there were two things that were hard to do (read not native and you have to write extra code):

  • dealing with over 32K of data
  • upload, manipulate, and include images in the text 

As part of our Plug-ins Pro offering, we wanted to create the best Rich Text Editor (RTE) possible for Oracle APEX. So, a year ago, Bartosz and I did some research on what would be the best way to address this. We thought about enhancing the current RTE of APEX, but while doing the research we found the component was quite old (based on CKEditor 4 in APEX 20.1 and before). So we looked at many different RTEs, and ended up with our top 3:

  • TinyMCE: an open-source editor, with which Bartosz had done a lot of work
  • CKEditor: this is what is included in Oracle APEX. Before APEX 20.2 it was CKEditor 4, from 20.2 onwards the default one is CKEditor 5. The new editor is also open-source, looks nice but was still undergoing a lot of changes and enhancements, and wasn't backward compatible.
  • Froala: used a lot by AOP customers, I have always been impressed by their offering, but it's paid.
They are all good, and although we needed to pay to use this editor in an APEX Plug-in and have the rights to distribute it without people needing to have their own license, we went with the Froala WYSIWYG Editor

And, I couldn't be happier with what Bartosz came up with and how easy it is now to use the RTE Editor.
Here's an animated gif to give you an idea of how it looks:



To implement a plug-in in your own app, you can follow this step-by-step guide, which will cover how to download the Oracle APEX plug-ins from the Plug-ins Pro website and import and register the plug-in.

In short, the steps to use the Rich Text Editor Pro plug-in in your own app are:
  1. Login, download, and unzip the Rich Text Editor Pro plug-in from plug-ins-pro.com
  2. Upload and run from SQL Workshop > SQL Scripts > ddl_unitedcodes_rich_text_editor_pro.sql and  ddl_unitedcodes_rich_text_editor_pro_sample_rest.sql
  3. Import the Plug-ins in your app (the RTE Editor exists out of 3 plug-ins): Shared Components > Plug-ins > Import: dynamic_action_plugin_unitedcodes_froala_rte_da_conf.sql and item_type_plugin_unitedcodes_froala_rte.sql and process_type_plugin_unitedcodes_froala_rte_process.sql



  4. Register the plug-in by going to SQL Workshop > SQL Commands and copy from the README.md file the script and enter your API key which you find in the Plug-ins Pro Portal


Now we are ready to use it in our application. I will use the Sample Database Application that comes with Oracle APEX, and change the Product description item on Page 6, which is a Textarea, to be the new Rich Text Editor Pro:


Edit page 6, click the P6_PRODUCT_DESCRIPTION item and:
- change the item type to be the United Codes Rich Text Editor Pro [Plug-in]


- give it a height of 300px
- change the Appearance - Template: Optional - Above
- change the Source - Form Region: - Select - 

The reason we unselect the Form Region and the item becomes Null is because Oracle APEX doesn't support more than 32K in an item. So we have to fill the item in another way. This is why the Rich Text Editor Pro comes with its own process which supports CLOBs and data over 32K.

We will add a Process by going to Pre-Rendering - After Header - Process - Right Click and Create Process. Call the process Fetch RTE and as Type select United Codes Rich Text Editor Pro (Process) [Plug-in]. The process supports three different ways of loading the CLOB value into the item: SQL Query, Function Returning CLOB, and Table. The most declarative one is the Table, so we will go with that. Select the item and fill in the columns:


In order to save the value back to the database, we also need a custom process, because of the same reason we had to fetch the value, to circumvent the 32K limit.

So, go to Processing - Processes - Right Click and Create Process which will add a new Process.
Move it after the Process Row process. Call the new process Process RTE and select as Type United Codes Rich Text Editor Pro (Process) [Plug-in]. The process supports two different ways of saving the item value into the CLOB column in the table: PL/SQL Code and Table. The most declarative one is the Table, so we will go with that. Select the item and fill in the columns:


That's it! 

Finally, I changed the height of the Modal to be 750px and changed the Labels of the other items to use the Floating template. When you open a product now, the result looks like this:


Now you can make the text bold, upload, drag-drop and manipulate pictures, add videos to the product description, etc. A plethora of options are now available through this new Rich Text Editor!

So our Rich Text Editor Pro exists out of 3 plug-ins: an item type plug-in, a process plug-in which we saw both being used above. But this plug-in also comes with a dynamic action plug-in. The dynamic action plug-in allows you to further customize the Rich Text Editor. Here are a couple of things you can do with the DA plug-in:
  • ability to change the toolbar: add and remove buttons
  • change settings: allow spell checker, copy/paste images, image resizing, configure the sticky toolbar
  • enable/disable image as base64
  • upload CLOB
You can customize the RTE editor also via JavaScript, but having this extra DA plug-in makes the life of a low-code developer easier 😀


To see the full power of this amazing plug-in, I strongly recommend installing the sample application that comes with it. You also find the sample application on the Plug-ins Pro website.

Sunday, January 24, 2021

Import an existing database and APEX app into the Oracle APEX Application Development Service

This post is part of the series My spin on the new Oracle APEX Application Development Service


In the previous post, we set up the environment which provided us everything to start building Oracle APEX apps. In this post, I want to take the Oracle APEX apps I've built before and move them to this new service.

There are different ways of accomplishing this. Here are the steps I will follow:
  1. In the current environment
    1. Export the database (schema)
    2. Export the APEX app
  2. On the new Oracle APEX Application Development Service
    1. Import the database (schema)
    2. Import/Deploy the APEX app
I will start by moving a very small app I built a few years ago; the reservation of meeting rooms we use in the Innovation and Incubation center in which our office is located.

In our current environment, we will export the data and the APEX app as explained below.

Export the database schema

There are different ways to get your data: export to flat files, make your data available through REST web services or use some other tools to move data. We export all our apps every night to a different location with Oracle Data Pump. In case you are new to Data Pump, here's the command you can use to export a specific schema:

expdp apex_backup/xxx@apex_pdb directory=DATAPUMP dumpfile=db_20210121_ini.dmp logfile=db_20210121_ini.log schemas=INI exclude=statistics

or export the entire database:

expdp apex_backup/xxx@apex_pdb directory=DATAPUMP dumpfile=db_20210121_full.dmp logfile=db_20210121_full.log full=y exclude=statistics 

I like to use Data Pump as it will export everything in one .dmp file and it's really fast to do.


Export the APEX application

There are different ways to export the APEX app. You can export through the GUI in Oracle APEX, use the external APEX Export tool, or use SQLcl.

Just like with the data, we also export all APEX workspaces and applications every night. We use the APEX Export tool. This is especially useful if you want to move your entire environment to the cloud as you don't have to export workspaces and apps one by one.

The command to export all workspaces:

/usr/bin/java oracle.apex.APEXExport -db localhost:1521/APEX_PDB -user apex_backup -password xxx -expWorkspace > workspaces.txt

The command to export all applications:

/usr/bin/java oracle.apex.APEXExport -db localhost:1521/APEX_PDB -user apex_backup -password xxx -instance > applications.txt


Now that we have the database data pump file (.dmp) and the APEX export file (.sql), we are ready to go to the new APEX Application Development Service to import the database and Oracle APEX app.

Import the database schema

The first thing we will do is import the database into the new service.

Log in to the Oracle Cloud and go to the APEX Application Development > APEX Instances.
From there click on the database link:


Here you arrive in the database. There's a DB Connection button:


Here you normally can download the connection details (credentials/wallet) to your database, which would mean we could follow the same technique as I described in my blog series on the Always Free Oracle Cloud, more specifically on how to use SQLcl and Data Pump with the Oracle ATP. Although this Oracle APEX Service is based on the same technology as the Always Free ATP, there are some limitations, one is, you can't connect directly to the database from external tools e.g. SQLcl. 


I thought as a workaround I would upgrade the service, import the data and downgrade again, but it seems that is not going to fly. Once you upgrade it's irrevocable.


Luckily there's another way to load our data, we can use the Oracle Data Pump API from PL/SQL.

First, we will make the dump file available to the Data Pump API. The database in the Oracle Cloud can not read a file from your filesystem directly, so you have to make the file available so the database can read it. In the Oracle Cloud, we can use Object Storage to store our dump file. In my blog post Setup Object Storage and use for File Share and Backups, you find step-by-step instructions both to upload through the GUI as from the command line.

As this is a very small database, I will just use the GUI. For anything that is less than 500MB you can most likely use the OCI Console, but for large data pumps the GUI will time out. In case your dump file is bigger, use the OCI REST API or CLI, or SDK.

Go to Object Storage :


Create a Bucket:


I call it db_import:


Click on the new bucket, db_import:


In the bucket, click the Upload button:


Drag-drop the files you want to upload. In my case I will just upload one file db_20210119_ini.dmp:


The uploaded files will appear in the bucket under Objects:


Make sure to copy the URL of the dump file. In order to do so, click the three dots next to the file and click View Object Details

And copy the URL Path (URI):


Now that the file is available, let's create the database (schema).

First, I typically create the user with a script:

create user ini identified by Thisisrealsecure123;
grant connect to ini;
grant create job to ini;
grant create dimension to ini;
grant create indextype to ini;
grant create operator to ini;
grant create type to ini;
grant create materialized view to ini;
grant create trigger to ini;
grant create procedure to ini;
grant create sequence to ini;
grant create view to ini;
grant create synonym to ini;
grant create cluster to ini;
grant create table to ini;
grant create session to ini;
grant execute on ctxsys.ctx_ddl to ini;
grant execute on dbms_crypto to ini;
alter user ini quota unlimited on data;

On the APEX Service dashboard, click the Launch SQL Developer Web, log in and copy and run the above script:


Next, import the data pump file which we stored earlier in Object Storage.

In order to connect to Object Storage from SQL Dev Web we need some Cloud Credentials. Just like in the first post where we created the SMTP credentials, we will now go to that screen to create some Auth Credentials.

Go to Identity > Users and select your user and click the Auth Tokens:


Click the Generate Token button, give it a name and click the Generate Token button:


Copy the created token and hit the close button:


Note that if you lost your token, you can create  another token from this screen (and delete the old one):


Go back to SQL Developer Web and create the credentials. This will allow us to connect to other cloud services like the Object Storage:

begin
  dbms_cloud.create_credential(
    credential_name => 'YourName',
    username => 'YourUser',
    password => 'YourPassword');
end;
/

In my case it looks like this:


Now we load the file from our Object Storage into the DATA_PUMP_DIR directory where Data Pump can find the dump file to finally do the import. We use the DBMS_CLOUD package. Use the credentials you created before, use the URL you copied and the directory name should be DATA_PUMP_DIR

begin
   DBMS_CLOUD.GET_OBJECT(
     credential_name => 'dgielis',
     object_uri => 'https://objectstorage.eu-amsterdam-1.oraclecloud.com/n/ax54uj3ardud/b/db_import/o/db_20210119_ini.dmp',
     directory_name => 'DATA_PUMP_DIR'); 
end;

I took a screenshot of some more commands I used to check my environment:
  • The first command to check I had credentials (dba_credentials)
  • The second to query the Object Storage bucket
  • The third to put the object from the Object Storage onto the database machine
  • The fourth command to make sure the dump file was in the directory

Ok - now we can start our Data Pump import by using the DBMS_DATAPUMP API.

declare
  ind NUMBER;              -- Loop index
  h1 NUMBER;               -- Data Pump job handle
  percent_done NUMBER;     -- Percentage of job complete
  job_state VARCHAR2(30);  -- To keep track of job state
  le ku$_LogEntry;         -- For WIP and error messages
  js ku$_JobStatus;        -- The job status from get_status
  jd ku$_JobDesc;          -- The job description from get_status
  sts ku$_Status;          -- The status object returned by get_status
begin
  -- Create a Data Pump job to do a "full" import (everything in the dump file without filtering).
  h1 := DBMS_DATAPUMP.OPEN('IMPORT','FULL',NULL,'INI');
  
  -- Add the file to the job handler
  DBMS_DATAPUMP.ADD_FILE(h1,'db_20210119_ini.dmp','DATA_PUMP_DIR');

  -- As on the previous system the tablespace was different, I remap the tablespace 
  DBMS_DATAPUMP.METADATA_REMAP(h1,'REMAP_TABLESPACE','INI','DATA');

  -- In case you migrate from a legacy Schema Service, in addition, to remap the 
  -- tablespace, you may also want to remap the schema to a more user-friendly name 
  --
  DBMS_DATAPUMP.METADATA_REMAP(h1,'REMAP_SCHEMA','DBPTTTBDEG1JF','MYNEWSCHEMA');

  -- Start the job. An exception is returned if something is not set up properly.
  DBMS_DATAPUMP.START_JOB(h1);

  -- The import job should now be running. In the following loop, the job is 
  -- monitored until it completes. In the meantime, progress information is displayed. 
  percent_done := 0;
  job_state := 'UNDEFINED';
  while (job_state != 'COMPLETED') and (job_state != 'STOPPED') 
  loop
    dbms_datapump.get_status(h1,
            dbms_datapump.ku$_status_job_error +
            dbms_datapump.ku$_status_job_status +
            dbms_datapump.ku$_status_wip,-1,job_state,sts);
    js := sts.job_status;

    -- If the percentage done changed, display the new value.
    if js.percent_done != percent_done
    then
      dbms_output.put_line('*** Job percent done = ' || to_char(js.percent_done));
      percent_done := js.percent_done;
    end if;

    -- If any work-in-progress (WIP) or Error messages were received for the job, display them.
    if (bitand(sts.mask,dbms_datapump.ku$_status_wip) != 0)
    then
      le := sts.wip;
    else
      if (bitand(sts.mask,dbms_datapump.ku$_status_job_error) != 0)
      then
        le := sts.error;
      else
        le := null;
      end if;
    end if;
    if le is not null
    then
      ind := le.FIRST;
      while ind is not null loop
        dbms_output.put_line(le(ind).LogText);
        ind := le.NEXT(ind);
      end loop;
    end if;
  end loop;

  -- Indicate that the job finished and gracefully detach from it. 
  dbms_output.put_line('Job has completed');
  dbms_output.put_line('Final job state = ' || job_state);
  dbms_datapump.detach(h1);
end;
/

And here's the result, everything nicely imported:


I switch users and see the tables, and querying the data of a table gives me the data:


Pretty neat 😃

When executing the import script in SQL Developer Web (SDW), for large imports, it is possible you will be timed out of SDW before the import job finishes. To monitor the status of the job, you can open another SDW instance and monitor the job. 

declare
  ind NUMBER;              -- Loop index
  h1 NUMBER;               -- Data Pump job handle
  percent_done NUMBER;     -- Percentage of job complete
  job_state VARCHAR2(30);  -- To keep track of job state
  le ku$_LogEntry;         -- For WIP and error messages
  js ku$_JobStatus;        -- The job status from get_status
  jd ku$_JobDesc;          -- The job description from get_status
  sts ku$_Status;          -- The status object returned by get_status
begin
  -- Attach to the Data Pump job
  h1 := DBMS_DATAPUMP.ATTACH('IMPORT','INI');

  -- Get the Job Status
  dbms_datapump.get_status(h1,
            dbms_datapump.ku$_status_job_error +
            dbms_datapump.ku$_status_job_status +
            dbms_datapump.ku$_status_wip,-1,job_state,sts);
  js := sts.job_status;

  -- check percent done and output to console
  percent_done := js.percent_done;
  dbms_output.put_line('*** Job percent done = ' || to_char(percent_done));

  -- Detach
  DBMS_DATAPUMP.DETACH(h1);
end;

In case the status is going bananas, here's a short script you can execute to kill the job if needed. 

declare
  ind NUMBER;              -- Loop index
  h1 NUMBER;               -- Data Pump job handle
  percent_done NUMBER;     -- Percentage of job complete
  job_state VARCHAR2(30);  -- To keep track of job state
  le ku$_LogEntry;         -- For WIP and error messages
  js ku$_JobStatus;        -- The job status from get_status
  jd ku$_JobDesc;          -- The job description from get_status
  sts ku$_Status;          -- The status object returned by get_status
begin
  -- Attach to the Data Pump job
  h1 := DBMS_DATAPUMP.ATTACH('IMPORT','INI');

  -- Stop the job
  dbms_datapump.stop_job(h1, 1);

  -- Detach
  DBMS_DATAPUMP.DETACH(h1);
end;

Credit to Todd Bottger (update on 30-APR-2021):
-- //start
There are two different ways to get the actual import job done in PL/SQL. Each has tradeoffs. 
  
1) Use of DBMS_CLOUD.GET_OBJECT followed by DBMS_DATAPUMP import from the local file as described above:  this approach probably will be the fastest one for large data sets. 
The downsides are slightly more complex script (more lines of script) and it consumes extra storage space due to GET_OBJECT making a full local copy first. Note that customers pay for this storage space as part of their storage bill. Of course, the price per TB in APEX Service is quite low, so perhaps it will not be a big concern for most folks.

2) Alternative approach using DBMS_DATAPUMP *only*. 
Note that DBMS_DATAPUMP in ADB-S now supports importing directly from OCI Object Storage. 
This was implemented in DB 21c and backported to DB 19c for internal patch applied to ADB-S services (including APEX service).  Here and here are some DB 21c docs that cover it. 
I have on good authority from folks here at Oracle that this operation reads data over the network and imports in real-time. There are some internal optimizations like read-ahead caching, but it does not make an additional copy to local disk, which is nice. This approach results in simpler script (fewer lines of code – no need for DBMS_CLOUD.GET_OBJECT) and also avoids consuming the extra storage space. 
However, there are performance points to be aware of – i.e. network latencies and some overheads in object storage itself – that probably make it slower than Option A. 
It still might be a good approach for when users want the simplest possible PL/SQL script and they are importing a small data set: 

DBMS_DATAPUMP.ADD_FILE(h1, 'https://swiftobjectstorage.us-phoenix-1.oraclecloud.com/v1/<objectstoragenamespace>/<bucketname>/<dumpfilename>', 'MYCREDENTIAL', NULL, dbms_datapump.ku$_file_type_uridump_file);

Side note: I believe you can use the <filename>_%U.dmp syntax at the end of such URLs when the export was split across multiple files like export_01.dmp, export_02.dmp, etc. to automatically import them all
-- //end

Now that we have our database (all objects and data), let's import the Oracle APEX app.

First, we have to create the workspace. I tried to find an easy way to run my workspace sql file, but couldn't. For example in SQL Developer on the desktop, I can just run files from a URL, but in SQL Dev Web this is not available yet:


So, I went with the conventional way of importing my APEX Workspace through APEX itself.
Log in to the INTERNAL workspace and click on Manage Workspace - Import Workspace:


Select the workspace file and hit next:


Follow the wizard and select to reuse the schema we already created when we imported the database objects:


Enable the checkbox:


Finally hit the Install Workspace button:


And there we go, done:


The reason I don't create a new workspace, is because I want to take all workspace settings and users from the other system.

Log in to the workspace we just imported. Note that you log in with the ADMIN user, same as INTERNAL:


APEX recognizes it's the first time you log in and it asks to set a password in this workspace:


You can chose one and hit Apply Changes:


And that is it, we are now in our imported Workspace.


Next up, import the APEX application. We could go to App Builder and follow the import wizard.
If you have only a handful of applications, this is definitely a good way. However, if you are moving hundreds of apps to this environment you probably more want to script it.

Here's a way you could do it. I uploaded the APEX app export files also to Object Storage.
In order to use this file in APEX, we create a Pre-Authenticated Request, so basically for a specific amount of time, we can access our file without the need for a password and other credentials.


Follow the wizard and set for example the expiration of the pre-authentication request to 1 extra day:


Copy the URL:


Now, go back to your APEX workspace and click SQL Workshop > SQL Commands and use the following script:

declare
    l_source apex_t_export_files;
begin
    l_source := apex_t_export_files (
                  apex_t_export_file (
                    name     => 'f263.sql',
                    contents => apex_web_service.make_rest_request (
                                   p_url         => 'https://objectstorage../f263.sql',
                                   p_http_method => 'GET' )));

    apex_application_install.install(p_source => l_source );
end;

And presto, the APEX app is installed:


Go to the App Builder, and enjoy seeing the APEX app


It turns out to be quite a long blog post to move my database and APEX app to this new environment.
There are probably more ways to do it, so feel free to comment on this blog post with an improved flow.

I also expect in the future it becomes easier. One of the focus points of Oracle APEX 21.1 and further is to make the integration with other Oracle Cloud services easier. If for example Object Storage was known by APEX_APPLICATION_INSTALL or other we would have fewer steps. 

That's it for now, next up, configure printing and exporting.

Update 30-APR-2021
Todd Bottger sent me a few commands which I incorporated in the blog post. Thank you, Todd!