Generative AI Comes to APEX
APEX 24.1 APEX_AI API and Open AI Assistant Dynamic Action
Introduction
With the release of APEX 24.1, Generative AI has come to APEX. Several new AI features are immediately useful to anyone looking to incorporate Generative AI into their APEX Apps. More importantly, it represents a statement of intent that Oracle plans bigger things for APEX in the AI space. This can only be good news for us developers and our users.
This post will focus on the new APEX_AI PL/SQL API and the Open AI Assistant Dynamic Action. I will describe how I use the APEX_AI
API to classify blog posts on my APEX Developer Blogs website and how to use the Open AI Assistant Dynamic Action to ask questions about blog posts.
Use Case
The APEX Developer Blogs site has an APEX Automation, which runs every evening to capture new posts for registered blogs. Because people don't always blog about the same thing, I don't publish all the blogs that get posted. The Automation brings the posts into a holding table to be assessed for relevance to APEX Developers. Before AI, this meant manually scoring each blog post. I score each post on a 0-5 relevance rating (5 being most relevant) for these five categories:
Oracle Application Express (APEX)
Oracle REST Data Services (ORDS)
Oracle Cloud Infrastructure (OCI)
Oracle SQL (SQL)
Oracle PL/SQL (PLSQL)
A blog post is excluded if the total score for these categories is less than three.
The Initial Solution
My initial approach was to write PL/SQL code (about 200 lines) to call the Open AI Chat Completions API for each new post, along with a system prompt telling Open AI how I wanted it to score the post for relevance. This section describes this initial approach.
System Prompt
One of the first things to master when dealing with Generative AI is the art of prompt engineering. The system prompt is passed to the Open AI API to guide Open AI on what you want it to do with the user message. In this scenario, the user message is the Blog Post content.
After some trial and error, this is the system prompt I came up with.
Code
I built three main APIs: one to build the JSON payload, one to call Open AI, and one to parse the response and score the blog post. This produced good results, and the solution was elegant (IMHO).
Build Open AI API JSON Payload
FUNCTION ai_payload_json
(p_model IN VARCHAR2,
p_system_msg IN VARCHAR2,
p_user_msg IN VARCHAR2) RETURN CLOB IS -- Blog Post
l_logger_params logger.tab_param;
l_payload_obj json_object_t := json_object_t();
l_format_obj json_object_t := json_object_t();
l_message_obj json_object_t := json_object_t();
lt_messages json_array_t := json_array_t ();
BEGIN
-- Build JSON Payload.
l_format_obj.put('type', 'json_object');
l_payload_obj.put('model', p_model); -- e.g. gpt-3.5-turbo, gpt-4o
l_payload_obj.put('response_format', l_format_obj);
l_message_obj.put ('role', 'system');
l_message_obj.put ('content', p_system_msg); -- System Prompt
lt_messages.append (l_message_obj);
l_message_obj.put ('role', 'user');
l_message_obj.put ('content', p_user_msg); -- Blog Post Text Content
lt_messages.append (l_message_obj);
l_payload_obj.put('messages', lt_messages);
-- Convert Payload JSON to CLOB.
RETURN (l_payload_obj.to_clob());
END ai_payload_json;
Sample JSON Payload sent to Open AI:
{
"model": "gpt-4o",
"response_format": { "type": "json_object" },
"messages": [
{
"role": "system",
"content": "You are a blog analyzer assistant..."
},
{
"role": "user",
"content": "Blog Post Content Goes Here"
}
]
}
- Important: Setting
response_format.type
tojson_object
ensures we get a JSON Response from Open AI. In addition to setting the value, you must use the wordJSON
somewhere in your system prompt.
Call Open AI API
PROCEDURE call_open_ai
(p_json_payload IN CLOB,
x_response OUT CLOB) IS
BEGIN
-- Set HTTP Headers
apex_web_service.set_request_headers
(p_name_01 => 'Content-Type',
p_value_01 => 'application/json',
p_reset => true);
-- Call Open AI API
x_response := apex_web_service.make_rest_request
(p_url => 'https://api.openai.com/v1/chat/completions',
p_http_method => 'POST',
p_body => p_json_payload,
p_transfer_timeout => 30,
p_credential_static_id => 'OPEN_AI'); -- APEX Credential with Open AI Token
-- Handle errors.
END call_open_ai;
Sample Open AI Response:
{
"id": "chatcmpl-9e7zWon0C2rwGAJf1YbFQSqq1B2gN",
"object": "chat.completion",
"created": 1719350866,
"model": "gpt-4o-2024-05-13",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": "{\n \"APEX\": 5,\n \"ORDS\": 0,\n \"OCI\": 0,\n \"SQL\": 1,\n \"PLSQL\": 0\n}"
},
"logprobs": null,
"finish_reason": "stop"
}
],
"usage": {
"prompt_tokens": 1631,
"completion_tokens": 40,
"total_tokens": 1671
},
"system_fingerprint": "fp_efc58689b0"
}
- The JSON containing the scores is contained in the
message.content
field.
{"APEX": 5, "ORDS": 0, "OCI": 0, "SQL": 1, "PLSQL": 0}
Parse Response and Score Blog Post
PROCEDURE categorize_post
(p_plain_content IN cnba_blog_posts.plain_content%TYPE, -- Blog Post
x_total_score OUT NUMBER) IS -- Total Relevance Score.
CURSOR cr_parse_response (cp_response IN CLOB) IS
SELECT *
FROM JSON_TABLE(cp_response, '$.choices[*]'
COLUMNS
(choice_index NUMBER PATH '$.index',
message_role VARCHAR2(4000) PATH '$.message.role',
message_content VARCHAR2(4000) PATH '$.message.content',
logprobs VARCHAR2(4000) PATH '$.logprobs',
finish_reason VARCHAR2(4000) PATH '$.finish_reason'
)
);
CURSOR cr_scores (cp_scores_json IN VARCHAR2) IS
SELECT apex_score
, ords_score
, oci_score
, sql_score
, plsql_score
, apex_score + ords_score + oci_score + sql_score + plsql_score total_score
FROM JSON_TABLE(cp_scores_json, '$'
COLUMNS
(apex_score NUMBER PATH '$.APEX',
ords_score NUMBER PATH '$.ORDS',
oci_score NUMBER PATH '$.OCI',
sql_score NUMBER PATH '$.SQL',
plsql_score NUMBER PATH '$.PLSQL'
)
);
l_response_json CLOB;
l_payload_clob CLOB;
l_system_message cnai_prompts.prompt_value%TYPE;
lr_parse_response cr_parse_response%ROWTYPE;
lr_scores cr_scores%ROWTYPE;
BEGIN
-- Default Score to 0.
x_total_score := 0;
-- Get System Prompt.
l_system_message := 'You are an Oracle Technology blog analyzer assistant...';
-- Build Open AI Payload (See above).
l_payload_clob := cn_ai_utl_pk.ai_payload_json
(p_model => 'gpt-4o',
p_system_msg => l_system_message,
p_user_msg => p_plain_content);
-- Call Open AI API (See above).
cn_ai_utl_pk.call_open_ai (p_json_payload => l_payload_clob, x_response => l_response_json);
-- Parse the message content out of the Open AI Response JSON.
OPEN cr_parse_response (cp_response => l_response_json);
FETCH cr_parse_response INTO lr_parse_response;
CLOSE cr_parse_response;
-- Parse the message content to get the scores JSON.
OPEN cr_scores (cp_scores_json => lr_parse_response.message_content);
FETCH cr_scores INTO lr_scores;
CLOSE cr_scores;
x_total_score := lr_scores.total_score;
END categorize_post;
Enter APEX_AI
While the above solution worked well, the APEX_AI API reduced custom code by two-thirds, making it much easier to maintain my code and uptake new features in the future. This section will focus on the current solution, which uses the APEX_AI
API.
Create an APEX Credential
We need an APEX Credential to store the Open AI Credentials. Navigate to Workspace Utilities > Web Credentials:
Credential Details:
I am using Open AI. I won't go into how to create an API key for Open AI; suffice it to say, it is pretty straightforward. Here is a link to the Open AI Quick Start Tutorial for APIs.
The
Credential Secret
should be entered as 'Bearer <Open AI Key>'. e.g.Bearer rt-apex-WgqYOhbU0SyoFDDv3weioi4FJMkf289WlxMtMnWqwr0v9
- Don't forget to add the space after the word Bearer and before the token.
Configure an APEX Generative AI Service
With the Open AI Credentials in place, we can create the APEX Generative AI Service definition. Navigate to Workspace Utilities > Generative AI:
Select the
AI Provider
, enter aName
andStatic ID
.I have named this service 'Open AI JSON' because this service will only work for generating JSON responses (as we will see shortly).
Used by App Builder
- Important: We are asking Open AI only to return a JSON response, so we cannot use this Generative AI Service in the APEX App Builder.
Credential
- Select the credential you created in the previous step.
AI Model
, this will change over time. You can find the current Open AI Models here.Additional Attributes
This setting allows us to pass additional objects into the JSON APEX sends to the AI.
In this case, I request that Open AI return a JSON object in the response.
Important: You must also include the word 'JSON' in your system prompt to ensure you get a JSON response from Open AI. In my prompt (shown above), I have the text 'Return a JSON object' to ensure a JSON response.
HTTP Headers
- This property allows us to pass HTTP Headers to the AI Service.
Update the Code
Now that we have an AI Service configured, we can use the APEX_AI
PL/SQL API to do most of the heavy lifting. The final categorization procedure now looks something like this:
PROCEDURE categorize_post
(p_plain_content IN cnba_blog_posts.plain_content%TYPE,
x_total_score OUT NUMBER) IS
CURSOR cr_scores (cp_scores_json IN VARCHAR2) IS
SELECT apex_score + ords_score + oci_score + sql_score + plsql_score total_score
FROM JSON_TABLE(cp_scores_json, '$'
COLUMNS
(apex_score NUMBER PATH '$.APEX',
ords_score NUMBER PATH '$.ORDS',
oci_score NUMBER PATH '$.OCI',
sql_score NUMBER PATH '$.SQL',
plsql_score NUMBER PATH '$.PLSQL'
)
);
l_response_json CLOB;
l_system_message VARCHAR2(32000);
lr_scores cr_scores%ROWTYPE;
l_messages apex_ai.t_chat_messages := apex_ai.c_chat_messages;
BEGIN
-- Default Score to 0.
x_total_score := 0;
-- Get System Prompt.
l_system_message := 'You are an Oracle Technology blog analyzer assistant...';
-- Call Open AI to Analyze the Post.
-- l_response_json will contain just the JSON scores.
l_response_json := apex_ai.chat
(p_prompt => p_plain_content,
p_system_prompt => l_system_message,
p_service_static_id => 'OPEN_AI_JSON',
p_messages => l_messages);
-- Parse the JSON returned from apex_ai.chat
-- Note APEX takes care of getting the message.content from the Open AI Response
-- Becase of this, we can expect a response like this:
-- {"APEX": 5, "ORDS": 0, "OCI": 0, "SQL": 1, "PLSQL": 0}
OPEN cr_scores (cp_scores_json => l_response_json);
FETCH cr_scores INTO lr_scores;
CLOSE cr_scores;
x_total_score := lr_scores.total_score;
END categorize_post;
Open AI Assistant Dynamic Action
The APEX_AI
PL/SQL API is not the only AI feature that can be used to incorporate AI into your APEX Apps. The Open AI Assistant Dynamic Action is also available. It lets you declaratively launch a chat session with an AI from APEX.
Generative AI Service
This time, we must create an APEX Generative AI Service without the JSON response restrictions.
- I selected
Used by App Builder
this time so that I can use this Generative AI service in APEX Application Builder.
Use Case
This example will allow users to ask questions about a blog post. I created a button with an On Click Dynamic action which triggers the 'Open AI Assistant' Action:
Service
I have selected the 'Open AI General' APEX Generative AI Service defined in the previous section.System Prompt
I have provided some basic instructions to the AI along with the content of the Blog Post from the page itemP50_BLOG_POST
.Welcome Message
This message will be displayed to the user when the chat session starts. I have included the blog title from the page itemP50_BLOG_POST_TITLE
.
Display As
allows you to show the chat sessionInline
in a page region or as a popupDialog
.Title
allows you to specify a title for the Dialog.Use Response
allows you to return the chat response to a page item or pass it to some Javascript to handle the response.Quick Actions
allows you to specify a number of pre-defined questions to help the user get started.
Result
- The user selects a Blog Post and clicks the 'Blog AI' button, which opens a Dialog. The dialog contains the welcome message and a pre-defined question that the user can click to start the conversation.
The user can also type their own questions:
Tracking Tokens
What is a token? I, of course, asked ChatGPT:
'Prompt Tokens' are consumed based on the text length you pass to the AI API. 'Completion Tokens' are consumed based on the text length the AI API generates for the response. In the AI world, tokens cost money. At the time of writing this post, the Open AI 'gpt-4o' model costs $5.00 for 1M input tokens and $15.00 for 1M input tokens. Open AI API Pricing.
Determining how many tokens are used is essential to understanding your AI costs. APEX has you covered (mostly). Every time the APEX_AI PL/SQL API is called, APEX adds a record to the Web Service Activity log. New in APEX 24.1, this log now includes the total number of tokens used in each request (prompt tokens + completion_tokens).
You can access the Web Service Activity log from APEX Builder:
or from the APEX view:
Other Use Cases
Another use case for the Open AI Assistant Dynamic Action could be providing help to users of an APEX Application via a chat interface.
Another use case for the APEX_AI
PL/SQL API is to analyze data from an Interactive Grid. Using Anton Nielsen's plugin get_ig_data, you could get the data from an IG in JSON format and send it to an AI using the APEX_AI
PL/SQL API to perform analysis.
Conclusion
Even though this is a fairly simple use case, I hope this post helps you visualize other use cases where the combination of APEX and AI can bring value to your users.