If you are reding this blog, is because most likely you have seen my LinkedIn video which I’m talking with Siri with my Apple Watch and I’m getting a response from the Autonomous Database and Select AI. In case you haven’t seen it, here you have the video before I detail how I did it:

I can imagine your first question. How did you manage to go to Internet using your Apple Watch? The answer is the Apple native application “Shortcuts”. A Shortcut is a quick way to get one ore more tasks done with your apps.

One of the possibilities is to execute REST calls 😁!! You can imagine the trick now!

The first thing I did is to configure Select AI with a profile over some tables. Once configured I can use natural language to query the data. For example, the query I would like to ask to my Autonomous Database is “which are the top 3 products per sales”. In this case I will select the action “narrate”. With this option the output of the query is explained in natural language via the chosen LLM (in this case OpenAI).

One of the great advantages of Select AI is that it is inside the Autonomous Database. Therefore I can integrate it easily with the full Oracle ecosystem. I can expose this Select AI query as a REST service which I can query:

If I query the REST endpoint generated by the Autonomous Database, I can see the response narrated by the Select AI. The response is inside of a JSON structure, so we need to parse it to get a natural response from Siri.

Below you have how my Shortcuts looks like. Something very important is the title: “My weekly report”. Once it is saved, I can use Siri to call the Shortcut using the word “run”. So when I say “run my weekly report” I’m telling to run this specific shortcut.

The main task of this Shortcut is call the REST service and filter the JSON response to get the answer I want. The last step is to tell Siri to Speak the result as you can see below.

In this very simple example, I have showed you how to do this configuration but you can go further! For example, you can collect the question via Siri and pass it as a parameter to the REST Service. You can have a real conversation with your Autonomous Database!

I hope you find it interesting!

2 comments

Kris November 13, 2024 - 2:00 pm

How do you build this one from end to end?

Reply
Javier November 13, 2024 - 2:58 pm

Hi Kris,

There are only two things that are not explained in the blog. The first thing you need is an Autonomous Database and some data inside. Then you need to configure Select AI with your favorite LLM. Is just a few sql commands, here you have a blog: https://blogs.oracle.com/machinelearning/post/introducing-natural-language-to-sql-generation-on-autonomous-database

You should be able to use Select AI over the SQL Developer web. Then you need to expose that as a REST endpoint. We have a page with free workshops, including REST for Autonomous here: https://apexapps.oracle.com/pls/apex/r/dbpm/livelabs/view-workshop?wid=815&clear=RR,180&session=11582783738142
Follow the Step 5 from the LiveLab and just add the code you see in the blog.

Reply

Leave a Comment