AILink for Wolfram and plugins for ChatGPT

“alloy”]hiNova = AISpeech[“Привет Хабр!”, “Voice” -> “nova”]

AudioPlot[{hiAlloy, hiNova}]< so that I couldn’t restore it in the end. It remains to be glad that it’s not "habrober"" title="Yes, OpenAI itself pronounced the word Habr in such a way that it was ultimately unable to restore it. It remains to be glad that it’s not "habrober"" width="881" height="460" src="https: so that I could not restore it in the end.
All that remains is to be glad that you are not a “habrober”

Generating Images

The last additional function in the review will be image generation. It's very simple:

AIImageGenerate["wolf and ram in the village"]
For clarity in the answer "rolled up" full description and address

For clarity, the full description and address are “collapsed” in the answer

Chat with LLM

Now let's move on to the main part of the article! Chat bot! To create an empty chat object we use the function:

chat = AIChatObject[]
Representing a Mutable Chat Object in Mathematica

Representing a Mutable Chat Object in Mathematica

Let's send the first message to OpenAI. After executing the request, it will be saved to the chat and the chat will return as the result of executing the function:

AIChatComplete[chat, "Привет!"]; 

chat["Messages"]
There are two messages in the chat so far

There are two messages in the chat so far

Well, you can continue to communicate with LLM by calling the AIChatComplete function. But it's too easy!

Plugin functions

The AIChatComplete function has several important options:

  • “Model” – allows you to specify a specific model. Default “gpt-4o”

  • “Temperature” – allows you to specify the temperature – not all models support it

  • “Tools” – allows you to use plug-in functions, again, not all models support them

Can be passed both to the add-on function and as chat constructor parameters:

chat = AIChatObject["Tools" -> {tool1, tool2}]; 
AIChatComplete[chat, "Model" -> "gpt-4o"]; 

The choice of model is still more or less clear.

  • “gpt-3.5-turbo” – fast and not the smartest

  • “gpt-4o” – smarter and slower

  • “o1-preview” – can reason and does not support tools

But what to pass as tool1, tool2, …? These must be functions! Functions that are created with some restrictions, but very simple:

  1. They return the string as response

  2. They have a description in usage

  3. The argument types are explicitly specified and can be String, Real or Integer

Let's create a very simple function like this:

time::usage = "time[] returns current time in string format."; 

time[] := DateString[]
Now LLM always knows the exact time

Now LLM always knows the exact time

Let's create another function with parameters. For example, getting the current temperature in a specified locality. In order to find out the temperature I will use WeatherAPI.

While writing the article, I registered there and looked at how the request was executed in the documentation. In the Wolfram Language it would be like this:

SystemCredential["WEATHERAPI_KEY"] = "<api key>";

wheather::usage = "wheather[lat, lon] returns info about current wheathe for specific geo coordinates. Result is JSON object with coordinates, text description, temperature, wind speed and etc.";

wheather[lat_Real, lon_Real] := 
 Module[{
   endpoint = "http://api.weatherapi.com/v1/current.json",
   apiKey = SystemCredential["WEATHERAPI_KEY"],  
   request, response
   }, 
  request = URLBuild[endpoint, {
     "q" -> StringTemplate["``,``"][lat, lon], 
     "key" -> apiKey
     }]; 
  
  response = URLRead[request]; 
  response["Body"]
]

Plus I need one more function to determine the coordinates of a settlement by name. I will make it using Wolfram Alpha:

geoPosition::usage = "geoPosition[city] geo position of the specific city. city parameter - name of the city only in English."; 

geoPosition[city_String] := 
 StringTemplate["lat = ``, lon = ``"] @@ 
  First @ WolframAlpha[city <> " GeoPosition", "WolframResult"]
Two new plugin functions

Two new plugin functions

Now I'll just add these functions to the list of LLM functions and see what happens:

Now LLM can find out the weather in almost any city!

Now LLM can find out the weather in almost any city!

To make the information fit in the notepad window, I deleted a lot, but in short. then the following happened:

  1. A user asked GPT for the current weather in Saratov

  2. GPT returned a message asking to call the geoPosition function[“Saratov”]

  3. The function sent coordinates to the GPT

  4. GPT returned the function call and parameters again – now weather[lat, lon]

  5. The function sent weather information to the GPT

  6. The GPT returned formatted, readable text to the user.

  7. Yes, while I’m writing this article in Saratov, it’s really about 6-7 degrees and drizzling – here it is in the photo below

    In fact, this is Engels, but the clouds on the horizon are right in Saratov

    In fact, this is Engels, but the clouds on the horizon are right in Saratov

The chatbot now knows three functions with which it can access the outside world. But they are quite narrowly focused: time. coordinates and weather. But what if you gave a bot a more general tool to communicate with the world? For example, searching on the Internet? It's actually quite easy to do! The implementation may not be the best or optimal the first time, but I spent literally a few minutes on it. At first I remembered that DuckDuckGo freely allows you to use its search, and then I looked at what options there were and found that there was a lite search. And this is what the web search tool looks like:

duckDuckGoSearch::usage = "duckDuckGoSearch[query] call DuckDuckGo search engine. .."; 

duckDuckGoSearch[query_String] := 
  Module[{url = "https://lite.duckduckgo.com/lite/", request, 
    response, responseBody}, 
   request = HTTPRequest[url, 
     <|
      Method -> "POST", 
      "Body" -> {
        "q" -> query
        }, 
      "ContentType" -> "application/x-www-form-urlencoded"
      |>
     ]; 
   response = URLRead[request]; 	
   responseBody = response["Body"];
   ImportString[ExportString[responseBody, "String"], "HTML"] <> 
    "\nhttps://duckduckgo.com?" <> URLQueryEncode[{"q" -> query}]
]; 
Web search code. The text is returned, although not the most relevant one

Web search code. Text is returned, albeit not the most relevant one

Well, let's try again what happens:

AIChatComplete[
   "Try to search in the web what the last Donald Trump speech", 
   "Tools" -> {time, geoPosition, wheather, duckDuckGoSearch}, 
   "Model" -> "gpt-4o"
]["Messages"][[-1, "content"]]
These are approximately the search results headlines that LLM read

These are approximately the search results headlines that LLM read

Hmm… there are links there. How about just taking it and teaching how to read LLM pages via links! It is very simple and lies on the surface. Just add urlRead!

urlRead::usage = "urlRead[url] read text from html page with address == url. Use this function when you need to get data from the specific address or site in the internet.  result is a imple text."; 

urlRead[url_String] := ImportString[URLRead[url]["Body"], "HTML"]; 
We repeat the question about Donald Trump

We repeat the question about Donald Trump

Now let’s ask LLM to read in more detail what is written in the article at the first link!

Yes, this is a very real article: https://time.com/7026339/donald-trump-speech-app-comment-public-reaction-kamala-harris-campaign/rather than a generated URL that just looks like the real thing.

In principle, you can continue adding plugins for a very long time and you can implement the most daring and interesting ideas. In addition to the examples shown, I did the same for myself:

  • Search local files in a directory

  • Compiling a multiple description of a large file to have a “directory”

  • Make your own CoPilot

  • Give access to the core and the ability to create and edit notepad cells

  • Execute code

  • Use other useful services – exchange rates, online stores, news, etc.

  • Add to group chat and give access to all messages and search for them

  • etc.

Conclusion

I very often see what was done on the knee within the framework of the article in the form of a product, but not often in the form of sales. A variety of influencers, cryptocurrency experts and tarot readers talk about a new incredible neural network with the coolest capabilities – the main thing is to buy a subscription to their telegram. Maybe I’m looking in the wrong place and I need to go not to the store, but to the factory. But anyway, I wanted to share my best practices for programmatically using the OpenAI API and creating plugins for it. Thank you all for your attention!

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *