In early summer, the developers at OpenAI introduced the GPT-3 language model, created for writing coherent text based on a given material. She was trained on 570 gigabytes of web content, wikipedia, and fiction, nearly 15 times the size of a GPT-2 dataset. The model is excellent at writing poetry and prose, can translate into some languages, solve anagrams and answer questions about the material read. The creativity of language models is becoming increasingly difficult to distinguish from real text, and GPT-3 is no exception. For example: How I, an AI specialist, bought AI text.
But the feature of GPT-3 is not only cool work with text – this does not particularly distinguish it from other models. In fact, its possibilities seem endless, and the examples are amazing. Judge for yourself: receiving a simple imperative request as input, GPT-3 can write code, typeset, compose requests, keep records, search for information, and much more.
It is enough to feed the model with a couple of training examples, and she will be able to write small structures like layout or description of ML-models:
This demo was written by Sharif Shameem as a prototype of a front-end builder. You can access it by filling out this form…
A similar version from the same author. It seems that the era of wysiwyg’a will end soon.
The rise of self-replicating machines is just around the corner!
It is unlikely that the model is trained with the full power of SQL, but it parses simple queries easily, which is already good.
This demo does not work quite correctly, but should be fixed already. You can sign up for the test here…
The already sensational demo of a plug-in for Figma on the Internet, which collects application design by text request.
“Overall, I am very impressed that GPT-3 models are capable of producing syntactically correct JSON based on just two or three examples. Despite the complexity of JSON itself, GPT-3 works correctly over 90% of the time. “
Here you can transform your text to fit the styles of different writers or even make a clickbait headline out of it.
Adapts to the author’s style and responds to the specified points. For those who do not have time for formalities.
=GPT3()– multitool for tables. At the time of the recording of the demo, she was able to search for the population in the states, the nicknames of people on Twitter and their place of work, and performed various calculations.
They are already joking about the next level, function
=DOMYWORK()… Robots work in …
You describe your transactions (including quite complex ones), the model parses them into a python script that fills / updates a google table. I’m not an accountant, but in my opinion this is very cool, especially for a pet project written over the weekend.
Sometimes it comes out really good! For the first time since Lebedev’s logos, artificial intelligence memes are able to compete with hand-created ones.
Functionality of GPT-3 is not limited to machine texts. Developers from a wide variety of disciplines are embedding this model in their tools, and a new wave of startups can already be expected based on it. Unfortunately, now GPT-3 is only available by invitation from OpenAI as part of beta testing, but later it will be opened for commercial use. Some demos are available by recording, so you can play with the model now.
There are more and more horror stories about artificial intelligence replacing programmers, and it seems that sooner or later this will happen. Wait and see.
Many of our clients have already appreciated the benefits epic servers!
it powerful virtual servers with AMD EPYC processors, CPU core frequency up to 3.4 GHz. The maximum configuration will allow you to come off to the full – 128 CPU cores, 512 GB RAM, 4000 GB NVMe. Hurry up to order!