, author: Ermakova M.

OpenAI announces GPT-4, a new model that will revolutionize ChatGPT

GPT-4 is getting smarter, more fun and more truthful with better alignment of his learning and the ability to communicate with him through long texts and images.

The new OpenAI GPT-4 language model was announced on March 14th with significant improvements in its ability to analyze complex problems and enhance the ability to interact with it. The user will no longer be limited to interacting with short texts such as simple questions, but can upload images or large texts, entire articles or even short books to summarize them, analyze or search for complex patterns.

Of course, the service's intelligence has also been improved, and OpenAI ensures that its ability to reason about complex problems is now more humane and less prone to making obvious mistakes like those seen in its previous version of GPT-3.5.

What improvements does GPT-4 include?

The new version of GPT-4 is capable of handling more than 25,000 words of text, enabling use cases such as long content creation, extended conversations, and search and analysis of user-provided documents. This allows you to translate documents or make it easier for editors to look for spelling or grammatical errors and redundancies. It may even help to determine the overall style or possible authorship, if these possibilities are expanded. We already know cases where artificial intelligence has been used to rescue and find out the authorship of many manuscripts.

OpenAI has been working for six months to get it properly matched, a major headache for researchers looking for artificial general intelligence. That is, so that he behaves correctly, does not lie and does not give harmful recommendations to his users.

OpenAl President Sam Altman noted that GPT-4 is supporting a new message through a developer API, soon to be extended to all ChatGPT users, to customize their behavior: “If you want the AI to always respond to you like it's Shakespeare, or only in JSON format , you can do that with this new version."

The demonstration showed a table that allows you to personalize the personality and capabilities of the GPT-4. For example, you can ask him to be a manager who calculates your taxes very carefully.

“Now he is very good at math,” says Greg Brockman, co-founder of OpenAI.

While less likely to lie, OpenAI says that GPT-4 is more creative than the previous version of ChatGPT because it can generate, edit, and assist the user in creating prose and poetry texts with a more refined style and sense of humor, such as songwriting, movie scripts or others. create articles based on the ones you have created to replicate their style and vocabulary.

Now it also understands images

GPT-4 can give you recommendations if you share a photo of the contents of your fridge with him. This small improvement and utility demonstrates a huge leap in the cognitive abilities of the language model. Any AI would need to be trained on thousands of photos of each vegetable/fruit to recognize them, as its ability to abstract is unlike a human's. Several models are already being used to catalog forests, locate bird nests, or locate endangered animals. But GPT-4 is generic and open to the public; for example, it is trained to recognize not only a type of mammal, but any object, product, or landscape.

It can now interpret a pencil sketch of a web page and convert it into HTML, CSS, and Javascript code with headings and functional buttons.

The ability to upload images for GPT-4 interpretation is not yet open to users, although the model has already been prepared for this, because the company prefers to "go incrementally and fine-tune the last details." During the presentation, they used an API-connected Discord channel to submit images.

For now, this latest language model update is only available to ChatGPT Plus subscribers, which cost $20/month.

Microsoft, which has teamed up with OpenAI to integrate the power of its language model into products like Bing, will host an AI event on March 16 that will no doubt showcase closer collaboration.

See also: Spotify talks about the lossless version of HiFi.

x