Microsoft’s AI assistant is to be boosted by advanced features of the OpenAI algorithm.
Perhaps not everyone knows that the technology behind Copilot, Microsoft’s AI assistant introduced in the latest versions of Windows 11, is based on the algorithm developed by OpenAI for ChatGPT. This means that with the company’s upgrade work on ChatGPT-4 Turbo, Copilot is also about to enjoy a significant technological advancement, resulting in many new features. Let’s take a look at some juicy new features that are about to land on Windows PCs.
COPILOT PUTS THE TURBO BOOST
A few days ago Mikhail Parakhin, CEO of Advertising and Web Services at Microsoft, announced via social that the free tier of Copilot was going to upgrade its algorithm from the one on which ChatGPT-4 is based to that of ChatGPT-4 Turbo. This is not an experimental or limited-time feat, but a permanent upgrade and one that does not require any outlay from the user, since precisely it will be integrated right from the free tier. It will also be available for both Creative Mode and Precise Mode.
After some work, GPT4-Turbo has replaced GPT-4 in the free Copilot tier. Pro users can still choose the older model if they prefer it (there is a toggle).
– Mikhail Parakhin (@MParakhin) March 12, 2024
The ChatGPT-4 Turbo algorithm brings significant improvements over the basic version. In addition to being able to process much longer text per prompt and has new features related to text-to-speech and text-to-image due to the integration of DALL-E 3, an image generation algorithm also developed by OpenAI. Technological improvements have meant that the new version is also more cost-effective, meaning it costs less than ChatGPT-4. In addition, its dataset is more up-to-date: the knowledge base of the previous version, in fact, only goes up to September 2021, so it is not able to help us with tasks that deal with topical issues of the last two years. Now things are about to change, and in particular, 3 new features will make all regular Copilot users happy.
COPILOT, 3 NEW DEVELOPMENTS COMING SOON
The first novelty concerns, precisely, the updated dataset: now ChatGPT-4 Turbo has a knowledge base updated to April 2023, thus almost two years more data than the previous version. This makes Copilot significantly more “on the ball” than before when it comes to topical issues. Second, the new version of the model developed by OpenAI exponentially increases its context window (context window), which is the textual range around the target token that the LLM can process at the time it generates the information. Put simply, the amount of text accompanying the main information that the new model can process increases tremendously over the previous version.
A larger context window allows for faster processing speed, acceptance of larger inputs, and consequently a more defined semantic construction capacity, more detailed analytical responses, and greater control of interdependence between period contents, which ensures greater internal consistency in responses and the confidence not to deviate from the main content of the query, while being able to enrich the response with premises, supporting arguments, explanations, and examples. To translate this into simple words: the new model has an effective capacity to process up to 300 pages of text per single prompt!
Finally, the new model is “less lazy” than the previous one. This is no joke; OpenAI itself has acknowledged that its algorithm occasionally gave the impression of exhibiting a certain “laziness,” that is, providing incomplete or very partial responses to some user prompts. No official explanation has been given as to the cause of this problem, although some of the user base suspects that it is a matter of available bandwidth to handle the requests of all users. In any case, the company had publicly acknowledged the problem last December, explaining that it had identified ways to correct it and that the next iteration of the model would fix it:
This new model can complete tasks such as code generation more thoroughly than the previous generation and is designed to reduce instances of “laziness” that occur when the model does not complete the entire request.”-New embedding models and API updates – Jan. 25, 2024
In short, halt to laziness: the new Copilot promises to be more viscous than ever and even more able to make itself useful for any user request, increasingly becoming our everyday AI assistant.