Start your day with intelligence. Get The OODA Daily Pulse.
Microsoft is detailing a number of new features coming to its Copilot service soon, including OpenAI’s latest models. Copilot will get support for GPT-4 Turbo soon, alongside an updated DALL-E 3 model, a new code interpreter feature, and deep search functionality inside Bing. Copilot will soon be able to respond using OpenAI’s latest GPT-4 Turbo model, which essentially means it will “see” more data thanks to a 128K context window. This larger context window will allow Copilot to better understand queries and offer better responses. “This model is currently being tested with select users and will be widely integrated into Copilot in the coming weeks,” explains Yusuf Medhi, EVP and consumer chief marketing officer at Microsoft. While you’re waiting on the GPT-4 Turbo model to appear in Copilot, Microsoft is now using an improved DALL-E 3 model in Bing Image Creator and Copilot. “You can now use Copilot to create images that are even higher quality and more accurate to the prompt with an updated DALL-E 3 model,” says Medhi. Microsoft Edge, which includes a Copilot sidebar, is also getting the ability to compose text within websites’ text input to rewrite sentences inline. You can also now use Copilot in Microsoft Edge to summarize videos you’re watching on YouTube. Coders and developers might be interested in a new code interpreter feature coming to Copilot soon. This new feature will allow Copilot users to get more accurate calculations, data analysis, or even code from the AI chatbot.
Full report : Microsoft’s Copilot is getting OpenAI’s latest models and a new code interpreter.