Microsoft Ignite 2023: 11 takeaways for CIOs

The observational study revealed that Copilot users were able to find information 27% faster and were able to catch up on missed meetings almost four times faster.

[…]

Microsoft Ignite 2023: 11 takeaways for CIOs

The observational study revealed that Copilot users were able to find information 27% faster and were able to catch up on missed meetings almost four times faster.

Perhaps a future iteration of Copilot could coach users on what to do with the time it saves them: While around half of those saving more than 30 minutes a day with it said they spent the time saved on focused work, one-sixth said they spent it in … more meetings.

7. Generative AI credentials

The domain is so new, it’s hard to evaluate who knows what, so Microsoft is stepping in to offer new credentials in its Microsoft Applied Skills to encompass AI. They’ll cover developing generative AI with Azure OpenAI Service; creating document processing systems with Azure AI Document Intelligence; building natural language processing tools with Azure AI Language; and building Azure AI Vision systems.

8. Streamlining generative AI operations on Azure

At Build in May, Microsoft announced Azure AI Studio, a unified system for building generative AI applications, and six months later it’s finally launching a preview of the platform. (Generative AI technology is advancing fast but not, it seems, all that fast.) Developers will be able to select from a range of proprietary and open-source LLMs; choose data sources, including Microsoft Fabric OneLake and Azure AI Search for vector embeddings, enabling responses to be fine-tuned with real-time data without having to retrain the whole model; and monitor their models’ performance once deployed.

9. New Azure chips for enterprise AI workloads

Microsoft is updating its Azure infrastructure with new chips tailored for AI workloads. To accelerate AI model training and generative inferencing, the ND MI300 v5 virtual machines will soon run on AMD’s latest GPU, the Instinct MI300X, while the new NVL variant of Nvidia’s H100 chip will power the NC H100 v5VMs, currently in preview. These will offer more memory per GPU to improve data processing efficiency.

But Microsoft is also adding custom chips of its own. It designed Azure Maia to accelerate AI training and inferencing workloads such as OpenAI models, GitHub Copilot, and ChatGPT. Maia has a companion, Azure Cobalt, for general (non-AI) workloads.

10. Easier development of small gen AI apps with Windows AI Studio

Azure AI Studio focuses on LLMs, but there’s growing interest in the use of less resource-intensive generative AI models trained for specific tasks — and small enough to run locally on a PC or mobile device. To help developers to customize and deploy such SLMs, Microsoft will soon release Windows AI Studio, which will provide the option of running models in the cloud or on the network edge, and include prompt-orchestration capabilities to keep things in sync wherever they run.

11. Using generative AI for knowledge management

Microsoft’s Viva Engage enterprise communication tools offers a way for employees to learn from their peers by searching a database of answers to frequently asked questions provided by subject matter experts. An update to Answers in Viva due to roll out before year end will add an option to generate those answers — and even the questions they respond to — using AI, based on training files imported from other sources. This could offer enterprises a quick way to switch from a legacy knowledge management platform, or to share resources held in another system.

About Author

Subscribe To InfoSec Today News

You have successfully subscribed to the newsletter

There was an error while trying to send your request. Please try again.

World Wide Crypto will use the information you provide on this form to be in touch with you and to provide updates and marketing.