From Software Giant To AI Powerhouse

Satya Nadella, CEO and Chairman, MicrosoftMicrosoft
At Ignite 2023, Microsoft unveiled a complete imaginative and prescient for its end-to-end AI stack, showcasing improvements that span from cloud infrastructure to AI-driven purposes and safety measures.

For Microsoft and its ecosystem, this 12 months’s Ignite convention proved to be distinctive and totally different. Traditionally, Ignite has been a convention that sometimes centered on infrastructure and operations, whereas Microsoft’s flagship occasion, Build, normally catered to developer audiences. However, bulletins about generative AI directed at builders and ML engineers took heart stage at Ignite 2023. It was not restricted to builders or IT professionals, nevertheless it turned a watershed second for your complete Microsoft ecosystem.

Microsoft desires to be a serious drive within the AI ecosystem, and Microsoft CEO and Chairman Satya Nadella made this clear in his keynote deal with. From creating its personal AI accelerator chips to launching a market for copilots, Microsoft has a long-term technique in place.

Here is an in depth evaluation of how Microsoft is capitalizing on AI to retain its management and dominance within the business:

Azure is the brand new AI working system, and Copilots are the brand new apps
Microsoft has an especially profitable monitor report of constructing platforms. The earliest model of the platform was constructed on Windows, the place builders leveraged OLE and COM to construct purposes via Visual Basic. Announced within the early 2000s, Microsoft.NET and Visual Studio led to the creation of a brand new platform that rekindled curiosity in builders creating net companies. Last decade, Microsoft efficiently launched one other platform within the type of Azure.

When Microsoft creates a platform, it results in a brand new ecosystem of unbiased software program distributors and resolution suppliers, serving to enterprises leverage it. This was evident from the success of Microsoft Windows, Office, Visual Studio and most not too long ago Azure.

With AI, Microsoft desires to repeat the magic of making a model new platform that ends in a thriving ecosystem of builders, ISVs, system integrators, enterprises and customers.
This season, Azure turns into the working system, offering the runtime and platform companies, whereas the apps are the AI assistants that Microsoft calls copilots. So, Azure is the brand new Windows and copilots are the brand new purposes. The basis fashions, akin to GPT-4, type the kernel of this new OS. Similar to Visual Studio, Microsoft has invested in a set of developer instruments within the type of AI Studio and Copilot Studio. This stack intently resembles Windows, .NET and Visual Studio, which dominated the developer panorama for many years.

MORE FROM FORBESInfacet Microsoft Copilot: A Look At The Technology StackBy Janakiram MSV
Microsoft’s method clearly demonstrates a way of urgency. This is clear given the present market dynamics and the teachings discovered from the failed makes an attempt to construct an ecosystem across the cellular platform. Satya is extremely dedicated to making sure that Microsoft turns into the corporate that pioneers synthetic intelligence by bringing the capabilities of generative AI nearer to its clients. He doesn’t need the corporate to overlook the following huge factor in know-how, like they did with search and cellular.
In only a few months, the corporate has shipped a number of copilots for its merchandise, starting from the Bing search engine to Microsoft 365 to the Windows working system. It additionally added numerous capabilities to the Edge browser, enhancing the person expertise. The velocity with which Microsoft has embraced generative AI in current months is astounding, making it one of many main AI platform corporations.
Microsoft invests in creating its personal CPU, GPU and DPU
For a long time, CPUs set the foundations for software program structure and formed its growth. Now, AI software program is shaping the event of chips, giving rise to purpose-built processors.
Microsoft formally introduced that it might start manufacturing its personal silicon and processors, together with CPUs, AI accelerators and knowledge processing items.
Let us begin with the CPU. Azure Cobalt, Microsoft’s personal CPU, relies on Arm structure for optimum efficiency and watt effectivity, and it powers frequent Azure cloud workloads. The first era of the collection, Cobalt 100, is a 64-bit 128-core chip that improves efficiency by as much as 40% over present generations of Azure Arm chips and powers companies akin to Microsoft Teams and Azure SQL. After Neoverse N1, the primary Arm-based CPU bought from Ampere Computing, Cobalt 100 turns into the second Arm-based processor out there on Azure.
Then there’s Azure Maia, the primary in a collection of customized AI accelerators designed to run cloud-based coaching and inference for AI workloads like OpenAI fashions, Bing, GitHub Copilot and ChatGPT. With 105 billion transistors, the Maia 100 is the primary era within the collection and one of many largest chips on 5nm course of know-how. It options quite a few improvements within the areas of silicon, software program, networking, racks and cooling. The new AI accelerator turns into a substitute for the GPU by optimizing Azure AI’s end-to-end techniques to run state-of-the-art basis fashions like GPT.MORE FROM FORBESWhat Is A Data Processing Unit (DPU) And Why Is NVIDIA Betting On It?By Janakiram MSV
Finally, Azure Boost, Microsoft’s personal DPU, turned usually out there. Microsoft acquired Fungible, a DPU firm, earlier this 12 months with a purpose to enhance the effectivity of Azure knowledge facilities. Software features akin to virtualization, community administration, storage administration and safety are offloaded to devoted {hardware} with Azure Boost, permitting the CPU to commit extra cycles to workloads slightly than techniques administration. Because the heavy lifting is moved to a purpose-built processor, this offloading considerably improves the efficiency of cloud infrastructure.
Apart from bringing its personal silicon to the combination, Microsoft has partnered with AMD, Intel and NVIDIA to carry the newest CPUs and GPUs to Azure. It can have the newest NVIDIA H200 Tensor Core GPU by subsequent 12 months to run bigger basis fashions with decreased latency. AMD’s new MI300 accelerator may even turn into out there on Azure early subsequent 12 months.
Less reliance on OpenAI with homegrown and open supply basis fashions
While Azure continues to be the popular platform to run inference on OpenAI-based fashions for enterprises, Microsoft is investing in coaching its personal basis fashions that complement present fashions out there in Azure OpenAI and Azure ML.
Phi-1-5 and Phi-2 are small language fashions which can be light-weight and wish fewer sources than conventional giant language fashions. Phi-1-5 has 1.3 billion  parameters, whereas Phi-2 has 2.7 billion parameters, making them a lot smaller in comparison with Llama 2, which begins with 7 billion parameters and goes as much as 70 billion parameters. These SLMs are perfect for embedding inside Windows to offer an area copilot expertise with out making the roundtrip to the cloud. Microsoft is releasing an extension for Visual Studio Code that enables builders to fine-tune these fashions within the cloud and deploy them regionally for offline inference.
Microsoft Research has developed Florence, a basis mannequin that brings multimodal capabilities to Azure Cognitive Services. This mannequin permits customers to investigate and perceive photos, video and language to supply customizable choices for constructing pc imaginative and prescient purposes. This mannequin is already out there in Azure.
Azure ML now helps further open supply basis fashions, together with Llama, Code Llama, Mistral 7B, Stable Diffusion, Whisper V3, BLIP, CLIP, Flacon and NVIDIA Nemotron.
Azure ML, Microsoft’s ML PaaS provides model-as-a-service to devour basis fashions as an API with out the necessity to provision GPU infrastructure. This considerably simplifies the combination of AI with trendy purposes.
The mixture of Azure OpenAI and Azure mannequin catalog delivers essentially the most complete and widest vary of basis fashions to clients, which turns into the important thing differentiating issue of Azure.
Microsoft Graph and Fabric on the Core of The Data Platform
AI requires a considerable amount of knowledge for pre-training, fine-tuning and retrieval. Microsoft Fabric and Microsoft Graph are two key merchandise that contribute considerably to Microsoft’s generative AI efforts.
Microsoft Fabric, which was introduced at Microsoft Build 2023, is a big addition to Microsoft’s knowledge product line. Satya emphasised its significance by evaluating it to the discharge of SQL Server, implying a elementary shift in Microsoft’s knowledge administration and analytics technique.MORE FROM FORBESThe Time Is Now: Why Enterprises Should Invest In Systems Of IntelligenceBy Janakiram MSV
At Ignite 2023, ​Microsoft introduced the overall availability of Fabric. It features a part named OneLake, which is a transformative knowledge lakehouse platform. OneLake is built-in into Azure Machine Learning and Azure AI Studio, representing a serious enhancement in Azure Machine Learning’s knowledge administration capabilities. This platform is designed to deal with giant and diverse datasets in a unified and environment friendly method, optimizing knowledge storage and retrieval for AI purposes. Its integration with Azure AI platforms is especially essential for situations that require high-volume knowledge processing and sophisticated computational duties, frequent in superior AI and machine studying initiatives​. What’s fascinating about OneLake is the idea of shortcuts that carry knowledge from exterior sources, together with Amazon S3 and Databricks, into the fold of Fabric.
Microsoft Graph, a robust device in Microsoft’s arsenal, performs a pivotal position within the realm of AI copilots. It has turn into pivotal for creating AI copilots, providing a unified API to entry various knowledge throughout Microsoft 365 companies. Microsoft Graph permits copilots to offer customized help by aggregating knowledge from emails, calendar occasions and workforce interactions. This built-in method ensures a contextual understanding of customers’ skilled environments, which is crucial for making clever strategies. Microsoft Graph helps real-time knowledge entry, which is essential for well timed copilot responses. Its compliance with Microsoft 365’s safety requirements ensures the protected dealing with of delicate knowledge.
Microsoft Fabric and Microsoft Graph turn into the muse for constructing copilots based mostly on real-time knowledge out there via APIs.
Overall, Microsoft’s technique at Ignite 2023 demonstrates a transparent give attention to main the AI revolution, leveraging its platform heritage and innovating in {hardware} and software program to keep up business dominance.

https://www.forbes.com/sites/janakirammsv/2023/11/19/microsofts-ai-transformation-from-software-giant-to-ai-powerhouse/

Recommended For You