This post has been republished via RSS; it originally appeared at: Windows Blog.With the latest Windows 11 update on Sept. 26 we released a host of developer features as the core component of the Windows OS with an intent to make every developer more productive on Windows. Today we are excited to announce Windows AI Studio, a new AI experience to help enterprises and developers jumpstart local AI development and deployment on Windows, along with new updates to Dev Home and new enterprise features in Windows Subsystem for Linux (WSL) that enhance security and simplify deployment.
Windows AI Studio simplifies generative AI app developmentMany developers and enterprises want to bring AI differentiated experiences to their apps and we have heard from these developers that they need an easier and trusted way to get started with local AI development. With many tools, frameworks and open-source models available it is difficult to pick the right set of tools to test, fine-tune and optimize the models or select the most trusted models that best fit diverse business needs. That’s why we are thrilled to announce Windows AI Studio, a new experience for developers, that extends the tooling of Azure AI Studio to jumpstart AI development locally on Windows.
Getting started with AI development locally on Windows is easier and faster than everWindows AI Studio simplifies generative AI app development by bringing together cutting-edge AI development tools and models from Azure AI Studio and other catalogs like Hugging Face, enabling developers to fine-tune, customize and deploy state-of-the-art small language models, or SLMs, for local use in their Windows apps. This includes an e2e guided workspace setup that includes model configuration UI and guided walkthroughs to fine-tune popular SLMs like Phi. Developers can then rapidly test their fine-tuned model using the Prompt Flow and Gradio templates integrated into the workspace. Windows AI Studio brings us closer to supporting Hybrid Loop development patterns and enabling hybrid AI scenarios across Azure and client devices. This gives developers greater choice either to run their models on the cloud on Azure or on the edge locally on Windows (or across the two) to meet their needs. Prompt Flow makes it easier than ever to implement this hybrid pattern by switching between local SLMs and cloud LLMs. The picture above shows a typical fine-tuning workflow. Developers will bring their own datasets for fine-tuning. See our fine-tuning guide for details on how to get started. Note that the fine-tuning + model evaluation steps will be iterative until the model meets the developers’ evaluation criteria. In the coming weeks developers can access Windows AI Studio as a VS Code Extension, a familiar and seamless interface to help you get started with AI development. The guided interface allows you to focus on what you do best, coding, while we do all the heavy lifting by setting your developer environment with all the tools needed. Learn more about Windows AI Studio.
Windows optimized state-of-the-art modelsIn addition to fine-tuning capabilities, Windows AI Studio will also highlight state-of-the-art (SOTA) models optimized specifically for Windows GPUs and NPUs in the future, starting with Llama 2-7B, Mistral-7B, Falcon-7B, and Stable Diffusion XL. Earlier this year, we talked about how ONNX Runtime is the gateway to Windows AI. DirectML is the native Windows machine learning API, and together they give developers access to a simplified yet highly performant AI development experience. With Olive, a powerful optimization tool for ONNX models, developers can ensure that their models run as performantly as possible with the DirectML+ONNX Runtime combo. At Inspire this year we shared details on how developers will be able to run Llama 2 with DirectML and the ONNX Runtime and we have been hard at work to make this a reality. We now have a sample showing our progress with Llama 2 7B; after an Olive optimization pass, our sample shows how developers can now run this versatile LLM locally and performantly on varied Windows hardware. We're excited about this milestone, and this is only a first peek. Stay tuned for future enhancements to support even larger models, fine-tuning and lower-precision data types. Learn more. Windows Subsystem for Linux (WSL) offers a robust platform for AI development on Windows by making it easy to run Windows and Linux workloads simultaneously. Developers can easily share files, GUI apps, GPU and more between environments with no additional setup. WSL is now enhanced to meet the enterprise grade security requirements so enterprise customers can confidently deploy WSL for their developers to take advantage of both Windows and Linux operating systems on the same Windows device and accelerate AI development efficiently.
Windows Subsystem for Linux now offers new enterprise features that enhance security and simplify deploymentIt’s now easier than ever to securely deploy WSL to your company with the latest enterprise features. These include:
- Microsoft Defender for Endpoint released a new plug-in for WSL that enables security teams to continuously monitor events in all running distributions – delivering unparalleled visibility into systems once considered a critical blind spot.
- Access to WSL and its key security settings are now controllable with Intune. Admins can configure access to WSL entirely, or dive into access to specific security settings like custom kernel, nested virtualization and more, to ensure security while using WSL.
- Advanced networking controls in WSL let you specify firewall rules that apply to the WSL virtual machine and improve network compatibility in complex enterprise environments. Learn more to get started with WSL today!