A code-first experience for building a copilot with Azure AI

This post has been republished via RSS; it originally appeared at: Microsoft Tech Community - Latest Blogs - .

Generative AI applications are transforming the user experience and accelerating adoption of AI tools and solutions in the enterprise. Developers now face new challenges in building such applications end-to-end, going from prompt engineering to LLM Ops. And they need new tools, platforms, and guidance to help them navigate this rapidly-evolving ecosystem.  


In this article, we’ll look at how Azure AI helps developers tackle these challenges with a code-first approach that makes it easier to build, run, test, and deploy copilot applications using their own data. Let’s dive in. 

Understanding the LLM App Lifecycle 

Traditional AI applications focused on building and deploying custom machine learning models, training them on custom datasets with the goal of generating predictions. The end-to-end application lifecycle was then defined by three phases - experimentation, development, and operationalization. 

By contrast, generative AI applications involve large language models that are pre-trained on massive quantities of data with the goal of generating content. End-to-end development must now deal with new concepts like prompts (natural language inputs), fine-tuning (to improve model performance), retrieval augmented generation (to get responses grounded in our data), evaluation (testing response quality) and responsible AI. 


This has resulted in a paradigm shift from MLOps to LLMOps where the three phases of the application development lifecycle now focus on ideation (build & validate the basic app), development (evaluate it for quality, iterate) and operationalization (deploy in production, use) – with potential iterations till the desired application requirements are met. This has created demand for tooling and frameworks that can help developers streamline the end-to-end development experience from prompt engineering to production deployment. 




Enter Azure AI Studio! 

Azure AI Studio addresses these challenges with a unified platform for building generative AI applications and custom copilot experiences. It was released in public preview last November with tools, services and guidance to help developers explore models (from both Microsoft and the community), build AI projects (that deploy models and integrate AI services), and manage AI resources for their end-to-end solutions. It also offers tools and guidance to help developers deploy safe and responsible AI solutions.  



In this context, a copilot is a generative AI application that takes advantage of large language models (LLM) and natural language processing (NLP) to assist your users in completing complex cognitive tasks with a conversational “chat” experience. With the Azure AI platform, you can build a copilot using your data, allowing customers to ask you questions about products or services, and receive more relevant responses. 

A Code-First Experience 

The Azure AI Studio platform provides a rich UI-driven experience for achieving these objectives in the browser, making it perfect for low-code developers and learnersBut if you are a developer that wants to dive deeper into the underlying code, build custom functions, collaborate using source control, or bring in existing code in various programming languages, then our code-first experience is for you!  


Let’s review the process for building a basic copilot application in Azure AI Studio. The typical end-to-end development workflow involves these steps: 


  1. Provision Azure resources - This includes your Azure AI project and services 
  2. Create Azure AI Search index - This is then populated with your custom data 
  3. Create model deployments - For chat completion, text embedding & evaluation 
  4. Create Azure AI connections- This allows your project to access Azure data & models 
  5. Build AI chat function - Write the Python code for your chat AI function 
  6. Run AI chat with test question - Validate that the basic copilot function works 
  7. Evaluate custom function for quality - Assess performance based on evaluation metrics 
  8. Deploy AI solution to Azure - Make chat API available as endpoint for integrations 

Now let’s explore how the Azure AI SDK and Azure AI CLI can streamline your experience. 


  1. The Azure AI SDK for Python (preview) provides core packages to help manage your Azure AI resources and build generative AI apps.  
  2. The Azure AI CLI (preview) is a cross-platform, language-agnostic way to achieve similar objectives from the command line.  

We’ll look at these in more detail next. But first, check out the following two demos with members of the Azure AI Studio team, to see the code-first approach in action, with an end-to-end development workflow. The first is a recent Global AI Notes session, and the second is a streamlined live walk through.


Azure AI CLI – Cross-platform, language-agnostic 

The Azure AI CLI is a powerful cross-platform command-line utility that can connect your application to Azure AI services and “execute control-plane and data-plane operations without having to write any code.  You can install the CLI (on Windows, macOS, and Linux devices) or experiment with it in a pre-configured Docker container (using VS Code). The screenshot below shows the Azure AI CLI in action, using the “ai init command to setup and initialize your Azure AI project and services. 




The table below lists the Azure AI CLI commands that support the key steps required for building a copilot experience from the command line. Note that the CLI is under active development – so the commands, options and UI/UX may evolve over time. 




ai init 

Provision and configure your Azure AI project with one (interactive) tool 

ai service 

Manage your connections to services and resources 

ai search 

Interact with Azure AI Search (manage search indexes) 

ai dev 

Create “.env”, populate it with environment variables for local development 

ai config

Manage configuration information (stored in local ".ai" folder configuration files)

ai chat 

Test your chat model deployment (interactively or non-interactively) to validate 

ai flow 

Work with prompt flows in an interactive manner (from creation to deployment) 

You can use "ai chat" throughout development to test your index, chat function, or deployment using interactive or non-interactive conversation modes. The screenshot below shows the rich set of command line options supported by this command today. 



Azure AI SDK – Python Code  

The Azure AI Studio platform also has an Azure AI SDK for Python with two distinct components, each serving a key purpose in the end-to-end developer workflow. 


  • The azure-ai-resources package provides the functionality required to connect to, and manage, your Azure AI resources and projects programmatically from apps. 
  • The azure-ai-generative package provides the functionality required for you to build, evaluate, and deploy Generative AI applications that leverage Azure AI services. 

In simpler terms, you would use the resources library to manage data, indexes, models, and deployments used in your AI projects. And you would use the generative library to build indexes and evaluate your apps in the local dev environment. It also has a promptflow package if you want to use that framework for your ideation phase. These ‘extra’ packages can be optionally removed if you don’t require the functionality. 


Dive into samples

Now that learned about the Azure AI code-first capabilities and seen them in action, its time to try it out yourself! The table below provides a list of quickstart options for building a copilot application - from using the basic Azure AI SDK (Python code) to combining it with other frameworks (like prompt flow) for advanced capabilities. Start by forking the sample of interest and completing the steps to setup, build, evaluate, and deploy the copilot. Then try extending it further to suit your application requirements or use your custom data.  


You will need an active Azure subscription and access to the relevant Azure OpenAI Service and models to complete the tutorials provided by the samples.  Also note that these samples are actively evolving to match updates to underlying tools or libraries and are not meant for production use.  





Directory for Azure AI samples 


Build a basic copilot using Azure AI SDK & CLI - built-in support for LangChain, Semantic Kernel, and prompt flow included


Build a basic copilot using prompt flow 


Build a basic copilot using LangChain 


Build a customer support agent with prompt flow 


Contoso Chat – End-to-End Application Sample 

The quickstart sample provides the foundational knowledge you will need for the tools and workflow steps required to build a copilot application. Want to try out a more advanced sample that tells the LLMOps story from provisioning to deployment? Fork the Contoso-Chat sample and explore the step-by-step workshop to build a customer support copilot application with Azure AI Studio and prompt flow, evaluate it, and deploy it to Azure. This illustrated guide outlines the workflow and the steps involved in each stage. 




You can then integrate the customer support chat capability into your Contoso-Outdoors application, using the deployed endpoint, to drive customer interactions like this. 




This was a quick look at the code-first experience in building a copilot with your data, using the Azure AI Studio platform. Want to keep learning? Check out the resources below – and revisit the Collection periodically for updates to relevant samples and training resources. 


Related Resources 

Leave a Reply

Your email address will not be published. Required fields are marked *


This site uses Akismet to reduce spam. Learn how your comment data is processed.