Integrating Azure Data Factory and the Microsoft Intune Export API

This post has been republished via RSS; it originally appeared at: Microsoft Tech Community - Latest Blogs - .

Microsoft Intune is a cloud-based service that helps organizations manage and secure their mobile devices, apps, and data. Intune provides a variety of reports that help administrators monitor and troubleshoot the health and compliance of their devices and apps. These reports include device inventory, app inventory, device compliance, app protection, app configuration, and device configuration reports. Intune also offers a dashboard that shows the overall status and trends of the Intune service.

 

In addition to the built-in reports, Intune also provides an export API that allows administrators to export Intune data to external systems for further analysis and reporting. The export API is a RESTful web service that supports the OData query syntax and returns data in JSON or .csv formats. The export API can be used to retrieve data on devices, apps, policies, assignments, and events. The export API can also be used to create custom reports that meet specific business needs or integrate with other reporting tools.

 

For example, customers have used the export API to build custom reports for the following scenarios:

  • Create a device inventory dataset for all devices or a subset of devices.
  • Create a catalog of apps present on the devices for security and/or license compliance.
  • Create a list of all users running mobile apps with app protection policies and the associated apps/versions.

 

You can read more about the Intune Export API and watch some great overview videos here:

 

The export API is great for getting started on custom reporting, however enterprise customers frequently have centralized custom reporting solutions. In many cases they have existing data stores that they’d like to use and want to import Intune data exports into these solutions. They also don’t want to manually export and import data files.

 

One possible solution for importing Intune data into existing data stores is to use Azure Data Factory, a cloud-based data integration service that orchestrates and automates the movement and transformation of data. Azure Data Factory can connect to various sources and destinations, including Intune export API, and create data pipelines that run on a specified schedule or trigger. Azure Data Factory also supports data transformation activities, such as mapping data columns, filtering data rows, or joining data sets. Azure Data Factory can enable administrators to create custom reports on Intune data by using their preferred data store and reporting tool.

 

You can read more about Azure Data Factory here:

 

This blog will show you how to combine Azure Data Factory and the Intune Export API to automate exporting reports and store them in Azure. You could then import this data into your data lakes or consume it directly via tools like Power BI.

 

Prerequisites

You’ll need a few prerequisites to follow this guide and complete configuration:

 

Microsoft Entra tenant ID
The Entra tenant ID is needed to send requests to your tenant.

To find your Entra tenant ID, you can use one of the following methods:

  1. Use PowerShell to run the command: Get-AzureADTenantDetail | Select ObjectId
  2. Use the Azure portal to navigate to Microsoft Entra ID > Overview and copy the Tenant ID value.
  3. Use the Microsoft Graph Explorer to make a GET request to https://graph.microsoft.com/v1.0/organization and look for the id field in the response.

    The Entra tenant ID is a 32-digit hexadecimal number that looks like this: 01234567-89ab-cdef-0123-456789abcdef.  Save this tenant ID for later.


An Enterprise app registration in Entra with Read permissions to relevant Intune data
The Entra app registration creates an application that allows Azure Data Factory activities to access your Intune data.  To create an enterprise app registration in Entra with read permissions to Intune, follow these steps:

  1. Sign in to the Microsoft Entra portal with an account that has global administrator or cloud application administrator role.
  2. Go to Identity > Applications > App Registration > New registration.
  3. Enter a name for your app, such as Intune Export API, and select Register.
  4. Copy the Application (client) ID and save it for later use.
  5. Go to API permissions > Add a permission > Microsoft Graph > Application permissions.
  6. Search and select the following permissions:
    • DeviceManagementConfiguration.Read.All
    • DeviceManagementManagedDevices.Read.All
    • DeviceManagementRBAC.Read.All
    • DeviceManagementServiceConfig.Read.All
    • DeviceManagementApps.Read.All
  7. Select Add permissions.
  8. Ensure you grant admin consent for the app permissions.

    AzureDataFactory-1.png

     

  9. Your app is now registered and has read permissions to Intune data.

    AzureDataFactory-2.png

     

Create a secret for your Entra app
A secret is a string that the app uses to prove its identity when requesting a token.  To create a secret, follow these steps:

  1. In App registrations change the view to “All applications”
  2. Select the application that was just created.
  3. Go to Certificates & secrets and select New client secret.
  4. Enter a description and an expiration date for the secret and select Add.
  5. Copy the Secret Value and the Secret ID and save it for later use. Note: You won't be able to see this value again after you leave this page.

    AzureDataFactory-3.png

Azure Blob Storage and a container

Azure Blob Storage will be used to store the exported data, the container provides a location within the storage location to hold the files.  You can use an existing storage account or create a new account by following the steps in Create a storage account – Azure Storage.

Note: For the purpose of this tutorial, you can use the default settings when configuring the account.
 

After your storage account is created, create a container by navigating to the Resource groups page and select the resource group that contains your storage account.

  1. Select your storage account to open its overview page.
  2. On the left menu, under Data storage, click on Containers.
  3. Click on the + Container button at the top of the page.
  4. Enter a name for your container. The name must be lowercase and must be unique within the storage account.
  5. For this tutorial, accept the default anonymous access level of private (no anonymous access).
  6. Click on the Create button to create your container.


Once an Azure blob storage and a container have been created, you can upload, download, and manage your blobs in the container using the Azure portal or other tools.


AzureDataFactory-4.png

 

Now that you’ve completed all the prerequisites you should have the following documented:

  • Entra tenant ID
  • Entra app registration and secret
    • Entra Application (client) ID
    • Secret ID
    • Secret Value
  • Azure storage account
    • Azure Blob Storage and a container name

With these completed we can now start working within the Azure Data Factory!

 

Azure Data Factory configuration

Using the steps below we’ll create an Azure Pipeline that will perform the following activities:

  1. Request an authorization token using the Entra App ID and Secret Information.
  2. Make a call to the Intune Export API and request a report, in this case “Devices with Inventory”.
  3. Monitor the Export API endpoint until the report is complete and a download URL is provided.
  4. Use the download URL to download a compressed JSON file, extract the JSON file, copy the file to Azure Blob Storage in a container.

 

Follow each of the steps below. Note that most configuration issues are a result of typos or changes to object names. Throughout the tutorial we’ll use the “Debug” capability to test our pipeline end to end.

 

  1. Create a new pipeline in ADF portal and name it “IntuneExportAPItoAzure”
    1. Open https://adf.azure.com and either create a new data factory or use an existing one.
    2. On the home screen select “New” and then “Pipeline” and name it “IntuneExportAPItoAzure”
      AzureDataFactory-5.png

       

      AzureDataFactory-6.png

       

  2. Select “Author” icon, select your pipeline in the middle nav bar, and search for Web in activities. Drag the Web into the pipeline.
    AzureDataFactory-7.png

     

  3. Change the name to GetToken
    AzureDataFactory-8.png

     

  4. First we’ll set the URL and add our tenant ID. Click on Settings and enter URL string with your tenant ID:
    https://login.microsoftonline.com/[TenantID]/oauth2/v2.0/token 
  5. In the post body we’ll add our tenant ID, the Entra App ID, and the App secret. 
    Change “Method” to POST and enter:
    tenant=[TenantID]&client_id=[EntraAPPid]&client_secret= [EntraAPPsecret]&grant_type=client_credentials&scope=https://graph.microsoft.com/.default
  6. Click on “+New” in Headers and add the following:
    1. Name: Content-Type
    2. Value: application/x-www-form-urlencoded
  7. When completed your configuration should look like this:
    AzureDataFactory-9.png

     

  8. Click on “Publish All” then click on “Debug” to test.
    AzureDataFactory-10.png

     


    You should see “GetToken” with an activity status of “Succeeded”
    AzureDataFactory-11.png

     

  9. Search for “Set Variable” in activities and drag it into the pipeline
    AzureDataFactory-12.png

     

    Then drag connect the GetToken web activity to the new Set Variable activity by clicking on the “on success” green check mark and dragging the line to the Set Variable activity.
    AzureDataFactory-13.png

     

  10. Click on “General” for the Set Variable activity and change the name to “Token”.
    AzureDataFactory-14.png

     

  11. Click on “Settings” for the Token Set variable activity and click on “+ New”
    AzureDataFactory-15.png

     


    Set the name to “Token” and the type to “String” and click on “Confirm”.
    AzureDataFactory-16.png

     

  12. Now under “Settings” for the Token Set Variable activity click in the “Value” box and then select “Add dynamic content”.  Enter “@activity('GetToken').output.access_token” and check for any validation errors. If you have validation errors confirm the activity name is correct (GetToken).
    AzureDataFactory-17.png

     

      AzureDataFactory-18.png

     

  13. Publish and again Debug to confirm token is being stored in the Token variable
    AzureDataFactory-19.png

     

  14. If you hover over token, you can click on output and confirm token contents.
    AzureDataFactory-20.png


    AzureDataFactory-21.png

     

  15. Now we will add a Web Activity to initiate a call to the Export API.
    1. Search for “Web” and drag the new activity into the pipeline.
      AzureDataFactory-22.png

       

    2. Search for web, drag the activity to the pipeline, rename it “InitiateExportAPI”, and then drag from the “on success” check box to connect the two activities.
      AzureDataFactory-23.png

       

    3. Click on settings for export job and enter the following URL:
      https://graph.microsoft.com/beta/deviceManagement/reports/exportJobs
    4. Set the Method to “Post”.
    5. To specify which report and attributes insert the payload details into the body.  In this example we’ll export the “DevicesWithInventory” report with no filtering.  You can change the report by updating the highlighted text.  This is where you could change format to “format”: “csv” to get a csv file:

      {
          "reportName": "DevicesWithInventory",
      
          "filter": "",
      
          "select": [],
      
          "format": "json"
      
      }​

       

      AzureDataFactory-24.png
  1. Click on the “+” in Headers in settings for the InitateExport API enter “Authorization” for the name and then click on “Add Dynamic” content. 
     AzureDataFactory-25.png

     

    In the expression builder enter “@concat('Bearer ',variables('Token'))”.  Note there is a space after Bearer before the single quote.  Ensure it passes validation and click OK.

    AzureDataFactory-26.png

     

    Completed configuration should look like this:

    AzureDataFactory-27.png

     

  2. Publish and Debug your pipeline again, hovering over output for InitiateExport API will show download URL.  In this output you can see “Status”:  “notStarted” which indicates the job was submitted
    AzureDataFactory-28.png

     

    AzureDataFactory-29.png

     

  1. Next search and add an “Until” activity to the pipeline.  This activity will monitor the job until the export is completed and then pass the download URL to another variable.
    1. Search for “until” and drag the activity into the pipeline.
      AzureDataFactory-30.png

       

    2. Connect the success output of InitiateExportAPI to the Until activity.
      AzureDataFactory-31.png

       

    3. Select the Until activity and under General change the name to “UntilExportIsComplete”
      AzureDataFactory-32.png

       

    4. Click on the”+” inside the UntilExportIsComplete activity and select “Web”.
      AzureDataFactory-33.png

       

      AzureDataFactory-34.png

       

    5. Click on the new web activity and under General change the name to “GetBlobURL”.
      AzureDataFactory-35.png

       

    6. Click on Settings for the “GetBlobURL” web activity then click on ‘URL’ and ‘Generate Dynamic Content’.
      AzureDataFactory-36.png

       

    7. In the Pipeline expression builder enter the following: @concat('https://graph.microsoft.com/beta/deviceManagement/reports/exportJobs(''',activity('InitiateExportAPI').output.id,''')')
      Ensure that the expression validates. If you have errors confirm the activity name is correct.
      AzureDataFactory-37.png

       

    8. Change “Method” to “GET”.  Then Under Headers click on “+” and set the name to “Authorization” and then click on “Add dynamic content”
      AzureDataFactory-38.png

       

    9. In the Expression builder enter: @activity('GetToken').output.access_token

If you have validation errors check naming for GetToken activity.

AzureDataFactory-39.png

 

  1. Inside the UntilExportIsComplete Until activity click on the “+” and add a “wait” activity
    AzureDataFactory-40.png

     

    AzureDataFactory-41.png

     

  2. Rename the Wait activity to “Wait30” and then under settings change the value to “30” seconds.
    AzureDataFactory-42.png

     

    AzureDataFactory-43.png

     

  3. Select the Until “UntilExportIsComplete” parent activity and click on “Settings”.  Select Expression and then “Add dynamic content”. 

    Enter “@equals(activity('GetBlobURL').output.status,'completed')” then click on OK
    AzureDataFactory-44.png

     

    AzureDataFactory-45.png

     

    AzureDataFactory-46.png

     

  4. Publish and debug.  You will see the job execute and loop until the “GetBlobURL” returns a “completed” status.  If you hover over the output for “GetBlobURL” you can see the status and zip file location.
    AzureDataFactory-47.png

     

    AzureDataFactory-48.png

     

  1. Now we’ll take the URL for downloading the zip file and assign it to a variable called “ZipDownloadURL”
    1. Search for variable in activities and drag it into your pipeline
      AzureDataFactory-49.png

       

    2. Under General rename the variable to “ZipDownloadURL” and drag the success exit on the UntilExportIsComplete activity to connect it.
      AzureDataFactory-50.png

       

    3. Select the ZipDownloadURL variable and click on settings and then “+ New” to add a variable
      AzureDataFactory-51.png

       

    4. Enter “ZipDownloadURL” as name and set the type to “String”.
      AzureDataFactory-52.png

       

    5. Then click on Value and select “Add dynamic content”.
      AzureDataFactory-53.png

       

    6. In the expression builder enter: @activity('GetBlobURL').output.url”
      If you have validation errors confirm the GetBlogURL activity name, then click OK to continue.
      AzureDataFactory-54.png

       

    7. Publish the pipeline and debug.  Ensure that the ZipDownloadURL variable output contains the download URL for the zip file.
      AzureDataFactory-55.png

       

      AzureDataFactory-56.png

       

  2. Now, we’ll use the ZipDownloadURL to copy and extract the zip file to Azure Blob Storage.
    1. Search for “copy data” and drag the activity into your pipeline.
      AzureDataFactory-57.png

       

    2. Rename the copy data activity to “CopyZipToBlob” and connect it from the success of ZipDownloadURL.
      AzureDataFactory-58.png

       

    3. Now click on “Source” and click on “+New”.
      AzureDataFactory-59.png

       

    4. Search and select “http” and click on “continue”, then select JSON as the format.
      AzureDataFactory-60.png

       

      AzureDataFactory-61.png

       

    5. In the Set properties name the Source ExportJSON and then click on “+ New” under Linked service.
      AzureDataFactory-62.png

       

    6. Set the name for the linked service to “HttpZipDownloadURL”, change the authentication to “Anonymous”, and add a parameter called “ZipDownloadURL” with the type of “string”.
      AzureDataFactory-63.png

       

    7. Then click on “Base URL”, select “Add dynamic content”, select the parameter you created ZipDownloadURL, click OK, then “Create”
      AzureDataFactory-64.png

       

      AzureDataFactory-65.png

       

    8. Completed configuration should look like this.  After you click create, ensure that “import schema” is set to “none” and then click OK on the final page.
      AzureDataFactory-66.png


      AzureDataFactory-67.png

       

  3. In the Factory resources blade expand your datasets and select the ExportJSON data set.  We need to add a parameter, configure the data set to deflate the zip file, set the URL value, then update our dataset in the CopyData activity.
    1. Click on parameters then click on “+ New” and add “ZipDownloadURL” with type string.
      AzureDataFactory-68.png

       

    2. Click on the ‘Connection’ tab and change the compression type to “ZipDeflate”.
      AzureDataFactory-69.png

       

    3. Select the value for “ZipDownloadURL” and click “Add dynamic content”.
      AzureDataFactory-70.png

       

    4. In the Expression builder select the parameter “ZipDownloadURL”.  This will automatically populate the expression with: @dataset().ZipDownloadURL.
      AzureDataFactory-71.png

       

    5. Completed configuration should look like this:
      AzureDataFactory-72.png

       

    6. In the pipeline select the CopyZiptoBlob activity, select source, in the data set properties click in value for ZipDownloadURL and click on “Add dynamic content”.
      AzureDataFactory-73.png

       

    7. In the expression builder select “Variables” and then click on “ZipDownloadURL” under Token.
      AzureDataFactory-74.png

       

  4. Now, we’ll configure the Sink location which determines where the extracted JSON is stored. In this case we’ll extract it to Azure Blob storage.
    1. Select the CopyZipToBlob activity and select “Sink” then click on “+ New”.
      AzureDataFactory-75.png

       

    2. Select “Azure Blob Storage” and click on Continue, then select JSON and click Continue.
      AzureDataFactory-76.png

       

      AzureDataFactory-77.png

       

    3. On the set properties page set the name to “StoreJSON” and click on “+New” under Linked service
      AzureDataFactory-78.png

       

    4. On the New linked service page set the name to “JSONBlobStorage”, ensure authentication type is set to “account key”, select your existing Azure Subscription and the storage account name with the container for your export jobs.  Then click “Create”.
      AzureDataFactory-79.png

       

    5. On the Set Properties screen click on the folder icon and then select the storage container.
      AzureDataFactory-80.png

       

      Enter an export file name like “Export.json” and confirm the Import Schema setting is “None”.
      AzureDataFactory-81.png

       

  5. Publish the pipeline and debug to confirm the Export API job is submitted, the URL is used to download and extract the zip file to Azure storage.
    1. Debug the entire pipeline, click on the details icon and you can confirm the file copy.
      AzureDataFactory-82.png

       

      AzureDataFactory-83.png

       

    2. You can also view the file in Azure Monitor.

      AzureDataFactory-84.png

       

Optional: Importing JSON from Blob Storage to SQL Server

Now that the extracted JSON is in Azure Blob Storage, we can use another Copy to Activity to import the data to an Azure SQL server. Note that this step could vary based on your environment. Your team may already have SQL servers or another solution that they’d like to utilize. Check with your reporting teams to confirm the solution(s) for your environment and if Azure Data Factory supports it via the Copy activity. This article has a full table of supported configurations: Copy activity - Azure Data Factory & Azure Synapse.

 

Prerequisites

To complete the following steps you will need:

  • Access to an Azure Subscription with an existing Azure SQL Server
  • A database in the SQL Server
  • A table created to hold the JSON data
  • Credentials to access the database/table

It’s recommended that you provide your database team with a sample of the JSON export so they can design a table that matches the expected fields and data types that are in your report(s).  This conversation should include how you plan on using the data and if you need historical trending, or just snapshot reporting.


For this example (DevicesWithInventory) we’ve used this sample query to build the table, set data types/lengths, and add a column that shows the date/time that the records were inserted to assist with historic reporting.  You will need to create different tables with unique columns/data types if you choose a different report:

 

 

CREATE TABLE [DevicesWithInventory](
	[TimeStamp] [datetime2] NOT NULL DEFAULT (GetDate()),
	[DeviceId] [nvarchar](100) NOT NULL,
	[DeviceName] [nvarchar](100) NULL,
	[CreatedDate] [datetime2](7) NULL,
	[LastContact] [datetime2](7) NULL,
	[ReferenceId] [nvarchar](100) NULL,
	[OSVersion] [nvarchar](100) NULL,
	[GraphDeviceIsManaged] [bit] NULL,
	[EasID] [nvarchar](100) NULL,
	[SerialNumber] [nvarchar](100) NULL,
	[Manufacturer] [nvarchar](100) NULL,
	[Model] [nvarchar](100) NULL,
	[EasActivationStatus] [bit] NULL,
	[IMEI] [nvarchar](100) NULL,
	[EasLastSyncSuccessUtc] [datetime2](7) NULL,
	[EasStateReason] [nvarchar](100) NULL,
	[EasAccessState] [nvarchar](100) NULL,
	[InGracePeriodUntil] [datetime2](7) NULL,
	[AndroidPatchLevel] [nvarchar](100) NULL,
	[WifiMacAddress] [nvarchar](100) NULL,
	[MEID] [nvarchar](100) NULL,
	[SubscriberCarrierNetwork] [nvarchar](100) NULL,
	[StorageTotal] [int] NULL,
	[StorageFree] [int] NULL,
	[ManagedDeviceName] [nvarchar](100) NULL,
	[CategoryName] [nvarchar](100) NULL,
	[UserId] [nvarchar](100) NULL,
	[UPN] [nvarchar](100) NULL,
	[UserEmail] [nvarchar](100) NULL,
	[UserName] [nvarchar](100) NULL,
	[WiFiIPv4Address] [nvarchar](100) NULL,
	[WiFiSubnetID] [nvarchar](100) NULL,
	[CompliantState] [nvarchar](100) NULL,
	[ManagedBy] [nvarchar](100) NULL,
	[Ownership] [nvarchar](100) NULL,
	[DeviceState] [nvarchar](100) NULL,
	[DeviceRegistrationState] [nvarchar](100) NULL,
	[SupervisedStatusString] [nvarchar](100) NULL,
	[EncryptionStatusString] [nvarchar](100) NULL,
	[OS] [nvarchar](100) NULL,
	[SkuFamily] [nvarchar](100) NULL,
	[JoinType] [nvarchar](100) NULL,
	[PhoneNumber] [nvarchar](100) NULL,
	[JailBroken] [nvarchar](100) NULL,
	[ICCID] [nvarchar](100) NULL,
	[EthernetMAC] [nvarchar](100) NULL,
	[CellularTechnology] [nvarchar](100) NULL,
	[ProcessorArchitecture] [nvarchar](100) NULL,
	[EID] [nvarchar](100) NULL,
	[SystemManagementBIOSVersion] [nvarchar](100) NULL,
	[TPMManufacturerId] [nvarchar](100) NULL,
	[TPMManufacturerVersion] [nvarchar](100) NULL,
	[ProductName] [nvarchar](100) NULL,
	[CertExpirationDate] [nvarchar](100) NULL,
)

 

 

 

Open your existing pipeline and follow the steps below:

  1. Create new Copy data activity by selecting the “CopyZipToBlob” activity and clicking on the blue arrow.  Then select “Copy data”.
    AzureDataFactory-85.png

  2. Select the new Copy data activity, select general and change the name to “CopyJSONtoSQL”.
    AzureDataFactory-86.png

     

  3. Select the CopyJSONtoSQL then click on “Source” and change the source dataset to “StoreJSON”.
    AzureDataFactory-87.png

     

  4. Click on the CopyJSONtoSQL activity and select “Sink” then click on “+New”.
    AzureDataFactory-88.png

     

  5. On the New dataset screen search for SQL and select “Azure SQL”.
    AzureDataFactory-89.png

     

  6. On the Set properties page change the name to “JSONtoSQL” and click on “+New”.
    AzureDataFactory-90.png

     

  7. On the New linked service page set the name to “ExportExportAPISQL”.  Select the subscription, server name, database and credential type you’d like to use.  In this example we’ll be using SQL server authentication,. Your database administrator may want to provide another credential type.
    AzureDataFactory-91.png

     

  8. Confirm you see the new linked service, select the table that was created for imports, select to import schema from connection/store and click “Ok”.
    AzureDataFactory-92.png

     

  9. Select the CopyJSONtoSQL activity and click on Mapping and select “Import schemas”.
    AzureDataFactory-93.png

     

  10. Now we need to import the schema and complete field mapping.
    1. Change the collection reference to “$['values']”.
    2. Then clear any columns with errors or that you don’t want to include.
    3. Click Import schemas again.  Check your entire mapping list and deselect any colums that do not resolve to a column name (see screen shot below)
      AzureDataFactory-94.png

       

      The names should resolve and now match the column names:
      AzureDataFactory-95.png

       

  11. Click on Publish and then debug your pipeline.  You should see the full end to end pipeline succeed you can also click “details” to see the number of rows imported.
    AzureDataFactory-96.png
     AzureDataFactory-97.png

     

    And the data from the SQL table using SQL management studio.
    AzureDataFactory-98.png

     

Now that you have the full pipeline running you can add a trigger to the pipeline that will cause the pipeline to run on a schedule.  For example, if you want to run a device inventory export automatically at 8 AM each day.

  1. Select your pipeline in the Azure Data Factory Portal and then click “Add trigger”.
    AzureDataFactory-99.png

     

    Click on “+New”.
    AzureDataFactory-100.png

     

  2. Configure schedule.  In this case we’re setting this job to run daily and to start the trigger once completed.  After you create the trigger click on publish.
    AzureDataFactory-101.png

     

Conclusion

Intune Export API is the recommended solution for easy bulk exports of data from your Intune tenant. It provides a quick method to extract a series of reports and is generally easier and more reliable than attempting to scrape the data via repeated direct Graph calls to the same endpoint.


Azure Data Factory provides a solution that allows you to automate calls to the Export API and extract, store, and import that data into other locations including Azure Blob Storage, SQL Servers, and other locations.


Once the data is in SQL or other backend systems you could join it with other data sources and also consume it via analytics tools like PowerBI.


Share this tutorial with your teams to drive a discussion around the data that you need, how you’d like to structure it, where you’d like to store it, and then how you might consume the data.

 

If you have any questions leave a comment below or reach out to us on X @IntuneSuppTeam!

Leave a Reply

Your email address will not be published. Required fields are marked *

*

This site uses Akismet to reduce spam. Learn how your comment data is processed.