Azure Sentinel Data Enrichment – Walk-through with Scripting, KQL and Playbooks

This post has been republished via RSS; it originally appeared at: New blog articles in Microsoft Tech Community.

Howdy Everyone!

 

Organizations are migrating over to Azure Sentinel as their primary cloud SIEM solution and they’re looking at ways to enrich their data that’s being connected via service-to-service. Example would be Azure Activity log, Office Data or Active Directory which is a tenant wide feed. Then having the ability to filter the information based on some variable.

 

Primary Goal from this walk-though:

We're needing to differentiate data fields when investigating an issue, example, we have a user in our company based out of US - Texas that's been added within AAD as an ExtensionProperty. My primary example would be using that information to be able to split the data up, then to send that collected data to the correct department location. This could be it's own workspace or it's own custom table within the same workspace.

 

Special Thanks!

  - Thank you for the assistance on creating the automation scripts with Azure Function.

 

Requirements:

Workspace and Log Permissions:

You’ll be needing API permissions for the Log Analytics workspace: More details here , I’ll personally be using the Log Analytics Contributor role. For my example, I’m using AAD permissions, please make sure you have the appropriate permissions when pulling your data. AAD Roles here if your pulling data from Azure AD.

PS Modules I’m using with PowerShell:

Walking through the test AAD Ingestion:

Based on the goal above, I’m looking at getting all the tenant AAD Attributes into Azure Sentinel to enrich my current dataset. As there is no native "data connector" for AAD Attributes we’ll have to build our own collector.

 

We have a few options of ingesting data into Log Analytics, those all centered on the API's Log Analytics offers. Examples are Data connector with Logic Apps, PS Module or even the raw API's with Scripting. We'll be using  and work on automation after. Playbooks can be a simple option as well, could even automate this pulling of this information with playbooks, although Azure Automation or Scripting can be cheaper if you’re expecting to Query/Format/Push lots of data into an Log Analytics workspace. I'm planning on submitting unformatted information into Log Analytics from AAD.

 

Starting with PowerShell:

 

This section is walking through the process with manually running PowerShell to ingest data. The scripts mentioned in this walk-through will be added to the Azure Sentinel Github Community: Link to Community  

 

3 min video explaining the process through PowerShell:

Getting started we'll need to connect to Azure Active Directory via the AzureAD Powershell Module.

 

 

Connect-AzureAD -AccountId <UPN> -TenantId <Tenant GUID>

 

 

Connect.png

 

I’m going to collect all the AzureAD Users and store it into a variable of $UsersCollected

 

 

$UsersCollected = Get-AzureADUser -All $True

 

 

 

Feel free to look at the variable, we’re needing this to be in JSON format to ingest the data cleanly. Going to make another variable to make my future scripting easier.

 

 

 

$JsonUserFormat = ConvertTo-Json $UsersCollected

 

 

 

Now all the information collected from the Azure AD Tenant has collected, we’re needing to test the ingestion into the Log Analytics workspace Azure Sentinel is using. You’ll need the following information:

  • $CustomerID : <Log Analytics WORKSPACE ID>
  • $SharedKey : <Log Analytics Primary Key>
  • $JsonUserFormat
  • $logType = “AADLogTest”
    • Important, this is the custom table field the logs will be ingested in.

(Optional)

  • $TimeStampfield = Get-Date
    • Will show the time of the ingestion

Copy and paste the below, update with your workspace information:

 

 

$CustomerID = ‘<WORKSPACE ID>’ $SharedKey = ‘<WORKSPACE KEY>’ $logType = “<WORKSPACE CUSTOM TABLE NAME>” $TimeStampfield = Get-Date

 

 

Variables.png

 

Now using the OMSAPI Module, send the data to the Log Analytics workspace.

 

 

Send-OMSAPIIngestionFile -customerId $customerId -sharedKey $sharedKey -body $JsonUserFormat -logType $logType -TimeStampField $Timestampfield

 

 

OMSAPI.png

Accepting the data will be very quick, the very first time it can take up to 20-30 mins for the table to be build into the Log Analytics workspace.

Looking at the data with-in Azure Sentinel’s log Analytics workspace:

 

AADLogTest.png

You’ll see above the Custom Logs table has been crated, and I was able to query the information.

 

An example user in my domain, Adele Vance, this user has been preconfigured with an ExtensionProperty that tells us her home office location. We're interested in enriching our Azure Sentinel alerts with this property.

Looking up her name, we found her Azure Active Directory Attributes information, then filtered the query down to the information we're currently interested in:

 

 

AADLogTest_CL | where UserPrincipalName_s contains "adelev" | project UserPN = UserPrincipalName_s, ObjectId = ObjectId_g, UserType = UserType_s, CompanyLocation = ExtensionProperty_extension_1fdbfe7a3f344194b08d177c8125ef04_UserLocation_s

 

 

 

AdelevTest.png

Now seeing the information we're wanting, lets join the “CompanyLocation” property with the UPN with other data sets. We're going to use SigninLogs, as she's just been complained about login failure's to her account multiple times with the wrong password.

 

Here is an example of joining the data together with SigninLogs query example:

 

 

 

SigninLogs | where UserPrincipalName contains "adelev" | project UserPrincipalName_s = UserPrincipalName, Status, Location, IPAddress, Identity, ResultDescription | join kind= inner ( AADLogTest_CL ) on UserPrincipalName_s | project Username_S = UserPrincipalName_s, Status, Location, IPAddress, Identity, ResultDescription, ObjectId = ObjectId_g, UserType = UserType_s, CompanyLocation = ExtensionProperty_extension_1fdbfe7a3f344194b08d177c8125ef04_UserLocation_s

 

 

AdelevTestwSigninLogs.png

Going one step further; we could look at multiple data sets and query out the information based on the UPN location:

 

Example would be us looking at SigninLogs, OfficeActivity and AuditLogs.

 

 

let UserLocation = "US - Texas"; let T1SigninLogs = SigninLogs | extend UserPrincipalName_s = UserPrincipalName | join kind= inner ( AADLogTest_CL | where TimeGenerated >= ago(7d) | where ExtensionProperty_extension_1fdbfe7a3f344194b08d177c8125ef04_UserLocation_s == UserLocation ) on UserPrincipalName_s; let T2AuditLogs = AuditLogs | extend UserPrincipalName_s = tostring(parse_json(tostring(InitiatedBy.user)).userPrincipalName) | join kind= inner ( AADLogTest_CL | where TimeGenerated >= ago(7d) | where ExtensionProperty_extension_1fdbfe7a3f344194b08d177c8125ef04_UserLocation_s == UserLocation ) on UserPrincipalName_s; let T3OfficeActivity = OfficeActivity | extend UserPrincipalName_s = UserId | join kind= inner ( AADLogTest_CL | where TimeGenerated >= ago(7d) | where ExtensionProperty_extension_1fdbfe7a3f344194b08d177c8125ef04_UserLocation_s == UserLocation ) on UserPrincipalName_s; union T1SigninLogs, T2AuditLogs, T3OfficeActivity

 

 

 

Azure Sentinel Playbook:

4 Min video walking through Playbook design:

 

Now that we're able to see the data, lets build an Azure Sentinel playbook to send the data into the US – Texas location Log Analytics workspace. This can be accomplished via Azure Automation, PS with API’s another easier way to visually see what’s going is a Playbook (Logic Apps) with Azure Sentinel.

Going into our Azure Sentinel Playbooks, create a new Playbook and decide how you’re wanting to start playbook. Popular choices are “Recurrence”, “HTTP request”, and “Alert Triggered with Azure Sentinel”.

For this example, lets use Recurrence, every 30 mins.

Recurrence.png

 

In this example, we're not going to dynamically fill a Location value, although this might be accomplished through some type of pull or alert or HTTP Request. Skipping that part, lets setup a “Location Value” with the Variables connector. Select “Initialize Variable”

Configure the following:

  • Name
  • Type : String
  • Value
 

Now to collect the information, we could use the Log Analytics API although I’m going to utilize the Log Analytics connector for querying information to simplify the process. Find “Azure Monitor Logs”, Select “Run Query and List Results”

Configure the following: We’re using the Workspace the AAD information has been sent to.

  • Subscription: Workspace Sub
  • ResourceGroup: Workspace Resource Group
  • ResourceType : Log Analytics Workspace
  • ResourceName: Workspace Name
  • Query Example:

 

 

 

SigninLogs | where TimeGenerated >= ago(30m) | extend UserPrincipalName_s = UserPrincipalName | join kind= inner ( AADLogTest_CL | where TimeGenerated >= ago(7d) | where ExtensionProperty_extension_1fdbfe7a3f344194b08d177c8125ef04_UserLocation_s == "@{variables('Location')}" ) on UserPrincipalName_s

 

 

 

  • Time Range: Set in query

LogAQuery (2).png

In the query we’ve configured to only look at Signinlogs to test the failure of login’s, like I showed earlier this could be expanded to include multiple tables when pulling information. We’re not caring about what user, unless the username has the approved value when we’re using the ExtensionProperty_UserLocation to match “US – Texas”. I’m planning on only updating my AAD tables once a week, why I’ve configured it look back 7 days, and I’m wanting only the past 30 mins of information to limit my window. This could be bigger or smaller, although to prevent repeating information setting a window would be better during reoccurring of pulling information over.

 

Now to compose the data to be ingested into Log Analytics again, the information I’m wanting is the “values” collected. So it’ll look something like this à

 

Compose.png

Now to send the data over to another workspace, or the same workspace. We’re going to build a new custom table for enrichment of information called “USTexasData” and send the outputs we’ve filtered into this workspace.

Configure the following: We’re using the Workspace you’re wanting to send the filtered information to

  • JSON Request Body : Outputs from Compose
  • Custom Log Name: Whatever Custom log name you’re wanting to use
  • Connect to : Confirm the connection you’re wanting to connect to, make sure whatever account you’re using has the correct access to write to this workspace when building that connection.

 

Example:

SendData.png

To create some noise, attempt to unsucessfully login to the account we're monitoring,Adelev's account in this example, we should now see a new collection of data after the 30 min window automatically to this table configured in the logic app (USTexasData_CL):

 

USTexasData.png

 

We've now successfully filtered our our users that are at the US - Texas location and sent these logs over to a new custom table.

 

If anyone has any other requests on what to be blogged about, please feel free to reply to this blog post.

 

Thanks!

 

Chris Boehm

Customer Experience Engineering - Azure Sentinel Team.

Linkedin Profile 

Leave a Reply

Your email address will not be published. Required fields are marked *

*

This site uses Akismet to reduce spam. Learn how your comment data is processed.