Operationalize against your archived logs in Azure Storage
externaldata() lookups on archived data the easy way via PowerShell Script
In a previous post we looked at sending Azure Sentinel logs using Data Export rules to be archived in Azure Storage. Towards the end of the article I discussed how you can use the | getschema and externaldata() KQL commands to build a query to bring the logs back into memory when searching. One of the additional pre requisites was generating a SAS URI signature for each archived log blob you wanted to bring in.
Operationalizing your team to use this manual method can be cumbersome so I built a mini project here to help aide in operationalizing your need to review archived Azure Sentinel logs.
Once logs are archiving into the Azure Storage account you can use the following script to operationalize external data lookup tasks by generating the Base KQL query that will include the schema and the SAS Uri signatures needed for each blob in start and end time range.
Generate Storage Lookup KQL Query PowerShell Script
Example input into the script:
StorageAcctName : siempipestorage
LAWorkspaceName : azulabs
TableName : emailevents
StartDate : 09/11/2021 02:00 AM
EndDate : 09/12/2021 12:00 PM
The PowerShell script currently runs in Windows and generates a kql query .yaml file and opens the file in notepade.exe.
externaldata(TenantId:string, AttachmentCount:int, ConfidenceLevel:string, Connectors:string, DetectionMethods:string, DeliveryAction:string, DeliveryLocation:string, EmailClusterId:long, EmailDirection:string, EmailLanguage:string, EmailAction:string, EmailActionPolicy:string, EmailActionPolicyGuid:string, OrgLevelAction:string, OrgLevelPolicy:string, InternetMessageId:string, NetworkMessageId:string, RecipientEmailAddress:string, RecipientObjectId:string, ReportId:string, SenderDisplayName:string, SenderObjectId:string, SenderIPv4:string, SenderIPv6:string, SenderMailFromAddress:string, SenderMailFromDomain:string, Subject:string, ThreatTypes:string, ThreatNames:string, TimeGenerated:datetime, Timestamp:datetime, UrlCount:int, UserLevelAction:string, UserLevelPolicy:string, SourceSystem:string, Type:string)
[
h@"https://siempipestorage.blob.core.windows.net/am-emailevents/WorkspaceResourceId=/subscriptions/f77542d9-6668-/resourcegroups/rgoperations/providers/microsoft.operationalinsights/workspaces/azulabs/y=2021/m=09/d=11/h=21/m=00/PT1H.json?sv=2019-07-07&sr=b&sig=&se=2021-09-14T03%3A29%3A16Z&sp=r",
h@"https://siempipestorage.blob.core.windows.net/am-emailevents/WorkspaceResourceId=/subscriptions/f77542d9-6668-/resourcegroups/rgoperations/providers/microsoft.operationalinsights/workspaces/azulabs/y=2021/m=09/d=12/h=06/m=00/PT1H.json?sv=2019-07-07&sr=b&sig=&se=2021-09-14T03%3A29%3A16Z&sp=r",
h@"https://siempipestorage.blob.core.windows.net/am-emailevents/WorkspaceResourceId=/subscriptions/f77542d9-6668-/resourcegroups/rgoperations/providers/microsoft.operationalinsights/workspaces/azulabs/y=2021/m=09/d=12/h=11/m=00/PT1H.json?sv=2019-07-07&sr=b&sig=%&se=2021-09-14T03%3A29%3A16Z&sp=r"
]
with(format="json")
To see this in action click the link below:
https://swiftsolvesblog.blob.core.windows.net/images/genstoragectxkql-ps1-animation.gif
FUTURES:
1. create parallelism in script to improve performance.
2. Additional function to then prompt for a custom kql query input after the externaldata lookup and execute kql query and export results in a csv format.
3. error handling for schema not found - create generic kql query instead
4. allow script as a parameter to output into a chosen file directory, could be ran on Windows or Linux.
5. create a Azure Sentinel Notebook to lookup Sentinel incident then prompt to look up historical dates and bring data back into notebook for searching purposes
6. create a similar option with ADX cluster - a script to generate the database, pull schema in, and bring the data into the ADX cluster database.