This is a follow up to my initial blog post On-demand Windows Diagnostic Logs via Intune. To make the life easier when dealing with the uploaded DiagLog archives I’ve created a small Logic App which is triggered by an Azure Blob Storage Event when a new blob is created. The Logic App sends an email to a defined address and deletes the DiagLogs after x days. With this I’ll get informed as an administrator about a new DiagLog archive via email which includes the download link and I don’t need to do any cleanup action on the blog storage as the DiagLog archive will be cleaned up automatically by the Logic App. To implement this solution follow these simple steps:
- Make sure your Storage account is a StorageV2 account:

- On your subscription make sure the resource provider Microsoft.EventGrid is registered, if not register it

- Go to your storage account click on Events and Create the Logic App for “When a new blob is uploaded“

- Create the necessary connections to Azure Blob Storage and Azure Event Grid

- Reorganize the elements by drag and drop and add a few more steps. First my complete sequence to see the simple structure (sequence) and then I’ll walk through the sequence step by step:

- Edit the step “When a resource event occurs”

- Add two variables for later usage, DaysToKeep and SAS for a SAS signature (generated in the same way as for the DiagnosticLog CSP – learn more about SAS here ‘Using shared access signatures (SAS)‘). This time we need it in the following form like bellow. This is needed to append to the download URL to get access to the DiagLog blob:
?sv=2018-03-28&ss=b&srt=o&sp=rl&se=2020-04-25T19:44:30Z&st=2019-04-24T11:44:30Z&spr=https&sig=xxx

- Add the “Send email V2” step and use the variables to construct the Body text. For the download SAS URL construction use an expression as follows, which will be displayed then as data.url:
triggerbody()?['data']?['url']
- for the Subject field use the expression:
split(triggerBody()?['subject'], '/')?[6]

- Finally modify the delay step with your variable DaysToKeep:

- In the Compose step you need to fill in
split(triggerBody()?['subject'], '/')?[4]/split(triggerBody()?['subject'], '/')?[6]

- The Delete blob step just takes the output from the compose step

If you have done everything right and a new diagnostic archive blob is uploaded you will receive a new email with a direct download link to the DiagLog archive:

In your Logic App history you will see jobs waiting the specified days before the final delete of the blob occurs:

I hope this helps in dealing with DiagLog requests from an operational perspective.
Thanks for this, really great stuff.
Managed to set this up and make it work in a lab to the point that I get the email when the log package is uploaded to the blob. But I’m struggling with the Compose part of the sequence that should be chained to the delete blob, looks like you have two (split) expressions there in the picture? Any hints would be appreciated. 🙂
Just to follow up – I found another easy way of doing the last part with the builtin “Delete old Azure blobs” template in Logic Apps. Just created a new logical app with that template that deletes the files. Really fun playing around with this, thanks again. 🙂
I’m glad you got it working and you are totally right I forgot to add the last two steps, probably it was to late during writing :-). I’ve added them to show my complete LogicApp now. Thanks for making me aware!
best,
Oliver