Integrating with Azure Sentinel

Edinburgh, UK – 5th October 2020 – Calum Finlayson, Cyber Security Analyst at Satisnet, notes his experiences of integrating into Azure Sentinel, for unified threat and vulnerability management.


This blog is the first in what will be a series discussing how we at Satisnet have attempted to integrate vulnerability scan data from Tenable Nessus into the Azure Sentinel security platform. This first entry will primarily discuss the challenges faced with trying to ingest currently unsupported log sources into Sentinel and will provide a few options for doing so. The series will then go on to discuss specifically how we have used the data from Tenable to build out visualizations and to enrich incidents and aid in investigation.

Azure Sentinel comes with connectors for various security products which allow for easy integration with Log Analytics. Once data is in Log Analytics it can be queried and interrogated to populate ‘dashboards’ known as ‘workspaces’, and to generate alerts and incidents that can be investigated by security analysts. The connectors supplied at the moment extend to a variety of common solutions, but we were recently faced with the question of how best to ingest logs from a source that isn’t currently supported natively in Sentinel. Microsoft provide a useful list of supported connectors as well as a discussion of custom connectors on their Tech Community site.

The rest of this post will aim to explain the though process behind how we aimed to accomplish the goal of integrating data from Tenable vulnerability scans but could easily be applied to other non-natively supported log sources. We will discuss the solutions identified individually and then provide an overview of the pros and cons associated with each method.


The three main approaches that we were able to identify that would meet our needs were:

  • Azure Logic App ingestion

  • A script feeding a Logic App

  • A script feeding Log Analytics via Logstash or Sentinel Agent


The first method that we investigated was to use a Logic App to pull in the data from Tenable and send it to Log Analytics. A Logic App is an automated workflow that allows you to build out automated actions without having to write code. It proved to be very intuitive to use for the most part, and allows for powerful functionality without having to grapple with code.

The workflow for this approach ended up looking like this:

The Logic App can be scheduled to run on a periodic basis or through other triggers. This allows you to schedule the app to run after a scan has been completed so you can make sure to pull in the new data. In our example the Logic App can then either use a generic HTTP callout block to query the API and retrieve a list of vulnerabilities that have been seen since the query was last run or can call out via a custom connector that has been set up from the API specification. Then the Logic App will use a built in function to send the data returned from the Tenable API to Log Analytics into a custom log. There is also plenty of flexibility within the Logic App that allows you to parse and transform the data before sending it on to Log Analytics.


In this solution we combined the power of the Logic Apps with a Python script that queries the Tenable API. This allows data parsing to be performed before the data ever hits Azure.

The flow of this approach looks like:

The Python app can be configured to run on a schedule via a cronjob or other scheduling method, and can interrogate the Tenable API for relevant data. This data can then be parsed out in the Python script and the relevant formatted data can be pushed to a Logic App that uses the HTTP endpoint trigger to await data. Alternatively, the script can save the data returned from Tenable into blob storage which can trigger a Logic App run. Then the data can be pushed into Log Analytics through the same method as in the pure Logic App approach.


The final approach we considered is an approach that entirely avoids using the Logic Apps. This method either uses the Sentinel Agent or Logstash to push data from a script to Sentinel Log Analytics. The Python script that we discussed in the past example can be repurposed for this, as all that is typically required is to save data from the Tenable API to a file which is in a directory being monitored by either Logstash or the Sentinel Agent.

The flow for this method looks like this:

The Python script calls out to the API for the relevant data and stores it in a file. Both Logstash and the Sentinel Agent can be configured to monitor a directory for new files, and upon seeing one will transform and send the data to Log Analytics directly. Logstash in particular provides excellent capabilities when it comes to parsing and transforming data due to the grok parsing capabilities it has built in as well as a variety of plugins to interpret data from different sources.


Each method comes with its pros and cons that may vary depending on whether you’re attempting to push in data from an on prem source or a cloud-based API.

Costs can also vary between the methods. Below is a brief overview of the pros and cons of each method:

Key takeaways are that the first method that relies solely on a Logic App doesn’t require any on premise infrastructure as all the processing is handled in the Azure cloud. This does however mean that you are paying for all the processing, which can add up if performing large amounts of parsing and transformation of the data as logic apps are billed on a “per action” basis. The last method is the opposite in that you don’t end up doing any computing on the cloud, but you do still pay for data ingestion costs. Both of the Python script options require some sort of computing power to run on, so if you aren’t running it on premise it may require spinning up a virtual machine or using a function app.


We have explored three different ways to ingest data into Sentinel Log Analytics. For our specific business needs the pure Logic App approach seems to be the best method for the time being. Since the API is cloud based then querying the API and ingesting the data natively in the Azure cloud prevents us from needing to dedicate any on premise resources to the parsing and/or storage of data. The Logic Apps also log to our Azure workspace and so any issues with the Log ingestion can be seen in the same place as the logs themselves, rather than having to access a machine and investigate why a script may or may not have run. Furthermore, having access to the vulnerability data through Logic Apps is an exciting prospect as Logic Apps can be run in response to incidents, allowing for the automated enrichment of Sentinel incidents with relevant data about the host and/or vulnerabilities.


Tenable Nessus

Sentinel Connectors

Logstash Sentinel Plugin

Parsing JSON in Logic Apps

Logic App Pricing