Alert on VM conditions with Azure Functions and Python

Ovidiu Borlean
5 min readJun 19, 2022

Altough the Azure Monitor with Azure Container Insights are a great solution for monitoring defferent condition of resources inside a cluster, we often need some custom solutions, at least for the performance testing on the Node.

Currently describe solution consist of the following components:

1. Azure Function — By implementing an HTTP Endpoint for receiving data from the sender.

For our PoC we have implemented the following points:

- date : Unformatted text, used for sending Date/Time information from the Sender. There is no format checker on this field, please feel free to add one, or use your own parser.

- host: For multiple senders, can be used to differentiate between hosts

- message: In this context, it is used to send output of different command executed on the selected nodes. In example sender provided, it will send the output of ps aux command

2. Sender Application — A simple Python application that use request library to make HTTP Post request towards the Azure Functions Endpoint

The alerting rules is implemented at the application level by comparing the threshold value with read value of free –m command


1. We will start by creating Azure Function:

  • In Azure Portal will search for Function App and select the corresponding blade
  • We will select a Resource Group, a unique App name and Python as Runtime stack. We can leave other setting as default for now and choose Review + Create

As the transmitted data will be save in a Storage Account, we will need to connect or Storage to Azure Function (Buinding).

We select our existing Storage Account and will get the Connection String in order to use it for writing Blobs in Containers

In Function App Blade, will choose Configuration and will add the values name and the Connection String from our Azure Storage Account:

After this operation, we will have a record containing the details for storage connectivity.

It is time to create our first function.

We choose as template: HTTP trigger function and as soon as the Function will be created, we chose the Code + Test Blade as follows:

Adding the following code in function body:

import logging

import azure.functions as func

def main(req: func.HttpRequest, outputBlob: func.Out[str]) -> func.HttpResponse:'Python HTTP trigger function processed a request.')

date = req.params.get('date')
host = req.params.get('host')
message = req.params.get('message')

if not date:
req_body = req.get_json()
except ValueError:
date = req_body.get('date')

if not host:
req_body = req.get_json()
except ValueError:
host = req_body.get('host')

if not message:
req_body = req.get_json()
except ValueError:
message = req_body.get('message')

if date and host and message:
az_output = str(date) + str(host) + str(message)
return func.HttpResponse(f"Hello, {host}. This HTTP triggered function executed successfully.")
return func.HttpResponse(
"This HTTP triggered function executed successfully. Pass a name in the query string or in the request body for a personalized response.",

On the Integration Panel we choose Azure Blob Storage, select our Storage account connection as defined previously and choose Blob parameter name. The last parameter used to call our storage binding from within the Python code. For the Path configuration, it should be in format container/blob, in our configuration it is defined as {rand-guid} which will generate a hopefully random value for the blob. Please make sure that the container is created in Storage Account before running this function.

As the coding part is over, we need to get the security code for accessing this function:

By adding this string in Postman we can see the URL and parameters of the function:

1. Implementation of Sender app

import requests
import os
import subprocess as sp

treshold = 1

used_memory = sp.getoutput("free -m | awk '/^Mem/ {print $3}'")

available_memory = sp.getoutput("free -m | awk '/^Mem/ {print $2}'")

memory_consumption = (float(used_memory) * 100/float(available_memory))

if memory_consumption > treshold:
print("Memory Usage Alert. Sending data to Storage")
output = sp. getoutput('ps aux')
data['date'] = "OvidiuBlabla"
data['host'] = "myhost"
data['message'] = output
params = {'code': 'ze7CDK1Qk_PHQFEBaM6bdgMknhM7OvPnMYtwqywVyjqI3AzIFuW0XxMQ=='}
response ='', params=params, json=data)

In params we add a dictionary with the key of “code” and the value of the content of our secret code.

We have defined also a dictionary structure consisting of the three keys (date, host, message) whose values can be overwritten by our code. In our example, first two keys are statically defined, the last one, message, carry the output of ps aux command.

For sake of simplicity, current script will calculate the percentage of memory consumed on the node, if the value is higher than our threshold value, will trigger the HTTP Post call to Azure Function with the defined data values, and those values would be written in a random-ganerated-name blob on the storage account.

The drawbacks on this method is that for every alert generated, will be created a new blob in defined Container. There is not possibility to append to existing blob as for now.

The Azure Function can be called also from a command line curl as follows:

curl -X POST -H “Content-Type: application/json” -d ‘{“date”: 123456, “host”: “VirtualaMachine”, “message”: “FailureAlert”}’

No looping mechanism has been implemented for this concept. You can use a simple crontab configuration, or implement an in-code while loop with desired delay between runs.

You can define your own schema for data transfer as long as it is the same schema defined in Azure Function and Sender application.