The developers at Mystique Unicorn process files as soon as they arrive. They want to switch to an event-driven architecture. They were looking for a custom trigger, whether it be payload-based or time-based, to process files efficiently.
They heard about Azure's capabilities for event processing. Can you help them implement this event processing at Mystique Unicorn?
Our solution enables seamless event processing on Azure Blob Storage through the use of Azure Functions and HTTP triggers. With a simple HTTP payload containing the blob_name
, the function can retrieve the corresponding blob using an input binding and process it accordingly. Our solution also includes an output binding to persist the processed event back to Blob Storage.
By leveraging the power of Bicep, all necessary resources can be easily provisioned and managed with minimal effort. Our solution uses Python for efficient event processing, allowing for quick and easy deployment of sophisticated event processing pipelines.
-
This demo, instructions, scripts and bicep template is designed to be run in
westeurope
. With few or no modifications you can try it out in other regions as well(Not covered here).- π Azure CLI Installed & Configured - Get help here
- π Bicep Installed & Configured - Get help here
- π VS Code & Bicep Extenstions - Get help here
-
-
Get the application code
https://github.com/miztiik/azure-blob-input-binding-to-function cd azure-blob-input-binding-to-function
-
-
Let check you have Azure Cli working with
# You should have azure cli preinstalled az account show
You should see an output like this,
{ "environmentName": "AzureCloud", "homeTenantId": "16b30820b6d3", "id": "1ac6fdbff37cd9e3", "isDefault": true, "managedByTenants": [], "name": "YOUR-SUBS-NAME", "state": "Enabled", "tenantId": "16b30820b6d3", "user": { "name": "miztiik@", "type": "user" } }
-
- Stack: Main Bicep
This will create the following resoureces
-
General purpose Storage Account with blob container
- This will be used by Azure functions to store the function code
-
Storage Account with blob container -
- This will be used to store the events
-
Python Azure Function
- Input, Trigger, Output Binding to the blob container for events
# make deploy sh deployment_scripts/deploy.sh
-
After successfully deploying the stack, Check the
Resource Groups/Deployments
section for the resources. - Stack: Main Bicep
This will create the following resoureces
-
-
Upload file(s) to blob
Get the storage account and container name from the output of the deployment. Upload a file to the container and check the logs of the function app to see the event processing in action.
Sample bash script to upload files to blob container. You can also upload manually from the portal,
FILE_NAME_PREFIX=$(openssl rand -hex 4) FILE_NAME="${RANDOM}_$(date +'%Y-%m-%d')_event.json" SA_NAME="warehouseg2hpj3003" CONTAINER_NAME="store-events-blob-003" echo -n "{\"message\": \"hello world on $(date +'%Y-%m-%d')\"}" > ${FILE_NAME} az storage blob upload \ --account-name ${SA_NAME} \ --container-name ${CONTAINER_NAME} \ --name ${FILE_NAME} \ --file ${FILE_NAME} \ --auth-mode login
Trigger the function with the following payload. You can also trigger the function from the portal,
{ "blob_name": "29050_2023-05-05_event" }
# Create JSON file echo -n "{\"name\": $RANDOM, \"timestamp\": \"$(date -u +"%Y-%m-%dT%H:%M:%SZ")\"}" > "${BLOB_NAME}" # Upload file to Azure Blob container az storage blob upload --account-name "${STORAGE_ACCOUNT}" \ --container-name "${CONTAINER}" \ --name "${BLOB_NAME}" \ --file "${BLOB_NAME}" \ --auth-mode login
JSON_DATA='{"blob_name":"29050_2023-05-05_event"}' URL="https://store-backend-fnapp-003.azurewebsites.net/api/store-events-consumer-fn-003" curl -X POST \ -H "Content-Type: application/json" \ -d "${JSON_DATA}" \ "${URL}"
You should see an output like this,
Blob 29050_2023-05-05_event.json processed
-
-
This solution has also bootstrapped the function with applicaiton insights. This gives us a lot of insights on how the input/output bindings works. For example, when we look at the application map, we can see the calls made to the blob storage and also the percentage of success/failure. If we drill down the errors we can see the error code and the error message.
We can observe that the output binding triggers a
HEAD
call to the storage account to check if the blob exists. It results in error404
as the blob does not exist, it will create a new blob with the processed data.2023-05-04T19:12:38.0139679Z HEAD https://warehouseg2hpj3003.blob.core.windows.net/store-events-blob-003/processed/2023-05-04T19-12-37Z_e1c41b31-1867-4005-9041-4f031a8ba2b1_42ec4195_bulk.json 2023-05-04T19:12:38.0417501Z HEAD https://warehouseg2hpj3003.blob.core.windows.net/store-events-blob-003/processed/2023-05-04T19-12-37Z_e1c41b31-1867-4005-9041-4f031a8ba2b1_42ec4195_bulk.json 2023-05-04T19:12:38.0616047Z PUT https://warehouseg2hpj3003.blob.core.windows.net/store-events-blob-003/processed/2023-05-04T19-12-37Z_e1c41b31-1867-4005-9041-4f031a8ba2b1_42ec4195_bulk.json
Ofcourse the screenshots and the data will be different, as they were taken during different file upload process
-
Here we have demonstrated trigger Azure functions with http trigger and process blob files. You can extend the solution and configure the function to send the events to other services like Event Hub, Service Bus, or persist them to Cosmos etc.
If you want to destroy all the resources created by the stack, Execute the below command to delete the stack, or you can delete the stack from console as well
- Resources created during Deploying The Application
- Any other custom resources, you have created for this demo
# Delete from resource group
az group delete --name Miztiik_Enterprises_xxx --yes
# Follow any on-screen prompt
This is not an exhaustive list, please carry out other necessary steps as maybe applicable to your needs.
This repository aims to show how to Bicep to new developers, Solution Architects & Ops Engineers in Azure.
Thank you for your interest in contributing to our project. Whether it is a bug report, new feature, correction, or additional documentation or solutions, we greatly value feedback and contributions from our community. Start here
Buy me a coffee β.
- Azure Functions HTTP Trigger
- Azure Blob Storage Input Binding
- Azure Blob Storage Ouput Binding
- Azure Functions Best Practices
- Miztiik Blog - Blob Storage Event Processing with Python Azure Functions
Level: 100