AZ Functions Queue and Storage Triggers
Azure Storage provides three storage services that Azure Functions can interact with through triggers and bindings: Queue Storage, Blob Storage, and Table Storage. This topic focuses on how to build functions that react to messages arriving in queues and files uploaded to blob storage — two of the most common patterns in cloud applications.
Azure Storage Overview
┌──────────────────────────────────────────────────────────────┐ │ AZURE STORAGE SERVICES │ │ │ │ ┌──────────────────┐ ┌──────────────────┐ │ │ │ Queue Storage │ │ Blob Storage │ │ │ │ │ │ │ │ │ │ Messages/Tasks │ │ Files/Images/ │ │ │ │ (like a to-do │ │ Videos/Docs │ │ │ │ list) │ │ (like a folder) │ │ │ └──────────────────┘ └──────────────────┘ │ │ │ │ ┌──────────────────┐ │ │ │ Table Storage │ │ │ │ │ │ │ │ Structured data │ │ │ │ (like a simple │ │ │ │ spreadsheet) │ │ │ └──────────────────┘ │ └──────────────────────────────────────────────────────────────┘
Part 1 – Queue Storage Trigger
What Is a Storage Queue?
A storage queue is a simple message holding area in Azure. One service puts a message in the queue (the producer) and another service reads and processes that message (the consumer). Azure Functions acts as the consumer — it watches the queue and fires automatically when a new message arrives.
┌────────────────────────────────────────────────────────────────┐
│ QUEUE TRIGGER FLOW │
│ │
│ Producer (Web App) │
│ │ │
│ │ Puts message: {"orderId": "ORD-101", "qty": 3} │
│ ▼ │
│ Azure Storage Queue ("orders") │
│ │ │
│ │ Azure detects new message │
│ ▼ │
│ Queue Trigger fires → Your Function runs │
│ │ │
│ │ Processes order, saves to database │
│ ▼ │
│ Message deleted from queue (automatically by Azure) │
└────────────────────────────────────────────────────────────────┘
Queue Trigger – function.json
{
"bindings": [
{
"type": "queueTrigger",
"direction": "in",
"name": "orderMessage",
"queueName": "orders",
"connection": "AzureWebJobsStorage"
}
]
}
Queue Trigger – index.js
module.exports = async function (context, orderMessage) {
context.log("New order received from queue");
context.log("Message content:", orderMessage);
// orderMessage is automatically parsed if it was JSON
const orderId = orderMessage.orderId;
const quantity = orderMessage.qty;
context.log(`Processing order: ${orderId}, Quantity: ${quantity}`);
// Simulate saving to database
await saveOrderToDatabase(orderId, quantity);
context.log(`Order ${orderId} saved successfully`);
// Azure automatically removes the message from queue after function succeeds
};
async function saveOrderToDatabase(orderId, qty) {
// Database logic goes here
return true;
}
Queue Trigger Properties
| Property | Description | Example Value |
|---|---|---|
| queueName | Name of the queue to monitor | "orders" |
| connection | App setting name holding the storage connection string | "AzureWebJobsStorage" |
| name | Parameter name in function code | "orderMessage" |
Message Retry and Poison Messages
┌──────────────────────────────────────────────────────────────┐ │ QUEUE RETRY MECHANISM │ │ │ │ Message arrives in queue │ │ │ │ │ ▼ │ │ Function runs → Fails (error thrown) │ │ │ │ │ ▼ │ │ Azure retries (up to 5 times by default) │ │ │ │ │ ├── Succeeds on retry → Message deleted ✓ │ │ │ │ │ └── Still fails after 5 tries → Message moved to │ │ "orders-poison" queue for manual inspection ✗ │ └──────────────────────────────────────────────────────────────┘
Queue Output Binding – Writing Messages
// function.json: Add output binding for a different queue
{
"type": "queue",
"direction": "out",
"name": "notificationQueue",
"queueName": "notifications",
"connection": "AzureWebJobsStorage"
}
// index.js: Send a message to another queue
module.exports = async function (context, orderMessage) {
// Process the incoming order
const orderId = orderMessage.orderId;
// Send a notification message to a different queue
context.bindings.notificationQueue = JSON.stringify({
type: "order_confirmed",
orderId: orderId,
timestamp: new Date().toISOString()
});
context.log(`Order ${orderId} processed and notification sent`);
};
Part 2 – Blob Storage Trigger
What Is Blob Storage?
Blob storage is Azure's cloud file system. "Blob" stands for Binary Large Object. It stores any type of file — images, videos, PDFs, JSON files, CSV exports, and more. The Blob trigger fires a function whenever a new file is added or an existing file is updated in a blob container.
┌──────────────────────────────────────────────────────────────┐ │ BLOB TRIGGER FLOW │ │ │ │ User uploads photo.jpg to "uploads" container │ │ │ │ │ ▼ │ │ Azure detects new blob in container │ │ │ │ │ ▼ │ │ Blob Trigger fires → Function receives the file │ │ │ │ │ │ Resizes image → Saves to "thumbnails" container │ │ ▼ │ │ Process complete │ └──────────────────────────────────────────────────────────────┘
Blob Trigger – function.json
{
"bindings": [
{
"type": "blobTrigger",
"direction": "in",
"name": "myBlob",
"path": "uploads/{name}",
"connection": "AzureWebJobsStorage"
}
]
}
Path Pattern Explanation
| Pattern | Meaning |
|---|---|
| uploads/{name} | Any file in the "uploads" container. {name} captures the filename. |
| uploads/{name}.jpg | Only JPG files in the "uploads" container |
| uploads/2024/{name} | Files in the "2024" subfolder of "uploads" |
Blob Trigger – index.js
module.exports = async function (context, myBlob) {
// context.bindingData.name gives the filename
const fileName = context.bindingData.name;
const blobSize = myBlob.length;
context.log(`New file uploaded: ${fileName}`);
context.log(`File size: ${blobSize} bytes`);
// myBlob is a Buffer containing the file's binary content
// Here you could: resize image, parse CSV, extract text from PDF, etc.
if (fileName.endsWith(".csv")) {
context.log("CSV file detected - starting data import...");
await processCsvData(myBlob);
}
};
async function processCsvData(buffer) {
const csvContent = buffer.toString("utf-8");
const rows = csvContent.split("\n");
console.log(`Found ${rows.length} rows in CSV`);
}
Blob Input Binding – Read a Specific File
// function.json: Read a configuration file using input binding
{
"type": "blob",
"direction": "in",
"name": "configFile",
"path": "config/settings.json",
"connection": "AzureWebJobsStorage"
}
// index.js
module.exports = async function (context, req, configFile) {
const settings = JSON.parse(configFile.toString());
context.log("Config loaded:", settings);
// Use settings in function logic
};
Blob Output Binding – Write a File
// function.json: Write a file to blob storage
{
"type": "blob",
"direction": "out",
"name": "outputBlob",
"path": "reports/{DateTime}.json",
"connection": "AzureWebJobsStorage"
}
// index.js
module.exports = async function (context, myTimer) {
const report = {
generatedAt: new Date().toISOString(),
totalOrders: 142,
revenue: 98500
};
// Azure saves this to blob storage automatically
context.bindings.outputBlob = JSON.stringify(report);
context.log("Report saved to blob storage");
};
Queue vs Blob – When to Use Which
| Scenario | Use |
|---|---|
| Process tasks or jobs one at a time | Queue Trigger |
| Decouple two services (producer/consumer) | Queue Trigger |
| React when a file is uploaded | Blob Trigger |
| Process uploaded images, CSVs, or PDFs | Blob Trigger |
| Send a message to another service | Queue Output Binding |
| Save a file to cloud storage | Blob Output Binding |
Local Development with Azurite
Both Queue and Blob triggers require a storage account. During local development, Azurite emulates Azure Storage on the machine. The connection string in local.settings.json points to Azurite:
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "node"
}
}
The value "UseDevelopmentStorage=true" tells the Azure SDK to connect to Azurite instead of the real Azure cloud.
