100 Days of Cloud – Day 69: Azure Logic Apps

Its Day 69 of my 100 Days of Cloud journey, and today I’m getting my head around Azure Logic Apps.

Azure Logic Apps is a cloud-based platform for creating and running automated workflows that integrate your apps, data, services, and systems.

Comparison with Azure Functions

If this sounds vaguely familiar, it should because all the way back on Day 55 we looked at Azure Functions, which by definition allows you to create serverless applications in Azure.

So they both do the same thing, right? Well yes and no. So at this stage it important to show what the differences are between them.

Lets start with Azure Functions:

  • Lets you run event-triggered code without having to explicitly provision or manage infrastructure.
  • Azure Functions have a “Code-First” (imperative) for user experience and are primarily authored via Visual Studio or another IDE.
  • Azure Functions have about a dozen built-in binding types (mainly for other Azure services). If there isn’t an existing binding, you will need to write custom code to create new bindings.
  • With Azure Functions, you have 3 pricing options. You can opt for an App Service Plan, which gives you dedicated resources. The second option is completely serverless, with the Consumption plan based on resources consumed and number of executions. The third option is Functions Premium, which is a hybrid of both the App Service Plan and Consumption Plan.
  • As Azure Functions are code-first, the only options for deployment are Visual Studio, Azure DevOps or FTP.

Now, lets compare that with Azure Logic Apps:

  • Azure Logic Apps is a cloud service that helps you schedule, automate, and orchestrate tasks, business processes, and workflows when you need to integrate apps, data, systems, and services across enterprises or organizations.
  • Logic Apps have a “Designer-First” (declarative) experience for the user by providing a visual workflow designer accessed via the Azure Portal.
  • Logic Apps have a large collection of connectors to Azure Services, SaaS applications (Microsoft and others), FTP and Enterprise Integration Pack for B2B scenarios; along with the ability to build custom connectors if one isn’t available. Examples of connectors are:
    • Azure services such as Blob Storage and Service Bus
    • Office 365 services such as Outlook, Excel, and SharePoint
    • Database servers such as SQL and Oracle
    • Enterprise systems such as SAP and IBM MQ
    • File shares such as FTP and SFTP
  • Logic Apps has a pure pay-per-usage billing model. You pay for each action that gets executed. However, there are different pricing tiers available, more information is available here.
  • There are many ways to manage and deploy Logic Apps. They can be created and updated directly in the Azure Portal (which will automatically create a JSON template). The JSON template can also be deployed via Azure DevOps and Visual Studio.

The following list describes just a few example tasks, business processes, and workloads that you can automate using the Azure Logic Apps service:

  • Schedule and send email notifications using Office 365 when a specific event happens, for example, a new file is uploaded.
  • Route and process customer orders across on-premises systems and cloud services.
  • Move uploaded files from an SFTP or FTP server to Azure Storage.
  • Monitor tweets, analyze the sentiment, and create alerts or tasks for items that need review.


These are the key concepts to be aware of:

  • Logic app – A logic app is the Azure resource you create when you want to develop a workflow. There are multiple logic app resource types that run in different environments.
  • Workflow – A workflow is a series of steps that defines a task or process. Each workflow starts with a single trigger, after which you must add one or more actions.
  • Trigger – A trigger is always the first step in any workflow and specifies the condition for running any further steps in that workflow. For example, a trigger event might be getting an email in your inbox or detecting a new file in a storage account.
  • Action – An action is each step in a workflow after the trigger. Every action runs some operation in a workflow.

How Logic Apps work

The workflow begins with a trigger, which can have a pull or push pattern. Pull triggers are initiated when a regularly scheduled process finds new updates in the source data since its last pull, while push triggers are initiated each time new data is generated in the source itself.

Next, users define a series of actions that run either consecutively or concurrently, based on the specified trigger and schedule. Users can export the worklow to JSON and use this to create and deploy Logic Apps using tools like Visual Studio and Azure DevOps, or they can save logic apps as Azure Resource Manager templates to reuse.


Connectors are the most powerful aspect of the structure of a Logic App. Connectors are blocks of pre-built operations that communicate with 3rd-party services as steps in the workflow. Connectors can be nested within each other to provide complex solutions that meet exact use case needs. 

Azure contains a catalog of hundreds of available connectors and users can leverage these connectors to accomplish tasks without requiring any coding experience. You can find the full list of connectors here.

Use Cases

The following are the common use cases for Logic Apps:

  • Send an email alert to users based on data being updated in an on-premises database.
  • Query a database and send email notifications based on result criteria.
  • Communication with external platforms and services.
  • Data transformation or ingestion.
  • Social media connectivity using built-in API connectors.
  • Timer- or content-based routing.
  • Create business-to-business (B2B) solutions.
  • Access Azure virtual network resources.

We saw in the templates above how we can use an Event Grid resource event as a trigger, the tutorial here gives an excellent run through of creating a Logic App based on an Event Grid resource event and using the O365 Email connector.


So thats an overview of Azure Logic Apps and also how it compares to Azure Functions. You can find out more about Azure Logic Apps in the official documentation here.

Hope you enjoyed this post, until next time!

100 Days of Cloud – Day 68: Azure Service Bus

Its Day 68 of my 100 Days of Cloud Journey, and today I’m looking at Azure Service Bus.

In the previous posts, we looked at the different services that Microsoft uses to handle events:

  • Azure Event Grid, which is an eventing backplane that enables event-driven, reactive programming. It uses the publish-subscribe model. Publishers emit events, but have no expectation about how the events are handled. Subscribers decide on which events they want to handle.
  • Azure Event Hubs, which is a big data streaming platform and event ingestion service. It can receive and process millions of events per second. It facilitates the capture, retention, and replay of telemetry and event stream data. 

Events v Messages

Both of the above services are based on events, and its important to understand the definition of an event:

  • An event is a lightweight notification of a condition or a state change. The publisher of the event has no expectation about how the event is handled. The consumer of the event decides what to do with the notification. Events can be discrete units or part of a series.
  • Discrete events report state change and are actionable. To take the next step, the consumer only needs to know that something happened. The event data has information about what happened but doesn’t have the data that triggered the event.

By contrast, a message is raw data produced by a service to be consumed or stored elsewhere. The message contains the data that triggered the message pipeline. The publisher of the message has an expectation about how the consumer handles the message. A contract exists between the two sides. For example, the publisher sends a message with the raw data, and expects the consumer to create a file from that data and send a response when the work is done.

Azure Service Bus

Azure Service Bus is a fully managed enterprise message broker with message queues and publish-subscribe topics (in a namespace). Service Bus is used to decouple applications and services from each other, whether hosted natively on Azure, on-premise, or from any other cloud vendor such as AWS or GCP. Messages are sent and kept into queues or topics until requested by consumers in a “poll” mode (i.e. only delivered when requested).

Azure Service Bus provides the following benefits:

  • Load-balancing work across competing workers
  • Safely routing and transferring data and control across service and application boundaries
  • Coordinating transactional work that requires a high-degree of reliability


  • Queues – Messages are sent to and received from queues. Queues store messages until the receiving application is available to receive and process them. Messages are kept in the queue until picked up by consumers, and are retrieved is on a first-in-first-out (FIFO) basis. A queue can have one or many competing consumers, but a message is consumed only once.
Image Credit: Microsoft
  • Topics: A queue allows processing of a message by a single consumer. In contrast to queues, topics and subscriptions provide a one-to-many form of communication in a publish and subscribe pattern. It’s useful for scaling to large numbers of recipients. Each published message is made available to each subscription registered with the topic. Publisher sends a message to a topic and one or more subscribers receive a copy of the message, depending on filter rules set on these subscriptions.
Image Credit: Microsoft

You can define rules on a subscription. A subscription rule has a filter to define a condition for the message to be copied into the subscription and an optional action that can modify message metadata.

  • Namespaces – A namespace is a container for all messaging components (queues and topics). Multiple queues and topics can be in a single namespace, and namespaces often serve as application containers.

There are also a number of advanced features available in Azure Service Bus:

  • Dead Letter Queue: this is a sub-queue to hold messages that could not be delivered or processed.
  • Consumption Mode: Azure Service Bus supports several consumption modes: pub/sub with a pull model, competing consumers, and partitioning can be achieved with the use of topics, subscriptions, and actions.
  • Duplicate Detection: Azure Service Bus supports duplicate detection natively.
  • Delivery Guarantee: Azure Service Bus supports three delivery guarantees: At-least-once, At-most-once, and Effectively once.
  • Message Ordering: Azure Service Bus can guarantee first-in-first-out using sessions.


Thats a brief overview of Azure Service Bus. You can learn more about Azure Service Bus in the Microsoft documentation here. Hope you enjoyed this post, until next time!

100 Days of Cloud – Day 67: Azure Event Hubs

Its Day 67 of my 100 Days of Cloud Journey, and today I’m looking at Azure Event Hubs.

In the last post, we looked at Azure Event Grid, which is a serverless offering that allows you to easily build applications with event-based architectures. Azure Event Grid contains a number of sources, and one of those is Azure Event Hub.

Whereas Azure Event Grid can take in events from sources and trigger actions based on those events, Azure Event Hubs is a big data streaming platform and event ingestion service. It can receive and process millions of events per second. Data sent to an event hub can be transformed and stored by using any real-time analytics provider or batching/storage adapters.

One of the key difference between the 2 services is that while Event Grid can plug directly into Azure services and listen for events coming from their sources, Event Hubs can listen for events coming from sources outside of Azure, and can handle millions of events coming from multiple devices.

The following scenarios are some of the scenarios where you can use Event Hubs:

  • Anomaly detection (fraud/outliers)
  • Application logging
  • Analytics pipelines, such as clickstreams
  • Live dashboards
  • Archiving data
  • Transaction processing
  • User telemetry processing
  • Device telemetry streaming


Azure Event Hubs represent the “front door” (or Event Ingestor to give it the correct name) for an event pipeline, and sits between event producers and event consumers. It decouples the process of producing data from the process of consuming data. You can publish events individually or in batches.

Azure Event Hubs are built around the concept of partitions and consumer groups. Inside an Event Hub, events are sent to partitions by specifying the partition key or partition id. Partition count of an Event Hub cannot be changed after creation so is mindful of this limitation.

Image Credit: Microsoft

Receivers are grouped into consumer groups. A consumer group represents a view (state, position, or offset) of an entire event hub. It can be thought of as a set of parallel applications that consume events at the same time.

Consumer groups enable receivers to each have a separate view of the event stream. They read the stream independently at their own pace and with their own offsets. Event Hub uses a partitioned consumer pattern; events are spread across partitions to allow horizontal scale. Events can be stored in either Blob Storage or Data Lake, this is configured when the initial event hub is created.

Image Credit: Microsoft

Use Cases

Event Hubs is the component to use for real-time and/or streaming data use cases:

  • Real-time reporting
  • Capture streaming data into files for further processing and analysis – e.g. capturing data from micro-service applications or a mobile app
  • Make data available to stream-processing and analytics services – e.g. when scoring an AI algorithm
  • Telemetry streaming & processing
  • Application logging

Event Hubs is also available as a feature for Azure Stack Hub, which allows you to realize hybrid cloud scenarios. Streaming and event-based solutions are supported, for both on-premises and Azure cloud processing.


You can learn more about Event Hubs in the Microsoft documentation here. Hope you enjoyed this post, until next time!

100 Days of Cloud – Day 66: Azure Event Grid

Its Day 66 of my 100 Days of Cloud Journey, and today I’m looking at Azure Event Grid, which I came across during my Az-204 studies.

Azure Event Grid is a serverless offering that allows you to easily build applications with event-based architectures. First, select the Azure resource you would like to subscribe to, and then give the event handler or WebHook endpoint to send the event to.

Image Credit: Microsoft


Azure Event Grid uses the following concepts which you will need to understand:

  • Events: An event is the smallest amount of information that fully describes something that happened in the system. Every event has common information like: source of the event, time the event took place, and unique identifier. An example of common events would be a file being uploaded, a Virtual Machine deleted, a SKU being added etc.
  • Publishers: A publisher is the user or organization that decides to send events to Event Grid.
  • Event Sources: An event source is where the event happens. Each event source is related to one or more event types. For example, Azure Storage is the event source for blob created events. The following Azure services support sending events to Event Grid:
    • Azure API Management
    • Azure App Configuration
    • Azure App Service
    • Azure Blob Storage
    • Azure Cache for Redis
    • Azure Communication Services
    • Azure Container Registry
    • Azure Event Hubs
    • Azure FarmBeats
    • Azure IoT Hub
    • Azure Key Vault
    • Azure Kubernetes Service (preview)
    • Azure Machine Learning
    • Azure Maps
    • Azure Media Services
    • Azure Policy
    • Azure resource groups
    • Azure Service Bus
    • Azure SignalR
    • Azure subscriptions
  • Topics: The event grid topic provides an endpoint where the source sends events. A topic is used for a collection of related events.
  • Event Subscriptions:A subscription tells Event Grid which events on a topic you’re interested in receiving. When creating the subscription, you provide an endpoint for handling the event.
  • Event Handlers: From an Event Grid perspective, an event handler is the place where the event is sent. The handler takes some further action to process the event. The supported event handlers are:
    • Webhooks. Azure Automation runbooks and Logic Apps are supported via webhooks.
    • Azure functions
    • Event hubs
    • Service Bus queues and topics
    • Relay hybrid connections
    • Storage queues


The key features of Azure Event Grid are:

  • Simplicity – You can direct events from any of the sources listed above to any event handler or endpoint.
  • Advanced filtering – Filter on event type to ensure event handlers only receive relevant events.
  • Fan-out – Subscribe several endpoints to the same event to send copies of the event to as many places as needed.
  • Reliability – 24-hour retry ensures that events are delivered.
  • Pay-per-event – Azure Event Grid uses a pay-per-event pricing model, so you only pay for what you use. The first 100,000 operations per month are free.
  • High throughput – Build high-volume workloads on Event Grid and scale up/down or in/out as required.
  • Built-in Events – Get up and running quickly with resource-defined built-in events.
  • Custom Events – Use Event Grid to route, filter, and reliably deliver custom events in your app.

Use Cases

Azure Event Grid provides several features that vastly improve serverless, ops automation, and integration work:

  • Serverless application architectures
Image Credit: Microsoft

Event Grid connects data sources and event handlers. For example, use Event Grid to trigger a serverless function that analyzes images when added to a blob storage container.

  • Ops Automation
Image Credit: Microsoft

Event Grid allows you to speed automation and simplify policy enforcement. For example, use Event Grid to notify Azure Automation when a virtual machine or database in Azure SQL is created. Use the events to automatically check that service configurations are compliant, put metadata into operations tools, tag virtual machines, or file work items.

  • Application integration
Image Credit: Microsoft

Event Grid connects your app with other services. For example, create a custom topic to send your app’s event data to Event Grid, and take advantage of its reliable delivery, advanced routing, and direct integration with Azure. Or, you can use Event Grid with Logic Apps to process data anywhere, without writing code.


You can learn more about Event Grid in the Microsoft documentation here. Hope you enjoyed this post, until next time!

100 Days of Cloud – Day 55: Azure Functions

Its Day 55 of my 100 Days of Cloud journey, and today I’m going to attempt to understand and explain Azure Functions.

What are Azure Functions?

Azure Functions is one of the ways that allow you to create serverless applications in Azure. All you need to do is write the code you need for the problem or task that you wish to perform, without having to worry about create a whole application or infrastructure to run the code for you.

Depending on what language you need your application to use, this link gives full details for the languages that are supported for Azure Functions. There are also Developer References for each of the languages which give full details of how to develop your desired functions using the supported languages. Azure Functions uses a code-first (imperative) development model

All functions contain 2 pieces – your code and the config file, which is called function.json. The function.json file contains the function’s trigger, bindings and other configuration settings.

Function App

A function app provides an execution context in Azure in which your functions run. As such, it is the unit of deployment and management for your functions. A function app is comprised of one or more individual functions that are managed, deployed, and scaled together.

A function app requires a general Azure Storage account, which supports Azure Blob, Queue, Files, and Table storage.

Hosting Plans

When you create a function app in Azure, you must choose a hosting plan for your app. There are three basic hosting plans available for Azure Functions:

  • Consumption plan – This is the default hosting plan. It scales automatically and you only pay for compute resources when your functions are running. Instances of the Functions host are dynamically added and removed based on the number of incoming events.
  • Functions Premium plan – Automatically scales based on demand using pre-warmed workers which run applications with no delay after being idle, runs on more powerful instances, and connects to virtual networks.
  • App service plan – Run your functions within an App Service plan at regular App Service plan rates. Best for long-running scenarios where Durable Functions can’t be used.

Triggers and Bindings

What is the funniest comedy sketch of all time? Mirror writers pick their  most hilarious moments - Mirror Online
A different kind of Trigger ……

Azure Functions are event driven – this means that an event or trigger is required in order for the function to run and the underlying code to execute. Each function must only have one trigger.

The most common types of triggers are:

  • Timer – Execute a function at a set interval.
  • HTTP – Execute a function when an HTTP request is received.
  • Blob – Execute a function when a file is uploaded or updated in Azure Blob storage.
  • Queue – Execute a function when a message is added to an Azure Storage queue.
  • Azure Cosmos DB – Execute a function when a document changes in a collection.
  • Event Hub – Execute a function when an event hub receives a new event.

Bindings are a way to both declaratively connect resources to functions and also to pass parameters from resources into a function. Bindings can be created as Input bindings, Output bindings or both.

Triggers and bindings let you avoid hardcoding access to other services within your code, therefore making it re-usable. Your function receives data (for example, the content of a queue message) in function parameters. You send data (for example, to create a queue message) by using the return value of the function.


We can see in our hosting plans above that depending on which one you choose, this wil dictate how Azure Functions will scale and the maxiumum resouirces that are assigned to a function app.

Azure Functions uses a component called the scale controller to monitor the rate of events and determine whether to scale out or scale in. The scale controller uses heuristics for each trigger type. For example, when you’re using an Azure Queue storage trigger, it scales based on the queue length and the age of the oldest queue message.

Scaling can vary on a number of factors, and scale differently based on the trigger and language selected. There are a some scaling behaviors to be aware of:

  • Maximum instances: A single function app only scales out to a maximum of 200 instances. A single instance may process more than one message or request at a time though, so there isn’t a set limit on number of concurrent executions.
  • New instance rate: For HTTP triggers, new instances are allocated, at most, once per second. For non-HTTP triggers, new instances are allocated, at most, once every 30 seconds. Scaling is faster when running in a Premium plan.

By default, Consumption plan functions scale out to as many as 200 instances, and Premium plan functions will scale out to as many as 100 instances. You can specify lower maximum instances for each app if required.

Real-World Examples

So while all of the above theory is interesting, we still haven’t answered the key question which is where would we need to use Azure Functions?

Lets take a look at some real world examples of where Azure Functions would be useful:

  • Take a snapshot of a Virtual Machine before updates are scheduled to be applied.
  • Monitor expiry dates of Certificates and trigger an email to be sent 30 days before they expire.
  • When a Virtual machine is deleted, remove it from Monitoring.
  • When a CPU spikes above 90%, send a message to a Teams Channel.


So thats a whistle stop overview of Azure Functions. There are tons of brilliant resources out there where you can dive in and learn about Azure Functions in greater depth, such as the Microsoft Learn Module as part of the AZ-204 learning path which gives a full lab on creating your own function using a HTTP trigger.

Hope you enjoyed this post, until next time!