100 Days of Cloud – Day 75: Create your Microsoft 365 tenant and configure Azure AD Connect

Its Day 75 of my 100 Days of Cloud journey, and today I’m looking at how Azure AD Connect is configured and how it synchronizes your on-premise identities to the Azure AD Tenant for use in Microsoft 365.

But first up, lets take a look at how we can create our Microsoft 365 tenancy and get it configured for use with our domains so its ready for use.

Create your Microsoft 365 Tenant

To create your tenant, you need to browse to the Office 365 E3 product page and click on the “Free Trial” option. E3 is the default trial option as it gives you the best experience of all the tools available for 30 days:

Clicking on “Free Trial” brings you into the registration screen. You need to enter the first email address you want to use with the tenant. This won’t configure anything, its just checking that the domain isn’t already configured for Microsoft 365.

We can also see on the right of the screen whats included with the E3 license, plus the benefits. We are allowed up to 25 users for the trial, this is a good number for testing.

Click on “Create New Account”:

This brings you into the “Tell us about yourself” screen where you need to answer some questions about your organisation:

When you click next, you are asked to verify your identity via SMS or Call:

Once you get verified, you are then prompted to add your domain name. Add the domain name. Try to use the same domain name as the primary email domain that you want to use with the tenant.

Click “Next” and this will create your account. Once that completes, the screen below will appear and you can click on “Manage your subscription” to log in

And now you are logged into the Microsoft 365 admin center and can manage your subscription! This is always available to log on to at https://admin.microsoft.com/ using the credentials you created above.

NOTEThe tenant I have set up above is a trial and I’m only going to use it for testing and for the purposes of the blog. So at some point in the next 30 days, I’m going to delete it as I’ve done with Azure resources in previous blog posts and I would advise you to do so as well (unless you really want to pay for your own Microsoft 365 tenant). The article here shows how to do this from within the Microsoft 365 Services and subscriptions page.

Add your Domain to your tenant

So lets fast forward 30 days – the trial has ended, your users are happy and you’ve decided as a business to migrate your existing workloads to Microsoft 365. The next step is to add your production domain and verify it.

So in the Microsoft 365 admin center, go to the “Settings” menu and select “Domains”. You have the option here to buy a domain which will redirect you to a 3rd party provider, and you can only use this option once your trial period has ended. This is useful if you need

However, we’re going to add our existing domain, so click on “Add Domain”

This brings us into the “Add Domain” screen. Enter the domain name you want to use and click on the “Use this Domain” button at the bottom of the screen:

The next screen provides a list of options for verifying the domain. Now, because the blog is on WordPress, its giving me the option to sign in to WordPress to verify. Unless your Website is hosted on WordPress, you’re not going to see this option, but wil see the 3 options below that.

The most common is the option to “Add a TXT record to the domain’s DNS records”, so we’ll select that and click “Continue”:

This detects who the hosting provider is, and provides you with the TXT record you need to add to your public DNS Records, so I’ll do that in the background and click “Verify” (this may take up to 30 minutes after you add the TXT to work):

Once thats verified, we get a screen asking us to connect our domain and set up DNS records. Again, I’m seeing the option to let Microsoft add the records for me automatically to WordPress (and this may also work depending on who your hosting provider is), however I’m going to choose the second option to add my own DNS records so we can take a look at whats provided:

The next screen gives me the MX Records I need to get set up with email initially, and there are also options for Skype for Business and Intune MDM at the bottom of the screen if required.

I wanted to show you this page to ensure you understand the process and how it works. However at this stage, I’m going to go back to the previous screen and click “Skip and do this later”. The reason is that this will impact mailflow, and our configuration doesn’t have a Hybrid configuration in place yet to support the mailflow.

Once we finish, we get a screen to say the setup is complete, and we can see our domain listed in the admin center.

Azure AD Connect Installation

Once your domain is registered in the portal, you should now be in a position to synchronise your user accounts so its time to install and configure Azure AD Connect.

To do this, we go to the “Users” menu and select “Active Users”. Once that screen appears, we click on the “ellipses” and select “Directory synchronization”:

This brings us to a screen with an external link to download the Azure AD Connect tool:

At the time of writing this post, the current Azure AD Connect version is 2.1.1.0 and is only supported on Windows Server 2016 and Windows Server 2019. There are a number of other prerequisistes that need to be satisfied before installing Azure AD Connect:

  • Azure AD Tenant: this is created for you when you sign up for the Microsoft 365 Trial.
  • Domain needs to be verified: we’ve done this above.
  • The on-premise Active Directory forest and domain functional levels must be Windows Server 2003 or later. The domain controllers can run any level as long as this condition is met. This also means that you don’t need to install Azure AD Connect on a Domain Controller.
  • The Domain Controller used by Azure AD during the setup must be writable and not a read-only domain controller (RODC). Even though you may have other writable domain controllers in your environment, Azure AD doesn’t support write redirects.
  • Enabling the Active Directory recycle bin is recommended.
  • The PowerShell execution policy neds to be set to “RemoteSigned” on the Server that Azure AD Connect is installed on.
  • Installing on Windows Server Core is not supported.
  • Finally as discussed in the last post, this is a good time to ensure the UPN and proxyAddress attributes are set correctly on your on-premise environment.

So now you can go ahead and install Azure AD Connect. As per the previous post, there are different authentication methods to choose from and these are available as install options in the Azure AD Connect installation wizard:

  • Password Hash Synchronization (PHS) – this can be run as express installation and assumes the following:
    • You have a single Active Directory forest on-premises.
    • You have an enterprise administrator account you can use for the installation.
    • You have less than 100,000 objects in your on-premises Active Directory.

With an Express installation, you get:

  • Password hash synchronization from on-premises to Azure AD for single sign-on.
  • A configuration that synchronizes users, groups, contacts, and Windows 10 computers.
  • Synchronization of all eligible objects in all domains and all OUs. At the end of the installation, you can run the installation wizard again and choose to filter domains or OU’s.
  • Automatic upgrade is enabled to make sure you always use the latest available version.

The other option is Pass-through authentication (PTA). If you have already run an express installation, all you need to do is select the “Change user sign-in” task from the Azure AD Connect application, select next and pick PTA as the sign-in method. Once successful, this will install the PTA agent on the same server as Azure AD Connect is installed on.

What you then need to do is ensure that Pass-through authentication is enabled on your tenant in the Azure AD Connect blade in your Azure AD tenant.

NOTE if you turn this feature on, it will affect all users in your managed domain, and not just for signing on to Microsoft 365, but other services such as Azure or Dynamics that you may be using the tenant for. So you need to be very aware of the effects of making this change.

Your on-premise users and computers will now synchronize to your Microsoft 365 tenant.

Conclusion

So thats the quick tour of setting up your tenant, adding domains and confirming DNS settings, and installing and configuring Azure AD Connect.

In the next post, we’ll look at setting up the Hybrid Configuration to enable your on-premise Exchange environment to co-exist with your Microsoft 365 tenant during the migration process. Hope you enjoyed this post, until next time!

100 Days of Cloud – Day 74: Preparing your Active Directory to Sync with Azure AD Connect

Its Day 74 of my 100 Days of Cloud journey, and today I’m jumping back into the Microsoft 365 ecosystem and taking a look at the steps needed to prepare your on-premise Active Directory environment.

We touched on Microsoft 365 briefly on Day 72 when we looked at whether migrating your on-premise File Server to Azure Files or SharePoint was the best option for your business. We also commented that a migration to Microsoft 365 hosted email was traditionally the first step that the majority of companies have taken or will take in their journey to Public Cloud environemnts.

I’ve decided to step back and look at Microsoft 365 as a whole and the services offered in the next few posts where we can see how each service can provide a benefit to your business. But before we do that, lets take a look at the preparation needed to decide on which identity models to use, and the preparation needed on your on-premise Active Directory environment if using the Hybrid identity model.

Authentication Methods

But before that happens or you decide on any migration strategy, you need to decide how your users will authenticate to those Cloud Services. For this we have 2 options

  1. Cloud-only identity: A cloud-only identity uses user accounts that exist only in Azure AD. Cloud-only identity is typically used by small organizations that do not have on-premises servers or do not use AD DS to manage local identities. All management of these identities is performed using the Microsoft 365 admin center and Windows PowerShell with the Microsoft Azure Active Directory Module.
  2. Hybrid identity: Hybrid identity uses accounts that originate in an on-premises AD DS and have a copy in the Azure AD tenant of a Microsoft 365 subscription. Most changes, with the exception of specific account attributes, only flow one way. Changes that you make to AD DS user accounts are synchronized to their copy in Azure AD. Azure AD Connect runs on an on-premises server, provides ongoing account synchronization, checks for changes in the AD DS, and forwards those changes to Azure AD

So that all makes sense! Now lets introduce another layer of complexity and choice. If you choose a Cloud-only identity model, things are straightforward. However, if you choose a Hybrid model, you have 2 authentication options:

  1. Managed Authentication: this is where Azure AD handles the authentication process. And nested within this, you have 2 options:
    • Password hash synchronization (PHS): this is where Azure AD performs the authentication using a hash of the password that has been syncronized from your on-premise Active Directory.
Image Credit: Microsoft
  • Pass-through authentication (PTA): this is where Azure AD redirects the authentication request back to your on-premise Active Directory.
Image Credit: Microsoft

2. Federated authentication: this is is primarily for large enterprise organizations with more complex authentication requirements. AD DS identities are synchronized with Microsoft 365 and users accounts are managed on-premises. With federated authentication, users have the same password on-premises and in the cloud and they do not have to sign in again to use Microsoft 365.

Managing and Protecting Privileged Accounts and Administrator Roles

The general rule of thumb is that we should never assign administrator roles to everyday user accounts, especially accounts that have been synchronized from on-premise.

You should assign dedicated Cloud-only identities for administrator roles, and protects these accounts with Multi-Factor Authentication, and/or Azure AD Priveleged Identity Management for on-demand, just-in-time assignment of adminstrator roles. You should also consider using a privileged access workstation (PAW). A PAW is a dedicated computer that is only used for sensitive configuration tasks, such as Microsoft 365 configuration that requires a privileged account.

Managing and Protecting User Accounts

While administrator accounts are the first ones to get protected, sometimes we forget about protecting our user accounts. While we have recommended MFA for administrator accounts, we need to be enabling and enforcing MFA for all users.

We can also enabled other advanced features (depending on our license levels):

  1. Security Defaults: this feature requires all of your users to use MFA with the Microsoft Authenticator app. Users have 14 days to register for MFA with the Microsoft Authenticator app from their smart phones, which begins from the first time they sign in after security defaults has been enabled. After 14 days have passed, the user won’t be able to sign in until MFA registration is completed.
  2. Azure AD Password Protection: detects and blocks known weak passwords and their variants and can also block additional weak terms that are specific to your organization. Default global banned password lists are automatically applied to all users in an Azure AD tenant. You can define additional entries in a custom banned password list. When users change or reset their passwords, these banned password lists are checked to enforce the use of strong passwords.
  3. Conditional Access policies: a set of rules that specify the conditions under which sign-ins are evaluated and access is granted. We looked at this in detail on Day 57.

Keep the following in mind:

  • You cannot enable security defaults if you have any Conditional Access policies enabled.
  • You cannot enable any Conditional Access policies if you have security defaults enabled.

Active Directory Domain Services Preparation

Before you synchronize your AD DS to your Azure AD tenant, you need to clean up your AD DS. This is an important step as if its not performed correctly, it can lead to a significant negative impact on the deployment process. It might take days, or even weeks, to go through the cycle of directory synchronization, identifying errors, and re-synchronization.

While there are a number of attributes you need to prepare for synchronization, the most important ones are:

  1. userPrincipalName (UPN): this needs to be a valid and unique value for each user object, as the AD DS UPN matches the Azure AD UPN. This is what users will use to authenticate, and is required to be in the Internet-style sign-in format, for example “firstname.lastname@yourdomain.com”.
    • A note on this – Active Directory use the sAMAccountName attribute to authenticate. It recommended that prior to syncing your identities, this should match the userPrincipalName to avoid confustion for both users and administrators, however this is not necessary.
    • Another note – if you are using multiple mail domains, you can add multiple UPN suffixes. The article here shows how to add these and also how to change the UPN for each or multiple users.
  2. mail: This is the users Primary email address, and needs to be unique for each user. This can only contain a single value.
  3. proxyAddress: This is the users email addresses, again it needs to be unique for each user object. It cannot contain any spaces or invalid characters. This can have multiple entries if you have multiple mail domains in use.
    • A note here – The Primary address will be same as the mail attribute, and will be in the format “SMTP:name@domain1.com”. Additional addresses will be in the format “smtp:name@domain2.com” (note the uppercase and lowercase).
  4. displayName: This is how your name will be displayed in the Global Address list, and is a combination of the givenName and surname attributes. Its not necessary to be unique, however it is recommended to avoid confusion.

For optimal use of the Global Address List, its recommended that these attributes are populated and correct for each account:

  • givenName
  • surname
  • displayName
  • Job Title
  • Department
  • Office
  • Office Phone
  • Mobile Phone
  • Fax Number
  • Street Address
  • City
  • State or Province
  • Zip or Postal Code
  • Country or Region

Conclusion

In this post, we looked at the steps required to prepare for synchronization for choosing an identity model, administrator and user security, and finally user account and attribute preparation.

In the next post, we’ll look at the steps to install Azure AD Connect to synchroniuze your identities. Hope you enjoyed this post, until next time!

100 Days of Cloud – Day 73: The Value of User Groups

Its Day 73 of my 100 Days of Cloud journey, and today its a quick post about the importance of attending and being a member of Azure and Cloud User Groups.

User Groups are a great way meet new people and network in the community, but also to learn new skills from guest speakers who are experts.

Over the last few weeks, I’ve attended some excellent User Group sessions with some awesome people in the Cloud Community, such as:

All of these User Groups and many more can be found on meetup.com, and you can also follow all of the speakers above on both Twitter (links above) or search for them on LinkedIn. Also, most of the sessions from these User Groups are available on their YouTube Channels a few days after the events.

So log on to meetup and search for a User Group or Community near you, or you can attend these awesome ones above while they are still hosted as online events!

Hope you enjoyed this post, until next time!

100 Days of Cloud – Day 72: Migrate On-Premise File Server to Azure Files or SharePoint?

Its Day 72 of my 100 Days of Cloud journey, and todays post attempts to answer a question that is now at the forefront of the majority of IT Departments across the world – we know how to migrate the rest of our Infrastructure and Applications to the Cloud, but whats the best solution for the File Server?

Traditional Cloud Migration Steps

The first step that most companies make into the Cloud is the Migration to Microsoft 365 from On-Premise Exchange, because the offer of hosted email is appealing due to how critical email communication is to businesses. However although there are numerous services available in the Microsoft 365 stack (which I’ll get into more detail on in a future post), most companies will only use Email and Teams following the migration.

Once Exchange is migrated, that leaves the rest of the infrastructure. We looked at Azure Migrate back on Day 18 and how it can assist with discovery, assessment and migration of on-premise workloads to Azure. Companies will make the decision to migrate their workloads from on-premise infrastructure to Azure IAAS or PAAS services based on the following factors:

  • Legacy or Unsupported Hardware that is expensive to replace.
  • Legacy or Unsupported Virtualization systems (older versions of VMware or Hyper-V).
  • Savings on Data Centre or Comms Room overheads such as power and cooling.
  • The ability to re-architect business applications at speed and scale without the need for additional hardware/software and complicated backup/recovery procedures in the failure.
  • Backup and Disaster Recovery costs to meet Compliance and Regulatory requirements.

Once thats done, the celebrations can begin. Its done! We’ve migrated to the Cloud! Party time! But wait, whats that sitting over in rack in the corner. Covered in dust, humming and drawing power as if to mock. You approach and see the lights flicker as the disks spin in protest at the read/write operations as they struggle to perform the IOPS required by that bloody Accounts spreadsheet ….

Yes, the File Server. Except its a long time since it was a simple file server. These days, File Servers encompass managing storage at an enterprise level with storage arrays, disk tiers and caching, redundancy and backup, not to mention the cost of the file server operating system upkeep and maintenance.

So we need to migrate the File Server as well, but what are our options?

SharePoint

SharePoint empowers your Departments and Project Teams with dynamic and productive team sites from which you can access and share files, data, news, and resources. Collaborate effortlessly and securely with team members inside and outside your organization, across PCs, Macs, and mobile devices.

All organizations with an Office365 subscription will have 1TB of storage available for use in SharePoint. Any additional storage is based on the amount of licensed users you have, and each user adds an additional 10GB of Storage yo that SharePoint storage pool. So for example, if you have 50 users, you would then have a total of 1.5TB of storage.

You also have the option to add on additional storage using Office 365 Extra File Storage, however this is limited to 25TB. This is only available as an option with the following plans:

  • Office 365 Enterprise E1
  • Office 365 Enterprise E2
  • Office 365 Enterprise E3
  • Office 365 Enterprise E4
  • Office 365 Enterprise E5
  • Office 365 A3 (faculty)
  • Office 365 A5 (faculty)
  • Office for the web with SharePoint Plan 1
  • Office for the web with SharePoint Plan 2
  • SharePoint Online Plan 1
  • SharePoint Online Plan 2
  • Microsoft 365 Business Basic
  • Microsoft 365 Business Standard
  • Microsoft 365 Business Premium
  • Microsoft 365 E3
  • Microsoft 365 E5
  • Microsoft 365 F1

If you move your files into SharePoint libraries, you can then use the OneDrive Sync Client to sync both the users’ individual files in OneDrive and also be used with SharePoint Online to sync libraries that the user requires frequent access to offline.

One important thing to remember – all licensed Office365 users have 1TB of personal storage available for use, but this storage does not contribute to the overall SharePoint storage pool. You can set sharing and storage limits on both OneDrive and SharePoint using the SharePoint Admin Center.

With Microsoft 365, you have a number of options to protect the data that you place into SharePoint Online and OneDrive for Business:

  • Restrict the ability to save, download, or print files on non-corporate owned devices.
  • Restrict the ability to offline sync files on non-corporate owned devices.
  • Control what users can do based on their geographic location or device class or platform.

We can also use additional features available in Azure AD Premium, Microsoft Intune, Office 365 ATP or Azure Information Protection to provide additional protections to the data stored in SharePoint.

You can find out more about SharePoint in the Microsoft documentation here.

Azure Files

Azure Files offers fully managed file shares in the cloud that are accessible via the industry standard Server Message Block (SMB) protocol or Network File System (NFS) protocol. Azure Files file shares can be mounted concurrently by cloud or on-premises deployments.

  • SMB Azure file shares are accessible from Windows, Linux, and macOS clients.
  • NFS Azure Files shares are accessible from Linux or macOS clients.

Additionally, SMB Azure file shares can be cached on Windows Servers with Azure File Sync for fast access near where the data is being used. Azure Files is closer to the traditional on-premise file shares in that you can use both Active Directory and Azure AD-based authentication to access you can use Group Policy to map drives as you would have done with on-premise file shares.

Azure Files is housed on Azure Storage and has 2 distinct billing options:

  • The provisioned model is only available for premium file shares, which are file shares deployed in the FileStorage storage account kind.
  • The pay-as-you-go model is only available for standard file shares, which are file shares deployed in the general purpose version 2 (GPv2) storage account kind.

Azure Files supports storage capacity reservations, which enable you to achieve a discount on storage by pre-committing to storage utilization. When you purchase reserved capacity, your reservation must specify the following dimensions:

  • Capacity – can be for either 10 TiB or 100 TiB, with more significant discounts for purchasing a higher capacity reservation.
  • Term: Reservations can be purchased for either a one year or three year term.
  • Tier: The tier of Azure Files for the capacity reservation, which can be either premium, hot, and cool tiers.
  • Location: The Azure region for the capacity reservation.
  • Redundancy: The storage redundancy for the capacity reservation. Reservations are supported for all redundancies Azure Files supports, including LRS, ZRS, GRS, and GZRS.

Finally, you have the option of Azure File Sync which is a service that allows you to cache several Azure file shares on an on-premises Windows Server or cloud VM.

You can find out more about Azure Files here, and Azure File Sync here.

Conclusion and Final Thoughts

We’ve seen both options that are available in migrating File Servers to the Microsoft Cloud ecosystem.

From the options we’ve seen and in my opinion, SharePoint is more suited to smaller businesses who are planning to or have already migrated to Microsoft 365, while Azure Files is more suited to larger enterprises with multiple sites or regions that have higher levels of storage requirements.

Hope you enjoyed this post, until next time!

100 Days of Cloud – Day 71: Microsoft Sentinel

Its Day 71 of my 100 Days of Cloud journey, and todays post is all about Microsoft Sentinel. This is the new name for Azure Sentinel, following on from the rebranding of a number of Microsoft Azure services at Ignite 2021.

Image Credit: Microsoft

Microsoft Sentinel is a cloud-native Security Information and Event Management (SIEM) and Security Orchestration, Automation, and Response (SOAR) solution. It provides intelligent security analytics and threat intelligence across the enterprise, providing a single solution for attack detection, threat visibility, proactive hunting, and threat response.

SIEM and SOAR

We briefly touched on SIEM and SOAR in the previous post on Microsoft Defender for Cloud. Before we go further, lets note what the definition of SIEM and SOAR is according to Gartner:

  • Security information and event management (SIEM) technology supports threat detection, compliance and security incident management through the collection and analysis (both near real time and historical) of security events, as well as a wide variety of other event and contextual data sources. The core capabilities are a broad scope of log event collection and management, the ability to analyze log events and other data across disparate sources, and operational capabilities (such as incident management, dashboards and reporting).
  • SOAR refers to technologies that enable organizations to collect inputs monitored by the security operations team. For example, alerts from the SIEM system and other security technologies — where incident analysis and triage can be performed by leveraging a combination of human and machine power — help define, prioritize and drive standardized incident response activities. SOAR tools allow an organization to define incident analysis and response procedures in a digital workflow format.

Overview of Sentinel Functionality

Microsoft Sentinel gives a single view of your entire estate across multiple devices, users, applications and infrastructure across both on-premise and multiple cloud environments. The key features are:

  • Collect data at cloud scale across all users, devices, applications, and infrastructure, both on-premises and in multiple clouds.
  • Detect previously undetected threats, and minimize false positives using Microsoft’s analytics and unparalleled threat intelligence.
  • Investigate threats with artificial intelligence, and hunt for suspicious activities at scale, tapping into years of cyber security work at Microsoft.
  • Respond to incidents rapidly with built-in orchestration and automation of common tasks.

Sentinel can ingest alerts from not just Microsoft solutions such as Defender, Office365 and Azure AD, but from a multitude of 3rd-party and multi cloud providers such as Akamai, Amazon, Barracuda, Cisco, Fortinet, Google, Qualys and Sophos (and thats just to name a few – you can find a full list here). These are whats known as Data Sources and the data is ingested using the wide range of built-in connectors that are available:

Image Credit: Microsoft

Once your data sources are connected, the data is monitored using Sentinel integration with Azure Monitor Workbooks, which allows you to visualize your data:

Image Credit: Microsoft

Once the data and workbooks are in place, Sentinel uses analytics and machine learning rules to map your network behaviour and to combine multiple related alerts into incidents which you can view as a group to investigate and resolve possible threats. The benefit here is that Sentinel lowers the noise that is created by multiple alerts and reduces the number of alerts that you need to react to:

Image Credit: Microsoft

Sentinel’s autotmation and orchestration playbooks are built on Azure Logic Apps, and there is growing gallery of built-in playbooks to choose from. These are based on standard and repeatable events, and in the same way as standard Logic Apps are triggered by a particular action or event:

Image Credit: Microsoft

Last but not least, Sentinel has investigation tools that go deep to find the root cause and scope of a potential security threat, and hunting tools based on the MITRE Framework which enable you to hunt for threats across your organization’s data sources before an event is triggered.

Do I need both Defender for Cloud and Sentinel?

My advice on this is yes – because they are 2 different products that integrate and complement each other

Sentinel has the ability to detect, investigate and remediate threats. In order for Sentinel to do this, it needs a stream of data from Defender for Cloud or other 3rd party solutions.

Conclusion

We’ve seen how powerful Microsoft Sentinel can be as a tool to protect your entire infrastructure across multiple providers and platforms. You can find more in-depth details on Microsoft Sentinel here.

Hope you enjoyed this post, until next time!

100 Days of Cloud – Day 70: Microsoft Defender for Cloud

Its Day 70 of my 100 Days of Cloud journey, and todays post is all about Azure Security Center! There’s one problem though, its not called that anymore ….

At Ignite 2021 Fall edition, Microsoft announced that the Azure Security Center and Azure Defender products were being rebranded and merged into Microsoft Defender for Cloud.

Overview

Defender for Cloud is a cloud-based tool for managing the security of your multi-vendor cloud and on-premises infrastructure. With Defender for Cloud, you can:

  • Assess: Understand your current security posture using Secure score which tells you your current security situation: the higher the score, the lower the identified risk level.
  • Secure: Harden all connected resources and services using either detailed remediation steps or an automated “Fix” button.
  • Defend: Detect and resolve threats to those resources and services, which can be sent as email alerts or streamed to SIEM (Security, Information and Event Management), SOAR (Security Orchestration, Automation, and Response) or IT Service Management solutions as required.
Image Credit: Microsoft

Pillars

Microsoft Defender for Cloud’s features cover the two broad pillars of cloud security:

  • Cloud security posture management

CSPM provides visibility to help you understand your current security situation, and hardening guidance to help improve your security.

Central to this is Secure Score, which continuously assesses your subscriptions and resources for security issues. It then presents the findings into a single score and provides recommended actions for improvement.

The guidance in Secure Score is provided by the Azure Security Benchmark, and you can also add other standards such as CIS, NIST or custom organization-specific requirements.

  • Cloud workload protection

Defender for Cloud offers security alerts that are powered by Microsoft Threat Intelligence. It also includes a range of advanced, intelligent, protections for your workloads. The workload protections are provided through Microsoft Defender plans specific to the types of resources in your subscriptions.

The Defender plans page of Microsoft Defender for Cloud offers the following plans for comprehensive defenses for the compute, data, and service layers of your environment:

Microsoft Defender for servers

Microsoft Defender for Storage

Microsoft Defender for SQL

Microsoft Defender for Containers

Microsoft Defender for App Service

Microsoft Defender for Key Vault

Microsoft Defender for Resource Manager

Microsoft Defender for DNS

Microsoft Defender for open-source relational databases

Microsoft Defender for Azure Cosmos DB (Preview)

Azure, Hybrid and Multi-Cloud Protection

Defender for Cloud is an Azure-native service, so many Azure services are monitored and protected without the need for agent deployment. If agent deployment is needed, Defender for Cloud can deploy Log Analytics agent to gather data. Azure-native protections include:

  • Azure PAAS: Detect threats targeting Azure services including Azure App Service, Azure SQL, Azure Storage Account, and more data services.
  • Azure Data Services: automatically classify your data in Azure SQL, and get assessments for potential vulnerabilities across Azure SQL and Storage services.
  • Networks: reducing access to virtual machine ports, using the just-in-time VM access, you can harden your network by preventing unnecessary access.

For hybrid environments and to protect your on-premise machines, these devices are registered with Azure Arc (which we touched on back on Day 44) and use Defender for Cloud’s advanced security features.

For other cloud providers such as AWS and GCP:

  • Defender for Cloud CSPM features assesses resources according to AWS or GCP’s according to their specific security requirements, and these are reflected in your secure score recommendations.
  • Microsoft Defender for servers brings threat detection and advanced defenses to your Windows and Linux EC2 instances. This plan includes the integrated license for Microsoft Defender for Endpoint amongst other features.
  • Microsoft Defender for Containers brings threat detection and advanced defenses to your Amazon EKS and Google’s Kubernetes Engine (GKE) clusters.

We can see in the screenshot below how the Defender for Cloud overview page in the Azure Portal gives a full view of resources across Azure and multi cloud sunscriptions, including combined Secure score, Workload protections, Regulatory compliance, Firewall manager and Inventory.

Image Credit: Microsoft

Conclusion

You can find more in-depth details on how Microsoft Defender for Cloud can protect your Azure, Hybrid and Multi-Cloud Workloads here.

Hope you enjoyed this post, until next time!

100 Days of Cloud – Day 69: Azure Logic Apps

Its Day 69 of my 100 Days of Cloud journey, and today I’m getting my head around Azure Logic Apps.

Azure Logic Apps is a cloud-based platform for creating and running automated workflows that integrate your apps, data, services, and systems.

Comparison with Azure Functions

If this sounds vaguely familiar, it should because all the way back on Day 55 we looked at Azure Functions, which by definition allows you to create serverless applications in Azure.

So they both do the same thing, right? Well yes and no. So at this stage it important to show what the differences are between them.

Lets start with Azure Functions:

  • Lets you run event-triggered code without having to explicitly provision or manage infrastructure.
  • Azure Functions have a “Code-First” (imperative) for user experience and are primarily authored via Visual Studio or another IDE.
  • Azure Functions have about a dozen built-in binding types (mainly for other Azure services). If there isn’t an existing binding, you will need to write custom code to create new bindings.
  • With Azure Functions, you have 3 pricing options. You can opt for an App Service Plan, which gives you dedicated resources. The second option is completely serverless, with the Consumption plan based on resources consumed and number of executions. The third option is Functions Premium, which is a hybrid of both the App Service Plan and Consumption Plan.
  • As Azure Functions are code-first, the only options for deployment are Visual Studio, Azure DevOps or FTP.

Now, lets compare that with Azure Logic Apps:

  • Azure Logic Apps is a cloud service that helps you schedule, automate, and orchestrate tasks, business processes, and workflows when you need to integrate apps, data, systems, and services across enterprises or organizations.
  • Logic Apps have a “Designer-First” (declarative) experience for the user by providing a visual workflow designer accessed via the Azure Portal.
  • Logic Apps have a large collection of connectors to Azure Services, SaaS applications (Microsoft and others), FTP and Enterprise Integration Pack for B2B scenarios; along with the ability to build custom connectors if one isn’t available. Examples of connectors are:
    • Azure services such as Blob Storage and Service Bus
    • Office 365 services such as Outlook, Excel, and SharePoint
    • Database servers such as SQL and Oracle
    • Enterprise systems such as SAP and IBM MQ
    • File shares such as FTP and SFTP
  • Logic Apps has a pure pay-per-usage billing model. You pay for each action that gets executed. However, there are different pricing tiers available, more information is available here.
  • There are many ways to manage and deploy Logic Apps. They can be created and updated directly in the Azure Portal (which will automatically create a JSON template). The JSON template can also be deployed via Azure DevOps and Visual Studio.

The following list describes just a few example tasks, business processes, and workloads that you can automate using the Azure Logic Apps service:

  • Schedule and send email notifications using Office 365 when a specific event happens, for example, a new file is uploaded.
  • Route and process customer orders across on-premises systems and cloud services.
  • Move uploaded files from an SFTP or FTP server to Azure Storage.
  • Monitor tweets, analyze the sentiment, and create alerts or tasks for items that need review.

Concepts

These are the key concepts to be aware of:

  • Logic app – A logic app is the Azure resource you create when you want to develop a workflow. There are multiple logic app resource types that run in different environments.
  • Workflow – A workflow is a series of steps that defines a task or process. Each workflow starts with a single trigger, after which you must add one or more actions.
  • Trigger – A trigger is always the first step in any workflow and specifies the condition for running any further steps in that workflow. For example, a trigger event might be getting an email in your inbox or detecting a new file in a storage account.
  • Action – An action is each step in a workflow after the trigger. Every action runs some operation in a workflow.

How Logic Apps work

The workflow begins with a trigger, which can have a pull or push pattern. Pull triggers are initiated when a regularly scheduled process finds new updates in the source data since its last pull, while push triggers are initiated each time new data is generated in the source itself.

Next, users define a series of actions that run either consecutively or concurrently, based on the specified trigger and schedule. Users can export the worklow to JSON and use this to create and deploy Logic Apps using tools like Visual Studio and Azure DevOps, or they can save logic apps as Azure Resource Manager templates to reuse.

Connectors

Connectors are the most powerful aspect of the structure of a Logic App. Connectors are blocks of pre-built operations that communicate with 3rd-party services as steps in the workflow. Connectors can be nested within each other to provide complex solutions that meet exact use case needs. 

Azure contains a catalog of hundreds of available connectors and users can leverage these connectors to accomplish tasks without requiring any coding experience. You can find the full list of connectors here.

Use Cases

The following are the common use cases for Logic Apps:

  • Send an email alert to users based on data being updated in an on-premises database.
  • Query a database and send email notifications based on result criteria.
  • Communication with external platforms and services.
  • Data transformation or ingestion.
  • Social media connectivity using built-in API connectors.
  • Timer- or content-based routing.
  • Create business-to-business (B2B) solutions.
  • Access Azure virtual network resources.

We saw in the templates above how we can use an Event Grid resource event as a trigger, the tutorial here gives an excellent run through of creating a Logic App based on an Event Grid resource event and using the O365 Email connector.

Conclusion

So thats an overview of Azure Logic Apps and also how it compares to Azure Functions. You can find out more about Azure Logic Apps in the official documentation here.

Hope you enjoyed this post, until next time!

100 Days of Cloud – Day 68: Azure Service Bus

Its Day 68 of my 100 Days of Cloud Journey, and today I’m looking at Azure Service Bus.

In the previous posts, we looked at the different services that Microsoft uses to handle events:

  • Azure Event Grid, which is an eventing backplane that enables event-driven, reactive programming. It uses the publish-subscribe model. Publishers emit events, but have no expectation about how the events are handled. Subscribers decide on which events they want to handle.
  • Azure Event Hubs, which is a big data streaming platform and event ingestion service. It can receive and process millions of events per second. It facilitates the capture, retention, and replay of telemetry and event stream data. 

Events v Messages

Both of the above services are based on events, and its important to understand the definition of an event:

  • An event is a lightweight notification of a condition or a state change. The publisher of the event has no expectation about how the event is handled. The consumer of the event decides what to do with the notification. Events can be discrete units or part of a series.
  • Discrete events report state change and are actionable. To take the next step, the consumer only needs to know that something happened. The event data has information about what happened but doesn’t have the data that triggered the event.

By contrast, a message is raw data produced by a service to be consumed or stored elsewhere. The message contains the data that triggered the message pipeline. The publisher of the message has an expectation about how the consumer handles the message. A contract exists between the two sides. For example, the publisher sends a message with the raw data, and expects the consumer to create a file from that data and send a response when the work is done.

Azure Service Bus

Azure Service Bus is a fully managed enterprise message broker with message queues and publish-subscribe topics (in a namespace). Service Bus is used to decouple applications and services from each other, whether hosted natively on Azure, on-premise, or from any other cloud vendor such as AWS or GCP. Messages are sent and kept into queues or topics until requested by consumers in a “poll” mode (i.e. only delivered when requested).

Azure Service Bus provides the following benefits:

  • Load-balancing work across competing workers
  • Safely routing and transferring data and control across service and application boundaries
  • Coordinating transactional work that requires a high-degree of reliability

Concepts

  • Queues – Messages are sent to and received from queues. Queues store messages until the receiving application is available to receive and process them. Messages are kept in the queue until picked up by consumers, and are retrieved is on a first-in-first-out (FIFO) basis. A queue can have one or many competing consumers, but a message is consumed only once.
Image Credit: Microsoft
  • Topics: A queue allows processing of a message by a single consumer. In contrast to queues, topics and subscriptions provide a one-to-many form of communication in a publish and subscribe pattern. It’s useful for scaling to large numbers of recipients. Each published message is made available to each subscription registered with the topic. Publisher sends a message to a topic and one or more subscribers receive a copy of the message, depending on filter rules set on these subscriptions.
Image Credit: Microsoft

You can define rules on a subscription. A subscription rule has a filter to define a condition for the message to be copied into the subscription and an optional action that can modify message metadata.

  • Namespaces – A namespace is a container for all messaging components (queues and topics). Multiple queues and topics can be in a single namespace, and namespaces often serve as application containers.

There are also a number of advanced features available in Azure Service Bus:

  • Dead Letter Queue: this is a sub-queue to hold messages that could not be delivered or processed.
  • Consumption Mode: Azure Service Bus supports several consumption modes: pub/sub with a pull model, competing consumers, and partitioning can be achieved with the use of topics, subscriptions, and actions.
  • Duplicate Detection: Azure Service Bus supports duplicate detection natively.
  • Delivery Guarantee: Azure Service Bus supports three delivery guarantees: At-least-once, At-most-once, and Effectively once.
  • Message Ordering: Azure Service Bus can guarantee first-in-first-out using sessions.

Conclusion

Thats a brief overview of Azure Service Bus. You can learn more about Azure Service Bus in the Microsoft documentation here. Hope you enjoyed this post, until next time!

100 Days of Cloud – Day 67: Azure Event Hubs

Its Day 67 of my 100 Days of Cloud Journey, and today I’m looking at Azure Event Hubs.

In the last post, we looked at Azure Event Grid, which is a serverless offering that allows you to easily build applications with event-based architectures. Azure Event Grid contains a number of sources, and one of those is Azure Event Hub.

Whereas Azure Event Grid can take in events from sources and trigger actions based on those events, Azure Event Hubs is a big data streaming platform and event ingestion service. It can receive and process millions of events per second. Data sent to an event hub can be transformed and stored by using any real-time analytics provider or batching/storage adapters.

One of the key difference between the 2 services is that while Event Grid can plug directly into Azure services and listen for events coming from their sources, Event Hubs can listen for events coming from sources outside of Azure, and can handle millions of events coming from multiple devices.

The following scenarios are some of the scenarios where you can use Event Hubs:

  • Anomaly detection (fraud/outliers)
  • Application logging
  • Analytics pipelines, such as clickstreams
  • Live dashboards
  • Archiving data
  • Transaction processing
  • User telemetry processing
  • Device telemetry streaming

Concepts

Azure Event Hubs represent the “front door” (or Event Ingestor to give it the correct name) for an event pipeline, and sits between event producers and event consumers. It decouples the process of producing data from the process of consuming data. You can publish events individually or in batches.

Azure Event Hubs are built around the concept of partitions and consumer groups. Inside an Event Hub, events are sent to partitions by specifying the partition key or partition id. Partition count of an Event Hub cannot be changed after creation so is mindful of this limitation.

Image Credit: Microsoft

Receivers are grouped into consumer groups. A consumer group represents a view (state, position, or offset) of an entire event hub. It can be thought of as a set of parallel applications that consume events at the same time.

Consumer groups enable receivers to each have a separate view of the event stream. They read the stream independently at their own pace and with their own offsets. Event Hub uses a partitioned consumer pattern; events are spread across partitions to allow horizontal scale. Events can be stored in either Blob Storage or Data Lake, this is configured when the initial event hub is created.

Image Credit: Microsoft

Use Cases

Event Hubs is the component to use for real-time and/or streaming data use cases:

  • Real-time reporting
  • Capture streaming data into files for further processing and analysis – e.g. capturing data from micro-service applications or a mobile app
  • Make data available to stream-processing and analytics services – e.g. when scoring an AI algorithm
  • Telemetry streaming & processing
  • Application logging

Event Hubs is also available as a feature for Azure Stack Hub, which allows you to realize hybrid cloud scenarios. Streaming and event-based solutions are supported, for both on-premises and Azure cloud processing.

Conclusion

You can learn more about Event Hubs in the Microsoft documentation here. Hope you enjoyed this post, until next time!

100 Days of Cloud – Day 66: Azure Event Grid

Its Day 66 of my 100 Days of Cloud Journey, and today I’m looking at Azure Event Grid, which I came across during my Az-204 studies.

Azure Event Grid is a serverless offering that allows you to easily build applications with event-based architectures. First, select the Azure resource you would like to subscribe to, and then give the event handler or WebHook endpoint to send the event to.

Image Credit: Microsoft

Concepts

Azure Event Grid uses the following concepts which you will need to understand:

  • Events: An event is the smallest amount of information that fully describes something that happened in the system. Every event has common information like: source of the event, time the event took place, and unique identifier. An example of common events would be a file being uploaded, a Virtual Machine deleted, a SKU being added etc.
  • Publishers: A publisher is the user or organization that decides to send events to Event Grid.
  • Event Sources: An event source is where the event happens. Each event source is related to one or more event types. For example, Azure Storage is the event source for blob created events. The following Azure services support sending events to Event Grid:
    • Azure API Management
    • Azure App Configuration
    • Azure App Service
    • Azure Blob Storage
    • Azure Cache for Redis
    • Azure Communication Services
    • Azure Container Registry
    • Azure Event Hubs
    • Azure FarmBeats
    • Azure IoT Hub
    • Azure Key Vault
    • Azure Kubernetes Service (preview)
    • Azure Machine Learning
    • Azure Maps
    • Azure Media Services
    • Azure Policy
    • Azure resource groups
    • Azure Service Bus
    • Azure SignalR
    • Azure subscriptions
  • Topics: The event grid topic provides an endpoint where the source sends events. A topic is used for a collection of related events.
  • Event Subscriptions:A subscription tells Event Grid which events on a topic you’re interested in receiving. When creating the subscription, you provide an endpoint for handling the event.
  • Event Handlers: From an Event Grid perspective, an event handler is the place where the event is sent. The handler takes some further action to process the event. The supported event handlers are:
    • Webhooks. Azure Automation runbooks and Logic Apps are supported via webhooks.
    • Azure functions
    • Event hubs
    • Service Bus queues and topics
    • Relay hybrid connections
    • Storage queues

Capabilities

The key features of Azure Event Grid are:

  • Simplicity – You can direct events from any of the sources listed above to any event handler or endpoint.
  • Advanced filtering – Filter on event type to ensure event handlers only receive relevant events.
  • Fan-out – Subscribe several endpoints to the same event to send copies of the event to as many places as needed.
  • Reliability – 24-hour retry ensures that events are delivered.
  • Pay-per-event – Azure Event Grid uses a pay-per-event pricing model, so you only pay for what you use. The first 100,000 operations per month are free.
  • High throughput – Build high-volume workloads on Event Grid and scale up/down or in/out as required.
  • Built-in Events – Get up and running quickly with resource-defined built-in events.
  • Custom Events – Use Event Grid to route, filter, and reliably deliver custom events in your app.

Use Cases

Azure Event Grid provides several features that vastly improve serverless, ops automation, and integration work:

  • Serverless application architectures
Image Credit: Microsoft

Event Grid connects data sources and event handlers. For example, use Event Grid to trigger a serverless function that analyzes images when added to a blob storage container.

  • Ops Automation
Image Credit: Microsoft

Event Grid allows you to speed automation and simplify policy enforcement. For example, use Event Grid to notify Azure Automation when a virtual machine or database in Azure SQL is created. Use the events to automatically check that service configurations are compliant, put metadata into operations tools, tag virtual machines, or file work items.

  • Application integration
Image Credit: Microsoft

Event Grid connects your app with other services. For example, create a custom topic to send your app’s event data to Event Grid, and take advantage of its reliable delivery, advanced routing, and direct integration with Azure. Or, you can use Event Grid with Logic Apps to process data anywhere, without writing code.

Conclusion

You can learn more about Event Grid in the Microsoft documentation here. Hope you enjoyed this post, until next time!