Creating Subscriptions with ARM Templates

This article is contributed. See the original author and article here.

 


As more and more enterprises embrace Azure, complete end-to-end automation for standing up workloads in the cloud is one of the most important steps to running at scale.  The latest release (2020-09-01) of the Microsoft.Subscription resource provider enables subscription creation via templates.  To get started you first need to ensure billing agreements are in place and you can find details on that process here.  Once this is done, a new subscription can be created for the proper workload and billing account.  Once created, the subscription can be referred to with an alias throughout your code or templates.


 


Here’s a look at a subscription resource in a template:


 

"resources": [
    {
        "scope": "/",
        "type": "Microsoft.Subscription/aliases",
        "apiVersion": "2020-09-01",
        "name": "[parameters('subscriptionAlias')]",
        "properties": {
            "workload": "[parameters('subscriptionWorkload')]",
            "displayName": "[parameters('subscriptionDisplayName')]",
            "billingScope": "[tenantResourceId('Microsoft.Billing/billingAccounts/enrollmentAccounts', parameters('billingAccount'), parameters('enrollmentAccount'))]"
        }
    }
]

 


 


Of particular note is the “scope” property.  Subscriptions are a tenant level resource in Azure and must be PUT to the tenant scope which the property allows.


 


The prereqs for creating subscriptions is to identify the billing scope for the subscription.  You can find more information about billing scopes in this doc.  A sample script for looking up billing information can be found here.


 


The next step, when using templates for subscription creation, is to determine the scope for the template deployment itself.  All templates are deployed to a specific scope; most commonly this is a resourceGroup, but template deployments can be done at the subscription, managementGroup or tenant scope as well.  The scope of the deployment does not need to match the scope of the resources that are deployed though it often does.  For our template sample, we’ll describe a scenario, or rather tenant, where no subscriptions (or resourceGroups) exist and we’ll deploy to a managementGroup.  It doesn’t matter which managementGroup we choose for the deployment because the subscription itself will be created at the tenant scope and placed in the default managementGroup, unless a different one is specified.  If this is still a little confusing, just focus on the subscription resource itself.  This resource must be deployed to the tenant scope and the examples will show how to use the “scope” property to indicate that.


 


Note, that you must have permissions to create template deployments at the scope you target.  Also, the permission to deploy a template at a scope does not automatically give permissions to create any other resource, so you do need to ensure that you have the necessary permissions to create the resources in your template, if any, as well.  That should cover everything about permission.


 


A QuickStart sample for deploying a subscription can be found here.  The command for deploying this template is just like deploying any other template and following our example would be:


 

New-AzManagementGroupDeployment -ManagementGroupId (Get-AzContext).Tenant.id -Location westeurope -TemplateFile azuredeploy.json -TemplateParameterFile myParameters.json

 


 


This will deploy the template to the “root” managementGroup for the tenant.  Again, remember that you must have permission to deploy to that scope, in this case the root managementGroup.  If you don’t have that permission, you can deploy the template to any other managementGroup, subscription or resourceGroup.  Also a reminder that even though the subscription is created at the tenant scope, the template deployment does not need to match that scope.


 


So far, this is a very simple example, but you can also create a subscription and deploy resources to that subscription in the same template.  There is a little more orchestration required here because you’re actually targeting multiple scopes within the same template.  And, in order to target the subscription, you need the subscriptionId or GUID that was assigned to the subscription when it was created.  This next sample will perform each of these steps:


 



  1. Create the subscription (this is shown in the previous sample)

  2. Retrieve the subscriptionId from the newly created alias


 

"outputs": {
    "subscriptionId": {
        "type": "string",
        "value": "[reference(parameters('subscriptionAlias')).subscriptionId]"
    }
}

 


 


3. Pass that subscriptionId to the next deployment in the template


 

"type": "Microsoft.Resources/deployments",
"apiVersion": "2020-10-01",
"name": "[concat('nested-createResourceGroup-', parameters('resourceGroupName'))]",
"location": "[parameters('location')]",
"properties": {
    "expressionEvaluationOptions": {
        "scope": "inner"
    },
    "mode": "Incremental",
    "parameters": {
        "subscriptionId": {
            // this cannot be referenced directly on the subscriptionId property of the deployment so needs to be nested one level
            "value": "[reference(resourceId('Microsoft.Resources/deployments', concat('createSubscription-', parameters('subscriptionAlias')))).outputs.subscriptionId.value]"
        },
...

 


 


4. Create a new deployment in the new subscription that creates the resourceGroup


 

{
    "type": "Microsoft.Resources/deployments",
    "apiVersion": "2020-10-01",
    "name": "[concat('createResourceGroup-', parameters('resourceGroupName'))]",
    "subscriptionId": "[parameters('subscriptionId')]",
    "location": "[parameters('location')]",
    "properties": { ... }
...

 


 


And then finally deploy the resources to that resourceGroup.


 


That’s more complex than just creating a subscription because all of the orchestration is handled within a single template. 


 


If your scenario requires a different scope of deployment or more steps then you may not want to include in that single template.  If I wanted to break this down into multiple steps for orchestration in a pipeline it can be as simple as two steps.


 


Step 1 – Create the Subscription


Performing this step separately can be useful if you do not want to provide a user or service principal with permission to create template deployments at a given scope.  Once the subscription is created, the principal that created the subscription is an owner of that subscription and can deploy templates to that newly created subscription.  This means that the only permission the principal needs outside of the subscription, is the permission to create one. 


 


At this writing, the Azure PowerShell does not have a built-in command to create a subscription but you can always invoke any Azure REST api using the Invoke-AzRestMethod.  This script shows how to do that to create a subscription through an alias resource using the following command.


 

 .Create-SubscriptionAlias.ps1 -aliasName "newSub" -DisplayName "demo subscription" -billingAccount "1234567" -enrollmentAccount "654321" -workLoad DevTest

 


 


You need to set the correct parameter values for the billingAccount and enrollmentAccount which you can discover using this script from the top of this article.


 


Step 2 – Deploy the template.


Next, keeping with our greenfield scenario, where the subscription is created in the same workflow or pipeline that deploys this next template, we’ll create a subscription-level deployment.  If we were running an automated pipeline, this sample would be a good example for the next step.  The sample will create a new resourceGroup in the subscription, lock it and assign a principal access to that resourceGroup.  From here you could deploy resources to the subscription (or resourceGroup) or simply make it available for the principal to use.


 


That’s a quick overview of how to leverage this new capability, in just a few scenarios, that you can use to automate new workloads in Azure.  Let me know how it goes or if you have any questions about automating subscription creation in your environments.


 


 

Experiencing Data Access Issue in Azure portal for Log Analytics – 04/28 – Investigating

This article is contributed. See the original author and article here.

Initial Update: Wednesday, 28 April 2021 11:33 UTC

We are aware of issues within Log Analytics and are actively investigating. Some customers may experience data access and delayed or missed Log Search Alerts in West US region.
  • Work Around: None
  • Next Update: Before 04/28 16:00 UTC
We are working hard to resolve this issue and apologize for any inconvenience.
-Soumyajeet

How to query data located in Azure Blob Storage, Azure Data Lake Store Gen2/1 with ADX

How to query data located in Azure Blob Storage, Azure Data Lake Store Gen2/1 with ADX

This article is contributed. See the original author and article here.

An external table is a schema entity that references data stored outside the Azure Data Explorer database. Azure Data Explorer Web UI can create external tables by taking sample files from a storage container and creating schema based on these samples. You can then analyze and query data in external tables without ingestion into Azure Data Explorer. For information about different ways to create external tables, see create and alter external tables in Azure Storage or Azure Data Lake.


One of the most common scenarios for External table is with historian data (e.g. data that need to be stored due to legal requirements, log records for longer retention period, etc.)  that need to be query rarely. 


external table.jpg


 


Please read create an external table document for detailed explanation, here are some highlighted points.


 


1. at the Source Page, In Link to source, enter the SAS URL of your source container. You can add up to 10 sources (You can remove the 10 limitation by using the create external table command at the Query page). The first source container will display files below the File filters. In a later step, you will use one of these files to generate the table schema. 


2. At the Schema Page, in the right-hand side of the tab, you can preview your data. On the left-hand side, you can add partitions to your table definitions to access the source data more quickly and achieve better performance. 


3. At the Summary Page, you can query this table using the query buttons or with external_table() function. For more information on how to query external tables, see Querying an external table


external table-query.jpg

Announcing General Availability of the new Exchange admin center

This article is contributed. See the original author and article here.

The new Exchange admin center (EAC) is a modern, accessible, web-based management portal for managing Exchange Online that is based on the Microsoft 365 admin center experience. The new EAC is simple and accessible, and it enables you perform tasks like restoring mailboxes, migrating data, and much more.



Since entering Public Preview in June 2020, over half a million admins around the world have used it. We thrived on the feedback of our early adopters and we have steadily improved the new EAC with the help from a great community of early users.



Today, we are excited to announce that the new EAC is now generally available for customers (including GCC customers) in 10 languages. With this announcement, we are also releasing a new dashboard, new usability features, and several intelligent reports to help admins be more productive in their work. The new EAC is expected to be available to customers in GCC High at the end of May 2021, and to customers in DoD at the end of June 2021.



Here are some highlights:


 



  1. Personalized Dashboard, Reports, Insights – The new EAC offers actionable insights and includes reports for mail flow, migration, and priority monitoring.



  2. Azure Cloud Shell – Cloud Shell is a browser-accessible shell that provides a command-line experience built with Azure management tasks in mind. It enables admins to choose a shell experience that best suits their workstyle.



  3. Mailbox management and recover deleted items – Recipient management is one of the most crucial tasks that admins perform. The new EAC now includes easier mailbox management.



  4. Modern, simplified management of Groups – The new EAC also enables you to create and manage 4 types of groups: Microsoft 365 Groups, distribution lists, mail-enabled security groups, and dynamic distribution lists.



  5. Migration – The new EAC supports various kinds of migrations, including cross-tenant migrations for M&A scenarios, and automation Google Workspace/G-Suite migrations.



  6. Left navigation panel – The new EAC also includes a new left navigation panel to make it easier to find features.




You can access EAC today at https://admin.exchange.microsoft.com.


 


To learn more, check out https://docs.microsoft.com/en-us/exchange/exchange-admin-center.


 


Take a tour of the new EAC at https://www.microsoft.com/en-us/videoplayer/embed/RE4FqDa.

Experiencing Data Access Issue in Azure portal for Log Analytics – 04/28 – Resolved

This article is contributed. See the original author and article here.

Final Update: Wednesday, 28 April 2021 02:39 UTC

We’ve confirmed that all systems are back to normal with no customer impact as of 04/27, 02:10 UTC. Our logs show the incident started on 04/27, 01:46 UTC and that during the 24 minutes that it took to resolve the issue 5% of customers experienced data access issues and delayed or missed alerts in South East Australia.
  • Root Cause: Engineers determined that backend storage device became unhealthy.
  • Mitigation: Engineers determined that the service is auto recovered by Azure platform.
  • Incident Timeline: 24 minutes – 04/27, 01:46 UTC through 04/27, 02:10 UTC
We understand that customers rely on Azure Log Analytics as a critical service and apologize for any impact this incident caused.

-Vincent