Az Update: Microsoft365 to retire IE11 support, Azure IoT Central updates and more

Az Update: Microsoft365 to retire IE11 support, Azure IoT Central updates and more

This article is contributed. See the original author and article here.

Quite a bit of Azure news to cover this week. Items covered include Microsoft 365 apps to retire support for Internet Explorer 11, Azure IoT Central updates including a command line-interface (CLI) IoT extension update and a Mobile app gateway sample, Assigning groups to Azure AD roles is now in public preview and Video Analytics is now available as a new Azure IoT Central App Template.

 

 

Microsoft 365 apps to retire support for Internet Explorer 11 and Windows 10 sunsets Microsoft Edge Legacy

M365_Edge_ProductTeams_IE11_retire_support_end_of_life.png

The Microsoft Teams web app will no longer support IE 11 as of November 30, 2020 and the remaining Microsoft 365 apps and services will no longer support IE 11 as of August 17, 2021.   After the above dates, customers will have a degraded experience or will be unable to connect to Microsoft 365 apps and services on IE 11. For degraded experiences, new Microsoft 365 features will not be available or certain features may cease to work when accessing the app or service via IE 11.
 
After March 9, 2021, the Microsoft Edge Legacy desktop app within Windows 10 will not receive new security updates. The new Microsoft Edge and Internet Explorer mode uses the Trident MSHTML engine from Internet Explorer 11 (IE11) for legacy sites and can be specifically configure via policy.
 

Azure IoT Central UI new and updated features

A plethora of new Azure IoT Central features updates this week including:

 

  • Azure command line-interface (CLI) IoT extension update
    • Enables the ability to troubleshoot and diagnose common issues when connecting a device to IoT Central. For example, you can use the Azure CLI to compare and validate the device property payload against the device model and run a device command and view the response.
       
  • Mobile app gateway sample
    • Most medical wearable devices are Bluetooth Low Energy devices that need a gateway to send data to IoT Central. This phone app acts as that gateway and can be used by a patient who has no access to, or knowledge of, IoT Central. You can customize the IoT Central Continuous Patient Monitoring sample mobile app for other use cases and industries.
       

  • Device builder documentation improvements
    • A new article for device developers describes message payloads. It describes the JSON that devices send and receive for telemetry, properties, and commands defined in a device template. The article includes snippets from the device capability model and provides examples that show how the device should interact with the application.
       

  • User management support with the IoT Central APIs
    • You can now use the IoT Central APIs to manage your application users. This makes it easier to add, remove, and modify users and roles programmatically. You can add and manage service principal (SPNs) as part of your application to make it easier to deploy IoT Central in your existing release pipeline.
       

Assigning groups to Azure AD roles is now in public preview

Currently available for Azure AD groups and Azure AD built-in roles, and Microsoft will be extending this in the future to on-premises groups as well as Azure AD custom roles. You’ll need to create an Azure AD group and enable it to have roles assigned which can be done by anyone who is either a Privileged Role Administrator or a Global Administrator.  To use this feature, you’ll need to create an Azure AD group and enable it to have roles assigned. This can be done by anyone who is either a Privileged Role Administrator or a Global Administrator.  After that, any of the Azure AD built-in roles, such as Teams Administrator or SharePoint Administrator, can have groups assigned to them.

 

New Azure IoT Central Video Analytics App Template now available

Azure_IoT_central_video_analytics_app_template.png

The new IoT Central video analytics template simplifies the setup of an Azure IoT Edge device to act as the gateway between cameras and Azure cloud services. The template installs the IoT Edge modules such as an IoT Central Gateway, Live Video Analytics on IoT Edge, OpenVINO Model server, and an ONVIF module on the Edge device. These modules help the IoT Central application configure and manage the devices, ingest the live video streams from the cameras, and easily apply AI models such as vehicle or person detection. Simultaneously in the cloud, Azure Media Services and Azure Storage record and stream relevant portions of the live video feed.

 

MS Learn Module of the Week

Microsoft_Learn_Banner.png
 

Develop IoT solutions with Azure IoT Central

Interested in rapidly building enterprise-grade IoT applications on a secure, reliable, and scalable infrastructure? This path is the place to start to learn how to build IoT solutions with Azure IoT Central. IoT Central is an IoT application platform that reduces the burden and cost of developing, managing, and maintaining enterprise-grade IoT solutions.

 

Let us know in the comments below if there are any news items you would like to see covered in next week show.  Az Update streams live every Friday so be sure to catch the next episode and join us in the live chat.

Azure Policy Remediation with Deployment Scripts

Azure Policy Remediation with Deployment Scripts

This article is contributed. See the original author and article here.

How many times have you wanted to remediate a non-compliant object using Azure Policy but found you can’t because the policy language or type of object can’t be manipulated in that way. Or maybe you’ve had to write a policy with an audit effect instead of being able to create a deployment to remediate the issue. Deployment Scripts are currently in preview and allow you to execute PowerShell or CLI scripts using Azure Container Instances as part of an Azure Resource Manager template. To put it simply – now you can run a script as part of a template deployment.

 

So my thought was if I can deploy a template using an Azure Policy DeployIfNotExists effect – why can’t I deploy a Deployment Script object which then runs the code to remediate my non-compliant Azure resource?

 

Well as it turns out you can! And this allows several interesting use cases which are not possible with the default policy language such as: –

  • Deleting orphaned objects.
  • Changing the license type to Hybrid Benefit for existing Azure machines.
  • Detailed tag application – running a script to build a tag value based on many other resources or conditions.
  • Performing data plane operations on objects like Azure Key Vault and Azure Storage.

The rest of this post takes you through how I set this functionality up to ensure that all Windows virtual machines are running with Azure Hybrid Benefit enabled.

 

Azure Policy

I won’t go into the details of creating the basic Azure Policy rules however you want to ensure that the effect for your policy is DeployIfNotExists. In my case I started of with a very simple rule which will help filter out resources I’m not interested in – so as part of the rule I’m looking for resource types which are virtual machines, and that have Microsoft Windows Server as the publisher.

 

 

{
    "if": {
        "allOf": [
            {
                "field": "type",
                "equals": "Microsoft.Compute/virtualMachines"
            },
            {
                "field": "Microsoft.Compute/virtualMachines/storageProfile.imageReference.publisher",
                "equals": "MicrosoftWindowsServer"
            }
        ]
    }
}

 

 

 

For the policy effect I specify DeployIfNotExists and then retrieve the same object and apply some more checks to it. This time as part of the existence condition I’m going to check the license type field to check if it is correct.

 

 

"existenceCondition": {
    "allOf": [
        {
            "field": "Microsoft.Compute/virtualMachines/licenseType",
            "exists": true
        },
        {
            "field": "Microsoft.Compute/virtualMachines/licenseType",
            "equals": "Windows_Server"
        },
        {
            "field": "Microsoft.Compute/virtualMachines/licenseType",
            "notEquals": "[parameters('StorageAccountId')]"
        }
    ]
}

 

 

 

In the JSON above there is a check for the StorageAccountId parameter which has nothing to do with the object we’re running the policy against – but I need to include it as I’ve used it as a parameter for my policy. It will always return true, so it’s not really included as part of the evaluation and by itself won’t trigger the deployment. (If you try to add a policy without consuming all the parameters in the policy rules you will get an error).

 

The rest of a DeployIfNotExists policy contains the object I want to deploy, and it does get a bit complicated. If I was to just deploy the deployment script object it would deploy in the same resource group as my resource to be remediated which isn’t a desirable outcome as it would leave a mess of orphaned objects. The deployment script also requires a storage account to work and I don’t want my subscription littered with random storage accounts. To get around this I create a subscription level deployment – which deploys a deployment resource, which contains a nested deployment to deploy the deployment script. Confused? Here it is in a diagram and you can follow the previous links or look at the policy itself.

 

policy01.png

The best part is we don’t have to manage the Azure Container Instance as the deployment script object does that for you.

 

What I do have to worry about is the script that runs – it uses a user assigned managed identity which must have permission manage the resources, in this case I need to give it Reader and Virtual Machine Contributor rights on the subscription so it can change that license type and update the virtual machine.

 

The PowerShell script which runs is so simple: –

 

 

Param($ResourceGroupName, $VMName)
$vm = Get-AzVM -ResourceGroupName $ResourceGroupName -Name $VMName
$vm.LicenseType = "Windows_Server"
Update-AzVM -VM $vm -ResourceGroupName $ResourceGroupName

 

 

 

The container instance comes with the Az modules already or if you prefer to use the Azure CLI you can specify that in the deployment script object in the template. The script can be either be provided inline or link to an external URL. If you are linking externally and don’t want it to be in a public location, you might have to provide a SAS URL. I also specify arguments to provide to the script in a concatenated string format, the documentation on the deployment script provides some more information on these arguments but you can incorporate parameters from the policy which means the inputs can come from the non-compliant objects. As well you can choose to use an existing storage account, or you can let the deployment script create one for you.

 

 

"forceUpdateTag": "[utcNow()]",
"azPowerShellVersion": "4.1",
"storageAccountSettings": {
    "storageAccountName": "[parameters('StorageAccountName')]",
    "storageAccountKey": "[listKeys(resourceId('Microsoft.Storage/storageAccounts', parameters('StorageAccountName')), '2019-06-01').keys[0].value]"
},
"arguments": "[concat('-ResourceGroupName ',parameters('VMResourceGroup'),' -VMName ',parameters('VMName'))]",
"retentionInterval": "P1D",
"cleanupPreference": "OnSuccess",
"primaryScriptUri": "https://raw.githubusercontent.com/anwather/My-Scripts/master/license.ps1"

 

 

 

I’ve linked the policy rule here for you to review, be careful to observe the flow of the parameters as  they are provided in the policy assignment and then are passed down through each deployment object as a value.  The ‘StorageAccountName’ parameter is a good example of this.

 

Deploying the Solution

Scripts and policies are located in my GitHub repository – you can clone or download them.

The steps to deploy the required resources and policy are as below: –

  1. Ensure that you have the latest version of the Az PowerShell modules available.
  2. Connect to Azure using Connect-AzAccount
  3. Modify the deploy.ps1 script and change the values where indicated.

 

 

$resourceGroupName = "ACI" # <- Replace with your value
$location = "australiaeast" # <- This must be a location that can host Azure Container Instances
$storageAccountName = "deploymentscript474694" # <- Unique storage account name
$userManagedIdentity = "scriptRunner" # <- Change this if you don’t like the name

 

 

     4. Run the deploy.ps1 script. The output should be like below.

policy02.png

 

The script will create a resource group, storage account and deploy the policy definition.

 

Create a Policy Assignment

In the Azure portal Policy section, we can now create the assignment and deploy the policy. Click on “Assign Policy”.

 

pol1.png

 

Select the scope you want to assign the policy to and ensure that the correct policy definition is selected.

pol2.png

 

Click next and fill in the parameters – the values for this are output by the deployment script. 

pol3.png

Click next – you can leave the options as is for this screen and simply click Review and Create. On the final screen just click create.

 

The policy will be assigned, and a new managed identity will also be created which allows us to remediate any non-compliant resources.

pol4.png

 

Testing It Out

To test the policy and remediation task I have built a new Windows Server making sure that I haven’t selected to use Azure Hybrid Benefit.

pp1.png

Once the policy evaluation cycle is complete (use Start-AzPolicyComplianceScan to trigger) I can see that my new resource is now showing as non-compliant.

pp2.png

 

I can go in now and create a remediation task for this machine by clicking on Create Remediation Task. The task will launch and begin the deployment of my Deployment Script object.

pp3.png

 

I can check the resource group I specified (ACI) that the deployment script objects are created in and will be able to see the object in there.

pp4.png

 

Selecting this resource will show the details about the container instance that was launched (it’s been deleted already since the container has run) and the logs. You can also see that during the script deployment it has been able to bring the parameters I specified in the template into the script.

pp5.png

 

 And finally, we can check the virtual machine itself, and I find that the Azure Hybrid Benefit has been applied successfully.

pp6.png

 

When I look at the resource now in the Azure Policy blade it is now showing the resource as compliant.

pp7.png

 

So there you have it, what started as a theory for remediating objects has been proven to work nicely and now I have the task of looking over all my other policies and seeing what I can remediate using this method.

 

Known Issues:

  • In the example given – Azure Spot instances can’t be remediated using this process
  • My testing cases are small and in no way should reflect your own testing.
  • This is hosted on GitHub – if there are issues or you make changes please submit a PR for review.

 

Disclaimer:

The sample scripts are not supported under any Microsoft standard support program or service. The sample scripts are provided AS IS without warranty of any kind. Microsoft further disclaims all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose. The entire risk arising out of the use or performance of the sample scripts and documentation remains with you. In no event shall Microsoft, its authors, or anyone else involved in the creation, production, or delivery of the scripts be liable for any damages whatsoever (including, without limitation, damages for loss of business profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use the sample scripts or documentation, even if Microsoft has been advised of the possibility of such damages.

Experiencing Alerting failure issue in Azure Portal for Many Data Types – 08/20 – Mitigated

This article is contributed. See the original author and article here.

Final Update: Thursday, 20 August 2020 23:46 UTC

We’ve confirmed that all systems are back to normal with no customer impact as of 8/14/20, 22:12 UTC. Our logs show the incident started on 9/23/19, 00:00 UTC and that during the 326 days that it took to resolve the issue 138 customers using metric alerts based on custom metrics in Brazil South region may have seen incorrect alert activations and or failures.

  • Root Cause: The failure was due to a backend storage configuration.
  • Incident Timeline: 326 days 22 hours 12 minutes   – 9/23/2019, 00:00 UTC through 8/14/2020, 22:12 UTC

We understand that customers rely on Application Insights as a critical service and apologize for any impact this incident caused.

-Jeff


Azure portal August 2020 update

Azure portal August 2020 update

This article is contributed. See the original author and article here.

Databases > SQL Database

All categories

Intune

 

Let’s look at each of these updates in greater detail.

           

Databases > SQL Database

Open your SQL database in Azure Data Studio

The Azure portal now includes a one-click connection to Azure Data Studio from any Azure SQL Database. Azure Data Studio offers a modern editor experience with IntelliSense, code snippets, source control integration, SQL Notebooks, and an integrated terminal. Simply open the Azure portal, navigate to any SQL Database, and click “Connect with Azure Data Studio”. The experience includes a link to download Azure Data Studio if you do not already have it. The connection info is passed to Azure Data Studio and the connection will automatically go through with AAD or will require just a password to complete the connection.

 

  1. Sign in to Azure portal
  2. Click “All Services”
  3. Search and select “Azure SQL”
  4. Find and select any SQL database
  5. Click “Connect with…”, then click “Azure Data Studio”

sql.png

6. Click to download or launch Azure Data Studio

sql2.png

7. Connect to your database and try out Azure Data Studio!

sql3.png

 

All Categories

Work with a freelancer

It’s now easier than ever to connect and start working with a freelancer who can help you complete your short-term, on-demand Azure projects. Microsoft and Upwork are partnering to provide you with easy access to freelancers with current Azure certifications.

In the upper right corner of the Azure portal, click on the question mark.  You’ll see the freelancer information pop up below.

 

 

All Categories

View a summary of your resources on a map and in other charts

On certain resource list views, you can now see a summary count of your resources. This feature allows you to visually represent your resources on a chart, summarizing over location, resource group, subscription, and resource type. A powerful use case of this would be to visually represent your resources on a map.

 

a. Go to a list of resources. Check if it has a “List view” dropdown at the top right of the list. If so, you can proceed with the demo. Otherwise, use the “All resource” list as an example.

resource2.png

b. In the dropdown, select “Summary view” where you will see a summary count of your resources. You can choose to summarize by location, resources group, subscription, and type, if applicable, from the menu on the left.

resourceb.png

c. You can use the filters to scope your results.

resourcec.png

d. In this view, you can change the visualization to a map, bar chart, donut chart, or list. (Note that maps are available for location only).

resourced.png

e.  If you have more than 10 items in a bar or donut chart, there will be a dropdown to choose your summary preference (see screenshot). If you want to see more than 10 items, change the visualization to “list” as described in step d.

resourcee.png

f.  If you like this view, you can save it using the “Manage view” dropdown for easy access.

resourcef.png

g.  If you want to drill down into one of the summary items, click into the item and you will see a list of the resources in that category.

resourceg.png

 

All Categories                

Portal search improvements

We have made several improvements to the portal’s search capabilities.

  • You can now use the portal’s global search bar to search for resources by IP address. The search will find all resources that have the specified IP address anywhere within their resource properties. 
  • You can now search for Azure invoices by typing in an invoice id.
  • We have improved the search functionality on all pages that have a menu on the left. Previously, a term had to be spelled correctly in order to produce results, but now, slight misspellings are accepted, and any existing results will be shown.

 

Step 1 – Log into the portal

Step 2 – Paste an IP address or an invoice ID into the global search bar.

 

Sample search by IP address:

portal search 1.png

Sample search by Invoice ID:

portal search 2.png

 

INTUNE

Updates to Microsoft Intune

 

The Microsoft Intune team has been hard at work on updates as well. You can find the full list of updates to Intune on the What’s new in Microsoft Intune page, including changes that affect your experience using Intune.

 

 

Azure portal “how to” video series

Have you checked out our Azure portal “how to” video series yet? The videos highlight specific aspects of the portal so you can be more efficient and productive while deploying your cloud workloads from the portal.  Check out our most recently published videos:

 

 

Next steps

The Azure portal has a large team of engineers that wants to hear from you, so please keep providing us your feedback in the comments section below or on Twitter @AzurePortal.

 

Sign in to the Azure portal now and see for yourself everything that’s new. Download the Azure mobile app to stay connected to your Azure resources anytime, anywhere.  See you next month!

 

 

Azure Sphere OS version 20.08 is now available

This article is contributed. See the original author and article here.

The Azure Sphere 20.08 OS quality release is now available in the Retail feed. This update includes enhancements and bug fixes in the Azure Sphere OS including a security update that represents a critical update to the OS. Because of this, the retail evaluation period has been shortened.

 

In addition, 20.08 includes new Sample applications; It does not include an updated SDK. 

 

The following changes are included in the 20.08 OS release:

  • Upgrades to incorporate a critical security update. This update addresses a CVE that has not yet been assigned an ID. We will update this post to provide more information when available. Please check back for updates.
  • Resolution of a problem that caused Ethernet-enabled devices to receive a non-random MAC address after OS recovery to 20.07 (via the azsphere device recover command).
  • Resolution of an issue with the system time not being maintained with RTC and battery.
  • Changes to WifiConfig_GetNetworkDiagnostics to return AuthenticationFailed in a manner consistent with 20.06 and earlier. This change fixes the issue mentioned for the 20.07 release.
  • Improvements to Networking_GetInterfaceConnectionStatus to more accurately reflect the ConnectedToInternet state.
  • Updated the Linux kernel to 5.4.54.

 

For hardware manufacturers:

The 20.08 release also contains important improvements for hardware manufacturers. These changes support greater flexibility in the manufacturing process and improved stability for customers.

 

We’ve updated the EEPROM configuration file , which is used with the command-line FT_PROG tool, to:

  • Disable Virtual COM ports to improve the stability of device-to-PC communications and reduce the number of PC crashes.
  • Supply unique serial numbers that start with ‘AS’ to distinguish Azure Sphere devices from other USB Serial Converters that may be present on the same PC.

We strongly recommend that you use the updated configuration file to program your devices.

The documentation now also shows how to program the FTDI EEPROM on multiple boards in parallel. See the MCU programming and debugging interface topic for details.

 

Sample Applications

We are also releasing three new sample applications on Friday, 8/21. We will update this post to provide a direct link when they are available. Please check back on Friday.

  • wolfSSL Sample for client-side TLS APIs – demonstrates using wolfSSL for SSL handshake in a high-level application.
  • Low power MCU-to-cloud reference solution –demonstrates how you might connect an external MCU to an Azure Sphere device and send data to IoT Central. This reference solution is optimized for low power scenarios.
  • Error reporting tutorial – demonstrates how to use error reporting features on the Azure Sphere platform.

 

For more information

For more information on Azure Sphere OS feeds and setting up an evaluation device group, see Azure Sphere OS feeds. 

 

If you encounter problems

For self-help technical inquiries, please visit Microsoft Q&A or Stack Overflow. If you require technical support and have a support plan, please submit a support ticket in Microsoft Azure Support or work with your Microsoft Technical Account Manager. If you would like to purchase a support plan, please explore the Azure support plans.

 

 

Azure SQL Capacity Planning: DTU or vCore? | Data Exposed

This article is contributed. See the original author and article here.

 

Capacity planning process plays a critical role in migrating to an existing application or designing a brand new one. In the second part of this three-part series with Silvano Coriani, learn about the differences between DTU and vCore when planning for your Azure SQL Database. For an overview of Azure SQL Capacity planning, watch part one.

 

Watch on Data Exposed

 

Additional Resources:
Choose between the vCore and DTU purchasing models
vCore model overview
Service tiers in the DTU-based purchase model
Migrate Azure SQL Database from the DTU-based model to the vCore-based model
Query Performance Insight for Azure SQL Database
Troubleshoot with Intelligent Insights

 

View/share our latest episodes on Channel 9 and YouTube!