Azure AD required for Update Compliance after October 15, 2022

Azure AD required for Update Compliance after October 15, 2022

This article is contributed. See the original author and article here.

Update Compliance enables organizations to monitor security, quality, and feature updates for Windows 10 or 11 Professional, Education, and Enterprise editions. It’s also one of many services powered by the Windows diagnostic data processor configuration, which allows IT administrators to authorize data to be collected from devices under their management. This blog prepares you for an upcoming set of changes in the requirements for Update Compliance.


The Windows diagnostic data processor configuration was announced in 2021. IT administrators leveraging this configuration are considered the data controllers for Windows diagnostic data collected from their enrolled devices. As defined by the European Union General Data Protection Regulation (GDPR), the data controller role allows you to determine the purposes and means of the processing of personal data.


To use the Windows diagnostic data processor configuration, targeted devices must be Azure Active Directory (Azure AD) joined or hybrid Azure AD joined. As a result, beginning October 15, 2022, devices that are neither joined nor hybrid joined to Azure AD will no longer appear in Update Compliance. All Windows diagnostic data processor prerequisites must be met to continue using the service after that date. The timeline for this change is as follows:


Paul_Reed_0-1651596966289.png


How to prepare for this change


Whether you are a current or new Update Compliance user, ensure that you meet the Azure AD requirement before October 15, 2022 to ensure continuity of your reporting. If your organization has not yet moved to Azure AD, we recommend that you begin your deployment now in preparation for this change. Additionally, if you do not yet have your CommercialID configured, you can do so now. Joining Azure AD and ensuring that your CommercialID is properly configured are two independent steps that can be taken in any order. As of October 15th, both steps will need to be taken to use or continue using Update Compliance.  These steps can be taken in any order prior to October 15th and further guidance will be released in the coming months.


What is the difference between Active Directory and Azure AD?


Azure AD is suitable for both cloud-only and hybrid organizations of any size or industry and can reduce the cost of managing Windows devices (except Home editions). Key capabilities include single sign-on (SSO) for both cloud and on-premises resources, Conditional Access through mobile device management (MDM) enrollment and MDM compliance evaluation, and self-service password reset and Windows Hello PIN reset on the lock screen. To learn more, see What is an Azure AD joined device?


Next steps


For a step-by-step guide on how to enroll your devices into Azure AD, see How to: Plan your Azure AD join implementation. This guide provides prescriptive guidance on how to:



  • Review your scenarios

  • Review your identity infrastructure

  • Assess your device management

  • Understand considerations for applications and resources

  • Understand your provisioning options

  • Configure enterprise state roaming

  • Configure Conditional Access


Alternatively, if you have an on-premises Active Directory environment, you may opt for hybrid Azure AD join. In that case, follow the steps outlined in Plan your hybrid Azure Active Directory join deployment. You can learn more about co-management of your cloud and on-premises devices with hybrid Azure AD at Plan your Azure Active Directory device deployment.









Note: Workplace Join does not meet the requirements for Update Compliance after October 15, 2022



Whether or not your devices are already Azure AD joined (or hybrid joined), you can enroll in and configure Update Compliance by following these instructions: Get started with Update Compliance.


To summarize, if your devices are still using on-premises Azure Directory, we recommend that you plan for this upcoming change to Update Compliance. In early 2023, we will replace the use of CommercialID in Update Compliance with Azure AD tenant ID. We will provide additional steps to help you register your Azure AD tenant ID so your targeted devices are properly configured for Update Compliance in the near future. Follow the Windows IT Pro Blog, or @MSWindowsITPro on Twitter, to be informed when these steps are available.


For the latest information on the types of Windows diagnostic data and the ways you can manage it within your organization, see Enable Windows diagnostic data processor configuration




Continue the conversation. Find best practices. Visit the Windows Tech Community.


 

Using research to unlock the potential of hybrid work

Using research to unlock the potential of hybrid work

This article is contributed. See the original author and article here.

Just last month, we released our 2022 Annual Work Trend Index to better understand how work has changed over the past two years. The biggest takeaway is clear: we’re not the same people that went home to work in early 2020.

The post Using research to unlock the potential of hybrid work appeared first on Microsoft 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Integrating Terraform and Azure DevOps to manage Azure Databricks

Integrating Terraform and Azure DevOps to manage Azure Databricks

This article is contributed. See the original author and article here.

Continuous integration and continuous delivery (CI/CD) culture started to get popular, and it brought the challenge of having everything automatized, aiming to make processes easier and maintainable for everyone.


 


One of the most valuable aspects of CI/CD is the integration of the Infrastructure as Code (IaC) concept, with IaC we can version our infrastructure, save money, creating new environments in minutes, among many more benefits. I won’t go deeper about IaC, but if you want to learn further visit: The benefits of Infrastructure as Code 


 


IaC can also bring some challenges when creating resources needed for the projects. This is mostly due to creating all the scripts for the infrastructure is a task that is usually assigned to the infrastructure engineers, and it happens that we can’t have the opportunity to be helped for any reason.


 


As a Data Engineer, I would like to help you understand the CI/CD process with a hands-on. You’ll learn how to create Azure Databricks through Terraform and Azure DevOps, whether you are creating projects by yourself or supporting your Infrastructure Team.


 


In this article, you´ll learn how to integrate Azure Databricks with Terraform and Azure DevOps and the main reason is just because in this moment I’ve had some difficulties getting the information with these 3 technologies together.


 


First of all, you’ll need some prerequisites 


 



  • Azure Subscription

  • Azure Resource Group (you can use an existing one)

  • Azure DevOps account

  • Azure Storage Account with a container named “tfstate”

  • Visual Studio Code (it’s up to you)


So, let’s start and have some fun


 


Please, go ahead and download or clone this GitHub repository  databrick-tf-ado and get demo-start branch.


In the folder you’ll see a file named main.tf and 2 more files in the folder modules/databricks-workspace


 


Vanessa_Segovia_0-1651505246300.png


 


It should be noted that this example is a basic one, so you can find more information of all the features for databricks in this link: https://registry.terraform.io/providers/databrickslabs/databricks/latest/docs 


 


Now, go to the main.tf file in the root folder and find line 8 where the declaration of azurerm starts


 


 

  backend "azurerm" {
    resource_group_name  = "demodb-rg"
    storage_account_name = "demodbtfstate"
    container_name       = "tfstate"
    key                  = "dev.terraform.tfstate"
  }

 


 


there you need to change the value of resource_group_name and storage_account_name for the values of you subscription, you can find those values in your Azure Portal, they need to be already created.


 


storageaccount.png


 


 


In main.tf file inside root folder there’s a reference to a module called “databricks-workspace”, now in that folder you can see 2 more files main.tf and variables.tf. 


 


main.tf contains the definition to create a databricks workspace, a cluster, a scope, a secret and a notebook, in the format that terraform requires and variables.tf contains the information of the values that could change depending on the environment. 


 


Now that you changed the values mentioned above into a GitHub or DevOps repository if you need assistance for that visit these pages: GitHub or DevOps.


 


At this moment we have our github or devops repository with the names that we require configured, so let´s create our pipeline to deploy our databricks environment into our Azure subscription.


 


First go to your azure subscription and check that you don’t have a databricks called demodb-workspace


 


portalazurebefore.png


 


 


You’ll need to install an extension so DevOps can use terraform commands so go to Terraform Extension.


 


Once is installed in your project in Azure DevOps click on Pipelines-Release and Create “new pipeline”, it appears the option by creating the pipeline with YAML or with the Editor, I’ll choose the Editor so we can see it clearer.


 


Vanessa_Segovia_3-1651505246308.png


 


 


In Add an Artifact in the Artifact section of the pipeline select your source type (provider where you uploaded your repository) and fill all the required information, like the image below and click “Add”


 


addartifact.png


 


 


Then click on Add stage in Stages section and choose empty Job and name the stage as “DEV”


 


addstage.png


 


After that click on Jobs below the name of the stage


Vanessa_Segovia_6-1651505246314.png


 


In the Agent job, press the “+” button and search for “terraform” select “Terraform tool installer”


 


addinstallterraform.png


Leave the default information


 


Then Add another 3 tasks of “Terraform” task


 


addterraformtask.png


 


Name the second task after Installer as “Init” and fill the information required like the image:


 


init.png


 


 


For all these 3 tasks set the information of your subscription, resource group, storage account and container, and there’s also a value labeled key, there you have to set “dev.terraform.tfstate” is a key that terraform uses to keep tracking of your Infrastructure changes.


 


suscription.png


 


Name next task as “Plan”


 


plan.png


 


Next task “Apply”


 


apply.png


 


Now change the name of your pipeline and save it


 


namepipeline.png


 


And we only need to create a Release to test it


 


You can monitor the progress


 


progress.png


 


 


When it finished, if everything was good you’ll see your pipeline as successful 


 


success.png


 


Lastly let´s confirm in the azure portal that everything is created correctly


 


finalportal.png


 


then login in your workspace and check the and run the notebook, so you can test that the cluster, the scope, the secret and the notebook are working correctly.


 


workspace.png


 


 


With that you can easily maintain your environments safe from the changes that contributors can do, only one way to accept modifications into your infrastructure.


 


Let us know any comments or questions.


 


 


 


 


 


 


 


 

Custom role to restrict Azure Data Factory pipeline developers to create/delete linked services

Custom role to restrict Azure Data Factory pipeline developers to create/delete linked services

This article is contributed. See the original author and article here.

 


Restrict ADF pipeline developers to create connection using linked services


 


Azure Data Factory has some built-in role such as Data Factory Contributor. Once this role is granted to the developers, they can create and run pipelines in Azure Data Factory. The role can be granted at the resource group or above depending on the assignable scope you want the users or group to have access to.


 


When there is a requirement that the Azure Data Factory pipeline developers should not create or delete linked services to connect to the data sources that they have access to, the built-in role (Data Factory Contributor) will not restrict them. This calls for the creation of custom roles. However, you need to be cognizant about the number of role assignments that you can have depending on your subscription. This can be verified by choosing your resource group and selection the Role assignments under Access Control (IAM).


 


How do we create a custom role to allow the Data Factory pipeline developers to create pipelines but restrict them only to the existing linked service for connection but not create or delete them?


 


The following steps will help to restrict them:



  1. In the Azure portal, select the resource group where you have the data factory created.

  2. Select Access Control (IAM)

  3. Click + Add

  4. Select Add custom role

  5. Under Basics provide a Custom role name. For example: Pipeline Developers

  6. Provide a description

  7. Select Clone a role for Baseline permissions

  8. Select Data Factory Contributor for Role to clone


JohnEmileLucien_0-1651263328599.png


 



  1. Click Next

  2. Under Permissions select + Exclude permissions


JohnEmileLucien_1-1651263328630.png


 



  1. Under Exclude Permissions, type Microsoft Data Factory and select it.


JohnEmileLucien_2-1651263328637.png


 



  1. Under Microsoft.DataFactory permissions, type Linked service

  2. Select Not Actions

  3. Select Delete: Delete Linked Service and Write: Create or Update any Linked service


JohnEmileLucien_3-1651263328652.png


 



  1. Click Add

  2. Click Next

  3. Under Assignable Scopes, make sure you want assignable scope to resource group or subscription. Delete and Add assignable scopes accordingly

  4. Go over the JSON Tab

  5. Click Review + create

  6. Once validated, click create


 


Note: Once the custom role is created, you can assign a user or group to this role. You can login with this user to Azure Data Factory. You will still be able to create a linked service but will not be able to save/publish.


 


JohnEmileLucien_4-1651263328673.png


 


 

Organize Automatic Deployment rules in Configuration Manager TP 2204

Organize Automatic Deployment rules in Configuration Manager TP 2204

This article is contributed. See the original author and article here.

Update 2204 for the Technical Preview Branch of Microsoft Endpoint Configuration Manager has been released. In this release, administrators can now organize automatic deployment rules (ADR) using folders. This feature helps to enable better categorization and management of ADRs. Folder management is also supported with PowerShell Cmdlets.


 


Bala_Delli_0-1650600711360.png


 


This preview release also includes:


Administration Service Management option


When configuring Azure Services, a new option called Administration Service Management is now added for enhanced security. Selecting this option allows administrators to segment their admin privileges between cloud management gateway (CMG) and administration service. By enabling this option, access is restricted to only administration service endpoints. Configuration Management clients will authenticate to the site using Azure Active Directory.


 


Note:


Currently, the administration service management option can’t be used with CMG.


 


For more details and to view the full list of new features in this update, check out our Features in Configuration Manager technical preview version 2204 documentation. 


 


Update 2204 for Technical Preview Branch is available in the Microsoft Endpoint Configuration Manager Technical Preview console. For new installations, the 2202 baseline version of Microsoft Endpoint Configuration Manager Technical Preview Branch is available on the Microsoft Evaluation Center. Technical Preview Branch releases give you an opportunity to try out new Configuration Manager features in a test environment before they are made generally available.


 


We would love to hear your thoughts about the latest Technical Preview! Send us feedback directly from the console.


 


Thanks,


The Configuration Manager team


 


Configuration Manager Resources:


Documentation for Configuration Manager Technical Previews


Try the Configuration Manager Technical Preview Branch


Documentation for Configuration Manager


Configuration Manager Forums


Configuration Manager Support