Microsoft Defender for Endpoint RedHat 7.9

Microsoft Defender for Endpoint RedHat 7.9

This article is contributed. See the original author and article here.

 


Red Hat Linux Manual Deployment 


Note: This document is in support of Microsoft Defender for Endpoint (MDE, formerly MDATP) on Red Hat Enterprise Linux (RHEL) 


Disclaimer:  This may not work on all versions of Linux.


System requirements: 


 



  • Linux server distributions and versions: Red Hat Enterprise Linux 7.2 or higher. 



  • The fanotify kernel option must be enabled. 


 


Instructions to Prepare for MDE/MDATP Installation: 


 


1. Connect to the RedHat server using Putty. 


2. Install yum-utils if it isn’t already installed 


sudo yum install yum-utils 


 


[azureuser@redhat ~]$ sudo yum install yum-utils 


 


3. Install the RedHat MDATP Channel.  


From a web browser go to https://packages.microsoft.com/config/ to select your OS, version, and channel. 


 


pbracher_0-1615303062604.png


 


4. I have RedHat Version 7.9 and chose the production channel 7.4 which is the highest version without going to the next major version. Copy the link with prod.repo to be included in the next step.  For example:  https://packages.microsoft.com/config/rhel/7.4/prod.repo 


 


5. Install the Package. 


sudo yum-config-manager –add-repo=https://packages.microsoft.com/config/rhel/7.4/prod.repo 


 


[azureuser@redhat ~]$ sudo yum-config-manager –add– repo=https://packages.microsoft.com/config/rhel/7.4/prod.repo 


Loaded plugins: langpacks, product-id 


adding repo from: https://packages.microsoft.com/config/rhel/7.4/prod.repo 


grabbing file https://packages.microsoft.com/config/rhel/7.4/prod.repo to /etc/yum.repos.d/prod.repo 


repo saved to /etc/yum.repos.d/prod.repo 


[azureuser@redhat ~]$ 


 


 6. Install the Microsoft GPG public key: 


 


sudo rpm –import http://packages.microsoft.com/keys/microsoft.asc 


[azureuser@redhat ~]$ sudo rpm –import http://packages.microsoft.com/keys/microsoft.asc 


[azureuser@redhat ~]$  


 


 7. Make all the metadata usable for the currently enabled yum repositories:  


yum makecache 


 


[azureuser@redhat ~]$ yum makecache 


Loaded plugins: langpacks, product-id, search-disabled-repos 


(1/5): packages-microsoft-com-prod/primary_db        118 kB  00:00:00 


(2/5): packages-microsoft-com-prod/other_db         7.2 kB  00:00:00 


(3/5): packages-microsoft-com-prod/filelists_db       341 kB  00:00:00 


(4/5): rhui-microsoft-azure-rhel7/filelists                    372 B  00:00:00 


(5/5): rhui-microsoft-azure-rhel7/other                      254 B  00:00:00 


rhui-microsoft-azure-rhel7      1/1 


rhui-microsoft-azure-rhel7      1/1 


rhui-microsoft-azure-rhel7      1/1 


Metadata Cache Created 


[azureuser@redhat ~]$ 


 


Install MDE/MDATP Application: 


 



  1. Run install command 


sudo yum install mdatp 


 


[azureuser@redhat ~]$ sudo yum install mdatp 


 


Loaded plugins: langpacks, product-id, search-disabled-repos 


packages-microsoft-com-prod     | 3.0 kB  00:00:00 


packages-microsoft-com-prod/primary_db       118 kB  00:00:00 


 


Resolving Dependencies 


–> Running transaction check 


—> Package mdatp.x86_64 0:101.18.53-1 will be installed 


–> Processing Dependency: libatomic for package: mdatp-101.18.53-1.x86_64 


–> Running transaction check 


—> Package libatomic.x86_64 0:4.8.5-44.el7 will be installed 


–> Finished Dependency Resolution 


 


Dependencies Resolved 


 


======================================================================== 


 Package  Repository                            Arch                    Size                                Version         


                                                                                                         


Installing: 


 mdatp                                               x86_64                     42 M                    101.18.53-1                             packages-microsoft-com-prod                                          


Installing for dependencies: 


libatomic                                           x86_64                      51 k                     4.8.5-44.el7                            rhui-rhel-7-server-rhui-rpms                                             


 


Transaction Summary 


 


Install  1 Package (+1 Dependent package) 


 


Total download size: 42 M 


Installed size: 145 M 


Is this ok [y/d/N]: y 


Downloading packages: 


(1/2): libatomic-4.8.5-44.el7.x86_64.rpm    |  51 kB  00:00:00 


(2/2): mdatp_101.18.53.x86_64.rpm            |  42 MB  00:00:01 


—————————————————————————————————————————————— 


Total                                                                                                                                                                                                         32 MB/s |  42 MB  00:00:01 


Running transaction check 


Running transaction test 


Transaction test succeeded 


Running transaction 


  Installing : libatomic-4.8.5-44.el7.x86_64    1/2 


  Installing : mdatp-101.18.53-1.x86_64         2/2 


  Verifying  : libatomic-4.8.5-44.el7.x86_64     1/2 


  Verifying  : mdatp-101.18.53-1.x86_64          2/2 


rhui-rhel-7-server-dotnet-rhui-rpms/x86_64/productid  | 2.1 kB  00:00:00 


rhui-rhel-7-server-rhui-extras-rpms/x86_64/productid    | 2.1 kB  00:00:00 


rhui-rhel-7-server-rhui-rpms/7Server/x86_64/productid | 2.1 kB  00:00:00 


rhui-rhel-7-server-rhui-supplementary-rpms/7Server/x86_64/productid   | 2.1 kB  00:00:00 


rhui-rhel-server-rhui-rhscl-7-rpms/7Server/x86_64/productid        | 2.1 kB  00:00:00 


 


Installed: 


  mdatp.x86_64 0:101.18.53-1 


 


Dependency Installed: 


  libatomic.x86_64 0:4.8.5-44.el7 


 


Complete! 


[azureuser@redhat ~]$ 


 


 2. List all repositories.  Make sure the ones in red are in the repository if you chose prod.repo  (production).  


yum repolist 


 


[azureuser@redhat ~]$ yum repolist 


 


Loaded plugins: langpacks, product-id, search-disabled-repos 


repo name                                                                                       status 


packages-microsoft-com-prod                                       packages-microsoft-com-prod                                   89 


[azureuser@redhat ~]$ 


 


 3. Install the package from the production repository: 


 


sudo yum —enablerepo=packages-microsoft-com-prod install mdatp 


 


[azureuser@redhat ~]$ sudo yum –enablerepo=packages-microsoft-com-prod install mdatp 


 


Loaded plugins: langpacks, product-id, search-disabled-repos 


Package mdatp-101.18.53-1.x86_64 already installed and latest version 


Nothing to do 


[azureuser@redhat ~]$ 


 


 


Download the onboarding package & onboard


 



Download the onboarding package from Microsoft Defender Security Center from your Workstation: 



  1. In Microsoft Defender Security Center, go to Settings > Device Management > Onboarding. 

  2. In the first drop-down menu, select Linux Server as the operating system. In the second drop-down menu, select Local Script (for up to 10 devices) as the deployment method. 

  3. Select Download onboarding package. Save the file as WindowsDefenderATPOnboardingPackage.zip to your workstation. 


 


From the workstation copy WindowsDefenderATPOnboardingPackage.zip from the workstation to RHEL. Putty must be installed. Here we are using a key to log in and copy the file. 


 


C:>pscp.exe -P 22 -i C:UsersazureuserDownloadsredhat_key.ppk C:usersAzureuserWindowsDefenderATPOnboardingPackage.zip azureuser@ipaddressoflinuxserver:/home/azureuser 


 


WindowsDefenderATPOnboard | 5 kB |   5.6 kB/s | ETA: 00:00:00 | 100% 


 


Connect back to Linux (putty) 


 


[azureuser@redhat ~]$ cd .. 


[azureuser@redhat home]$ cd azureuser/ 


[azureuser@redhat ~]$ ls 


 


WindowsDefenderATPOnboardingPackage.zip 


 


   4. Unzip WindowsDefenderATPOnboardingPackage.zip 


 


[azureuser@redhat ~]$ unzip WindowsDefenderATPOnboardingPackage.zip 


Archive:  WindowsDefenderATPOnboardingPackage.zip 


inflating: MicrosoftDefenderATPOnboardingLinuxServer.py 


[azureuser@redhat ~]$ 


 


   5. Check the health of MDATP which should say no license found:    


mdatp health –field org_id 


 


[azureuser@redhat ~]$ mdatp health –field org_id 


ATTENTION: No license found. Contact your administrator for help. 


unavailable 


[azureuser@redhat ~]$ 


 


  6. Run Onboarding script: 


     MicrosoftDefenderATPOnboardingLinuxServer.py 


 


[azureuser@redhat ~]$ sudo python MicrosoftDefenderATPOnboardingLinuxServer.py 


Generating /etc/opt/microsoft/mdatp/mdatp_onboard.json … 


[azureuser@redhat ~]$ 


 


   7. Check the health of MDATP:  mdatp health –field org_id 


 


[azureuser@redhat ~]$ mdatp health –field org_id 


5447sdf90-2220-4161-82f7-0dgs2f39h8329-125fd412″ 


 


   8. Check the MDATP Azure console: 


 


pbracher_1-1615303062613.png


 


 

SharePoint monthly community call – March 9, 2021

SharePoint monthly community call – March 9, 2021

This article is contributed. See the original author and article here.

The SharePoint PnP Community monthly call is our general monthly review of the latest SharePoint and Microsoft 365 PnP topics (news, tools, extensions, features, capabilities, content and training), engineering priorities and community recognition for Developers, IT Pros and Makers.  This monthly community call happens on the second Tuesday of each month. You can download recurrent invite from https://aka.ms/sp-call.


 



 


Call Summary:


If you’re looking at this blog post, then you are at the new Microsoft 365 PnP Community hub at Microsoft Tech Communities!  Please take a moment to look around. The Microsoft 365 Update – Community (PnP) | March 2021is available. In this call, the Top 10 developer and non-developer entries in UserVoice are reviewed and top engineering priorities identified.  


 


Your votes do influence engineering priorities.  You are invited to attend the growing list of Sharing is Caring events.  Register today.  In Episode 117 of PnP Weekly tools and approaches for simplifying the move from Classic to Modern, on-prem to cloud were discussed.  Why Modern?  Well, one reason – Viva capabilities like the one that will be demonstrated today are available only in Modern.  Testing of SPFx v1.12 is on the final stretch.  Release expected any day.  


 


Thank you to the 200 + active contributors and organizations actively participating in this PnP Community during February. You are truly amazing.  The host of this call was Vesa Juvonen (Microsoft) @vesajuvonen.  Q&A took place in the chat throughout the call.


 


march-sp-monthly-together-mode.gif


 


Demo: Getting started with Microsoft Viva Topics – system and tools to help customers manage knowledge within their organizations through a conscious AI assisted strategy of connecting people and actionable knowledge.  Content is ultimately rendered through the Topic web part.  Topics along with aligned content and SMEs are initially discovered through AI algorithms, then confirmed and curated by humans.   Topics draws on capabilities from across Microsoft and can be extended by you.  


 


Actions: 


 



  • Register for Sharing is Caring Events

    • First Time Contributor Session – March 22nd  (EMEA, APAC & US friendly times available)

    • Community Docs Session – March

    • PnP – SPFx Developer Workstation Setup – March 10th

    • PnP SPFx Samples – Solving SPFx version differences using Node Version Manager – March 11th

    • PnP – AMA (Ask Me Anything) – SPFx Samples Edition – March 9th

    • First Time Presenter – March 24th

    • More than Code with VSCode – March 23rd

    • Maturity Model Practitioners – March 16th

    • PnP Office Hours – 1:1 session – Register



  • Download the recurrent invite for this call – https://aka.ms/sp-call.


 


You can check the latest updates in the monthly summary and at aka.ms/spdev-blog.


This call was delivered on Tuesday, March 9, 2021. The call agenda is reflected below with direct links to specific sections.  You can jump directly to a specific topic by clicking on the topic’s timestamp which will redirect your browser to that topic in the recording published on the Microsoft 365 Community YouTube Channel.


 


Call Agenda:


 



  • UserVoice status for non-dev focused SharePoint entries – 4:10

  • UserVoice status for dev focused SharePoint Framework entries – 5:08

  • SharePoint community update with latest news and roadmap – 8:57

  • Community contributors and companies which have been involved in the past month – 10:40

  • Demo:  Getting started with Microsoft Viva Topics – Naomi Moneypenny (Microsoft) | @nmoneypenny – 14:26


 


The full recording of this session is available from Microsoft 365 & SharePoint Community YouTube channel – http://aka.ms/m365pnp-videos.


 



  • Presentation slides used in this community call are found at OneDrive.


 


Resources: 


Additional resources on covered topics and discussions.


 



 


Additional Resources: 


 



 


Upcoming calls | Recurrent invites:


 



 


Too many links, can’t remember” – not a problem… just one URL is enough for all Microsoft 365 community topics – http://aka.ms/m365pnp.


 


“Sharing is caring”




SharePoint Team, Microsoft – 10th of March 2021


  


 

Announcing Az Predictor preview 2

Announcing Az Predictor preview 2

This article is contributed. See the original author and article here.

Last November, we announced the first preview of Az Predictor, a PowerShell module for Azure that brings to your fingertips the entire knowledge of the Azure documentation customized to your current session.
Today we are announcing the second preview of Az Predictor and we want to share some clarity on our plans for the next few months.


 


AzPredictor-preview2-dynamichelp.png


 


 


What did we learn from the first preview?


Since the release of the first preview, we listened to customer feedback and identified some challenges.



  1. Customers believed that the predictor was not functional. Since the service that is used to deliver the predictions did not support outages, we believe this feedback is caused by the following reasons:

    1. The module had to be imported manually and several customers either forgot or did not know that they had to import the module.

    2. The default configuration of PSReadline had to be changed to show the predications from Az Predictor.



  2. With an accepted suggestion, the navigation through the parameter values can be complicated especially when the list of parameters is long.

  3. We were not making any suggestions for several modules (for example Az.MySql).


 


What has changed?


The module now exposes two cmdlets ‘Enable-AzPredictor’ and ‘Disable-AzPredictor’ to automatically import the module and configure PSReadline. The cmdlet also allows users to enable the settings for future sessions by updating the user’s PowerShell profile (Microsoft.PowerShell_profile.ps1).


Az.Tools.Predictor required API changes to the PowerShell engine to improve suggestions (requiring PowerShell 7.2 preview 3).


You can now use dynamic completers to easily navigate through the parameter value with the ‘Alt+A’ combination.


We are continuously improving the model that is serving the predictions displayed on your screen. This is the most important and invisible piece of software that makes the magic! The most recent update of the model now comprises the missing modules.


 


Getting started with preview 2


If you have installed the first preview:



  • Close all your PowerShell sessions

  • Remove the Az.Tools.Predictor module


To install the second preview of Az.Tools.Predictor follow these steps:



  1. Install PowerShell 7.2-preview 3
    Go to: https://github.com/PowerShell/PowerShell/releases/tag/v7.2.0-preview.3
    Select the binary that corresponds to your platform in the assets list.


  2. Launch PowerShell 7.2-preview 3 and Install PSReadline 2.2 beta 2 with the following:

    Install-Module -Name PSReadLine -AllowPrerelease

    More details about PSReadline: https://www.powershellgallery.com/packages/PSReadLine/2.2.0-beta2


  3. Install Az.Tools.Predictor preview 2
    Install-module -name Az.Tools.Predictor -RequiredVersion 0.2.0

    More details about Az.Tools.Predictor: https://www.powershellgallery.com/packages/Az.Tools.Predictor/0.2.0


  4. Enable Az Predictor

    Enable-AzPredictor -AllSession

    This command will enable Az Predictor in all further sessions of the current user.


Inline view mode (default)


Once enabled, the default view is the “inline view” as shown in the following screen capture: 


 


AzPredictor-preview2-inlineview.png


 


This mode show only one suggestion at a time. The suggestion can be accepted by pressing the right arrow or you can continue to type. The suggestion will dynamically adjust based on the text that you have typed.


You can accept the suggestion at any time then come back and edit the command that is on your prompt. 


 


List view mode


This is definitely my favorite mode!


Switch to this view either by using the “F2” function key on your keyboard or run the following command: 


Set-PSReadLineOption -PredictionViewStyle ListView

This mode shows in the a list down your current prompt a list of possible match for the command that you are typing. It combines suggestions from the history as well as suggestions from Az Predictor.


 


Select a suggestion and then navigate through the parameter values with “Alt + A” to quickly fill replace the proposed values with yours.


 


AzPredictor-preview2-dynamichelp.png


 


 


What’s next?


We are looking for feedback on this second preview.



We will continue to improve our predictor in the coming months. Stay tuned for our next update of the module.



Tell us about your experience. What do you like or dislike about Az Predictor?


 


Further Reading



 


 

Advance Resource Access Governance for AML

Advance Resource Access Governance for AML

This article is contributed. See the original author and article here.



 






Access control is a fundamental building block for enterprise customers, where protecting assets at various levels is absolutely necessary to ensure that only the relevant people with certain positions of authority are given access with different privileges. This is more so prevalent in machine learning, where data is absolutely essential in building ML models, and companies are highly cautious about how the data is accessed and managed, especially with the introduction of GDPR.  We are seeing an increasing number of customers seeking for explicit control of not only the data, but various stages of the machine learning lifecycle, starting from experimentation and all the way to operationalization. Assets such as generated models, cluster creation and model deployment require to be governed to ensure that controls are in line with the company’s policy.


 


Azure traditionally provides Role-based Access Control [1], which helps to manage access to resources; who can access these and what they can access.  This is primarily achieved via the concept of roles.  A role defines a collection of permissions.


 


Existing Roles in AML


 


Azure Machine Learning provides three roles [3] for enterprise customers to provision as a coarse-grained access control, which is designed for simplicity in mind.  The first role (Owner) has the highest level of privileges, that grants full control of the workspace.  This is followed by a Contributor, which is a bit more restricted role that prevents users from changing role assignment. Reader having the most restrictive permissions and is typically read or view only (see figure 1 below).  


 


roles-1.png

 


 Figure 1 – Existing AML roles


 


What we have found with the customers is that while Coarse-grained Access Control immensely simplifies the management of the roles, and works quite well with a small team, primarily working in the experimentation environment.  However, when a company decides to operationalize the ML work, especially in the enterprise space, these roles become far too broad, and too simplistic.   In the enterprise space, the deployment tends to have several stages (such as dev, test, pre-prod, prod, etc.), and require various skillset (data scientist, data engineer, etc.) with a greater control in each stage.  For example, a Data Scientist may not operate in the production environment. A Data Engineer can only provision resources and should not have the ability to commission and decommission training clusters. Such governance policies are crucial for companies to be enforced and monitored to maintain integrity of their business and IT processes.


 


Unfortunately, such requirements cannot be captured with the existing roles. Enterprise needs a better mechanism to define policies for various assets in AML to satisfy their business specific requirements.


 


This is where the new exciting feature of advanced Role-based Access Control really shines. It is based on Fine-grained Access Control at component level (see figure 2) with a number of pre-built out of the box roles, plus the ability to create custom roles that can capture more complex governance access processes and enforce them.  


 


Advance Fine-grained Role-based Access Control


 


The new advance Role-based Access Control feature of AML is really going to solve many of the enterprise problems around the ability to restrict or grant user permissions for various components.  Azure AML currently defines 16 components  with varying permissions.




aml-components.png


 


Figure 2 – Components Level RBAC


 


Each component defines a list of actions such as read, write, delete, etc.  These actions can then be amalgamated together to create a custom specific role. To illustrate this with an example of a list of actions currently available for a Datastore component (see Figure 3 below).


 


datastore-1.png

 


Figure 3 – Datastore Actions


 


A datastore along with Dataset are important concepts in Azure Machine Learning,  since they provide access to various data sources, with lineage and tracking ability.  Many enterprises have built global Datalake that contain terabytes of data which can contain highly sensitive information. Companies are quite protective of who can access these data, along with various business justifications for how these data are being accessed/used. It is therefore imperative that a tighter access control is mandated for a specific role, such as a Data Engineer to accomplish such a task.


 


Fortunately, AML advance access control provide custom roles.  to cater for their company specific access control, which may be a hybrid of these roles.  For such requirements, Azure caters for custom roles.


 




Custom Role


 


Custom role [4] allows creation of Fine-grained Access Control on various components, such as the workspace, datastore, etc. 



  • Can be any combination of data or control plane actions that AzureML+AISC support.

  • Useful for creating scoped roles to a specific action like an MLOps Engineer


These controls are defined in a JSON policy definition, for example.


 


{
“Name”: “Data Scientist”,
“IsCustom”: true,
“Description”: “Can run experiment but can’t create or delete datastore.”,
“Actions”: [“*”],
“NotActions”: [
“Microsoft.MachineLearningServices/workspaces/*/delete”,
“Microsoft.MachineLearningServices/workspaces/ datastores/write”,
“Microsoft.MachineLearningServices/workspaces/ datastores /delete”,
“Microsoft.MachineLearningServices/workspaces/datastores/write”,
“Microsoft.Authorization/*/write”
],
“AssignableScopes”: [
“/subscriptions/<subscription_id>/resourceGroups/<resource_group_name>/providers/Microsoft.MachineLearningServices/workspaces/<workspace_name>”
]
}

 


The above code defines a Data Scientist who can run an experiment but cannot create or delete a Datastore. This role can be created using the Azure CLI (az role definition create -role-definition filename), however, the CLI ML extension needs to be installed first.  


 


Role Operation Workflow


 


In an organization, the following activities are to be undertaken by various role owners. 



  • Sub admin comes in for an enterprise and requests Amlcompute quota

  • They create an RG and a workspace for a specific team, and also set workspace level quota

  • The team lead (aka workspace admin), comes in and starts creating compute within the quota that the sub admin defined for that workspace

  • Data Scientist comes in and uses the compute that workspace admin created for them (clusters or instances).


 


Roles for Enterprise


 


AML provides a single environment for doing end-to-end experimentation to operationalization.  For a start-up this is really useful as they tend to operate in a very agile manner, where many iterations can happen in a short period of time and having the ability to quickly move from ideation to production really reduces their cycle time.  Unfortunately, this may not be the case for the enterprise customers, where they would typically be using either two or three environments to carry out their production workload such as: Dev, QA and Prod. 


 


Dev is used to do the experimentation, while QA is catered for satisfying various functional and non-functional requirements, followed by Prod for deployment into the production for consumer usage.


 


The environments would also have various roles to carry out different activities, such as Data Scientist, Data Engineer and MLOps Engineer (see figure 8 below).


 


 


role-3.png

 


 


Figure 8 – Enterprise Roles


 


A Data Scientist normally operates in the Dev environment and has full access to all the permissions related to carrying out experiments, such as provisioning training clusters, building models, etc. While some permissions are granted in the QA environment, primarily related model testing and performance, and very minimal access to the Prod environment, mainly telemetry (see below Table 1). 


 


A Data Engineer on the other hand primarily operates in the Build and QA environment. The main focus is related to the data handling, such as data loading, doing some data wrangling, etc.  They have restricted access in the Prod environment.


 


Mufajjul_Ali_10-1614737951507.png

 


 


Table 1 – Role/environment Matrix


 


An MLOps Engineer has some permission in the Dev environment, but full permissions in the QA and Prod.  This is because an MlOPs Engineer is tasked with building the pipeline, gluing things together, and ultimately deploying models in production.


 


The interesting part is how do all these roles and environments and other components fit together in Azure to provide the much-needed access governance for the enterprise customers. 


 


Enterprise AML Roles Deployment


 


It is impressive for enterprises to be able to model these complex roles/environments mapping as shown in Table one.  Fortunately these can be achieved in Azure using a combination of AD groups, roles and resource groups.


 


Mufajjul_Ali_11-1614737951524.png

 


 


Figure 9 – Enterprise AML Roles Deployment


 


Fundamentally, Azure Active Directory groups play a major part in gluing all these components together to make it functional. 


 


First step is to group the users specific to role(s) in a “Role AD group” for a given persona (DS, DE, etc.,). Then assign roles with various RBAC actions (Data Writer, MLContributor, etc.) to this AD group.  All these users will now inherit the permissions specific to this role(s).  Multiple AD groups will be created for different persona roles.


 


Separate AD groups (‘AD group for Environment’) are created for each environment (i.e. Dev, QA and Prod), the Role AD Groups are added to these Environment AD groups.  This creates a mapping of users belonging to a specific role persona with given permissions to an environment.


 


The ‘AD group for Environment’ is then assigned to a resource group, which contains a specific AML Workspace.  This ensures that the role permissions assigned to users will be enforced at the workspace level. 


 


Summary


 


In this blog, we have discussed the new advance Role-based Access Control, and how it is being applied in a complex enterprise with various environments with different user personas.


 


The important point to note is the flexibility that comes with this new feature which can operate at any of the 16 AML components and be able to define Fine-grained Access Control for each through custom roles, and out of box four roles which should be sufficient for the majority of the customers.  


 


References


 


[1] https://docs.microsoft.com/en-us/azure/role-based-access-control/overview


[2] https://azure.microsoft.com/en-gb/services/machine-learning/


[3] https://docs.microsoft.com/en-us/azure/machine-learning/concept-enterprise-security


[4] https://docs.microsoft.com/en-us/azure/role-based-access-control/custom-roles


 


Additional Links:


 



 

 


co-author: @Nishank Gupta 




Support Tip: Connecting Adobe and OneDrive for Business

Support Tip: Connecting Adobe and OneDrive for Business

This article is contributed. See the original author and article here.

Adobe Acrobat recently updated their application to include deeper integration with Microsoft including access to OneDrive for Business files. This integration allows users to access their OneDrive for Business files from the Acrobat app. The improvements have a few configuration changes which will require that Intune admins approve the Adobe Acrobat app to connect to the Intune service. This is a one-time approval that you may not have had to do historically when connecting Adobe Acrobat and OneDrive for Business.


 


There are two options for this one-time approval:



  1. Use the latest Adobe Acrobat iOS and Android app and enable the OneDrive feature:

    Adobe Acrobat Reader for PDF approval promptAdobe Acrobat Reader for PDF approval prompt


  2. Use the link below to associate the two for your organization:


    Permissions requested – Review for your organization | Adobe Acrobat Reader

    Admin consent - Permissions requested for review and approval processAdmin consent – Permissions requested for review and approval process




Enjoy the integration!


 


More info and feedback


Let us know if you have any additional questions by replying to this post or reaching out to @IntuneSuppTeam on Twitter.