Extracting SAP data using OData – Part 4 – Handling large volumes of data

Extracting SAP data using OData – Part 4 – Handling large volumes of data

This article is contributed. See the original author and article here.







Before implementing data extraction from SAP systems please always verify your licensing agreement.

 


I hope you all enjoyed the summer with OData-based extraction and Synapse Pipelines. Time flies quickly, and we’re now in the fourth episode of this mini-blog series. You’ve learnt quite a lot! Initially, we built a simple pipeline with a single activity to copy data from the SAP system to data lake storage. But that solution evolved quickly, and now it supports storing metadata in an external store that decouples the management of OData services and pipelines. To extract data from a new service, you can just enter its name to the Azure Table Storage. You don’t have to make any changes in Synapse!


 


Today we continue our journey, and we’ll focus on optimizing the extraction performance of large datasets. When I first started working with OData-based extraction, it caused me a lot of challenges – have you ever tried to generate an OData payload for more than a million records? I can’t assure you – most of the time, it doesn’t end well!


 


image001.png


 


But even if we forget about string size limitations, working with huge HTTP payloads causes problems in data transmission and processing. How should we then approach the data extraction from large OData sources?


 


CLIENT-SIDE PAGING


 









There is a GitHub repository with source code for each episode. Learn more:


https://github.com/BJarkowski/synapse-pipelines-sap-odata-public




The solution to the above problem is pretty straightforward. If working with a large amount of data is causing us an issue, let’s split the dataset into a couple of smaller chunks. Then extract and process each part separately. You can even run each thread in parallel, improving the overall performance of the job.


 


There is a couple of ways how you can split the dataset. Firstly, the most obvious is to use business logic to provide rules. Using the Sales Orders as an example, we could chunk it into smaller pieces using the following keys:



  1. SalesOrganization

  2. SalesOrderType

  3. SoldToPary


Those are not the only business-related keys you can use to create partitions in your data. It’s a reliable approach, but it requires a lot of preparation and data discovery work. Fortunately, there is another solution.


 


When working with OData services, you can manipulate the amount of data to fetch within a single request using the $top query parameter. When you pass $top=25 you will only receive only 25 records in the response. But that’s not the end! We can also use the $skip parameter that indicates the first record to fetch. So, to process 15 records, you can send a single request, or we can chunk it into smaller pieces using the combination of $skip and $top parameters.


 


image003.png


 


As sending a single request asking for large amounts of data is not the best approach, a similar danger comes from flooding the SAP system with a large number of tiny calls. Finding the right balance is the key!


 


The above approach is called Client-Side Paging. We will use the logic inside Synapse Pipeline to split the dataset into manageable pieces and then extract each of them. To implement it in the pipeline, we need three numbers:



  • The number of records in the OData service

  • Amount of data to fetch in a single request (batch size)

  • Number of requests


Getting the number of records in the OData service is simple. You can use the $count option passed at the end of the URL. By dividing it by the batch size, which we define for each OData service and store in the metadata table, we can calculate the number of requests required to fetch the complete dataset.


 


Open the Storage Explorer to alter the metadata table and add a new Batch property:


image005.png


 


Now go to Synapse Studio and open the child pipeline. Add a new parameter to store the Batch size:


 


image007.png


 


In the metadata pipeline, open the Execute Pipeline activity. The following expression will pass the batch size value from the metadata table. You don’t have to make any changes to the Lookup.


 


 


 

@item().Batch

 


 


 


image009.png


 


There is a couple of ways to read the record count. Initially, I wanted to use the Lookup activity against the dataset that we already have. But, as the result of a $count is just a number without any data structure, the OData connector fails to interpret the value. Instead, we have to create another Linked Service and a dataset of type HTTP. It should point to the same OData Service as the Copy Data activity.


 


Create the new Linked Service of type HTTP. It should accept the same parameters as the OData one. Refer to the second episode of the blog series if you’d like to refresh your memory on how to add parameters to linked services.


 


 


 


 

{
    "name": "ls_http_sap",
    "properties": {
        "parameters": {
            "ODataURL": {
                "type": "String"
            }
        },
        "annotations": [],
        "type": "HttpServer",
        "typeProperties": {
            "url": "@{linkedService().ODataURL}",
            "enableServerCertificateValidation": true,
            "authenticationType": "Basic",
            "userName": "bjarkowski",
            "password": {
                "type": "AzureKeyVaultSecret",
                "store": {
                    "referenceName": "ls_keyvault",
                    "type": "LinkedServiceReference"
                },
                "secretName": "s4hana"
            }
        },
        "connectVia": {
            "referenceName": "SH-IR",
            "type": "IntegrationRuntimeReference"
        }
    }
}

 


 


 


 


image011.png


Now, let’s create the dataset. Choose HTTP as the type and DelimitedText as the file format. Add ODataURL and Entity parameters as we did for the OData dataset. On the Settings tab, you’ll find the field Relative URL, which is the equivalent of the Path from the OData-based dataset. To get the number of records, we have to concatenate the entity name with the $count. The expression should look as follows:


 


 


 

@concat(dataset().Entity, '/$count')

 


 


 


 


image013.png


 


Perfect! We can now update the child pipeline that processes each OData service. Add the Lookup activity, but don’t create any connection. Both parameters on the Settings tab should have the same expression as the Copy Data activity. There is one difference. It seems there is a small bug and the URL in the Lookup activity has to end with a slash ‘/’ sign. Otherwise, the last part of the address may be removed, and the request may fail.


 


 


 


 

ODataURL: @concat(pipeline().parameters.URL, pipeline().parameters.ODataService, '/') 
Entity: @pipeline().parameters.Entity

 


 


 


image015.png


 


Difficult moment ahead of us! I’ll try to explain all the details the best I can. When the Lookup activity checks the number of records in the OData service, the response contains just a single value. We will use the $skip and $top query parameters to chunk the request into smaller pieces. The tricky part is how to model it in the pipeline. As always, there is no single solution. The easiest approach is to use the Until loop, which could check the number of processed rows at every iteration. But it only allows sequential processing, and I want to show you a more robust way of extracting data.


 


The ForEach loop offers parallel execution, but it only accepts an array as the input. We have to find a way on how to create one. The @range() expression can build an array of consecutive numbers. It accepts two parameters – the starting position and the length, which in our case will translate to the number of requests. Knowing the number of records and the batch size, we can easily calculate the array length. Assuming the OData service contains 15 elements and the batch size equals 5, we could pass the following parameters to the @range() function:


 


 


 

@range(0,3)

 


 


 


As the outcome we receive:


 


 


 

[0,1,2]

 


 


 


Each value in the array represents the request number. Using it, we can easily calculate the $skip parameter.


 


But there is one extra challenge. What if the number of records cannot be divided by the batch size without a remainder? As there is no rounding function, the decimal part of the result will be omitted, which means we’re losing the last chunk of data. To avoid that, I implemented a simple workaround – I always add 1 to the number of requests. Of course, you could think about a fancy solution using the modulo function, but I’m a big fan of simplicity. And asking for more data won’t hurt.


 


Add the ForEach loop as the successor of the Lookup activity. In the Items field, provide the following expression to create an array of requests. I’m using the int() function to cast the string value to the integer that I can then use in the div().


 


 


 


 

@range(0, add(div(int(activity('l_count').output.firstRow.Prop_0), int(pipeline().parameters.Batch)),1))

 


 


 


image017.png


 


To send multiple requests to the data source, move the Copy Data activity to the ForEach loop. Every iteration will trigger a copy job – but we have to correctly maintain query parameters to receive just one chunk of data. To achieve it, we will use the $top and $skip parameters, as I mentioned earlier in the post.


 


The $top parameter is static and always equals the batch size. To calculate the $skip parameter, we will use the request number from the array passed to the loop multiplied by the batch size.


 


Open the Copy Data activity and go to the Settings tab. Change the field Use Query to Query and provide the following expression:


 


 


 


 

@concat('$top=',pipeline().parameters.Batch, '&$skip=',string(mul(int(item().value), int(pipeline().parameters.Batch))))

 


 


 


 


image019.png


That was the last change to make. Let’s start the pipeline!


 


EXECUTION AND MONITORING


 


Once the pipeline processing finishes we can see successfully completed jobs in the monitoring view. Let’s drill down to see the details.


 


image021.png


 


Comparing to the previous extraction, you can see the difference. Instead of just one, there are now multiple entries for the Copy Data activity. You may be slightly disappointed with the duration of each copy job. It takes much longer to extract every chunk of data – in the previous episode, it took only 36 seconds to extract all sales orders. This time every activity took at least a minute.


 


There is a couple of reasons why it happens. Let’s take a closer a closer look at the components of the extraction job to understand why the duration increased heavily.


 


image023.png


Look at the time analysis. Before the request was processed, it was in the Queue for 1 minute and 29 seconds. Extracting data took only 9 seconds. Why there is such a long wait time?


 


In the first episode of the blog series, I briefly explained the role of the integration runtime. It provides the computing resources for pipeline execution. To save cost, I host my integration runtime on a very tiny B2ms virtual machine. It provides enough power to process two or three activities at the same time, which means that extracting many chunks is rather sequential than parallel. To fix that, I’ve upgraded my virtual machine to a bigger one. The total duration to extract data decreased significantly as I could process more chunks at the same time.


 


image025.png 


Here is the duration of each Copy Data activity.


 


image00X.png


 


The request is in the queue only for a couple of seconds instead of over a minute.


 


image023.png


Before we finish, I want to show you some results of processing a large dataset. In another system, I have over 1 million sales orders with more than 5 million line items. It wasn’t possible to extract it all in a single request as every attempt resulted in a short dump at the SAP side. I’ve adjusted the batch size to 100 000 which should be alright to process at a time. To further optimize the extraction, I changed the processing of OData services to Sequential, which means the job firstly extracts Sales Order headers before moving to line items. You can do it in the ForEach Loop in the metadata pipeline. To limit the impact of extraction to the SAP system, I also set a concurrency limit for the Copy Data activity (ForEach loop in the child pipeline).


 


It took 25 minutes to extract in total over 6 million records. Not bad.


 


image027.png


 


The extraction generated quite a lot of files on the data lake. Let’s count all records inside them and compare the number with what I have in my SAP system:


image029.png


image031.png


 


Both numbers match, which means we have the full dataset in the data lake. We could probably further optimize the extraction duration by finding the right number of parallel processes and identifying the best batch size.


 


Finally, before you even start the extraction, I recommend checking if you need all data. Trimming the dataset by setting filters on columns or extracting only a subset of columns can heavily improve the extraction duration. That’s something we’ll cover in the next episode!

Tackling Recursion in FHIR Using Blazor Components

Tackling Recursion in FHIR Using Blazor Components

This article is contributed. See the original author and article here.

MicrosoftTeams-image.png


Overview


Many FHIR include complex nesting.  For example the Item component of Questionnaire can either be a single item or an array of more items, which can themselves also be arrays of more items, with even more arrays below.  Arrays all the way down!  A previous post (FHIR + Blazor + Recursion = Rendering a Questionnaire ) showed a method to render objects with complex nesting: using Dynamic Fragments.  This article shows an alternative method: using self-referential components.


Why use components?


Components will allow easier reuse than Dynamic Fragment rendered pages.  Imagine a case where we want to both create new and allow updates of complex nested children.  If we use components, we can easily adapt a component for both EDIT and CREATE.  More reuse means less code!


How it’ll work


We’ll create a parent component.  The parent component will render the child component. 


ParentChildComponents.png


Parent Component;


 


 

<span>@Parent.Title</span>
<p>@Parent.Description</p>

@{
   int childNum=1;
}

@foreach(var child in Parent.children){
    Child #@childNum
    ChildComponent child=child />
    childNum++;
}

 


 


AND HERE’s THE MAGIC:


Child Component:


 


 

<span>@Child.Title</span>

@foreach(var child in Child.children){
    <ChildComponent child=child />
}

 


 



The child component will render additional child components.


FHIR Specific Example


Check out the code below from (FHIRBlaze)


Parent Component (QuestionnaireComponent)


 


 

<EditForm Model=Questionnaire OnValidSubmit=Submit>
    <label class="col-sm-2 form-label">Title:</label>
    <InputText @bind-Value=Questionnaire.Title />

    @foreach(var item in Questionnaire.Item)
    {
        <div class="border border-primary rounded-left  rounded-right p-2">
            <div class="row">
                <div class="col-sm-12">@GetHeader(item)</div>
            </div>
            <div class="row">
                <div class="col-sm-11">
                    <ItemDisplay ItemComponent=item/>
                </div>
                <div class="col-sm-1">
                    <button type="button" class="btn btn-primary" @onclick="()=>RemoveItem(item)">
                        <span class="oi oi-trash" />
                    </button>
                </div>
            </div>
        </div>
    }

    <div>
        <ItemTypeComponent ItemSelected="AddItem" />
    </div>   
    <br/>
    <button type="submit">Submit</button>
</EditForm>

 


 


 


Note the ItemDisplay component.


Child Component  ( ItemDisplay )


 


 

<div class="card-body">
        <label class="sr-only" >@GetTitleText(ItemComponent)</label>
        <input type='text' required class='form-control' id='question' placeholder=@GetTitleText(ItemComponent) @bind-value='ItemComponent.Text' >
        <label  class="sr-only">LinkId:</label>
        <input type='text' required class='form-control' placeholder='linkId' @bind-value='ItemComponent.LinkId'>

        @switch (ItemComponent.Type)
        {
            case Questionnaire.QuestionnaireItemType.Group:
                foreach(var groupitem in ItemComponent.Item)
                {
                     <div class="border border-primary rounded-left  rounded-right p-2">
                        <div class="row">
                            <div class="col-sm">@GetHeader(groupitem)</div>
                        </div>
                        <div class="row">
                            <div class="col-sm-11">
                                <ItemDisplay ItemComponent=groupitem/>
                            </div>
                            <div class="col-sm">
                                <button type="button" class="btn btn-primary"  @onclick="()=>ItemComponent.Item.Remove(groupitem)">
                                    <span class="oi oi-trash" />
                                </button>
                            </div>
                        </div>
                    </div>                 
                 }
                 break;
             case Questionnaire.QuestionnaireItemType.Choice:
                 int ansnum= 1;           
                 @foreach (var opt in ItemComponent.AnswerOption)
                 {
                    <div class="row">
                        <form class="form-inline">
                        <div class="col-sm-1">#@ansnum</div>
                        <div class="col-sm-10"><AnswerCoding Coding=(Coding)opt.Value /></div>
                        <div class="col-sm-1"><button type="button" class="btn btn-primary" @onclick="()=>ItemComponent.AnswerOption.Remove(opt)"><span class="oi oi-trash" /></button></div>
                        </form>
                   </div>
                   ansnum++;
                 }
                 <button type="button" @onclick=AddAnswer >Add Choice</button>
                 break;
             default:
                 break;
        }

        @if (ItemComponent.Type.Equals(Questionnaire.QuestionnaireItemType.Group))
        {
            <div>
                <ItemTypeComponent ItemSelected="AddItem" />
            </div>
        }
</div>

 


 


Line 58 is the key line.   This component renders itself!


 


Caveats


With this you should be able to quickly render nested Item after nested Item.  But there are some caveats you should know.


 


#1: Memory Consumption


              As you render each nesting- all those objects are loaded into memory.  If a FHIR has an unknown number of children- each of which could have its own children, you could potentially consume large amounts of memory.  This is a particular problem because you could be rendering on a phone.


Suggested Mitigation: Consider rendering to a max depth and a max number of children.  Render “see more” type links to allow the user to see remaining children


#2: Display Clipping


              The standard approach is to indent children.  But if you have 5 levels of nesting and the app is rendered on mobile than your 5th level children may show up in a single character column.  If you render 100 levels of nested children, your final level may not render at all.


UIClipping.png


Suggested Mitigation:  Consider an alternative to displaying all children.  For example- consider using collapsible sections to show children.


#3: Labeling Problems


              If you’re allowing Edit of a component with complex nesting it may be difficult for a user to remember which level of children, they are in. For example, imagine the following are rendered as cards: Parent 1: Child 1: Child 2: Child 3: Child 4: Child 5.


Suggested Mitigation:  Consider using borders and labeling to help users determine which child is currently selected       

Azure Marketplace new offers – Volume 177

Azure Marketplace new offers – Volume 177

This article is contributed. See the original author and article here.











We continue to expand the Azure Marketplace ecosystem. For this volume, 115 new offers successfully met the onboarding criteria and went live. See details of the new offers below:































































































































































































































































































































































































































Get it now in our marketplace


Accelario DataOps Platform for MySQL databases.png

Accelario DataOps Platform for MySQL Databases: Accelario DataOps Platform for MySQL Databases accelerates and automates self-service provisioning and data refreshes from on-premises to the cloud and vice versa with minimal downtime. Speed up your go-to-market and application delivery without sacrificing performance.


Accelario DataOps Platform for PostgreSQL database.png

Accelario DataOps Platform for PostgreSQL Databases: Accelerate your application development and cloud migration with Accelario DataOps Platform for PostgreSQL Databases. This self-service platform streamlines database copy and speeds up DevOps pipelines. Reduce wait time for any type of test data to minutes. 


ARGOS CSPM - Contextual Cloud Security Monitoring.png

ARGOS – Contextual Cloud Security Monitoring: Get a precise picture of your cloud security posture with ARGOS’ end-to-end security service. Tools like resource graphs and exploitability checks enable real-time detection and remediation so you can focus on securely deploying applications with speed and efficiency.


Bugzilla Issue Tracker.png

Bugzilla Issue Tracker: Bugzilla Issue Tracker on Ubuntu Server 20.04 is an open-source bug tracking solution that enables users to stay connected with their clients or employees while keeping track of outstanding bugs and issues throughout the software development life cycle.


Cloud Membership Management System.png

Cloud Membership Management System: Caloudi’s AI-powered cloud membership management platform helps consolidate and manage all information related to your retail customer on a single platform. Personalize your client’s shopping experience with targeted messaging and promotions.


DesktopReady for MSP.png

DesktopReady for MSP: DesktopReady for MSP is a Microsoft Azure Virtual Desktop automation platform that enables managed service providers to deliver Windows 10 desktops on Azure. Set up your modern workspace for improved agility and ongoing cost savings.


EcoVadis Sustainability Ratings.png

EcoVadis Sustainability Ratings: Integrate sustainability in your procurement policy, processes, and tools with EcoVadis Sustainability Ratings. This scalable SaaS solution helps screen and assess suppliers’ sustainability performance. This offer is for existing EcoVadis customers only.


MediaValet Digital Asset Management.png

MediaValet Digital Asset Management: This secure, cost-effective digital asset management platform seamlessly integrates with your existing tools and allows marketing teams to create, organize, and distribute high-value digital assets across teams, departments, and partners. 


Metabase on Debian.png

Metabase on Debian: Powered by Niles partners, Metabase is a business intelligence and data visualization tool with SQL capabilities. It offers a simple graphical interface to power in-application analytics without writing any SQL.


Prescript EMR.png

Prescript EMR: Prescript EMR is an integrated healthcare platform for optimizing patient care in both urban and rural settings. Powered by a clinical management system it offers a single portal for administering appointments, electronic medical records (EMR), billing, and more.


Procore Sharepoint Integration.png

Procore SharePoint IntegrationSyncEzy’s offering allows you to access Procore photos and documents within Microsoft SharePoint and discuss site documents with your construction crew from a central cloud location. Access work files within Microsoft Teams and more.


QuerySurge.png

QuerySurge: Continuously detect data issues in your delivery pipeline with this automated data testing solution from RTTS. QuerySurge optimizes your critical data by integrating Microsoft Power BI analytics in your DataOps pipeline and improves ROI.


Revenue Cycle Management.png

Revenue Cycle Management: Inforich’s solution leverages AI, automation, and analytics to streamline the insurance and clinical aspects of healthcare by linking administrative, insurance, and other financial information with the patient’s treatment plan.


Scheduling as a Service.png

Scheduling as a Service: This employee management scheduling software creates custom scheduling templates using AI to optimize resources and workforce requirements. Improve your field services and control costs by assigning employees based on total labor expense.


Seascape for Notes.png

Seascape for Notes: Seascape for Notes is an archiving solution for Lotus Notes (HCL Notes and Domino). It enables administrators to archive entire Notes applications, mail files, and other custom databases using a streamlined archiving process.


ShiftLeft CORE.png

ShiftLeft CORE: ShiftLeft CORE uses rapid, repeatable static application security testing to help developers fix 91% of new vulnerabilities within the code they are working on in two discovery sprints. Release secure code at scale with this easy-to-use SaaS platform.


Symend.png

Symend: Symend’s relationship-based approach uses behavioral science and analytics to empower customers to resolve past due bills before they reach collections. Determine which strategies will empathetically help customers while lowering your operating costs.


Vitals KPI Management for Healthcare.png

Vitals KPI Management for Healthcare: Vitals KPIM uses artificial intelligence and analytics to improve healthcare services by creating success metrics that align patient satisfaction, business processes and team collaboration. This solution is only available in Chinese.


WhiteSource Open Source Security Management.png

WhiteSource Open Source Security Management: WhiteSource Open Source Security Management offers an agile open source security and license compliance management solution that makes it easy to develop secure software without compromising speed or agility.



Go further with workshops, proofs of concept, and implementations


AKS Container Platform Build-16-Week Implementation.png

AKS Container Platform Build: 16-Week Implementation: In this engagement, BlakYaks will implement a secure and scalable Microsoft Azure Kubernetes Service (AKS) platform built with code for hosting container workloads at scale on Microsoft Azure. 


AKS Container Platform Design- 8-Week Implementation.png

AKS Container Platform Design: 8-Week Implementation: Utilizing enterprise-grade designs, patterns, and operational frameworks, BlakYaks will provide a comprehensive design engagement for a secure and compliant Microsoft Azure Kubernetes Service platform.


Azure IoT Jumpstart Kit- 1-Day Implementation Workshop .png

Azure IoT Jumpstart Kit: 1-Day Implementation Workshop: ACP IT Solutions will help connect your industrial sensors, machines, and production processes with Azure IoT using cost-effective, ready-made retrofitting product bundles. This offer is only available in German.


Azure Managed Services-12-Month Implementation.png

Azure Managed Services: 12-Month ImplementationCoreBTS’ custom Microsoft Azure managed service will help cost-optimize your business processes by enabling your teams to focus on strategic tasks rather than day-to-day operations.


Azure Migration- 4-Week Implementation.png

Azure Migration: 4-Week ImplementationIn this collaborative engagement, MNP Digital will seamlessly migrate your servers to Microsoft Azure to optimize your cloud usage and ensure a sustainable foundation, structure, governance, and security for your digital transformation.


Azure Purview Foundations- 3-Week Implementation.png

Azure Purview Foundations: 3-Week Implementation: Coretek will help you create a holistic, up-to-date map of your data landscape with automated data discovery, sensitive data classification, and end-to-end data lineage with Azure Purview Foundations.


Azure Real-Time IoT Data Analytics- 20-Day Proof of Concept.png

Azure Real-Time IoT Data Analytics: 20-Day Proof of Concept: ScienceSoft’s proof of concept is designed to help companies get real-time visibility into operational processes and enable intelligent automation using Azure IoT Hub, Azure Stream Analytics and Microsoft Power BI.


Azure Sentinel Onboarding- 2-Week Proof of Concept.png

Azure Sentinel Onboarding: 2-Week Proof of Concept: Enhance your organization’s threat detection and response capabilities in this proof of concept. The experts from Stripe OLT Consulting will help your organization modernize its security operation by onboarding Microsoft Azure Sentinel into your own tenant.


ECF Data Azure Sentinel- 2-Day Workshop.png

Azure Sentinel: 2-Day Workshop: In this workshop you will partner with ECF Data to modernize your security operation and capture threat intelligence using Microsoft Azure Sentinel and move your organization’s defenses from a reactive state to a proactive one.


Azure Services‎- 1-Week Implementation.png

Azure Services‎: 1-Week Implementation: Manapro Consultants will demonstrate how Microsoft Azure services can transform your applications and lower operational costs as you lay the foundations for your cloud migration journey. This offer is available only in Spanish.


Azure Site Recovery and Backup- 3-Week Proof of Concept.png

Azure Site Recovery and Backup: 3-Week Proof of Concept: Using your existing on-premises and/or cloud servers, Insight will guide you through the concepts of cloud backup and disaster recovery and configure a working prototype. Learn how Azure Site Recovery can simplify and reduce the cost of your disaster recovery solution.


Azure Site Reliability Engineering (Managed Service)- 12-Month Implementation.png

Azure Site Reliability Engineering (Managed Service): 12-Month Implementation: BlakYaks will create and implement a custom managed service for all your Microsoft Azure hosted platforms to keep them up-to-date and aligned to your strategic requirements. Cost-optimize your business processes and site reliability engineering operations.


Windows Virtual Desktop on Azure- 2-Week Proof of Concept.png

Azure Virtual Desktop: 2-Week Proof of Concept: Insight’s proof of concept will give you the foundational knowledge to configure a secure, scalable, virtual desktop infrastructure using Microsoft Azure Virtual Desktop. Empower your employees with a flexible work environment.


Azure Virtual Desktop- 5-Week Implementation.png

Azure Virtual Desktop: 5-Week Implementation: Is your organization struggling with transitioning to remote work? 3Cloud will deliver an Azure Virtual Desktop deployment tailored to meet your operational and security needs. You’ll learn about various deployment scenarios and how to enable remote work for your organization.


Azure Virtual Desktop & Windows 365 Managed Services.png

Azure Virtual Desktop & Windows 365 Managed Services: The experts from Cubesys will develop a virtualized desktop strategy that includes a roadmap and cost-benefit analysis for an enterprise-wide implementation of Microsoft Azure Virtual Desktop and Windows 365. Learn how you can access your desktop and apps from anywhere.


Backup as a Service- 12-Month Implementation.png

Backup as a Service: 12-Month Implementation: Using Microsoft Azure and Commvault, Databarracks’ implementation will proactively resolve any security issues and help your organization monitor, manage, and restore backups so your critical data is always protected.


Cloud-Native Consulting- 2-Week Implementation.png

Cloud-Native Consulting: 2-Week Implementation: Alerant engineers will develop a wholistic understanding of your business needs before creating a roadmap to discover, plan, and develop cloud-native solutions to speed up your enterprise’s digital transformation.


Data Innovation Studio- 2-Week Workshop.png

Data Innovation Studio: 2-Week WorkshopIn this innovation workshop, the experts from Data#3 will help your organization further its analytics and AI capabilities by delivering a tailored roadmap using a modern data platform reference architecture for Azure services.


Data Lake for Mortgage Servicing- 4-Week Implementation.png

Data Lake for Mortgage Servicing: 4-Week Implementation: Invati will simplify and reduce loan servicing costs and improve business insights by mapping your team’s mortgage data sources to a single point of access using Azure Data Lake and Azure Synapse Analytics.


Data Quality & MDM - 4-Week Proof of Concept.png

Data Quality & MDM: 4-Week Proof of Concept: Using machine learning models built on Microsoft Azure components, the experts at Tredence will improve your data by removing duplicates and inconsistent records. Access reliable data for better business insights.


Data Security Protection- 4-Week Workshop.png

Data Security Protection: 4-Week Workshop: Freedom Systems will help you understand your business’s security requirements and leverage Microsoft Enterprise Mobility and Security platform to protect and secure your organization. 


DevOps Assessment-1-Day Workshop.png DevOps Assessment: 1-Day Workshop: Xpirit will leverage their expertise in DevOps and help roll out tooling and methodology as they facilitate your organization’s transition to the cloud. Companies in highly regulated sectors such as defense and finance will benefit from this offer.
Digital Twin Smart Spaces- 3-Month Proof of Concept.png

Digital Twin Smart Spaces: 3-Month Proof of Concept: T-Systems MMS will identify, optimize, or sublet unused space by tracking the digital version of available physical workspaces in real-time via its Smart Spaces platform using battery-less sensors and Azure IoT services.


Disaster Recovery as a Service with Azure-12-Month Implementation.png

Disaster Recovery as a Service with Azure Site Recovery: 12-Month Implementation: Databarracks’ 24/7/365 service is compatible with both Windows and Linux Operating Systems and uses cloud-native solutions like Azure Backup and Azure Site Recovery (ASR) to implement a simple, secure, and cost-effective disaster recovery solution.


Disaster Recovery as a Service- 12-Month Implementation.png

Disaster Recovery as a Service with Zerto: 12-Month Implementation: Databarracks will replicate your on-premises or cloud servers using Zerto Virtual Manager into a Zerto Cloud Appliance hosted in Microsoft Azure. At the point of recovery, Zerto uses Azure queues and Azure virtual machine scale sets to accelerate recovery.


Education diagnostic system- 5-Day Implementation.png

Education Diagnostic System: 5-Day Implementation: In this implementation SiES IT will help set up an appraisal platform to assess the value and quality of educational institutions using Microsoft Azure services. This offer is only available in Russian.


Identity Cleanse- 5-Day Workshop.png

Identity Cleanse: 5-Day Workshop: The experts from ITC Secure will consolidate and reconcile all sources of user and account information to assess your environment and improve the security of your ecosystem using Microsoft Azure Active Directory.


Intelligent Hybrid Cloud Platform Hosting Solution- 3-Week Implementation.png

Intelligent Hybrid Cloud Platform Hosting Solution: 3-Week Implementation: In this offer, Acer AEB will provide a consistent multi-cloud and on-premises management platform with the successful implementation of Azure Stack HCI (hyperconverged infrastructure) architecture and its integration with Azure Arc. This offer is only available in Chinese.


Linux to Azure Migration- 16-Day Workshop.png

Linux to Azure Migration: 16-Day Workshop: SVA consultants will help your organization analyze its existing Linux infrastructure and develop a roadmap and business case to move your servers and applications to Microsoft Azure using Microsoft’s Cloud Adoption Framework (CAF). This offer is only available in German.


Microsoft Azure Migration & Deployment- 3-Month Implementation.png

Microsoft Azure Migration & Deployment: 3-Month Implementation: In this offer, Insight’s specialists will guide you through the adoption and migration of Microsoft Azure and ensure your deployment process is tailored to your organization’s exact business goals and needs.


Migrate to Azure- 15-Day Deployment.png

Migrate to Azure: 15-Day Deployment: Learn how Nebulan’s iterative approach using the Microsoft Cloud Adoption Framework can cost-effectively migrate your top 10 workloads, including Windows and SQL Server, to Microsoft Azure. This offer is only available in Spanish.


Migrate to Azure- 5-Week Implementation.png

Migrate to Azure: 5-Week Implementation: In this offer, Xavor will implement a low-risk, data-driven Microsoft Azure cloud migration solution tailored to your organization’s unique needs. Ensure business continuity and improve performance with governance, automation, and control of multi-cloud environments.


ML Ops Framework Setup- 12-Week Implementation.png

ML Ops Framework Setup: 12-Week Implementation: Tredence’s automated industrialized machine learning operations platform will help you generate higher ROI on your data science investments and offer clear and robust analytical insights with minimal manual effort using Microsoft Azure DevOps.


Modern Data Warehouse- 4-Week Proof of Concept.png

Modern Data Warehouse: 4-Week Proof of Concept: In this proof of concept, devoteam will demonstrate how its solution built on Azure Data Lake, Azure Analytics, and Azure Synapse can transform and modernize your legacy data landscape. 


Modernize with Azure Kubernetes Service- 5-Day Workshop.png

Modernize with Azure Kubernetes Service: 5-Day Workshop: The experts at SVA will lead a hand-on workshop to demonstrate how Azure Kubernetes Service (AKS) can provide an agile developer environment in Microsoft Azure while reducing costs and administrative overhead. This offer is only available in German.


Secure OnMesh- 8-Week Proof of Concept.png

Secure OnMesh: 8-Week Proof of Concept: Make security an intrinsic part of your digital fabric with Logicalis’ Secure OnMesh solution. In this engagement you will learn how Secure OnMesh leverages Microsoft Azure Sentinel to protect your entire digital ecosystem with AI-enabled threat hunting capabilities.


Truveta Data Migration- 3-Month Service.png

Truveta Data Migration: 3-Month ServiceTegria’s service helps healthcare organizations build scalable data pipelines from the cloud to the Truveta healthcare data platform using Microsoft Azure tools like Data Factory and Databricks. Truveta anonymizes and maps patient data and automates file creation.


Windows to Azure Migration- 16-Day Workshop.png

Windows Server to Azure Migration: 16-Day Workshop: SVA will identify and prioritize all your assets located on-prem Windows server environment and help move them to Microsoft Azure using a Cloud Adoption Framework-aligned approach. This offer is only available in German.



Contact our partners



AccessGov



AI and Data Projects: 2-Hour Briefing



Azure Analytics: 5-Day Assessment



Azure Cloud Adoption: 2-Week Assessment



Azure Cloud: 4-Week Assessment



Azure Container Platform Security: 6-Week Assessment



Azure Migration: 3-Day Assessment



Azure Migration Readiness: 2-Week Assessment



Azure Session: 2-Hour Initial Assessment



Azure Virtual Desktop Services



C-Track Comprehensive Court Case Management Solution



Cisco Integrated System for Microsoft Azure Stack



Cloudera Data Platform 7.2.x Runtimes



CloudXR Introductory Offer – Windows Server 2019



Cryptographic Risk Assessment



Cyber Care – Managed SAP Connector for Azure Sentinel



Cybersecurity: 1-Week Assessment



Data Estate Assessment: 2-Week Assessment



DataNeuron: Automated Learning Platform


Data Science & AI: 1-Week Assessment

Digia Cloud Cost Management Reporting



DLO Starter



e4Integrate



eMission Cloud View: 10-Week Assessment



eZuite Cloud ERP



HPC Cluster – CPU Based Cluster on SUSE Enterprise Linux 15.3 HPC



iEduERP



iFIX Intelligent Service Automation



impress.ai



Intelligent Document Processor



iPILOT Teams Direct Routing



IQ3 Cloud – Azure Managed Cloud



Journey to Cloud: 1-Hour Briefing



KIVU Expense



KPN Outbound Email Security Solution



Logicworks Managed Services for Azure



Loopr Data Labelling



Mammography Intelligent Assessment



MIND’s SAP on Azure: 1-Day Briefing



Minimum Viable Cloud: 6-Week Assessment



MT Cloud Control Volumes



Networking Services for Cloud: 1-Hour Briefing



OneDrop



Oracle Database



Orbital Insight Defense and Intelligence



PwC Promo and Assortment Management Tool (RGS)



Qlik Forts



Quality Management Software System



SailPoint Sentinel Integration



Sarus Private Learning



Scale by indigo.ai



Sentiment Analysis (Call Centre) by BiTQ



SINVAD



Thunder Threat Protection System Virtual Appliance for DDoS Protection



Urbana IoT Platform



Versa SASE in vWAN



Wipro Smart Asset Twin



Introducing Azure AD custom security attributes

Introducing Azure AD custom security attributes

This article is contributed. See the original author and article here.

This public preview of Microsoft Azure Active Directory (Azure AD) custom security attributes and user attributes in ABAC (Attribute Based Access Control) conditions builds on the previous public preview of ABAC conditions for Azure Storage. Azure AD custom security attributes (custom attributes, here after) are key-value pairs that can be defined in Azure AD and assigned to Azure AD objects, such as users, service principals (Enterprise Applications) and Azure managed identities. Using custom attributes, you can add business-specific information, such as the user’s cost center or the business unit that owns an enterprise application, and allow specific users to manage those attributes. User attributes can be used in ABAC conditions in Azure Role Assignments to achieve even more fine-grained access control than resource attributes alone. Azure AD custom security attributes require Azure AD Premium licenses.


 


We created the custom attributes feature based on the feedback we received for managing attributes in Azure AD and ABAC conditions in Azure Role Assignments:



  • In some scenarios, you need to store sensitive information about users in Azure AD, and make sure only authorized users can read or manage this information. For example, store each employee’s job level and allow only specific users in human resources to read and manage the attribute.

  • You need to categorize and report on enterprise applications with attributes such as the business unit or sensitivity level. For example, track each enterprise application based on the business unit that owns the application.

  • You need to improve your security posture by migrating from API access keys and SAS tokens to a centralized and consistent access control (Azure RBAC + ABAC) for your Azure storage resources. API access keys and SAS tokens are not tied to an identity; meaning, anyone who possesses them can access your resources.  To enhance your security posture in a scalable manner, you need user attributes along with resource attributes to manage access to millions of Azure storage blobs with few role assignments.


Let’s take a quick look at how you can manage attributes, use them to filter Azure AD objects, and scale access control in Azure.


 


Step 1: Define attributes in Azure AD


The first step is to create an attribute set, which is a collection of related attributes. For example, you can create an attribute set called “marketing” to refer to the attributes related to the marketing department. The second step is to define the attributes inside the attribute set and the characteristics of the attribute set. For example, only pre-defined values are allowed for an attribute and whether an attribute can be assigned a single value or multiple values. In this example, there are three values for the project attribute—Cascade, Baker, and Skagit—and a user can be assigned only one of the three values. The picture below illustrates the above example.


 


Step 1.png


 


Step 2: Assign attributes to users or enterprise applications


Once attributes are defined, they can be assigned to users, enterprise applications, and Azure managed identities.


 


Step 2.png


 


Once you assign attributes, users or applications can be filtered using attributes. For example, you can query all enterprise applications with a sensitivity level equal to high.


 


Enterprise applications.png


 


Step 3: Delegate attribute management


There are four Azure AD built-in roles that are available to manage attributes.


 


ABAC.png


 


By default, Global Administrators and Global Readers are not able to create, read, or update the attributes. Global Administrators or Privileged Role Administrators need to assign the attribute management roles to other users, or to themselves, to manage attributes. You can assign these four roles at the tenant or attribute set scope. Assigning the roles at tenant scope allows you to delegate the management of all attribute sets. Assigning the roles at the attribute set scope allows you to delegate the management of the specific attribute set. Let me explain with an example.


 


Xia.png


 



  1. Xia is a privileged role administrator; so, Xia assigns herself Attribute Definition Administrator role at the tenant level. This allows her to create attribute sets.

  2. In the engineering department, Alice is responsible for defining attributes and Chandra is responsible for assigning attributes. Xia creates the engineering attribute set, assigns Alice the Attribute Definition Administrator role and Chandra the Attribute Assignment Administrator role for the engineering attribute set; so that Alice and Chandra have the least privilege needed.

  3. In the marketing department, Bob is responsible for defining and assigning attributes. Xia creates the marketing attribute set and assigns the Attribute Definition Administrator and Attribute Assignment Administrator roles to Bob.


 


Step 4: Achieve fine-grained access control with fewer Azure role assignments


Let’s build on our fictional example from the previous blog post on ABAC conditions in Azure Role Assignments. Bob is an Azure subscription owner for the sales team at Contoso Corporation, a home improvement chain that sells items across lighting, appliances, and thousands of other categories. Daily sales reports across these categories are stored in an Azure storage container for that day (2021-03-24, for example); so, the central finance team members can more easily access the reports. Charlie is the sales manager for the lighting category and needs to be able to read the sales reports for the lighting category in any storage container, but not other categories.


 


With resource attributes (for example, blob index tags) alone, Bob needs to create one role assignment for Charlie and add a condition to restrict read access to blobs with a blob index tag “category = lighting”. Bob needs to create as many role assignments as there are users like Charlie. With user attributes along with resource attributes, Bob can create one role assignment, with all users in an Azure AD group, and add an ABAC condition that requires a user’s category attribute value to match the blob’s category tag value. Xia, Azure AD Admin, creates an attribute set “contosocentralfinance” and assigns Bob the Azure AD Attribute Definition Administrator and Attribute Assignment Administrator roles for the attribute set; giving Bob the least privilege he needs to do his job. The picture below illustrates the scenario.


 


RBAC.png


 


 


Bob writes the following condition in ABAC condition builder using user and resource attributes:


 


Role assignment condition.png


 


To summarize, user attributes, resource attributes, and ABAC conditions allow you to manage access to millions of Azure storage blobs with as few as one role assignment!


 


Auditing and tools


Since attributes can contain sensitive information and allow or deny access, activity related to defining, assigning, and unassigning attributes is recorded in Azure AD Audit logs. You can use PowerShell or Microsoft Graph APIs in addition to the portal to manage and automate tasks related to attributes. You can use Azure CLI, PowerShell, or Azure Resource Manager templates and Azure REST APIs to manage ABAC conditions in Azure Role Assignments.


 


Resources


We have several examples with sample conditions to help you get started. The Contoso corporation example demonstrates how ABAC conditions can scale access control for scenarios related to Azure storage blobs. You can read the Azure AD docs, how-to’s, and troubleshooting guides to get started.


 


We look forward to hearing your feedback on Azure AD custom security attributes and ABAC conditions for Azure storage. Stay tuned to this blog to learn about how you can use custom security attributes in Azure AD Conditional Access. We welcome your input and ideas for future scenarios.


 


 


 


Learn more about Microsoft identity:


New Microsoft Teams Essentials is built for small businesses

New Microsoft Teams Essentials is built for small businesses

This article is contributed. See the original author and article here.

Perhaps no one has been hit harder over the past 20 months than small businesses. To adapt and thrive in this new normal, small businesses need comprehensive solutions that are designed specifically for them and their unique needs.

The post New Microsoft Teams Essentials is built for small businesses appeared first on Microsoft 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.