by Priyesh Wagh | Dec 24, 2020 | Dynamics 365, Microsoft, Technology
Retrieving OptionSet Labels from CDS data in a Cloud Flow / Power Automate / Flow is an extra step than just picking from the Dynamic Values. Check this post!!
Scenario
While working with data that is either a result of a Dataverse Trigger (on Create/Update) or Action like (Get record, list record), the OptionSet fields from CDS/Dataverse return the Values of the OptionSet instead of the Text Labels –
- Let’s say this is the OptionSet in Dynamics

- And when you pick the OptionSet field from CDS either from an Action or a Trigger like this
Let’s say I’m capturing this in a variable to show you
I’m storing this in a variable for this example to show you

- I’ll capture the Value of the OptionSet i.e. the Value part
Result –

- What needs to be displayed is the Label of the OptionSet! Let’s see how we can do this –
- Triggers
- Actions – List records (Inside Loops)
- Actions – Get record (Single record)
triggerOutputs() / triggerBody() to read the OptionSet values from Triggers
Let’s see how to read the OptionSet values from CDS Triggers, Create or Update –
- When a Flow is triggered using CDS/Dataverse Triggers (I’m using Common Data Service Environment(Current Environment) trigger), you can read the OptionSet value by using triggerOutputs() / triggerBody() function –
Here’s a triggerBody() example

Now, notice that the name of the field is cf_accounttype. So, the label in Triggers can be retrieved as _cf_accounttype_label and NOT cf_accounttype.
Same way, you can also use triggerBody() function instead as well to yield the same result

Here’s a post on triggerBody() / triggerOutputs() for your reference – Using triggerBody() / triggerOutput() to read CDS trigger metadata attributes in a Flow | Power Automate
items() to read the OptionSet values from Action Step – List Records (Multiple Records)
Let’s look at this example where you are retrieving multiple records using List Records and now, you apply a loop on each of those to read individual records.
- Assuming you are using List Records and you loop through the outputs of each of these records
- Now, the value is represented by items() function on the Loop that you are in i.e. Apply to each.
Hence, the function is
items('Apply_to_each')?['cf_accounttype@OData.Community.Display.V1.FormattedValue']

This will show the label of the OptionSet in the results

outputs() to read the OptionSet values from Action Step – Get a record (Single Record)
Similar to items() function, we can use outputs() function to read from the Output’s of the Get a Record Action from Common Data Service (Current Environment)
- Get an Account is a ‘Get a record’ Action in CDS/Dataverse Current Environment connector. It’ll only fetch a single record.

- Now, let’s look at how outputs() function, similar to items() function used in case of Loops is used.
Formula is –
outputs('Get_an_Account')?['body/cf_accounttype@OData.Community.Display.V1.FormattedValue']

In the above scenario, I’m reading under body hierarchy for cf_accounttype@OData.Community.Display.V1.FormattedValue
Hope this helps!
Here are some Power Automate / Cloud Flow posts you might find helpful –
- InvalidWorkflowTriggerName or InvalidWorkflowRunActionName error in saving Cloud Flows | Power Automate Quick Tip
- Create a Team, add Members in Microsoft Teams upon Project and Team Members creation in PSA / Project Operations | Power Automate
- Setting Lookup in a Flow CDS Connector: Classic vs. Current Environment connector | Power Automate Quick Tip
- Adaptive Cards for Outlook Actionable Messages using Power Automate | Power Platform
- ChildFlowUnsupportedForInvokerConnections error while using Child Flows [SOLVED] | Power Automate
- Run As context in CDS (Current Environment) Flow Trigger | Power Automate
- Using outputs() function and JSON Parse to read data from missing dynamic value in a Flow | Power Automate
- Setting Retry Policy for an HTTP request in a Flow | Power Automate
- Make On-Demand Flow to show up in Dynamics 365 | Power Automate
- Task Completion reminder using Flow Bot in Microsoft Teams | Power Automate
- BPF Flow Step as a Trigger in CDS (Current Environment) connector | Power Automate
- Pause a Flow using Delay and Delay Until | Power Automate
Thank you!!
Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.
by Priyesh Wagh | Dec 22, 2020 | Dynamics 365, Microsoft, Technology
If you are lately started creating Flows, you’ll come across some errors which are not precisely directing to what might be the issue but throw an error that says as below –
For Triggers (InvalidWorkflowTriggerName) –

Run Action (InvalidWorkflowRunActionName) –

Reasoning
The reason you see these errors are because of Invalid Characters which are in your Flow Trigger Step Names or on the Flow Action Step Names.
Invalid Characters in Trigger Names

Invalid Characters in Action Step Name.

Hence, the Step Names shouldn’t contain the below special characters –
< > % & ? /
Note: Trigger/Step Names can’t end with a Full Stop (.). However, if there’s a Full Stop in between, it’ll work
Hope this was helpful!
Here are some more Power Automate posts you might want to check out –
- Create a Team, add Members in Microsoft Teams upon Project and Team Members creation in PSA / Project Operations | Power Automate
- ChildFlowUnsupportedForInvokerConnections error while using Child Flows [SOLVED] | Power Automate
- Using outputs() function and JSON Parse to read data from missing dynamic value in a Flow | Power Automate
- Setting Lookup in a Flow CDS Connector: Classic vs. Current Environment connector | Power Automate Quick Tip
- Make On-Demand Flow to show up in Dynamics 365 | Power Automate
- Setting Correct Default Mode for Forms in a Canvas App | [Quick Tip]
- Pause a Flow using Delay and Delay Until | Power Automate
- Call HTTP Request from a Canvas Power App using Flow and get back Response | Power Automate
Thank you!
Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.
by Priyesh Wagh | Dec 20, 2020 | Dynamics 365, Microsoft, Technology
If you are new to creating a Canvas App and you’ve just added a Form and when you try to run and test the same in the browser itself, you don’t see it.
So let’s say you’ve put a form like this

And all you see is this –

So this happens based on the Default Mode and whether the context is passed to the form depending on the type or not.
Default Mode
There are 3 Default Modes for a form and it’s critical which one should be used when –

1. Edit – To be used if you are creating a new record. While using this mode, make sure the Item (from a data source is passed to the Form itself. Else, you’ll see the same error message as above and won’t see a blank form in that case.)
The Item property should be pointed to pick .Selected i.e. typically, <GalleryName>.Selected.

Context of a record i.e. a Row should be passed on to this form for Edit mode i.e. an existing row to be passed on to this Form from the preceding Gallery/Table control (or any data source based on your implementation). For example – The context I send to the form looks like this.
The last parameter i.e. <GalleryName>.Selected is placed on the select button so that once clicked, it’ll send the record context over to the form so that the form is displaying the selected record (which is picked from Item property in the above screenshot)

2. New – New should be used if the Form you’ve inserted isn’t coming as a result of populating a selected record. But you will enter new values and Submit it the first time.
Once New is selected, the form will correctly render for you when you test or run the App once correct Mode is selected (i.e. you don’t intend to pass any context to the Form)

3. View – In this view, fields will not be visible and instead the data will be a result of the selected item from the List/Gallery view it is coming from i.e. a Data Row which will populate this form. This will be a Read-Only view.
Another form example where View is selected, fields won’t show up but this form would need Context to be passed to it just like in the New mode.

Hope this quick tip was useful.
Here are some more Dynamics 365 related posts you might want to check
- Find deprecated JS code used in your Dynamics 365 environment | Dynamics 365 v9 JS Validator tool | XrmToolBox
- Make On-Demand Flow to show up in Dynamics 365 | Power Automate
- Track and Set Regarding are disabled for Appointments in Dynamics 365 App For Outlook message | Demystified
- Cancelled Bookings Imported in Time Entries in Dynamics 365 PSA issue | [Quick Tip]
- Remove ‘This Email has been blocked due to potentially harmful content.’ message in Dynamics 365 Emails | OrgDbSettings utility
- Get GUID of the current View in Dynamics 365 CRM JS from ribbon button | Ribbon Workbench
- Get Dynamics 365 field metadata in a Canvas App using DataSourceInfo function | Common Data Service
- Dynamics 365 App For Outlook missing on SiteMap in CRM? Use shortcut link [Quick Tip]
Thank you!!
Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.
by Contributed | Dec 18, 2020 | Alerts, Microsoft, Security, Technology
This article is contributed. See the original author and article here.
Microsoft security researchers have been investigating and responding to the recent nation-state cyber-attack involving a supply-chain compromise followed by cloud assets compromise.
Microsoft 365 Defender can help you track and respond to emerging threats with threat analytics. Our Threat Intelligence team has published a new Threat analytics report, shortly following the discovery of this new cyber attack. This report is being constantly updated as the investigations and analysis unfold.
The threat analytics report includes deep-dive analysis, MITRE techniques, detection details, recommended mitigations, updated list of indicators of compromise (IOCs), and advanced hunting queries that expand detection coverage.
Given the high profile of this threat, we have made sure that all our customers, E5 and E3 alike, can access and use this important information.
If you’re an E5 customer, you can use threat analytics to view your organization’s state relevant to this attack and help with the following security operation tasks:
- Monitor related incidents and alerts
- Handle impacted assets
- Track mitigations and their status, with options to investigate further and remediate weaknesses using threat and vulnerability management.
For guidance on how to read the report, see Understand the analyst report section in threat analytics.

Read the Solorigate supply chain attack threat analytics report:
For our E3 customers, you can read similar relevant Microsoft threat intelligence data, including the updated list of IOCs, through the MSRC blog. Monitor the blog, Customer Guidance on Recent Nation-State Cyber Attacks, where we share the latest details as the situation unfolds.
by Contributed | Dec 17, 2020 | Microsoft, Security, Technology
This article is contributed. See the original author and article here.
Microsoft Information Protection (MIP) is a built-in, intelligent, unified, and extensible solution to protect sensitive data in documents and emails across your organization. MIP provides a unified set of capabilities to know and protect your data and prevent data loss across Microsoft 365 apps (e.g., Word, PowerPoint, Excel, Outlook), services (e.g., Microsoft Teams, SharePoint, Exchange, Power BI), on-premises locations (e.g., SharePoint Server, on-premises files shares), devices, and third-party apps and services (e.g., Box and Dropbox).
We are excited to announce availability for new MIP capabilities:
- General availability of Exact Data Match user interface in Microsoft 365 compliance center and configurable match
- External sharing policies for Teams and SharePoint sites, in public preview
- Customer key support for Teams, in public preview
- Expansion of MIP sensitivity labels support to Power BI desktop application (PBIX), in public preview
Exact Data Match user interface in Microsoft 365 compliance center
The first step to effectively protect your data and prevent data loss is to understand what sensitive data resides in your organization. Foundational to Microsoft Information Protection are its classification capabilities—from out-of-the-box sensitive information types (SITs) to Exact Data Match (EDM). Out-of-box SITs use pattern matching to find the data that needs to be protected. Credit card numbers, account numbers, and Social Security Numbers are examples of data that can be detected using patterns. MIP offers 150+ out-of-the-box sensitive information types mapped to various regulations worldwide. EDM is a different approach. It is a classification method that enables you to create custom sensitive information types that use exact data values. Instead of matching on generic patterns, EDM finds exact matches of data to protect the most sensitive data in your organization. You start by configuring the EDM custom SIT and uploading a CSV table of the specific data to be protected, which might include employee, patient, or other customer-specific information. You can then use the EDM custom SIT with policies, such as Data Loss Prevention (DLP), to protect your sensitive data. EDM nearly eliminates false positives, as the service compares the data being copied or shared with the data uploaded for protection.
We continue to invest in and enhance our EDM service, increasing its service scale by a factor of 10 to support data files containing up to 100 M rows, while decreasing by 50% the time it takes for your data to be uploaded and indexed in our EDM cloud service. To better protect sensitive data uploaded into our EDM service, we added salting to the hashing process, which adds additional protection for the data while in transit and within the cloud repository. You can learn more about these EDM enhancements and details on how to implement in this three-part blog series.
Today we are announcing general availability of a user interface in the Microsoft 365 compliance center to configure and manage EDM in the portal, in addition to the option of using PowerShell. This allows customers who are unable to use PowerShell or prefer to use the UI to manage EDM. Learn more here.

Figure 1: Details of an Exact Data Match schema
We are also announcing general availability of configurable match (aka normalization). This feature will add additional flexibility in defining the matches, allowing you to protect your confidential and sensitive data more broadly. For example, you can elect to ignore case so customer email address will match whether it is capitalized or not. Similarly, you can choose to ignore punctuation such as spaces or dashes in the data such as for social security number. Learn more here.
External sharing policies for Teams and SharePoint sites
Core to Microsoft Information Protection are sensitivity labels. You can apply your sensitivity labels to not only protect document and emails but also to protect entire Teams and sites. In spring, we enabled you to apply a sensitivity label to a Team or site and associate that label with policies related to privacy and device access. This allows for holistically securing sensitive content whether it is in a file or in a chat by managing access to a specific team or site. Along with manual and auto-labeling of documents on SharePoint and Teams, this capability helps you scale your data protection program to manage the proliferation of data and the challenge of secure collaboration while working remotely.
We are pleased to announce that you can now also associate external sharing policies with labels to achieve secure external collaboration. This capability is in public preview. Administrators can tailor the external sharing settings according to the sensitivity of the data and business needs. For example, for ‘Confidential’ label you may choose to block external sharing whereas for ‘General’ label you may allow it. Users then simply select the appropriate sensitivity label while creating a SharePoint site or Team and the appropriate external sharing policy for SharePoint content is automatically applied. It is common for projects at an organization to involve collaboration across employees, vendors, and partners. This capability further helps ensure only authorized users can get access to sensitive data in Teams and SharePoint sites.

Figure 2: External sharing policies available alongside policy for unmanaged device access
Customer Key support for Teams
Microsoft 365 provides customer data protection at multiple layers, starting with volume-level encryption enabled through BitLocker, and then there is protection at the application layer. We offer Customer Key, so you can control a layer of encryption for your data in Microsoft’s data centers, with your own keys. This also enables you to meet requirements of compliance regulations for controlling your own keys.
Customer Key was already available for SharePoint, OneDrive, and Exchange. Today, we are pleased to announce that Customer Key is available in Public Preview for Microsoft Teams. You can now assign a single data encryption policy at the tenant level to encrypt your data-at-rest in Teams and Exchange. Click here to learn more.
Sensitivity labels in Power BI desktop
In June we announced general availability of MIP sensitivity labels in Power BI service, helping organizations classify and protect sensitive data even as it is exported from Power BI to Excel, PowerPoint and PDF files, all this without compromising user productivity or collaboration.
We’re now expanding MIP sensitivity labels support to Power BI desktop application (PBIX), in public preview, to enable content creators to classify and protect sensitive PBIX files while authoring datasets and reports in Power BI desktop. The label applied on PBIX files persist when uploaded to Power BI service. Learn more here.

Figure 3: Sensitive built-in label experience in Power BI Desktop
We are also announcing the availability of a new API that enables administrators to get information on sensitivity labels applied to content in Power BI service. With this information, Power BI and Compliance admins can answer questions like which workspaces in Power BI service have reports with a specific label. Learn more here.
Data is the currency of today’s economy. Data is being created faster than ever in more locations than organizations can track. To secure your data and meet compliance requirements like the General Data Protection Requirement (GDPR) – you need to know what data you have, where it resides, and have capabilities to protect it. The above new capabilities are part of the built-in, intelligent, unified, and extensible solution that Microsoft Information Protection offers to enable both administrators and users to protect organization data while staying productive.
Getting Started
Here’s information on licensing and on how to get started with the capabilities announced today:
Maithili Dandige, Principal Group Program Manager
by Contributed | Dec 16, 2020 | Azure, Microsoft, Technology
This article is contributed. See the original author and article here.
Overview
The Microsoft Cloud App Security (MCAS) connector lets you stream alerts and Cloud Discovery logs from MCAS into Azure Sentinel. This will enable you to gain visibility into your cloud apps, get sophisticated analytics to identify and combat cyberthreats, and control how your data travels, more details on enabling and configuring the out of the box MCAS connector (Connect data from Microsoft Cloud App Security)

Cloud App Security REST API (URL Structure , Token & Supported Actions)
The Microsoft Cloud App Security API provides programmatic access to Cloud App Security through REST API endpoints. Applications can use the API to perform read and update operations on Cloud App Security data and objects.
To use the Cloud App Security API, you must first obtain the API URL from your tenant. The API URL uses the following format:
https://<portal_url>/api/<endpoint>
To obtain the Cloud App Security portal URL for your tenant, do the following steps:
– In the Cloud App Security portal, click the question mark icon in the menu bar. Then, select About.
– In the Cloud App Security about screen, you can see the portal url.

Once you have the portal url, add the /api
suffix to it to obtain your API URL. For example, if your portal’s URL is https://m365x933168.eu2.portal.cloudappsecurity.com
, then your API URL is https://m365x933168.eu2.portal.cloudappsecurity.com
/api
.
Cloud App Security requires an API token in the header of all API requests to the server, such as the following:
Authorization: Token <your_token_key>
Where <your_token_key>
is your personal API token. For more information about API tokens, see Managing API tokens., here’s an example of CURLing MCAS Activity log:

The following table describes the actions supported:

Where Resource represents a group of related entities, fore more details please visit MCAS Activities API
Implementation (MCAS Activity Connector)
- Log in to the Azure tenant, http://portal.azure.com
- Go to Azure Sentinel > Playbooks
- Create a new Playbook and follow the below gif / step-by-step guide, the code being uploaded to github repo as well:
- Add a “Recurrence” step and set the following field, below is an example to trigger the Playbook once a daily basis:
- Interval: 1
- Frequency: Day
- Initialize a variable for the MCAS API Token value, make sure to generate MCAS API Token following this guide
- Name: MCAS_API_Token
- Type: String
- Value: Token QhXXXXBSlodAV9AXXXXXXQlMcFhweXXXXXRXXh1OGkXXkXXkeX
- Set an HTTP endpoints to Get MCAS Activity data:
- HTTP – MCAS Activities API:
- Parse MCAS Activities data via Json:
- Parse JSON – MCAS Activities:
- Content: @{body(‘HTTP_-_MCAS_Activities_API’)}
- Schema: uploaded to github
- Initialize an Array Variable:
- Name: TempArrayVar
- Type: Array
- Append to array variable:
- Name: TempArrayVar
- Value: @{body(‘Parse_JSON_-_MCAS_Activities’)}
- Add For each control to iterate MCAS Activities parsed items:
- Select an output from previous steps: @variables(‘TempArrayVar’)
- Send the data (MCAS Activity Log) to Azure Sentinel Log analytics workspace via a custom log tables:
- JSON Request body: @{items(‘For_each’)}
- Custom Log Name: MCAS_Activity_Log

Notes & Consideration
- You can customize the parser at the connector’s flow with the required and needed attributed / fields based on your schema / payload before the ingestion process, also you can create custom Azure Functions once the data being ingested to Azure Sentinel
- You can customize the for-each step to iterate MCAS Activity log and send them to the Log Analytics workspace so eventually each activity log will be logged in a separate table’s record / row
- You can build your own detection and analytics rules / use cases, a couple of MCAS Activities analytics rules will be ready to use at github, stay tuned
- Couple of points to be considered while using Logic Apps:
Get started today!
We encourage you to try it now!
You can also contribute new connectors, workbooks, analytics and more in Azure Sentinel. Get started now by joining the Azure Sentinel Threat Hunters GitHub community.
Recent Comments