by Priyesh Wagh | Nov 22, 2020 | Dynamics 365, Microsoft, Technology
Let’s take a look at how you can design Adaptive Cards for Outlook and get response from Outlook users to process using Power Automate.
Scenario
Let’s say I wanted to send an Adaptive Card to an Outlook user on their Email and ask some comment, example – A descriptive feedback and read their response back and send them a confirmation Adaptive Card.
In this post, I’ll simply read the Response in the Flow and send a Confirmation-like Adaptive Card so that you can then further decide to take required action for it based on your use case.

Adaptive Cards Designer – Initial Card Layout
You can logon to https://adaptivecards.io/designer/ i.e. the Adaptive Card designer and start building your card. Below are the high-level steps:
- Once you are in the Adaptive Cards Designer, make sure you select the host app as “Outlook Actionable Messages“

You’ll see a sample already created for you which you can start working off of –

- Since I want to start from the beginning, I’ll select New Card option as shown below

- Now, I’ll start to design my card with the use of some TextBlock, an Input.Text to capture a response. In your case, you can choose the type of input(s) you want. (I’m trying to keep it simple for now
)
These controls works like a drag-and-drop behavior, just drop whatever you need.
So, I did the following, I added a TextBlock to add the title I want to show on the card and a Multi-line Input.Text to capture a response of the user I’ll send the card to in their Email.

- For the Input.Text, I’ve added an id ‘answer‘. We’ll need this when we capture responses back.
Also, to make the Input.Text control as multi-line, Multi-Line option should be selected as I did below

- At this point, our Adaptive Card’s layout is ready. Before we proceed we must first, create a Flow that will capture an HTTP Response and then we’ll come back here.
Flow to Capture Response
Adaptive Cards for Outlook Actionable Messages work off of the HTTP mechanism and that’s why you need a URL where you can capture the responses and send back a confirmation response back.
Let’s call this Flow as “Accept Feedback Response“
- To do so, first create a Flow that Accepts an HTTP Request. You can refer my other post in order to understand how you can capture HTTP Responses – Accept HTTP Requests in a Flow and send Response back | Power Automate
Once once you save a Flow that has a trigger of HTTP Response, you’ll get a URL generated, copy it.

- Next, I’ll enter the schema by clicking on Use sample payload to generate schema button on the first step above and enter the following schema since I know that’s what is to be expected from the User when they fill in the Text and submit back.

- Once you click OK, your schema will look like this.

- Collect the result of the above trigger in the Compose so that you can save the Flow with at least 1 Action. (We’ll come back to it later)

- Next up, we will also need to setup a response Adaptive Card. This is something that the users will see when they Submit their responses.

- It is mandatory to send the response back to the caller i.e. the Outlook in this case. Hence, we’ll send a Response back using Response action in HTTP and send the status code as 200 and header CARD-UPDATE-IN-BODY as true.
The Body will have the Adaptive Card which we created in step #5 above i.e. the Adaptive Card we have to send as confirmation.

Here are the details of Refreshing Cards when you send back a response to the client (Outlook user in this case): https://docs.microsoft.com/en-us/outlook/actionable-messages/adaptive-card#refresh-cards?WT.mc_id=DX-MVP-5003911
Complete your Adaptive Card – Add Action
Once your Flow to capture the response is ready, let’s complete the Adaptive Card to add an Action to the same.
- Select the card body, you’ll see a button to Add an action

You’ll need to select Action.Http

- Once you select the Action.Http type of Action, look at the Element Properties, you’ll need to enter the URL we got from the Flow we created to capture responses above.

Here why we chose Action.Http – https://docs.microsoft.com/en-us/outlook/actionable-messages/adaptive-card#outlook-specific-adaptive-card-properties-and-features?WT.mc_id=DX-MVP-5003911
- Next, set the title of the Action (it appears as a button). I’ll name it Submit.
Also, select what kind of Method is to be used for the HTTP Request.

- Now, since you’ve selected POST method, you will need to pass the Body. This will be the response which the user will send and which the Flow above will read it as a Response to process the information further.
Once you select POST, a Body element property will be added where you need to enter the schema and what ID of the element that information belongs to

- Remember, the ID for the Input.Text was set as “answer”, we’ll put that here and then it’s Value property. These values should be enclosed in double curly brackets {{ }}.
- And also, you’ll need to pass along the Header for Authorization. Read this post by Microsoft – https://docs.microsoft.com/en-us/outlook/actionable-messages/security-requirements#action-authorization-header?WT.mc_id=DX-MVP-5003911
Add Header as shown below

Now, as mentioned in the above post, we need to pass the property as Authorization header and leave it blank and Content-Type as application/json

- Once all this is done, it’ll look like below in the card payload for the Action.Http part.

- Now copy the entire Payload from the Editor and we’ll create a new Flow that will be used to send the Adaptive Card to the user.
Flow to Send the Adaptive Card in Email
I’m creating a Flow here which will be used to send the Adaptive Card. To keep it simple, I’m just triggering it using a button on-demand. Your use-case will vary.
Let’s call this Flow as “Send Feedback Request“
- Wherever you need to create the Adaptive Card, you’ll need to paste the entire Payload copied from the step above in a Compose step

- I’ll add a step to send an Email directly using Send Email and make sure you enable the </> part for the body.
Make sure the Outputs of the Compose we pasted above in Step #2 are enclosed in
<script type="application/adaptivecard+json">
</script>

Working
Let’s look at how this will turn out!!
- Let me just Run the Flow as is so that the Adaptive Card is generated and sent to the user I intend to in the Email.
- They’ll receive an email like this

- And the user will submit the answer in the text box and submit.

- Once they submit, the Accept Feedback Response Flow we created will be triggered

- And the response is available to be read and processed further in the Compose step we created to collect it with the defined schema.
- The Adaptive Card we had created to send back as confirmation (which looks as below) will be sent

- Once the user submits, they’ll get the below Adaptive Card as a confirmation that their feedback has been taken.

And that’s how you can make your Adaptive Cards to function with Outlook!! Hope this was helpful.
Here are some superb posts on Adaptive Cards –
- Microsoft Message Cards – The Ultimate Guide by Thomas Poszytek – https://poszytek.eu/en/microsoft-en/microsoft-message-cards-the-ultimate-guide/
- Multi line Approvals with Adaptive Cards, Outlook and Power Automate by Yash Agarwal – https://www.bythedevs.com/post/multi-line-approvals-with-adaptive-cards-outlook-and-power-automate
- Custom Actionable Messages with Microsoft Flow series by Rik-de Koning (3 posts) – https://www.about365.nl/category/blog-series/custom-actionable-messages-with-microsoft-flow/
Here are some more Power Automate / Adaptive Cards posts you might want to look at –
- Adaptive Cards for Teams to collect data from users using Power Automate | SharePoint Lists
- Task Completion reminder using Flow Bot in Microsoft Teams | Power Automate
- Make On-Demand Flow to show up in Dynamics 365 | Power Automate
- Run As context in CDS (Current Environment) Flow Trigger | Power Automate
- Using triggerBody() / triggerOutput() to read CDS trigger metadata attributes in a Flow | Power Automate
- Terminate a Flow with Failed/Cancelled status | Power Automate
- Call HTTP Request from a Canvas Power App using Flow and get back Response | Power Automate
- Setting Retry Policy for an HTTP request in a Flow | Power Automate
- Send a Power App Push Notification using Flow to open a record in Canvas App | Power Automate
- ChildFlowUnsupportedForInvokerConnections error while using Child Flows [SOLVED] | Power Automate
- BPF Flow Step as a Trigger in CDS (Current Environment) connector | Power Automate
- Pause a Flow using Delay and Delay Until | Power Automate
Thank you!! 
Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.
by Contributed | Nov 22, 2020 | Azure, Microsoft, Technology
This article is contributed. See the original author and article here.
Logic Apps connectors provide quick access from Logic Apps to events, data, and actions across other apps, services, systems, protocols, and platforms. By using connectors in your logic apps, you expand the capabilities for your cloud and on-premises apps to perform tasks with the data that you create and already have. Azure Logic Apps connectors are powered by the connector infrastructure that runs in Azure. A workflow running on the new runtime can use these connectors by creating a connection, an Azure resource that provides access to these connectors.
A key capability in the redesigned Logic Apps runtime introduces the extensibility to add built-in connectors. These built-in connectors are hosted in the same process as the Logic App runtime and it provides higher throughput, low latency, and local connectivity. The connection definition file also contains the required configuration information for connecting through these built-in connectors. The preview release comes with the built-in connectors for Azure Service Bus, Azure Event Hub and SQL Server. The extensibility framework that these connectors are built on can be used to build custom built-in connectors to any other service that you need.
In this blog post, I am showing how we can leverage this extensibility framework to create a built-in CosmosDB connector with a trigger and no actions. In this example, whenever the new document is added in the lease collection or container of Cosmos DB, the Logic Apps trigger will fire and execute the Logic App with the input payload as Cosmos document. This built-in connector leverages the functionality of Azure function capability for Cosmos DB trigger based upon the Azure Function trigger binding. In general, you can add any action or function trigger as part of your own built-in connectors. Currently trigger capabilities are limited to Azure Function specific triggers only, in future the Logic app will support non-Azure function triggers also.
Built-in connector plugin model
The Logic Apps built-in connector extensibility model leverages the Azure Functions extensibility model to enable adding built-in connector implementations like Azure Functions extensions. This allows developers to write their connectors as Azure Functions extensions, build and package them as a NuGet for anyone to consume.
There are mainly two operation parts that a developer would need to implement:
Operation descriptions are metadata about the operations that the custom built-in connector implements. These are primarily used by the Logic Apps designer to drive the authoring and monitoring experience related to these connectors’ operations. For example, designer uses operation descriptions to understand the input parameters required for a given operation as well as facilitate the generation of outputs property tokens based on the schema of the output of an operation.
Logic Apps runtime will use these implementations at runtime to invoke the specified operation in the workflow definition.
In order to hook up with function runtime the new built-in connector extension need to be registered with the Azure Function runtime extensions. The details are discussed later in this blog.
CosmosDB Built-in Connector
Here in this sample connector, I am developing the CosmosDB built-in custom connector which has only one trigger and no actions are available. The details of the operations are described below:
Logic App Operation
|
Operation details
|
Description
|
Trigger
|
Receive Document
|
The trigger is invoked when there are inserts or updates in the specified database and collection of CosmosDB.
|
Action
|
–
|
No action operation are defined for this connector
|
To develop your own built-in connector, you need to add the work flow webjob extension package. , I am creating the .NET Core 3.1 class library project in visual studio and added the Microsoft.Azure.Workflows.Webjobs.Extension package as Nuget reference to the project. The Service provider interface is implemented to provide the operations of the CosmosDB connector.
Service Provider interface implementation
The webjob extension Nuget package which was added to the class library project provides the service provider interface IServiceOperationsTriggerProvider which needs to be implemented.
As part of operation description, the IServiceOperationsTriggerProvider interface provides methods GetService() and GetOperations() which are required to be implemented by the custom built-in connector. These operations are used by the logic app designer to describe the actions/triggers by the custom built-in connector on the logic app designer surface. Please note that the GetService() method also specifies the connection parameters needed by the Logic app designer.
For action operation, you need to implement the InvokeActionOperation() method, which is invoked during the action execution. If you would like to use the Azure function binding for azure triggers, then you need to provide connection information and trigger bindings as needed by Azure function. There are two methods which need to be implemented for Azure function binding, GetBindingConnectionInformation() method which provides the connection information to the Azure function binding and GetTriggerType() which is same as “type” binding parameter for Azure function.
The picture below shows the implementation of methods as required by the Logic app designer and Logic app runtime.

The details of the methods which are required to be implemented are tabulated below:
Operation Methods
|
Comments
|
Example
|
GetService()
|
This is needed by Logic app designer. This is the high-level description of your service, which includes the service descriptions, brand color, Icon URL, connection parameters and capabilities etc.
|
public ServiceOperationApi GetService()
{
return this.CosmosDBApis.ServiceOperationServiceApi;
}
|
GetOperations()
|
This is needed by Logic app designer, to get the list of operations that your service has implemented. This is based upon a swagger schema.
|
public IEnumerable<ServiceOperation> GetOperations(bool expandManifest)
{
return expandManifest ? serviceOperationsList : GetApiOperations();
}
|
InvokeActionOperation()
|
This is invoked for every action operation during runtime. Here you can use any client (FTPClient, HTTPCLient etc..) as needed by your custom built-in connector actions. If you are just implementing the trigger as in this case, then you do not need to implement this method.
|
using (var client = new HttpClient())
{
response = client.SendAsync(httpRequestMessage).ConfigureAwait(false).ToJObject();
}
return new ServiceOperationResponse(body: response);
|
GetBindingConnectionInformation()
|
These are the required connection parameters by trigger binding in case you are using the Azure function trigger type.
|
return ServiceOperationsProviderUtilities
.GetRequiredParameterValue(
serviceId: ServiceId,
operationId: operationId,
parameterName: "connectionString",
parameters: connectionParameters)?
.ToValue<string>();
|
GetFunctionTriggerType()
|
If you are using the Azure function built-in triggers as Logic App trigger, then you need to return the string which is same as type in Azure function trigger binding.
“type”: “cosmosDBTrigger”,
|
public string GetFunctionTriggerType(){
return "CosmosDBTrigger";
}
|
Function Extensions and registration:
The function extension registration needs to be added as a startup job and register the service provider as part of service provider list, so that the built-in connector extension can be loaded during the function runtime start process.
Adding the converter is optional depending upon the type of data you need as an input to the built-in trigger. In this example I am converting the Document data type for Cosmos DB Documents to JObject array.
- Create startup job: To register the custom built-in connector as function extension, you need to create a startup class using [assembly:WebJobsStartup] assembly attribute and implementing IWebJobsStartup interface, refer the function registration for more details. In the configure method you need to register the extension and inject the service provider as shown below:
public class CosmosDbTriggerStartup : IWebJobsStartup
{
public void Configure(IWebJobsBuilder builder)
{
// Registering and extension
builder.AddExtension<CosmosDbServiceProvider>();
// DI the trigger service operation provider.
builder.Services.TryAddSingleton<CosmosDbTriggerServiceOperationProvider>();
}
}
- Register service provider: We need to register the service provider implementation as function extension. We are using the built-in Azure function Cosmos DB Trigger as a new trigger. Here in this example, we register the new Cosmos DB service provider for an existing list of service providers which are already part of Logic App extension.
[Extension("CosmosDbServiceProvider", configurationSection: "CosmosDbServiceProvider")]
public class CosmosDbServiceProvider : IExtensionConfigProvider
{
public CosmosDbServiceProvider(ServiceOperationsProvider serviceOperationsProvider, CosmosDbTriggerServiceOperationProvider operationsProvider)
{
serviceOperationsProvider.RegisterService(ServiceName, ServiceId, operationsProvider);
}
public void Initialize(ExtensionConfigContext context)
{
// Converts Cosmos Document list to JObject array.
context.AddConverter<IReadOnlyList<Document>, JObject[]>(ConvertDocumentToJObject);
}
}
- Add Converter: Logic app has implemented the generic way to handle any function built-in trigger using the JObject array, we may need (optional) to add a converter to convert the read only list of Azure Cosmos DB document into JObject array. Once the converter is ready as shown in above example, we need to register the converter as part of ExtensionConfigContext.
// Converts Cosmos Document list to JObject array.
context.AddConverter<IReadOnlyList<Document>, JObject[]>(ConvertDocumentToJObject);
The complete implementation of all three classes as mentioned above are given in the following code map diagram.

Testing the built-in connector:
You need to update the extensions.json in extension bundle to add the above NuGet reference. You can refer the deploy.ps1 script.
Update the extension bundle to include the custom built-in connector.
Create the Logic App project and install the extension package given below:
dotnet add package ServiceProviders.CosmosDb.Extensions --version 1.0.0 --source <ServiceProviders.CosmosDb.Extensions package path>
Once you open the workflow in designer (make sure you close any func.exe process in case if it already running before opening the designer), you should be able to see the newly added connector.

You can now add connection string of Azure cosmos DB and add new document in the database under collection or lease collection to test the trigger by creating a simple logic app using the CosmosDB trigger.
Specifying the connection string.

Specify the database name and collection name for the trigger.

Execute F5 in VS code and invoke trigger by adding a new item using data explorer of your CosmosDB account as shown below to trigger the workflow.

The sample code can be downloaded from the repository. https://github.com/praveensri/LogicAppCustomConnector/tree/main/ServiceProviders.CosmosDb.Extensions
by Contributed | Nov 21, 2020 | Technology
This article is contributed. See the original author and article here.
When writing data to Azure SQL DB as part of your Data Factory ETL job using data flows, there are a number of features available to you that can handle common constraints found in target tables including identity inserts (use sink scripts) , handling known constraints in your data flow logic, and the latest feature to trap, log, and continue on row-level errors in SQL DB.
In your ADF Data Flow SQL DB Sink, you will see an option at the bottom for “Error row handling”. The default is the current behavior in ADF, which tells ADF to fail fast as soon as a target table constraint is encountered on the target table.

You can now optionally tell ADF to “Continue on Error” so that the ETL process will continue writing rows to the SQL DB sink even after error rows have been encountered. ADF does this through a 2-stage process which means that there is a small performance penalty incurred by choosing this option.
However, once you’ve decided to pass over error rows and continue writing using the Sink setting, you can now also tell ADF to automatically log those errors along with the error conditions and original data. This will allow you to view the error details as well as to have the opportunity to re-process those original rows, processing only the errored rows.

Once you’ve chosen to “continue on error”, you can then choose the return code status of the activity by setting “Report success on error”. When true, ADF will return a success code for your data flow even when rows errored. Optionally, set it to false to return a fail status. You will then see the results of the number of success vs. failed rows in the Sink details in the data flow activity monitoring view.

Recent Comments