MidDay Cafe Episode 36 – Human AI Partnership

MidDay Cafe Episode 36 – Human AI Partnership

This article is contributed. See the original author and article here.

MidDayCafe.png  In this episode of MidDay Café hosts Tyrelle Barnes and Michael Gannotti discuss Human/AI partnership. Many organizations are trying to figure out AI strategy but seem to be taking a tech/product first approach. What Tyrelle and Michael discuss is how to anchor on people/employees first with AI/Tech in support.


Listen to the Audio podcast version:



Resources:



Thanks for Visiting!


Tyrelle Barnes LinkedIn : Michael Gannotti LinkedIn | Twitter


TyrelleBarnes.jpeg me.jpg


 

Bring your own data to Azure OpenAI chat models

Bring your own data to Azure OpenAI chat models

This article is contributed. See the original author and article here.

Introduction


Azure OpenAI models provide a secure and robust solution for tasks like creating content, summarizing information, and various other applications that involve working with human language. Now you can operate these models in the context of your own data. Try Azure OpenAI Studio today to naturally interact with your data and publish it as an app from from within the studio.


 


Getting Started


Follow this quickstart tutorial for pre-requisites and setting up your Azure OpenAI environment.


 


In order to try the capabilities of the Azure OpenAI model on private data, I am uploading an ebook to the Azure OpenAI chat model. This e-book is about “Serverless Apps: Architecture, patterns and Azure Implementation” written by Jeremy Likness and Cecil Phillip. You can download the e-book here


 


Before uploading own data


Prior to uploading this particular e-book, the model’s response to the question on serverless design patterns is depicted below. While this response is relevant, let’s examine if the model is able to pick up the e-book related content during the next iteration


 


pre-training.png


 


After uploading own data


This e-book has an exclusive section that talks in detail about different design patterns like Scheduling, CQRS, Event based processing etc.


 


ebook.png


After training the model on this PDF data, I asked a few questions and the following responses were nearly accurate. I also limited the model to only supply the information from the uploaded content. Here’s what I found.


 


post-training.png


 


Now when I asked about the contributors to this e-book, it listed everyone right.


 


post-training-1.png


 


Read more


With enterprise data ranging to large volumes in size, it is not practical to supply them in the context of a prompt to these models. Therefore, the setup leverages Azure services to create a repository of your knowledge base and utilize Azure OpenAI models to interact naturally with them.


 


The Azure OpenAI Service on your own data uses Azure Cognitive Search service in the background to rank and index your custom data and utilizes a storage account to host your content (.txt, .md, .html, .pdf, .docx, .pptx)Your data source is used to help ground the model with specific data. You can select an existing Azure Cognitive Search index, Azure Storage container, or upload local files as the source we will build the grounding data from. Your data is stored securely in your Azure subscription.


 


We also have another Enterprise GPT demo that allows you to piece all the azure building blocks yourself. An in-depth blog written by Pablo Castro chalks the detail steps here.


 


Getting started directly from Azure OpenAI studio allows you to iterate on your ideas quickly. At the time of writing this blog, the completions playground allow 23 different use cases that take advantage of different models under Azure OpenAI.


 



  1. Summarize issue resolution from conversation

  2. Summarize key points from financial report (extractive )

  3. Summarize an article (abstractive)

  4. Generate product name ideas

  5. Generate an email

  6. Generate a product description (bullet points)

  7. Generate a listicle-style blog

  8. Generate a job description

  9. Generate a quiz

  10. Classify Text

  11. Classify and detect intent

  12. Cluster into undefined categories

  13. Analyze sentiment with aspects

  14. Extract entities from text

  15. Parse unstructured data

  16. Translate text

  17. Natural Language to SQL

  18. Natural language to Python

  19. Explain a SQL query

  20. Question answering

  21. Generate insights

  22. Chain of thought reasoning

  23. Chatbot


Resources


There are different resources to get you started on Azure OpenAI. Here’s a few:



 

Account-based seller insights improve sales account manager effectiveness 

Account-based seller insights improve sales account manager effectiveness 

This article is contributed. See the original author and article here.

Busy sales account managers prioritize their activities by mining information from their accounts. But manually making sense of all that unstructured data takes time and can lead to inaccurate assumptions. They can end up focusing on the wrong activities, which results in a lower impact on business outcomes. The most productive and successful account managers are the ones who focus on the right customers with the right priority. Dynamics 365 Sales account-based seller insights can help.

Account-based seller insights help drive priorities

Account-based seller insights help you set priorities and formulate the best engagement plan for your customers. These are automated, actionable insights that are derived from multiple sources of unstructured data and presented to you in the right context. For instance, you might be shown an upsell insight for an account based on past won opportunities for similar accounts, along with guidance on the next best action to take. Seller insights help you proactively manage the customer journey, from the first engagement to the final sale. 

Behind the scenes with seller insights

Account-based seller insights can be generated in three ways:

  • Bring your own model. Use your own AI model, trained on your data, to generate insights, and work with them in the Dynamics 365 sales accelerator.
  • Use out-of-the-box models. The account-based seller insights solution comes with its own models, which mine the data in Dynamics 365 Sales to generate insights.
  • Build a back-end rule framework. You can build your own rule framework that uses Power Automate flows to generate insights when certain conditions are met. 
The sales accelerator in Dynamics 365 Sales, with seller insights and next actions highlighted.

How seller insights boost productivity

How can seller insights help you be a more effective sales account manager? Let’s look.

Insight list and actions

First, you get curated insights for all your accounts:  

  • You only see insights that are relevant to you, not your team members. 
  • The insights have expiration dates so that you know the information is fresh and relevant. 
  • You can see the reasons an insight appears in the list. 

And after you acknowledge an insight, you’re guided through the next best steps to act on it, optimizing the sales workflow for better results. You can also collaborate with team members while you’re working on your insights. 

The sales accelerator in Dynamics 365 Sales, with a seller insight and sequence highlighted.

Insight assignment and distribution

Second, although your insights are curated, that doesn’t mean they’re siloed. Insights are assigned to the account owner. If the owner of an entity is a team, an insight can be automatically assigned to the appropriate salesperson on the team, based on role, through the flexible rule framework. Ownership can be transferred from one seller to another, and multiple sellers can work on a single insight. 

Insight action history

Finally, you can find all the insights that have been generated for an account on the account’s Insights tab. The list includes status, type, due date, and other helpful information. Filter and sort it to focus on what’s most important. You can easily identify all seller activities for the insights on the timeline view of the account.

By helping you identify your most important and profitable accounts, understand their needs and preferences, tailor your messages and offers, and nurture long-term relationships with them, account-based seller insights can lead to higher revenues, shorter sales cycles, and better customer satisfaction.

Next steps

To get started with seller insights:

Not a Dynamics 365 Sales customer yet? Take a guided tour and sign up for a free trial

The post Account-based seller insights improve sales account manager effectiveness  appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Copy data to Azure Data Services at scale with Microsoft Fabric

Copy data to Azure Data Services at scale with Microsoft Fabric

This article is contributed. See the original author and article here.

Introduction


Did you know that you can use Microsoft Fabric to copy data at scale from on-premises SQL Server to Azure SQL Database or Azure SQL Managed Instance within minutes?


 


It is often required to copy data from on-premises to Azure SQL database, Azure SQL Managed Instance or to any other data store for data analytics purposes. You may simply want to migrate data from on-premises data sources to Azure Database Services. You will most likely want to be able to do this data movement at scale, with minimal coding and complexity and require an automated and simple approach to handle such scenarios.


 


In the following example, I am copying 2 tables from an On-premises SQL Server 2019 database to Azure SQL Database using Microsoft Fabric. The entire migration is driven through a metadata table approach, so the copy pipeline is simple and easy to deploy. We have used this approach to copy hundreds of tables from one database to another efficiently. The monitoring UI provides flexibility and convenience to track the progress and rerun the data migration in case of any failures. The entire migration is driven using a database table that holds the information about the tables to copy from the source.  


 


Architecture diagram


This architectural diagram shows the components of the solution from SQL Server on-premises to Microsoft Fabric.


addy_0-1687194528376.png


 


Steps


Install data gateway:


To connect to an on-premises data source from Microsoft Fabric, a data gateway needs to be installed. Use this link to install an on-premises data gateway | Microsoft Learn


 


Create a table to hold metadata information:


First, let us create this table in the target Azure SQL Database.


 


 


 


 

CREATE TABLE [dbo].[Metadata](
      [Id] [int] IDENTITY(1,1) NOT NULL,
      [DataSet] [nvarchar](255) NULL,
      [SourceSchemaName] [nvarchar](255) NULL,
      [SourceTableName] [nvarchar](255) NULL,
      [TargetSchemaName] [nvarchar](255) NULL,
      [TargetTableName] [nvarchar](255) NULL,
      [IsEnabled] [bit] NULL 
)

 


 


 


 


 


 


I intend to copy two tables – Customer and Sales – from the source to the target. Let us insert these entries into the metadata table. Insert one row per table.


 


 


 


 

INSERT [dbo].[Metadata] ([DataSet], [SourceSchemaName], [SourceTableName], [TargetSchemaName], [TargetTableName], [IsEnabled]) VALUES (N'Customer', N'dbo', N'Customer', N'dbo', N'Customer', 1);

INSERT [dbo].[Metadata] ([DataSet], [SourceSchemaName], [SourceTableName], [TargetSchemaName], [TargetTableName], [IsEnabled]) VALUES (N'Sales', N'dbo', N'Sales', N'dbo', N'Sales', 1);

 


 


 


 


 


 


Ensure that the table is populated. The data pipelines will use this table to drive the migration.


addy_1-1687194528380.png


 


Create Data Pipelines:


Open Microsoft Fabric and click create button to see the items you can create with Microsoft Fabric.


addy_0-1687203365811.png


 


Click on “Data pipeline” to start creating a new data pipeline.


addy_2-1687194559507.png


 


Let us name the pipeline “Copy_Multiple_Tables”.


addy_3-1687194559511.png


 


Click on “Add pipeline activity” to add a new activity.


addy_4-1687194559514.png


 


Choose Azure SQL Database from the list. We will create the table to hold metadata in the target.


addy_5-1687194559516.png


 


Ensure that the settings are as shown in the screenshot.


addy_6-1687194559518.png


 


Click the preview data button and check if you can view the data from the table.


addy_7-1687194559518.png


 


Let us now create a new connection to the source. From the list of available connections, choose SQL Server, as we intend to copy data from SQL Server 2019 on-premises. Ensure that the gateway cluster and connection are already configured and available. 


addy_8-1687194559524.png


 


Add a new activity and set the batch count to copy tables in parallel.


addy_9-1687194559528.png


 


We now need to set the Items property, which is dynamically populated at runtime. To set this click on this button as shown in the screenshot and set the value as:


 


 


 


 

@activity('Get_Table_List').output.value

 


 


 


 


 


addy_10-1687194559530.png


 


addy_11-1687194559534.png


 


Add a copy activity to the activity container.


addy_12-1687194559536.png


 


Set the source Table attributes in the copy activity as shown in the screenshot. Click on the edit button and click the “Add dynamic content” button. Ensure that you paste the text only after you click the “Add dynamic content” button, otherwise, the text will not render dynamically during runtime.


 


Set the Table schema name to:


 


 


 


 

@item().SourceSchemaName

 


 


 


 


 


 


 


Set the Table name to:


 


 


 


 

@item().SourceTableName

 


 


 


 


 


addy_13-1687194559541.png


 


Click on the destination tab and set the destination attributes as in the screenshot.


Set the Table schema name to:


 


 


 


 

@item().TargetSchemaName

 


 


 


 


 


Set the Table name to:


 


 


 


 

 @item().TargetTableName

 


 


 


 


 


addy_14-1687194559547.png


 


We have configured the pipeline. Now click on save to publish the pipeline.


addy_15-1687194559551.png


 


Run pipeline:


Click the Run button from the top menu to execute the pipeline. Ensure the pipeline runs successfully. This will copy both tables from source to target.


addy_16-1687194559556.png


 


Summary:


In the above example, we have used Microsoft Fabric pipelines to copy data from an on-premises SQL Server 2019 database to Azure SQL Database. You can modify the sink/destination in this pipeline to copy to other sources such as Azure SQL Managed Instance or Azure Database for PostgreSQL. If you are interested in copying data from a mainframe z/OS database, then you will find this blog post from our team also very helpful.


 


Feedback and suggestions 


If you have feedback or suggestions for improving this data migration asset, please contact the Azure Databases SQL Customer Success Engineering Team. Thanks for your support!


 


Note: For additional information about migrating various source databases to Azure, see the Azure Database Migration Guide.


 

Customize data models to view your organization’s unique metrics 

Customize data models to view your organization’s unique metrics 

This article is contributed. See the original author and article here.

Dynamics 365 Customer Service is a powerful tool for managing your contact center. Its built-in analytic dashboards, such as the recently launched Omnichannel real-time analytics dashboard, provide a wealth of industry standard KPIs and metrics to help you monitor and improve performance. These dashboards are built on Power BI with two components: a data model (or data set) that houses the KPIs, and reports that visualize the data for viewers. Dynamics 365 Customer Service reads the data from Dataverse, performs transformation logic for each of the KPIs, and makes these KPIs available for you within the data model. You can customize data models using data within Dynamics or external data to view metrics tailored to your needs.

Every Dynamics organization that has analytics enabled gets their copy of this solution deployed and available only to them. While the data model is not editable, the reports are fully customizable through visual customization. This way, you can see and use the data in ways that make sense for your organization. You can view metrics outside of what’s in the out-of-box reports. You can also create additional pivots and dimensions to slice the data as needed. 

We have received a lot of feedback from you around the need for customizations. You want to modify the data or logic used to calculate metrics in the data set. You also want to create your metrics in addition to the out-of-box metrics available in the data model. Additionally, you want to create variants of existing metrics or calculate metrics differently based on your organization’s unique processes. Another frequent request has been guidance around building custom dashboards that combine KPIs from Dynamics Customer Service with other applications.  

To address these scenarios, Dynamics 365 Customer Service launched model customization. This feature deploys a copy of the data set used by the out-of-box reports into your organization’s Power BI workspace. Therefore, you can build composite models that connect to the Dynamics data model. 

By leveraging the out-of-box model and only creating the metrics that are unique to your organization, you can reduce the risk of metric definitions going stale as Dynamics updates its capabilities. This also saves you valuable time and development effort. Furthermore, by using model customization, you can build custom reports and dashboards that combine data from multiple applications. This gives you a more complete picture of your contact center’s performance. 

Overall, Dynamics 365 Customer Service provides a powerful set of tools for managing your contact center. Its built-in analytic dashboards offer the specific insights you need to improve contact center performance. And with model customization, you can tailor these to your specific needs.  

Learn more 

Watch the following videos:

For instructions on how to customize data models, follow the instructions outlined here: Model customization of historical and real-time analytics reports in Customer Service | Microsoft Learn

The post Customize data models to view your organization’s unique metrics  appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.