Spend more time selling—new sales capabilities in 2023 release wave 1

Spend more time selling—new sales capabilities in 2023 release wave 1

This article is contributed. See the original author and article here.

The role of the seller is evolving. Buyers expect a blend of digital and personalized experiences throughout their journey. To achieve this, sellers must be efficient and effectiveprioritizing who to engage, identifying how and when to connect, and spending more time becoming trusted advisors to their customers. Sellers can’t be overwhelmed trying to make sense of too much data and information; rather, they need the data to work for them by providing value in every customer interaction.

With Microsoft Dynamics 365 Sales, sellers can improve their sales by prioritizing their best bets, collaborating with their sales team in the moment with Microsoft Teams built-in, knowing when they should engage with prospective customers, and then seeing how it went after ending the call. Sellers are given the gift of time, plus intelligence, so they can close more deals faster than before.

For years, customer relationship management (CRM) systems have asked sellers to enter data so sales managers could forecast revenue and assess seller performance. With Dynamics 365 Sales and Microsoft Viva Sales, we have put the applications to work for youusing AI to simplify data capture and recommend in-the-moment interactions whether selling from Dynamics 365 Sales or in Microsoft 365 productivity tools. Starting today, we will begin to roll out new capabilities for Dynamics 365 Sales and Viva Sales that use the power of AI to help sellers:

  1. Prioritize your work to land more deals faster.
  2. Stay productive and collaborate in the flow of work with Viva Sales.

Let’s take a closer look at what’s in store for sellers in the weeks and months ahead.

Woman using a Surface Pro inside a library.

2023 release wave 1 for Dynamics 365 Sales and Viva Sales

Learn about the new sales capabilities helping sellers with the power of AI.

Prioritize your work to land more deals faster

The sales accelerator in Dynamics 365 Sales helps sellers to sell with intent by building a prioritized worklist and surfacing automated activity recommendations to speed the sales process. Sequences enable sales organizations to automate these processes, tailoring them to their unique sales approach and best practices. Sequences are powered by our common customer journey orchestration engine shared across Dynamics 365 applications. We are enhancing the sequence capabilities to support account-centric selling with multiple sequences to a record, improve effectiveness with actionable AI-powered suggestions, and analyze performance using sequence insights.

Time with customers is precious, so every sales interaction matters. Conversation intelligence helps sellers make the most of their sales calls by transcribing the dialog and using AI to detect sentiment, questions, and actions to ensure no follow-up is missed. In this release, we have enabled text message as an additional channel for sellers to engage with customers and added additional AI capabilities to redact sensitive personally identifiable information (PII) data from phone calls and provide in-the-moment suggestions to guide sales conversations.

graphical user interface, application

Sellers are routinely managing many deals at the same time. As sales engagements progress and sellers learn more about their customers, they need to regularly adjust and review this data while, at the same time, keeping an eye on how they are performing. Sellers can easily maintain the various stakeholders for an account with the new org chart capability, identify and analyze the activity of key decision makers, and ensure they stay on top of their performance and update their pipeline with the new opportunity management experience. The new opportunity experience eliminates many processes that sellers would normally need to do and streamlines everything into a single workspace.

graphical user interface, application
New pipeline view to manage opportunities in Dynamics 365 Sales

Stay productive and collaborate in the flow of work

Not all sellers spend all their time in a CRM system. Many spend much of their time in productivity tools, emailing, calling, and collaborating with colleagues and customers. In October 2022, we launched Microsoft Viva Sales. We are empowering salespeople with AI-driven insights and data automation right in the flow of workin the productivity and collaboration tools millions are already using every day: Microsoft 365 and Teams.

Selling is a team sport. Enabling sales team members to collaborate with each other effectively with the right tools is key to their success. Collaboration spaces bring together the right users, contextual insights, and productivity apps to boost seller collaboration in Teams. Collaboration spaces makes internal and external collaboration take center stage. Sellers can use sales templates to create a collaboration space. Sales templates speed up structured team/channel creation with predefined channels, pre-pinned apps, and integrated access to CRM data.

For sales teams, the adage “time is money” is more relevant than ever before. Sellers are busy people who find it challenging to balance the time and effort required to respond to customer emails with their other responsibilities. Pain points include:

  • Pulling data from a CRM system is time-consuming with complicated navigation and menus.
  • Keeping track of customer opportunities and the history is difficult and increases for sellers managing many accounts.
  • Responding to a high volume of emails can be overwhelming and might cause the seller to miss important details.

With the help of Copilot in Viva Sales, alongside the context of the email or meeting and CRM data, we will now generate suggested email content for a variety of scenariossuch as replying to an inquiry, creating a proposal, or summarizing the action items from meetings. Viva Sales brings together Microsoft 365 data and CRM data to help sellers quickly generate responses using the power of Microsoft Azure OpenAI Service.

graphical user interface, text, application, email

Organizations have taken pride in the customization of their CRM systems, considering it to be critical to their business success. These customized experiences allow sellers to engage and capture customer data effectively within their CRM system. In October, we introduced Viva Sales, which lets sellers use Microsoft 365 and Teams to automatically capture data into any CRM system, eliminating manual data entry and giving more time to focus on selling.

In February, we released the ability for CRM administrators to customize CRM forms, fields, and behavior in Viva Sales for accounts, contacts, and opportunities. With this release, we will add the ability to configure additional out-of-the-box entities as well as custom entities using queries defined in the CRM system. CRM administrators will be able to add or remove relevant custom and out-of-the-box entities to Viva Sales forms and control filtering and sorting behavior of lists using CRM-defined queries. Sellers will be able to see custom and out-of-the-box entities in the Outlook side pane in Viva Sales, share custom entities with colleagues in Teams, search for custom entities in the Teams messaging extension, and connect Outlook email and meeting activities to custom or out-of-the-box entities.

Learn more about 2023 release wave 1 for Dynamics 365 Sales and Viva Sales

These are just a few of the new capabilities that we are rolling out for sellers in 2023 release wave 1. To learn more about these new capabilities in Dynamics 365 Sales and Viva Sales, click on the links below.

If you are not yet a Dynamics 365 Sales customer, check out our Dynamics 365 Sales webpage where you can take a guided tour or get a free 30-day trial.

The post Spend more time selling—new sales capabilities in 2023 release wave 1 appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Enrich your advanced hunting experience using network layer signals from Zeek

Enrich your advanced hunting experience using network layer signals from Zeek

This article is contributed. See the original author and article here.

In our previous blog about hunting for network signatures in Microsoft 365 Defender, we described how we used device discovery capabilities to capture some network event information in deeper detail and expose them in advanced hunting with the NetworkSignatureInspected action type. Since then we have made several developments, the most significant being the integration with Zeek. This release has expanded what is possible for generating network detections across Microsoft Defender for Endpoint. That announcement, shared examples of detections created for PrintNightmare and NTLM password spraying attempts.


 


Today, we would like to share a variety of Zeek-based events in advanced hunting that will help you expand your investigation, hunting, and detection capabilities for identifying and addressing network-layer anomalies across HTTP, SSH and ICMP protocols. Using the new Zeek events, we will demonstrate how to perform network threat hunting while also covering some of the MITRE ATT&CK Matrix.


 


Note: As the integration with Zeek continues to mature, more action types will gradually be released over time. With the Zeek integration only supported on Windows devices, these action types will surface for connections to and from Windows device.


 


To identify these action types in your tenant, look for the value ConnectionInspected in the ActionType field of the DeviceNetworkEvents table of advanced hunting. The extra information is stored in the AdditionalFields column as a JSON data structure and has the commonly known Zeek fields per event, which can be parsed. These field names are identical to those that Zeek uses, which are documented on Zeek’s site. You can also check the Schema Reference flyout page on the advanced hunting pages to check for any new action types that were recently released.


 


Link to query


DeviceNetworkEvents


| where ActionType contains ‘ConnectionInspected’


| distinct ActionType


 


The result of this query looks something like this:


 


cventour_0-1681377541830.png


Figure 1 – Sample result upon checking for ConnectionInspected in the ActionType table


 


The format of the action type will follow the [Protocol_Name]ConnectionInspected standard.


 


Inspecting HTTP connections


 


The HttpConnectionInspected action type contains extra information about HTTP connections, inbound or outbound. In cases where you click on an event of the HttpConnectionInspected action type, the page flyout will parse the additional fields and present them in a  format like the example below:


 


cventour_1-1681378349897.png


 


Figure 2 – Sample result of an HttpConnectionInspected action type


 


Below, you will find a complete list of fields that this action type can expose and the respective descriptions:


 


























































Field Name



Description



direction



The direction of the conversation relevant to the Microsoft Defender for Endpoint-onboarded device, where the values are either ‘In’ or ‘Out’



host



The host header content



method



The HTTP method requested



request_body_len



Length of the HTTP message body in bytes



response_body_len



Length of the HTTP response body in bytes



status_code



The HTTP response code



status_msg



The full text message of the response



tags



A set of indicators of various attributes discovered and related to a particular request/response pair.



trans_depth



Represents the pipelined depth into the connection of the request/response transaction



uri



The complete URI that was requested



user_agent



The user_agent header of the request



version



The HTTP version used



 


Let’s look at a few examples of using the HttpConnectionInspected action type. In the first example, you want to look for rare user agents in the environment to identify potentially suspicious outbound web requests and cover the “T1071.001: (Application Layer Protocol) Web Protocols” technique.


 


Link to query


// Identify rare User Agent strings used in http conversations


DeviceNetworkEvents


| where ActionType == ‘HttpConnectionInspected’


| extend json = todynamic(AdditionalFields)


| extend direction = tostring(json.direction), user_agent = tostring(json.user_agent)


| where direction == ‘Out’


| summarize Devices = dcount(DeviceId) by user_agent


| sort by Devices asc


 


Suppose you have identified a suspicious-looking user-agent named “TrickXYZ 1.0” and need to determine which user/process/commandline combination had initiated that connection.  Currently, the HttpConnectionInspected events, as with all Zeek-related action types, do not contain that information, so you must execute a follow-up query by joining with events from  ConnectionEstablished action type. Here’s an example of a follow-up query:


 


Link to query


// Identify usage of a suspicious user agent


DeviceNetworkEvents


| where Timestamp > ago(1h) and ActionType == “HttpConnectionInspected”


| extend json = todynamic(AdditionalFields)


| extend user_agent = tostring(json.user_agent)


| where user_agent == “TrickXYZ”


| project ActionType,AdditionalFields, LocalIP,LocalPort,RemoteIP,RemotePort, TimeKey = bin(Timestamp, 5m)


| join kind = inner (


DeviceNetworkEvents


| where Timestamp > ago(1h) and ActionType == “ConnectionSuccess”


| extend TimeKey = bin(Timestamp, 5m)) on LocalIP,RemoteIP,LocalPort,TimeKey


| project DeviceId, ActionType, AdditionalFields, LocalIP,LocalPort,RemoteIP,RemotePort , InitiatingProcessId,InitiatingProcessFileName,TimeKey


 


In another example, let’s look for file downloads from HTTP, particularly files of executable and compressed file extensions to cover the “T1105: Ingress tool transfer” technique:


 


Link to query


// Detect file downloads


DeviceNetworkEvents


| where ActionType == ‘HttpConnectionInspected’


| extend json = todynamic(AdditionalFields)


| extend direction= tostring(json.direction), user_agent=tostring(json.user_agent), uri=tostring(json.uri)


| where uri matches regex @”.(?:dll|exe|zip|7z|ps1|ps|bat|sh)$”


 


The new HTTP action type will unlock a variety of possibilities for detection on this protocol. We  look forward to seeing the queries you come up with by sharing your contributions with the community.


 


Looking at SSH connections


 


The SshConnectionInspected action type will display information on SSH connections. While decrypting the entire SSH traffic is not possible, the cleartext part of the SSH session initiation can provide valuable insights. Let’s look at the data found in the AdditionalFields section.


 


cventour_0-1681379880041.png


Figure 3 – Screenshot of additional fields that SshConnectionInspected generates.


 


The fields depend on the activity that was observed. Some of these fields might not appear depending on the connection. For example, if the client disconnected before completing the authentication, you will not have an auth_success field populated for that event..


 


Below, you will find a complete list of fields that this action type can expose and the respective descriptions:


 










































Field Name



Description



direction



The direction of the conversation relevant to the Defender for Endpoint-onboarded device, where the values are either ‘In’ or ‘Out’



auth_attempts



The number of authentication attempts until the success or failure of the attempted session.



auth_success



The success or failure in authentication, where ‘true’ means successful user authentication and ‘false’ means the user-provided credentials are incorrect.



client



The version and type of client used to authenticate to the SSH session.



host_key



Host public key value



server



SSH server information



version



SSH protocol major version used



uid



The unique ID of the SSH session attempt



 


Let’s look at a few advanced hunting examples using this action type. In the first example, you want to look for potentially infected devices trying to perform “T1110: Brute-Force” against remote servers using SSH as an initial step to “T1021.004: Lateral Movement – Remote Services: SSH”.


 


The query below will give you a list of Local/Remote IP combinations with at least 12 failed attempts (three failed authentications on four sessions) of SSH connections in the last hour. Feel free to use this example and adapt it to your needs.


 


Link to query


// Detect potential bruteforce/dictionary attacks against SSH


DeviceNetworkEvents


| where ActionType == ‘SshConnectionInspected’


| extend json = todynamic(AdditionalFields)


| extend direction=tostring(json.direction), auth_attempts = toint(json.auth_attempts), auth_success=tostring(json.auth_success)


| where auth_success==’false’


| where auth_attempts > 3


| summarize count() by LocalIP, RemoteIP


| where count_ > 4


| sort by count_ desc


 


In the next example, let’s suppose you are looking to identify potentially vulnerable SSH versions and detect potentially unauthorized client software being used to initiate SSH connections and operating systems that are hosting SSH server services in your environment:


 


Link to query


// Identify Server/Client pairs being used for SSH connections


DeviceNetworkEvents


| where  ActionType == “SshConnectionInspected”


| extend json = todynamic(AdditionalFields)


| project Server = tostring(json.server),Client = tostring(json.client)


| distinct Server ,Client


 


cventour_1-1681380056116.png


Figure 4 – An example result with a short description of the different components


 


The results above describe breaking down the SSH banners to identify the different components. A short analysis of the banners shows that the server is Ubuntu 22.04, running OpenSSH version 8.9, and the client software is WinSCP version 5.21.3. Now, you can search these versions online to verify if they are vulnerable.


 


Note: The query above can be used to surface potential “T1046: Network Service Discovery” attempts, as attackers may try to search for unpatched or vulnerable SSH services to compromise.


 


Reviewing ICMP connections


 


The IcmpConnectionInspected action type will provide details about ICMP-related activity. The breadth of fields generated creates opportunities for some interesting detections. Here’s an example of the human-readable view of the event as shown on the event flyout page


 


cventour_2-1681380100285.png


 


 Below, you will find a complete list of fields that this action type can expose and the respective descriptions:


 






















































Field Name



Description



direction



The direction of the conversation relevant to the Defender for Endpoint-onboarded device, where the values are either ‘In’ or ‘Out’



conn_state



The state of the connection. In the screenshot example OTH means that no SYN packet was seen. Read the Zeek documentation for more information on conn_state.



duration



The length of the connection, measured in seconds



missed_bytes



Indicates the number of bytes missed in content gaps, representing packet loss. 



orig_bytes



The number of payload bytes the originator sent. For example, in ICMP this designates the payload size of the ICMP packet.



orig_ip_bytes



The number of IP level bytes that the originator sent as seen on the wire and taken from the IP total_length header field.



orig_pkts



The number of packets that the originator sent.



resp_bytes



The number of payload bytes the responder sent.



resp_ip_bytes



The number of IP level bytes that the responder sent as seen on the wire.



resp_pkts



The number of packets that the responder sent. 



Uid



Unique Zeek ID of the transaction.



 


Let’s explore a few examples of hunting queries that you can use to leverage the ICMP connection information collected by Defender for Endpoint.


 


In the first example, you wish to look for potential data leakage via ICMP to cover the “T1048: Exfiltration Over Alternative Protocol” or “T1041: Exfiltration Over C2 Channel” techniques. The idea is to look for outbound connections and check the payload bytes a device sends in a given timeframe. We will parse the direction, orig_bytes, and duration fields and look for conversations over 100 seconds where more than 500,000 were sent. The numbers are used as an example and do not necessarily indicate malicious activity. Usually, you will see the download and upload are almost equal for ICMP traffic because most devices generate “ICMP reply” with the same payload that was observed on the “ICMP echo” request.


 


Link to query


// search for high upload over ICMP


DeviceNetworkEvents


| where ActionType == “IcmpConnectionInspected”


| extend json = todynamic(AdditionalFields)


| extend Upload = tolong(json[‘orig_bytes’]), Download = tolong(json[‘resp_bytes’]), Direction = tostring(json.direction), Duration = tolong(json.duration)


| where Direction == “Out” and Duration > 100 and Upload > 500000


| top 10 by Upload


| project RemoteIP, LocalIP, Upload = format_bytes(Upload, 2, “MB”), Download = format_bytes(Download, 2, “MB”),Direction,Duration,Timestamp,DeviceId,DeviceName


 


Below is an example result after exfiltrating a large file over ICMP to another device on the network:


 


cventour_3-1681380100287.png


 


In the last example, you wish to create another hunting query that helps you detect potential Ping sweep activities in your environment to cover the “T1018: Remote System Discovery” and “T1595: Active Scanning” techniques. The query will look for outbound ICMP traffic to internal IP addresses, create an array of the targeted IPs reached from the same source IP, and display them if the same source IP has pinged more than 5 IP Addresses within a 10-minute time window.


 


Link to query


// Search for ping scans


DeviceNetworkEvents


| where ActionType == “IcmpConnectionInspected”


| extend json = todynamic(AdditionalFields)


| extend Direction = json.direction


| where Direction == “Out” and ipv4_is_private(RemoteIP)


| summarize IpsList = make_set(RemoteIP) by DeviceId, bin(Timestamp, 10m)


| where array_length(IpsList) > 5


 


Identifying the origin process of ICMP traffic can be challenging as ICMP is an IP-Layer protocol. Still, we can use some OS-level indications to narrow down our search. We can use the following query to identify which process-loaded network, or even ICMP-specific, binaries:


 


Link to query


DeviceImageLoadEvents


| where FileName =~ “icmp.dll” or FileName =~ “Iphlpapi.dll”


 


More information


 


Understand which versions of the Microsoft Defender for Endpoint agent support the new integration here:



Find out more details about the integration in our ZeekWeek 2022 presentations:



View the open-source contribution in Zeek’s GitHub repository:



Previous announcements:


Trigger SQL Views with Logic App Standard with the built-in SQL Connector

Trigger SQL Views with Logic App Standard with the built-in SQL Connector

This article is contributed. See the original author and article here.

By the time of writing this article, the Logic App Standard SQL Connector does not have the functionality to monitor the row version of SQL Views so it can’t be triggered by a change in the View’s data, which would have allowed us to configure a trigger on a View in SQL. Until it gets rolled out, we are exploring a way in this article to imitate this functionality.


The SQL built-in trigger (SQL Server – Connectors | Microsoft Learn) is based upon tracking update on SQL table and tracking cannot be enabled for SQL views. Azure SQL trigger uses SQL change tracking functionality to monitor a SQL table for changes and trigger a function when a row is created, updated, or deleted.


Omar_Abu_Arisheh_8-1681609753782.png


 


 


Assuming that we have a SQL Server, with three tables, and a View that joins the three tables. If any of the tables has an update, it will reflect on this View. This is what we tested in this POC, you can change this based on your requirements and based on how your View gets updated, if it gets updated only by two tables and the third is just static data then you will only need two Parent workflows to trigger the child one. The idea here is to pass the triggered value and use it as a where condition in the child workflow. The child workflow will execute a Get rows action on the SQL View using the “where condition”, it will then do the select on the View instead of a table as we use the View name instead of a Table name.


 


SQL side:


 


To begin, you might need to whitelist your client IP if you are connecting to your SQL Server from your machine.


Omar_Abu_Arisheh_0-1681605413525.png


If that doesn’t work, you can whitelist your IP from the Networking section under the SQL Server (browse to the SQL Server from the Database Overview page then go to Networking).


Omar_Abu_Arisheh_1-1681605590498.png


 


We create the tables in SQL Server.


Omar_Abu_Arisheh_2-1681605961074.png


 


We create the SQL View.


Omar_Abu_Arisheh_3-1681606459364.png


 


We enable Change Tracking on the Database and on the Tables. (right click, properties), you can also do this using code as well.


Omar_Abu_Arisheh_4-1681606552665.png


Omar_Abu_Arisheh_5-1681606664302.png


 


 


Create the Logic App and Workflows:


 


We create a Logic App Standard.


We create four workflows. (for the triggering workflows, you can have one only or more, based on your requirements)


tst_Workflow_SQL_Trigger_Tbl1


tst_Workflow_SQL_Trigger_Tbl2


tst_Workflow_SQL_Trigger_Tbl3


tst_Workflow_SQL_Get_View_Updated


 


Design for Child workflow that will get the updated rows of the SQL View:


 


Add a Request trigger.


Omar_Abu_Arisheh_0-1681608009980.png


 


Add the below schema to the Request Body so we can easily pass the values when calling this workflow from the Parent workflows.


Omar_Abu_Arisheh_1-1681608136600.png


 



{

    “properties”: {

        “Id”: {

            “type”“string”

        },

        “Value”: {

            “type”“integer”

        }

    },

    “type”“object”

}


 


Add an action to Get Rows for a table. Select the built-in SQL Connector and select the Get Rows Action.


Omar_Abu_Arisheh_3-1681608279171.png


 


In the Table Name click on Custom value and enter the name of the SQL View manually.


In the Parameters, add Where condition.


Select the Outputs of the Request Trigger to populate the where condition (will translate to: id=value)


Omar_Abu_Arisheh_2-1681608248684.png


 


Add a Response Action so the Child workflow can respond back to the Parent workflow.


Here you can precede this Action with a condition to check the output of the Get Rows Action and respond accordingly.


You can respond with the Output of the Get rows Action, but to steer away from repeating the work in the Parent workflows it is better to do all the work in the Child workflow. So you can act upon the result of the triggered SQL View in the Child workflow.


Omar_Abu_Arisheh_4-1681608499283.png


 


 


Design for the triggering workflow for Table 1 (Parent workflow):


 


Add a trigger. Select from the built-in tab the SQL Connector, select the Trigger When a row is modified (note the difference between this trigger and the When a row is updated, select the one that matches your requirements, even When a row is inserted)


After creating the connection, select the Table that you want to trigger this workflow. Table 1 in our scenario.


Omar_Abu_Arisheh_9-1681607057450.png


 


Add a Parse JSON Action. Use a sample of the table single row data to create the schema.


Omar_Abu_Arisheh_8-1681606940880.png


Sample:


{


  “Id”20004,


  “Auhthor_id”7346,


  “Price”57,


  “Edition”6,


  “RowVer”“AAAAAAAAtA8=”


}


 


Omar_Abu_Arisheh_6-1681606829259.png


 


Finally for this workflow, add an Action to call another workflow, the child workflow.


As we have created the child workflow earlier, the parameters for the workflow should be accessible.


For the Id, use the name that is used in the View, so you can easily select that exact column.


For the value, pass the value from the parsed JSON for that column. In our case it is called Id.


Omar_Abu_Arisheh_7-1681606881472.png


 


Create the other two workflows in the same manner. Point the trigger for each workflow to the correct Table.


In the Parse JSON use the schema for the relevant table.


In the Invoke Action, use the correct name of the column, and select the correct value from the Parse JSON output parameters.


 


 


Testing:


 


Add or update a row for one of the tables in SQL, you will notice that the corresponding Parent workflow was triggered, and called the Child workflow.


The Child workflow would get the updated row in the SQL View based on the passed where condition.


You can alter the where condition and the passed parameter based on your requirements.


This article is only a prove of concept.


Omar_Abu_Arisheh_5-1681608929364.png


 


 


Thank you :)


 


 

Microsoft Purview in the Real World (April 14, 2023)

Microsoft Purview in the Real World (April 14, 2023)

This article is contributed. See the original author and article here.

James_Havens_1-1681515058315.png


 


Disclaimer


This document is not meant to replace any official documentation, including those found at docs.microsoft.com.  Those documents are continually updated and maintained by Microsoft Corporation.  If there is a discrepancy between this document and what you find in the Compliance User Interface (UI) or inside of a reference in docs.microsoft.com, you should always defer to that official documentation and contact your Microsoft Account team as needed.  Links to the docs.microsoft.com data will be referenced both in the document steps as well as in the appendix.


All the following steps should be done with test data, and where possible, testing should be performed in a test environment.  Testing should never be performed against production data.


 


Target Audience


Microsoft customers who want to better understand Microsoft Purview.


 


 


Document Scope


The purpose of this document (and series) is to provide insights into various user cases, announcements, customer driven questions, etc.


 


 


Topics for this blog entry


Here are the topics covered in this issue of the blog:



  • Applying Retention Policies to a Teams Channels


 


Out-of-Scope


This blog series and entry is only meant to provide information, but for your specific use cases or needs, it is recommended that you contact your Microsoft Account Team to find other possible solutions to your needs.


 


Applying a Retention Label Policy Teams Channels


 


Overview


By default, you can set up Retention Policies for Teams Channels which is applied at the Team level for ALL channels under a single team, NOT a single channel under a team. 


 


The Note below is from the following Microsoft documentation:


 


Information Point #1


Learn about retention for Teams – Microsoft Purview (compliance) | Microsoft Learn


 


 


 


 


This Microsoft Link explains how the storage on the backend works for Teams Chats.


 


Learn about retention for Teams – Microsoft Purview (compliance) | Microsoft Learn


 


James_Havens_2-1681517987279.png


 


 


Below are some excerpts that I find to be of value in understanding out this retention operates.


 


Information Point #2


“You can use a retention policy to retain data from chats and channel messages in Teams, and delete these chats and messages. Behind the scenes, Exchange mailboxes are used to store data copied from these messages. Data from Teams chats is stored in a hidden folder in the mailbox of each user included in the chat, and a similar hidden folder in a group mailbox is used for Teams channel messages. These hidden folders aren’t designed to be directly accessible to users or administrators, but instead, store data that compliance administrators can search with eDiscovery tools.


These mailboxes are, listed by their RecipientTypeDetails attribute:



  • UserMailbox: These mailboxes store message data for Teams private channels and cloud-based Teams users.

  • MailUser: These mailboxes store message data for on-premises Teams users.

  • GroupMailbox: These mailboxes store message data for Teams standard channels.

  • SubstrateGroup: These mailboxes store message data for Teams shared channels.”


 


Information Point #3


“Although this data from Teams chats and channel messages are stored in mailboxes, you must configure a retention policy for the Teams channel messages and Teams chats locations. Teams chats and channel messages aren’t included in retention policies that are configured for Exchange user or group mailboxes. Similarly, retention policies for Teams don’t affect other email items stored in mailboxes.”


 


Information Point #4


“After a retention policy is configured for chat and channel messages, a timer job from the Exchange service periodically evaluates items in the hidden mailbox folder where these Teams messages are stored. The timer job typically takes 1-7 days to run. When these items have expired their retention period, they are moved to the SubstrateHolds folder—another hidden folder that’s in every user or group mailbox to store “soft-deleted” items before they’re permanently deleted.


 


Messages remain in the SubstrateHolds folder for at least 1 day, and then if they’re eligible for deletion, the timer job permanently deletes them the next time it runs.”


 


Information Point #5


 


Overview of security and compliance – Microsoft Teams | Microsoft Learn


 


James_Havens_1-1681517928283.png


 


 


Questions and Answers


 


Question #1 – What if I have an existing Team (or Teams) and for each Channel under that Team, I want to apply a DIFFERENT retention Ppolicy?  Or in other words, I do not want to reconfigure my Team(s) to have 1 Channel mapped to 1 Team and therefore be able to map 1 Retention policy to that Channel.


 


Answer #1 – At the writing of this blog entry, because of the underlying architecture of how Teams Channel message are stored (See Information Points #1 and #2 above) there is currently NO method to apply a Retention Policy to an individual Channel under a Team. 


 


Question #2 – Follow-up, I cannot even do this with Adaptive Scopes?


 


Answer #2 – The answer is still currently NO.  Adaptive scopes do not have attributes that apply to Teams Channels specifically.  Here is a summary of attributes and properties used in Adaptive scopes.


 


James_Havens_0-1681517883757.png


 


 


 


Question #3 – Do I have any other way to delete data from Teams Channels?


 


Answer #3 – Yes and No.  Through the Adaptive Scopes mentioned above, you can apply retention policies to users’ mailboxes and thus the data held within those mailboxes.  However, this approach would limit those retention policies to the users specified AND to all their email data, not just one specific Teams Channel data held in the Substrate.  Refer to Information Point #2 above to see how Teams Channel date is organized and stored in M365 tenants.


 


Appendix and Links


Learn about retention policies & labels to retain or delete – Microsoft Purview (compliance) | Microsoft Learn


 


Flowchart to determine when an item is retained or deleted – Microsoft Purview (compliance) | Microsoft Learn


 


Learn about retention for Teams – Microsoft Purview (compliance) | Microsoft Learn


 


Configure Microsoft 365 retention settings to automatically retain or delete content – Microsoft Purview (compliance) | Microsoft Learn


 


Limits for Microsoft 365 retention policies and retention label policies – Microsoft Purview (compliance) | Microsoft Learn


 


Learn about Microsoft Purview Data Lifecycle Management – Microsoft Purview (compliance) | Microsoft Learn


 


Get started with data lifecycle management – Microsoft Purview (compliance) | Microsoft Learn


 


Automatically retain or delete content by using retention policies – Microsoft Purview (compliance) | Microsoft Learn


 


Create retention labels for exceptions – Microsoft Purview (compliance) | Microsoft Learn


 


Records management for documents and emails in Microsoft 365 – Microsoft Purview (compliance) | Microsoft Learn


 


Resources to help you meet regulatory requirements for data lifecycle management and records management – Microsoft Purview (compliance) | Microsoft Learn


 


Declare records by using retention labels – Microsoft Purview (compliance) | Microsoft Learn


 


Publish and apply retention labels – Microsoft Purview (compliance) | Microsoft Learn


 


Learn about retention for Teams – Microsoft Purview (compliance) | Microsoft Learn


 


Overview of security and compliance – Microsoft Teams | Microsoft Learn

Mixed reality experience now more intuitive in Dynamics 365 Guides

Mixed reality experience now more intuitive in Dynamics 365 Guides

This article is contributed. See the original author and article here.

Microsoft aims to make mixed reality accessible and intuitive for frontline workers everywhere. With Dynamics 365 Guides, deskless workers use step-by-step holographic instructions to ensure process compliance, improve efficiency, and learn on the job. In 2022, new Microsoft Teams capabilities in Guides combined anyone, anywhere, seamless collaboration with the “see what I see” magic of HoloLens 2. In real-time, participants on a call could see what the HoloLens user saw, annotate their colleague’s three-dimensional space, and share files easily. We’ve just released another set of features in Guides to make this new experience even more intuitive and reliable.

Draw anywhere with digital 3Dannotations

Imagine being able to draw on any object, any surface, or in thin air. With our recent annotation improvements, you can. Previously, a HoloLens user could only draw on a flat or semi-flat surface some way off in the distance. Now HoloLens users can draw 3D images anywhere using digital inkand in Dynamics 365 Guides and Remote Assist, an expert working on a PC or mobile device on the other side of the world can draw in your world in 3D.

These drawings stick where they’re placed in the space and remain still. Users can walk around them and view them from different angles. On surfaces, the digital ink stays where intended, regardless of whether the user changes location or position. With HoloLens, the entire world is inkable, allowing you to annotate and share in real time.

Join mixed reality Teams calls more securely

In Dynamics 365 Guides, HoloLens users now have more options when joining a Teams call. Before entering the call, you can turn video on or off and join muted or not. As before, you can also change these settings once you’re in the meeting. In spaces where confidentiality is core, this allows frontline workers to use HoloLens as their main calling devicewithout compromising on security.

Further driving our efforts to help you keep your company and your information secure, we also recently added restricted mode features that enable your admin to restrict who can log on to the device and make calls.

Link to Guides from inside a Guide

Navigating through the steps of a guide is as intuitive as paging through a document or scrolling through a file. What about jumping from one file to another? You can do that now, too. We’ve added the ability to navigate directly from one guide to another by linking the second guide in an action step. Navigating between different sets of training materials or guides is as easy as jumping to a new web page from a hyperlink.

What’s next?

Learn the details of all our recent additions in the Dynamics 365 Guides release notes.

Stay tuned for more updates coming soon as we continue to build on the intuitive and frontline worker-focused features in Dynamics 365 Guides!

Not yet a Dynamics 365 customer? Take a tour and get a free trial.

The post Mixed reality experience now more intuitive in Dynamics 365 Guides appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Enable hassle-free migration between SQL Server and Azure SQL Managed Instance

Enable hassle-free migration between SQL Server and Azure SQL Managed Instance

This article is contributed. See the original author and article here.

NikoNeugebauer_1-1681489904287.png


Migrating databases. When your databases anchor applications, employees, and customers, migration plunks a big rock in the pond that is your business, creating lasting ripples. Ripples that force thoughtful, coordinated action across business units to prevent outages. Add moving to the cloud into the equation and the potential tide of complications rises.  


 


In the past, undertaking migration often required you to refactor your databases to make them compatible with cloud database servers. Security policies needed thorough analysis and updating for cloud models. Factor in potential downtimes, and migration once loomed as a labor-intensive hit to business productivity. This made migrating an on-premises SQL Server to the cloud an ROI evaluation against the once heavy-lift tradeoffs required by moving to the cloud.  


 


Azure SQL Managed Instance eliminates these tradeoffs with the broadest SQL Server engine compatibility offering available. Because of the compatibility level, Azure SQL Managed Instance doesn’t require you to compromise much, if anything, to deploy SQL Server within a fully managed aPlatform as a Service (PaaS) environment. Because the SQL Server 2022 Enterprise Edition and Azure SQL Managed Instance engines are almost one hundred percent identical, making it possible to migrate both ways. You don’t need to refactor databases or risk negative performance impacts to migrate to Azure SQL Managed Instance. Azure SQL Managed Instance is also compatible with earlier SQL Server editions.  


 


Additionally, the storage engine format alignment between Azure SQL Managed Instance and SQL Server 2022 provides  an easy way to copy or move databases from Azure SQL Managed Instance to a SQL Server 2022 instance. A back up from Azure SQL Managed Instance can be restored on SQL Server 2022. The standalone SQL Server can be hosted on-premises, on virtual machines in Azure, or in other clouds. This puts you in control of your data, ensuring data mobility regardless of SQL Server location.  


 


Compatibility between the Azure SQL Managed Instance and SQL Server 2022 engines extends to the database engine settings and the database settings. As with the standalone version of SQL Server, with Azure SQL Managed Instance, you decide what server configuration best serves your business needs. To get started with Azure SQL Managed Instance, you choose the performance tier, service tier, and the reserved compute, along with data and backup storage options, that make the most sense for your applications and your data.  


 


Since Azure SQL Managed Instance is built on the latest SQL Server engine, it’s always up to date with the latest features and functionality, including online operations, automatic plan corrections, and other enterprise performance enhancements. A comparison of the available features is explained in Feature comparison: Azure SQL Managed Instance versus SQL Server. 


 


Azure SQL Managed Instance offers two performance tiers: 



  • General purpose - for applications with typical performance, I/O latency requirements, and built-in High Availability (HA). 

  • Business Critical - for applications requiring low I/O latency and higher HA requirements. It provides two non-readable secondaries and one readable secondary along with the readable/writable primary replica. The readable secondary allows you to distribute reporting workloads off your primary. 


Once you’re running on Azure SQL Managed Instance, changing your service tier (CPU vCores or reserved storage changes) occurs online and incurs little to no downtime. To optimize performance of transaction processing, data ingestion, data load, and transient data, leverage In-Memory OLTP, available in the Business Critical tier.  


 


Migrate from your on-premises SQL Server to Azure SQL Managed Instance with ease. Leverage the fully automated Azure Data Migration Service or set up an Azure SQL Managed Instance link. The link feature uses an expanded version of distributed availability groups to extend your on-prem SQL Server availability group to Azure SQL Managed Instance safely, replicating data in near real-time. With Azure SQL Managed Instance link feature, you can migrate, test, and then perform a simple, controlled fail-over.  


 


While Azure SQL Managed Instance provides nearly one hundred percent compatibility with SQL Server, you will notice some changes in the transition from SQL Server standalone editions. These differences are based on architectural dissimilarities. Certain SQL features (audits, for instance) operate in a fashion that optimizes cloud architecture. Cloud architecture is designed to maximize resource utilization and minimize costs while ensuring high levels of availability, reliability, and security. Azure architecture leverages resource sharing while guaranteeing security and isolation. This resource sharing provides you with a flexible environment that can scale rapidly in response to customer needs.  Because high availability is built into Azure SQL Managed Instance, it cannot be configured or controlled by users as it can be in SQL Server 2022. 


 


Explore the differences between SQL Server 2022 Enterprise Edition and Azure SQL Managed Instance. 


What’s more, Azure SQL Managed Instance is backed by Intel® Xeon® Scalable processors, ensuring that your environment is performant and secure from the silicon layer. With 8 to 40 powerful cores and a wide range of frequency, feature, and power levels Intel® Xeon® Scalable processors are a part of an end-to-end solution for your data. 


 


Come delve into Azure SQL Server Managed Instance. The depth and breadth of SQL engine compatibility provides you with a safe, simple, full-featured, and flexible migration path. Azure SQL Managed Instance puts you in complete control of your data, your databases, your performance, and your business. Microsoft’s continued commitment to improvement means you can take advantage of the benefits of the cloud with Azure SQL Managed Instance and modernize your on-premises SQL Server databases. 


 


Dive deeper into the benefits of migrating to Azure SQL Managed Instance. Check out the on-demand recording: https://www.mssqltips.com/sql-server-video/932/modernize-your-apps-with-azure-sql-managed-instance/Modernize Your Apps with Azure SQL Managed Instance. 

Streaming data in real time from Azure Database for MySQL – Flexible Server to Power BI

Streaming data in real time from Azure Database for MySQL – Flexible Server to Power BI

This article is contributed. See the original author and article here.

 


Modern applications require the capability to retrieve modified data from a database in real time to operate effectively. Usually, developers need to create a customized tracking mechanism in their applications, utilizing triggers, timestamp columns, and supplementary tables, to identify changes in data. The development of such applications typically requires significant effort and can result in schema updates resulting in considerable performance overhead. 


 


Real-time data processing is a crucial aspect of nearly every modern data warehouse project. However, one of the biggest hurdles to overcome in real-time processing solutions is the ability to ingest efficiently and effectively, process, and store messages in real-time, particularly when dealing with high volumes of data. To ensure optimal performance, processing must be conducted in a manner that does not interfere with the ingestion pipeline. In addition to non-blocking processing, the data store must be capable of handling high-volume writes. Further challenges such as the ability to quickly act on the data, generating real-time alerts or business needs where dashboard that needs to be updated in real-time or near real-time. In many cases, the source systems utilize traditional relational database engines, such as MySQL, that do not offer event-based interfaces.  


 


In this series of blog posts, we will introduce an alternative solution that utilizes an open-source tool Debezium to perform Change Data Capture (CDC) from Azure Database for MySQL – Flexible Server with Apache Kafka writes these changes to the Azure Event Hub, Azure Stream Analytics perform real time analytics on the data stream and then write to Azure Data Lake Storage Gen2 for long-term storage and further analysis using Azure Synapse serverless SQL pools and provide insights through Power BI 


 


Azure Database for MySQL – Flexible Server is a cloud-based solution that provides a fully managed MySQL database service. This service is built on top of Azure’s infrastructure and offers greater flexibility. MySQL uses binary log (binlog) to record all the transactions in the order in which they are committed on the database. This includes changes to table schemas as well as changes to the rows in the tables. MySQL uses binlog mainly for purposes of replication and recovery. 


 


Debezium is a powerful CDC (Change Data Capture) tool that is built on top of Kafka Connect. It is designed to stream the binlog, produces change events for row-level INSERT, UPDATE, and DELETE operations in real-time from MySQL into Kafka topics, leveraging the capabilities of Kafka Connect. This allows users to efficiently query only the changes from the last synchronization and upload those changes to the cloud. After this data is stored in Azure Data Lake storage, it can be processed using Azure Synapse Serverless SQL Pools. Business users can then monitor, analyse, and visualize the data using Power BI.


  


Solution overview 


This solution entails ingesting MySQL data changes from the binary logs and converting the changed rows into JSON messages, which are subsequently sent to Azure Event Hub. After the messages are received by the Event Hub, an Azure Stream Analytics (ASA) Job distributes the changes into multiple outputs, as shown in the following diagram.
 


architecture mysql.jpg


End-to-end serverless streaming platform with Azure Event Hubs for data ingestion 


 


Components and Services involved


In this blog post, following are the services used for streaming the changes from Azure Database for MySQL to Power BI. 



  • A Microsoft Azure account 

  • An Azure Database for MySQL Flexible server  

  • A Virtual Machine running Linux version 20.04 

  • Kafka release (version 1.1.1, Scala version 2.11), available from kafka.apache.org 

  • Debezium 1.6.2 

  • An Event Hubs namespace 

  • Azure Stream Analytics 

  • Azure Data Lake Storage Gen2 

  • Azure Synapse Serverless SQL pools

  • A Power BI workspace 


 


Dataflow 


The following steps outline the process to set up the components involved in this architecture to stream data in real time from the source Azure Database for MySQL flexible Server.



  1. Provisioning and configuring Azure Database for MYSQL- Flexible Server & a Virtual Machine 

  2. Configure and run Kafka Connect with a Debezium MySQL connector 

  3. Reading CDC Messages Downstream from Azure Event Hub and capture data in an Azure Data Lake Storage Gen2 account in Parquet format 

  4. Create External Table with Azure Synapse Serverless SQL Pool 

  5. Use Serverless SQL pool with Power BI Desktop & create a report. 

  6. Build real-time dashboard with Power BI dataset produced from Stream Analytics 


Each of the above steps is outlined in detail in the upcoming sections.  


 


Prerequisites 



 


Provisioning and configuring Azure Database for MYSQL- Flexible Server & a Virtual Machine 


It is important to create an Azure Database for MySQL Flexible Server instance and a Virtual Machine as outlined below before proceeding to the next step. To do so, perform the following steps: 



  1. Create an instance of Azure Database for MySQL – Flexible Server 

  2. Under server parameters blade, configure binlog_expire_logs_seconds parameter, as per your requirements (e.g.: 86400 seconds  for 24Hrs) on the server to make sure that binlogs are not purged quickly. For more information, see How to Configure server parameters. 

  3. Under the same server parameter blade, also configure and set binlog_row_image parameter to a value of FULL. 

  4. Use a command line client or download and install MySQL Workbench or another third-party MySQL client tool to connect to the Azure Database for MySQL Flexible Server. 

  5. Create an Azure VM in the same resource group running Linux version 20.04. 

  6. Maintain enough disk space on the Azure VM to copy binary logs remotely. 

  7. For this example, the “orders_info” table has been created in Azure Database for MySQL Flexible Serversaikondapalli_0-1681405946971.png

     




Configure and run Kafka Connect with a Debezium MySQL connector  


To track row-level changes in response to insert, update and delete operations in database tables, Change Data Capture (CDC) is a technique that you use to track these changes, Debezium is a distributed platform that provides a set of Kafka Connect connectors that can convert these changes into event streams and send those events to Apache Kafka.  


 


To set up Debezium & Kafka on a Linux Virtual Machine follow the steps outlined in: CDC in Azure Database for MySQL – Flexible Server using Kafka, Debezium, and Azure Event Hubs – Microsoft Community Hub 
 


Reading CDC Messages Downstream from Event Hub and capture data in an Azure Data Lake Storage Gen2 account in Parquet format


Azure Event Hubs is a fully managed Platform-as-a-Service (PaaS) Data streaming and Event Ingestion platform, capable of processing millions of events per second. Event Hubs can process, and store events, data, or telemetry produced by distributed software and devices. Data sent to an event hub can be transformed and stored by using any real-time analytics provider or batching/storage adapters. Azure Events Hubs provides an Apache Kafka endpoint on an event hub, which enables users to connect to the event hub using the Kafka protocol.  


Configure a job to capture data 


Use the following steps to configure a Stream Analytics job to capture data in Azure Data Lake Storage Gen2. 



  1. In the Azure portal, navigate to your event hub. 

  2. Select the event hub created for the “orders_info” table.

  3. Select Features > Process Data, and then select Start on the Capture data to ADLS Gen2 in Parquet format card. 
     


 


Ganapathivarma_0-1681400256863.png


 


 


3. Enter a name to identify your Stream Analytics job. Select Create. 
 


Ganapathivarma_0-1681401444305.png


Ganapathivarma_3-1681401516363.png


 


4. Specify the Serialization type of your data in the Event Hubs and the Authentication method that the job will use to connect to Event Hubs. Then select Connect. 


Ganapathivarma_3-1681399159550.png


 


 


 


5. Then the connection is established successfully, you’ll see: 



  • Fields that are present in the input data. You can choose Add field or you can select the three dots symbol next to a field to optionally remove, rename, or change its name. 



  • A live sample of incoming data in the Data preview table under the diagram view. It refreshes periodically. You can select Pause streaming preview to view a static view of the sample input. 


 


Ganapathivarma_4-1681399159558.png


 


 


 



  1. Select the Azure Data Lake Storage Gen2 tile to edit the configuration. 

  2. On the Azure Data Lake Storage Gen2 configuration page, follow these steps: 

    a. Select the subscription, storage account name and container from the drop-down menu.
     
    b. After the subscription is selected, the authentication method and storage account key should be automatically filled in. 
    c. For streaming blobs, the directory path pattern is expected to be a dynamic value. It’s required for the date to be a part of the file path for the blob – referenced as {date}. To learn about custom path patterns, see to Azure Stream Analytics custom blob output partitioning. 
     
    Ganapathivarma_5-1681399159563.png

     


     
     d. Select Connect 



  1. When the connection is established, you’ll see fields that are present in the output data. 

  2. Select Save on the command bar to save your configuration. 

  3. On the Stream Analytics job page, under the Job Topology heading, select Query to open the Query editor window.  

  4. To test your query with incoming data, select Test query 

  5. After the events are sampled for the selected time range, they appear in the Input preview tab. 


Ganapathivarma_6-1681399159571.png


 


 



  1. Stop the job before you make any changes to the query for any desired output. In many cases, your analysis doesn’t need all the columns from the input stream. You can use a query to project a smaller set of returned fields than in the pass-through query.

  2. When you make changes to your query, select Save query to test the new query logic. This allows you to iteratively modify your query and test it again to see how the output changes. 


Ganapathivarma_7-1681399159578.png


 


 



  1. After you verify the results, you’re ready to Start the job. 

  2. Select Start on the command bar to start the streaming flow to capture data. Then in the Start Stream Analytics job window: 



  • Choose the output start time. 

  • Select the number of Streaming Units (SU) that the job runs with. SU represents the computing resources that are allocated to execute a Stream Analytics job. For more information, see Streaming Units in Azure Stream Analytics. 



  • In the Choose Output data error handling list, select the behavior you want when the output of the job fails due to data error. Select Retry to have the job retry until it writes successfully or select another option. 


 


Ganapathivarma_8-1681399159581.png


 


 


 



  1. verify that the Parquet files are generated in the Azure Data Lake Storage container. 


Ganapathivarma_9-1681399159587.png


 


 


Create External Table with Azure Synapse Serverless SQL Pool 


  



  1. Navigate to Azure Synapse Analytics Workspace. Select Data -> Linked -> Navigate to the ADLS gen 2 (folder path) 

  2. Select the file that you would like to create the external table from and right click -> New SQL Script -> Create External table  


 


Ganapathivarma_10-1681399159593.png


 


 


 


3. In the New External Table, change Max string length to 250 and continue 
 
Ganapathivarma_11-1681399159597.png


 


 
 
 


4. A dialog window will open. Select or create new database and provide database table name and select Open script 


Ganapathivarma_12-1681399159600.png


 


 


5. A new SQL Script opens, and you run the script against the database, and it will create a new External table. 


6. Making a pointer to a specific file. You can only point to folder not the files too 


7. Point to enriched folder in Data Lake Storage 


Ganapathivarma_13-1681399159604.png


 


8. Save all the work by clicking Publish All 


9. Verify the external table created in Data -> Workspace -> SQL Database 
 
Ganapathivarma_14-1681399159610.png


 


 


External tables encapsulate access to files making the querying experience almost identical to querying local relational data stored in user tables. Once the external table is created, you can query it just like any other table:


 


 


 


 


 

SELECT TOP 100 * FROM dbo.orders_info 
GO 

SELECT COUNT(*) FROM dbo.orders_info 
GO 

 


 


 


 


 


 


10. END 


 


Use serverless SQL pool with Power BI Desktop & create a report
 



  1. Navigate to Azure Synapse Analytics Workspace. Starting from Synapse Studio, click Manage. 


 


Ganapathivarma_15-1681399159613.png


 


 


 2. Under External Connections, click Linked services. Click + New. Click Power BI and click Continue. 


 


Ganapathivarma_16-1681399159616.png


 


 


 


3. Enter a name for the linked service and select an existing workspace which you want to use to publish. Provide any name in the “Name” field. Then you will see Power BI linked connection with the name. 


4. Click Create. 


Ganapathivarma_17-1681399159619.png


 


 


 


5. View Power BI workspace in Synapse Studio 



  • After your workspaces are linked, you can browse your Power BI datasets, edit/create new Power BI Reports from Synapse Studio. 

  • Navigate to develop hub. Create Power BI linked service will be here. 



  • Expand Power BI and the workspace you wish to use. 


Ganapathivarma_0-1681405185559.png


 


6. New reports can be created clicking + at the top of the Develop tab. Existing reports can be edited by clicking on the report name. Any saved changes will be written back to the Power BI workspace. 


Ganapathivarma_19-1681399159625.png


 


 


 


Summary 


Overall, Debezium, Kafka Connect, Azure Event Hubs, Azure Data Lake Storage, Azure Stream Analytics, Synapse SQL Serverless, and Power BI work together to create a comprehensive, end-to-end data integration, analysis, and visualization solution that can handle real-time data streams from databases, store them in a scalable and cost-effective manner, and provide insights through a powerful BI tool.  


To learn more about the services used in this post, check out the following resources: 



 


 


 

Microsoft Teams innovations for manufacturing at Hannover Messe 2023

Microsoft Teams innovations for manufacturing at Hannover Messe 2023

This article is contributed. See the original author and article here.

Join Microsoft at Hannover Messe 2023 to learn how the latest enhancements to Microsoft Teams help frontline workers streamline communication and increase productivity.

The post Microsoft Teams innovations for manufacturing at Hannover Messe 2023 appeared first on Microsoft 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Insert templates faster with email template views

Insert templates faster with email template views

This article is contributed. See the original author and article here.

If you are a customer service agent, you know how important it is to respond to your customers quickly and effectively. You also know how frustrating it can be to waste time searching for the right email template to use for each situation. That is why you need to use email template views while selecting an email template. 

We have upgraded enhanced email template selection to help agents efficiently use email templates. With integrated record selection and email template views, finding the right email template is easier than before. 

Image of email template search bar with an email template in the right panel

With email template views, just create a view with your choice of filters and then have it available when selecting the templates. Creating a view with your favorite or most used templates will help save a lot of time when working with emails. 

In Customer Service workspace and Customer Service Hub, the enhanced template dialog is enabled by default. Administrators can disable the enhanced email template selection option if they want to display the default email selection dialog. When using enhanced email template selection, admins can determine whether to show the record selection within the email template selection window. This saves the agent extra clicks when selecting a template. Regarding is selected by default and can be easily changed by navigating to the record tab. The list of templates will automatically get refreshed when Record is changed.

In short, when agents use email template views, they can use preconfigured views to find the right templates quickly, switch between views of templates that have persisting filters, and save time with the integrated record selection in the template selection dialog box.

Learn more

Watch a quick video introduction.

To find out more about email template views, read the documentation:

The post Insert templates faster with email template views appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Egypt 2023 time zone update now available

This article is contributed. See the original author and article here.

As a result of the March 1st order from the Government of Egypt, Daylight Saving Time (DST) in the Arab Republic of Egypt will resume from the last Friday of April. Hence, clocks will be set forward by an hour at 12:00 AM on April 28, 2023. 


 


The April 2023 Monthly quality update for Windows includes the following time zone update for Egypt: 



  1. Clocks will be set forward by an hour at 12:00 a.m. on April 28, 2023, for the Egypt time zone. 


Here are the KB article numbers that contain the Egypt DST fix for different versions of Windows:


















































Product



KB article ID



Windows 11, version 21H2



KB5025224



Windows 11, version 22H2



KB5023778



Windows Server 2022



KB5025230



Windows Server 2012



KB5025287



Windows Server 2008 SP2



KB5025271



Windows 10 Enterprise LTSC 2015



KB5025234



Windows 10 Enterprise LTSC 2016



KB5025228



Windows 10 Enterprise LTSC 2019



KB5025229



Windows 8.1



KB5025285



Windows 7.0 SP1



KB5025279



 


For Microsoft’s official policy on DST and time zone changes, please see Daylight saving time help and support. For information on how to update Windows to use the latest global time zone rules, see How to configure daylight saving time for Microsoft Windows operating systems.