This article is contributed. See the original author and article here.
Explore Azure AI Services: Prebuilt Models and Demos
Azure AI services provide a comprehensive suite of prebuilt models and demos designed to address a wide range of use cases. These models are readily accessible and allow you to implement AI-powered solutions seamlessly. We have curated and catalogued prebuilt demos available across Azure AI services. We hope this helps you infuse AI seamlessly into your products and services.
Speech Recognition
Speech to Text Scenarios
Scenario
Description
Link
Real-time speech to text
Quickly test your audio on a speech recognition endpoint without writing any code.
Use our sample application to learn how to use Azure Speech to automatically caption your content in real-time and offline by transcribing the audio of films, videos, live events, and more. Display the resulting text on a screen to provide an accessible experience. In this example, we leverage features like speech to text and phrase list.
Batch transcribe call center recordings and extract valuable information such as Personal Identifiable Information (PII), sentiment, and call summary. This demonstrates how to use the Speech and Language services to analyze call center conversations.
Seamlessly translate and generate videos in multiple languages automatically. With its powerful capabilities, you can efficiently localize your video content to cater to diverse audiences around the globe.
Extract invoice details including customer and vendor details, totals, and line items.
Receipts
Extract transaction details from receipts including date, merchant information, and totals.
Identity Documents
Extract details from passports and ID cards.
US Health Insurance Cards
Extract details from US health insurance cards.
US personal tax
Classify then extract information from documents containing any number of W2s, 1040s, 1098s and 1099s.
US mortgage
Extract information from a variety of mortgage
US pay stubs
Extract employee information, payment information including earnings, deductions, net pay and more.
US bank statements
Extract bank statements
US checks
Extract amount, date, pay to order MICR numbers, name and address of the player, and more.
Marriage Certificates
Extract details from marriage certificates.
Credit Cards
Extract details from credit cards including card number and cardholder name.
Contracts
Extract title and signatory parties’ information from contracts.
Business Cards
Extract contact details from business cards.
Gen-AI Safety Solutions
Safeguard your image content
Scenario
Description
Link
Moderate image content
This is a tool for evaluating different content moderation scenarios. It takes into account various factors such as the type of content, the platform’s policies, and the potential impact on users. Run moderation tests on sample content. Use Configure filters to rerun and further fine tune the test results. Add specific terms to the block list that you want detect and act on.
This will display your API usage, moderation results, and their distributions per category. You can customize the severity threshold for each category to view the updated results and deploy the new threshold to your end. Additionally, you can edit the blocklist on this page to respond to any incidences.
This article is contributed. See the original author and article here.
Hey! Rob Greene again. Been on a roll with all things crypto as of late, and you are not going to be disappointed with this one either!
Background
Many know that Remote Desktop Services uses a self-signed certificate for its TLS connection from the RDS Client to the RDS Server over the TCP 3389 connection by default. However, Remote Desktop Services can be configured to enroll for a certificate against an Enterprise CA, instead of continuing to use those annoying self-signed certificates everywhere.
I know there are other blogs out there that cover setting up the certificate template, and the group policy, but what if I told you most of the blogs that I have seen on this setup are incomplete, inaccurate, and do not explain what is happening with the enrollment and subsequent renewals of the RDS certificate!? I know… Shocker!!!
If you are a pretty regular consumer of the AskDS blog content you know how we love to recommend using one certificate on the server for a specific Enhanced Key Usage (EKU), and make sure that you have all the information required on the certificate so that it works with all applications that need to use the certificate.
This certificate is no different. I would recommend that the certificate that is used ONLY has the EKU for Remote Desktop Authentication and DOES NOT have an EKU of Server Authentication at all. The reason for this is that this certificate should not be controlled / maintained via Autoenrollment/renewal behaviors. This needs to be maintained by the Remote Desktop Configuration service, and you do not want certificates being used by other applications being replaced by a service like this as it will cause an issue in the long run.
There is a group policy setting that can be enabled to configure the Remote Desktop Service to enroll for the specified certificate and gives the NT AuthorityNetworkService account permission to the certificates private key which is a requirement for this to work.
The interesting thing about this is that you would think that the Remote Desktop Service service would be the service responsible for enrolling for this certificate, however it is the Remote Desktop Configuration (SessionEnv) service that is responsible for initial certificate requests as well as certificate renewals.
It is common to see the RDS Authentication Certificate template configured for autoenrollment, however this is one of the worse things you can do, and WILL cause issues with Remote Desktop Services once the certificate renewal timeframe comes in. Autoenrollment will archive the existing certificate causing RDS to no longer be able to find the existing certificate; then when you require TLS on the RDS Listener, users will fail to connect to the server. Then, at some point, Remote Desktop Configuration service will replace the newly issued certificate with a new one because it maintains the Thumbprint of the certificate that RDS should be using within WMI. When it tries to locate the original thumbprint and cannot find it, it will then attempt to enroll for a new one at the next service start. This is generally when we see the cases rolling in to the Windows Directory Services team because it appears to be a certificate issue even though this is a Remote Desktop Services configuration issue.
What we want to do is first make sure that all the steps are taken to properly configure the environment so that the Remote Desktop Configuration service is able to properly issue certificates.
The Steps
Like everything in IT (information technology), there is a list of steps that need to be completed to get this setup properly.
Configure the certificate template and add it to a Certification Authority to issue the template.
Configure the Group Policy setting.
Configuring the Certificate Template
The first step in the process is to create and configure the certificate template that we want to use:
Log on to a computer that has the Active Directory Certificate Services Tools Remote Server Administration Tools (RSAT) installed or a Certification Authority within the environment.
Launch: CertTmpl.msc (Certificate Template MMC)
Find the template named Computer, right click on it and select Duplicate Template.
On the Compatibility tab, select up to Windows Server 2012 R2 for Certification Authority and Certificate recipient. Going above this might cause issues with CEP / CES environments.
On the General tab, we need to give the template a name and validity period.
Type in a good descriptive name in the Template display name field.
If you would like to change the Validity period, you can do that as well.
You should NOT check the box Publish certificate in Active Directory.
NOTE: Make sure to copy the value in the Template name field, as this is the name that you will need to type in the group policy setting. Normally it will be the display name without any spaces in the name, but do not rely on this. Use the value you see during template creation or when looking back at the template later.
6. On the Extensions tab, the Enhanced Key Usage / Application Policies need to be modified.
a. Select Application Policies, and then click on the Edit button.
b. Multi select or select individually Client Authentication and Server Authentication and click the Remove button.
c. Click the Add button, and then click on the New button if you need to create the Application Policy for Remote Desktop Authentication. Otherwise find the Remote Desktop Authentication policy in the list and click the OK button.
d. If you need to create the Remote Desktop Authentication application policy, click the Add button, and then for the Name type in Remote Desktop Authentication, and type in 1.3.6.1.4.1.311.54.1.2 for the Object identifier value, and click the OK button.
e. Verify the newly created Remote Desktop Authentication application policy, and then click the OK button twice.
7. Remote Desktop service can use a Key Storage Provider (KSP). So, if you would like to change over from a Legacy Cryptographic Service Provider (CSP) to using a Key Storage Provider this can be done on the Cryptography tab.
8. Get the permissions set properly. To do this click on the Security tab.
a. Click the Add button and add any specific computer or computer groups you want to enroll for a certificate.
b. Then Make sure to ONLY select Allow Enroll permission. DO NOT select Autoenroll.
NOTE: Please keep in mind that Domain Controllers DO NOT belong to the Domain Computers group, so if you want all workstations, member server and Domain Controllers to enroll for this certificate, you will need Domain Computers and Enterprise Domain Controllers or Domain Controllers groups added with the security permission of Allow – Enroll.
9. When done making other changes to the template as needed, click the OK button to save the template.
Configure the Group Policy
After working through getting the certificate template created and configured to your liking, the next step in the process is to setup the Group Policy Object properly.The group policy setting that needs to be configured is located at: Computer ConfigurationPoliciesAdministrative TemplatesWindows ComponentsRemote Desktop ServicesRemote Desktop Session HostSecurity
With the Policy “Server authentication certificate template“
When adding the template name to this group policy it will accept one of two things:
Certificate template name, again this is NOT the certificate template display name.
Certificate templates Object Identifier value. Using this is not common, however some engineers will recommend this over the template name.
If you use the certificate template display name, the Remote Desktop Configuration service (SessionEnv) will successfully enroll for the certificate, however the next time the policy applies it will enroll for a new certificate again. This causes enrollments to happen and can make a CA very busy.
Troubleshoot issues of certificate issuance
Troubleshooting problems with certificate issuance is usually easy once you have a good understanding of how Remote Desktop Services goes about doing the enrollment, and there are only a few things to check out.
Investigating what Certificate Remote Desktop Service is configured to use.
The first thing to investigate is figuring out what certificate, if any,the Remote Desktop Services is currently configured to use. This is done by running a WMI query and can be done via PowerShell or good’ol WMIC. (Note: WMIC is deprecated and will be removed at a future date.)
WMIC: wmic /namespace:rootcimv2TerminalServices PATH Win32_TSGeneralSetting Get SSLCertificateSHA1Hash
We are interested in the SSLCertificateSHA1Hash value that is returned. This will tell us the thumbprint of the certificate it is attempting to load.
Keep in mind that if the Remote Desktop Service is still using the self-signed certificate, it can be found by:
launch the local computer certificate store (CertLM.msc).
Once the Computer store opened look for the store named: Certificates – Local ComputerRemote DesktopCertificates.
We would then double click on the certificate, then click on the Details tab, and find the field named Thumbprint.
Then validate if this value matches the value of SSLCertificateSHA1Hash from the output.
If there is no certificate in the Remote Desktop store, or if the SSLCertificateSHA1Hash value does not match any of the certificates in the store Thumbprint field, then it would be best to visit the Certificates – Local ComputerPersonalCertificates store next. Look for a certificate that has the Thumbprint field matching the SSLCertificateSHA1Hash value.
Does the Remote Desktop Service have permission to the Certificate private key
Once the certificate has been tracked down, we then must figure out if the certificate has a private key and if so, does the account running the service have permission to the private key?
If you are using Group Policy to deploy the certificate template information and the computer has permissions to enroll for the certificate, then the permissions in theory should be configured properly for the private key and have the NT AuthorityNetworkService with Allow – Read permissions to the private key.
If you are having this problem, then more than likely the environment is NOT configured to deploy the certificate template via the group policy setting, and it is just relying on computer certificate autoenrollment and a certificate that is valid for Server Authentication. Relying on certificate autoenrollment is not going to configure the correct permissions for the private key and add Network Service account permissions.
To check this, follow these steps:
launch the local computer certificate store (CertLM.msc).
Once the Computer store opened look for the store named: Certificates – Local ComputerPersonalCertificates.
Right click on the certificate that you are interested in, then select All Tasks, and click on Manage Private Keys.
4. Verify that Network Service account has Allow – Read Permissions. If not, then add it.
a. Click the Add button.
b. In the Select Users or Groups, click the Locations button, and select the local computer in the list.
c. Type in the name “Network Service”
d. Then click the Check Names button, and then click the OK button.
5. If the certificate does not appear to have a private key associated with it in via the Local Computer Certificate store snapin, then you may want to run the following CertUtil command to see if you can repair the association. CertUtil -RepairStore My [* / CertThumbprint].
How to change the certificate that Remote Desktop Services is using
If you have determined that Remote Desktop Services is using the wrong certificate, there are a couple of things that we can do to resolve this.
We can delete the certificate from the Computer Personal store and then cycle the Remote Desktop Configuration (SessionEnv) service. This would cause immediate enrollment of a certificate using the certificate template defined in the group policy.
PowerShell: $RDPSettings = Get-WmiObject -Class “Win32_TSGeneralSetting” -Namespace Rootcimv2Terminalservices -Filter “TerminalName=’rdp-tcp'” CertUtil -DelStore My $RDPSettings.SSLCertificateSHA1Hash Net Stop SessionEnv Net Start SessionEnv
2. We could update the Thumbprint value in WMI to reference another certificates thumbprint.
WMIC: wmic /namespace:rootcimv2TerminalServices PATH Win32_TSGeneralSetting Set SSLCertificateSHA1Hash = “CERTIFICATETHUMBPRINT”
Conclusion
The first thing to remember is deploying certificates for Remote Desktop Services is best done by the Group Policy setting and to NOT setup the certificate template for autoenrollment. Setting the template up for autoenrollment will cause certificate issuance problems within the environment from multiple angles.
Unless you modify the certificate templates default Key Permissions setting found on the Request Handling tab, the account running the Remote Desktop Service will not have permission to the private key if the certificate is acquired via autoenrollment. This is not something that we would recommend.
This will cause a scenario where even if the SSLCertificateSHA1Hash value is correct, it will not be able to use the certificate because it will not have permission to use the private key. If you do have the template configured for custom Private Key permissions, you could again still have issues with the WMI SSLCertificateSHA1Hash value not being correct.
2. Configure the group policy setting properly as well as the certificate template. It is best to manage this configuration via group policy and you can ensure consistent experience for all RDS connections.
I know that a lot of you might have deeper questions about how the Remote Desktop Configuration service does this enrollment process, however, please keep in mind that the Remote Desktop Service is really owned by the Windows User Experience team in CSS, and so us Windows Directory Services engineers may not have that deeper level knowledge. We just get called in when the certificates do not work or fail to get issued. This is how we tend to know so much about the most common misconfigurations for this solution.
Rob “Why are RDS Certificates so complicated” Greene
This article is contributed. See the original author and article here.
Mv3 High Memory General Availability
Executing on our plan to have our third version of M-series (Mv3) powered by 4th generation Intel® Xeon® processors (Sapphire Rapids) across the board, we’re excited to announce that Mv3 High Memory (HM) virtual machines (VMs) are now generally available. These next-generation M-series High Memory VMs give customers faster insights, more uptime, lower total cost of ownership and improved price-performance for their most demanding workloads. Mv3 HM VMs are supported for RISE with SAP customers as well. With the release of this Mv3 sub-family and the sub-family that offers around 32TB memory, Microsoft is the only public cloud provider that can provide HANA certified VMs from around 1TB memory to around 32TB memory all powered by 4th generation Intel® Xeon® processors (Sapphire Rapids).
Key features on the new Mv3 HM VMs
The Mv3 HM VMs can scale for workloads from 6TB to 16TB.
Mv3 delivers up to 40% throughput over our Mv2 High Memory (HM), enabling significantly faster SAP HANA data load times for SAP OLAP workloads and significant higher performance per core for SAP OLTP workloads over the previous generation Mv2.
Powered by Azure Boost, Mv3 HM provides up to 2x more throughput to Azure premium SSD storage and up to 25% improvement in network throughput over Mv2, with more deterministic performance.
Designed from the ground up for increased resilience against failures in memory, disks, and networking based on intelligence from past generations.
Available in both disk and diskless offerings allowing customers the flexibility to choose the option that best meets their workload needs.
During our private preview, several customers such as SwissRe unlocked gains from the new VM sizes. In their own words:
“Mv3 High Memory VM results are promising – in average we see a 30% increase in the performance without any big adjustment.”
SwissRe
Msv3 High Memory series (NVMe)
Size
vCPU
Memory in GiB
Max data disks
Max uncached Premium SSD throughput: IOPS/MBps
Max uncached Ultra Disk and Premium SSD V2 disk throughput: IOPS/MBps
Max NICs
Max network bandwidth (Mbps)
Standard_M416s_6_v3
416
5,696
64
130,000/4,000
130,000/4,000
8
40,000
Standard_M416s_8_v3
416
7,600
64
130,000/4,000
130,000/4,000
8
40,000
Standard_M624s_12_v3
624
11,400
64
130,000/4,000
130,000/4,000
8
40,000
Standard_M832s_12_v3
832
11,400
64
130,000/4,000
130,000/4,000
8
40,000
Standard_M832s_16_v3
832
15,200
64
130,000/ 8,000
260,000/ 8,000
8
40,000
Msv3 High Memory series (SCSI)
Size
vCPU
Memory in GiB
Max data disks
Max uncached Premium SSD throughput: IOPS/MBps
Max uncached Ultra Disk and Premium SSD V2 disk throughput: IOPS/MBps
Max NICs
Max network bandwidth (Mbps)
Standard_M416s_6_v3
416
5,696
64
130,000/4,000
130,000/4,000
8
40,000
Standard_M416s_8_v3
416
7,600
64
130,000/4,000
130,000/4,000
8
40,000
Standard_M624s_12_v3
624
11,400
64
130,000/4,000
130,000/4,000
8
40,000
Standard_M832s_12_v3
832
11,400
64
130,000/4,000
130,000/4,000
8
40,000
Standard_M832s_16_v3
832
15,200
64
130,000/ 8,000
130,000/ 8,000
8
40,000
Mdsv3 High Memory series (NVMe)
Size
vCPU
Memory in GiB
Temp storage (SSD) GiB
Max data disks
Max cached* and temp storage throughput: IOPS / MBps
Max uncached Premium SSD throughput: IOPS/MBps
Max uncached Ultra Disk and Premium SSD V2 disk throughput: IOPS/MBps
Max NICs
Max network bandwidth (Mbps)
Standard_M416ds_6_v3
416
5,696
400
64
250,000/1,600
130,000/4,000
130,000/4,000
8
40,000
Standard_M416ds_8_v3
416
7,600
400
64
250,000/1,600
130,000/4,000
130,000/4,000
8
40,000
Standard_M624ds_12_v3
624
11,400
400
64
250,000/1,600
130,000/4,000
130,000/4,000
8
40,000
Standard_M832ds_12_v3
832
11,400
400
64
250,000/1,600
130,000/4,000
130,000/4,000
8
40,000
Standard_M832ds_16_v3
832
15,200
400
64
250,000/1,600
130,000/ 8,000
260,000/ 8,000
8
40,000
Mdsv3 High Memory series (SCSI)
Size
vCPU
Memory in GiB
Temp storage (SSD) GiB
Max data disks
Max cached* and temp storage throughput: IOPS / MBps
Max uncached Premium SSD throughput: IOPS/MBps
Max uncached Ultra Disk and Premium SSD V2 disk throughput: IOPS/MBps
Max NICs
Max network bandwidth (Mbps)
Standard_M416ds_6_v3
416
5,696
400
64
250,000/1,600
130,000/4,000
130,000/4,000
8
40,000
Standard_M416ds_8_v3
416
7,600
400
64
250,000/1,600
130,000/4,000
130,000/4,000
8
40,000
Standard_M624ds_12_v3
624
11,400
400
64
250,000/1,600
130,000/4,000
130,000/4,000
8
40,000
Standard_M832ds_12_v3
832
11,400
400
64
250,000/1,600
130,000/4,000
130,000/4,000
8
40,000
Standard_M832ds_16_v3
832
15,200
400
64
250,000/1,600
130,000/ 8,000
130,000/ 8,000
8
40,000
*Read iops is optimized for sequential reads
Regional Availability and Pricing
The VMs are now available in West Europe, North Europe, East US, and West US 2. For pricing details, please take a look here for Windows and Linux.
We are thrilled to unveil the latest and largest additions to our Mv3-Series, Standard_M896ixds_32_v3 and Standard_M1792ixds_32_v3 VM SKUs. These new VM SKUs are the result of a close collaboration between Microsoft, SAP, experienced hardware partners, and our valued customers.
Key features on the new Mv3 VHM VMs
Unmatched Memory Capacity: With close to 32TB of memory, both the Standard_M896ixds_32_v3 and Standard_M1792ixds_32_v3 VMs are ideal for supporting very large in-memory databases and workloads.
High CPU Power: Featuring 896 cores in the Standard_M896ixds_32_v3 VM and 1792 vCPUs** in the Standard_M1792ixds_32_v3 VM, these VMs are designed to handle high-end S/4HANA workloads, providing more CPU power than other public cloud offerings. Enhanced Network and Storage Bandwidth: Both VM types provide the highest network and storage bandwidth available in Azure for a full node VM, including up to 200-Gbps network bandwidth with Azure Boost.
Optimal Performance for SAP HANA: Certified for SAP HANA, these VMs adhere to the SAP prescribed socket-to-memory ratio, ensuring optimal performance for in-memory analytics and relational database servers.
This article is contributed. See the original author and article here.
Microsoft Fabric seamlessly integrates with generative AI to enhance data-driven decision-making across your organization. It unifies data management and analysis, allowing for real-time insights and actions.
With Real Time Intelligence, keeping grounding data for large language models (LLMs) up-to-date is simplified. This ensures that generative AI responses are based on the most current information, enhancing the relevance and accuracy of outputs. Microsoft Fabric also infuses generative AI experiences throughout its platform, with tools like Copilot in Fabric and Azure AI Studio enabling easy connection of unified data to sophisticated AI models.
Check out GenAI experiences with Microsoft Fabric.
00:00 — Unify data with Microsoft Fabric 00:35 — Unified data storage & real-time analysis 01:08 — Security with Microsoft Purview 01:25 — Real-Time Intelligence 02:05 — Integration with Azure AI Studio
As Microsoft’s official video series for IT, you can watch and share valuable content and demos of current and upcoming tech from the people who build it at Microsoft.
-If you want to bring custom Gen AI experiences to your app so that users can interact with them using natural language, the better the quality and recency of the data used to ground responses, the more relevant and accurate the generated outcome.
-The challenge, of course, is that your data may be sitting across multiple clouds, in your own data center and also on the edge. Here’s where the complete analytics platform Microsoft Fabric helps you to unify data wherever it lives at unlimited scale, without you having to move it.
-It incorporates a logical multi-cloud data lake, OneLake, for unified data storage and access and separately provides a real-time hub optimized for event-based streaming data, where change data capture feeds can be streamed from multiple cloud sources for analysis in real time without the need to pull your data. Then with your data unified, data professionals can work together in a collaborative workspace to ingest and transform it, analyze it, and also endorse it as they build quality data sets.
-And when, used with Microsoft Purview, this can be achieved with an additional layer of security where you can classify and protect your schematized data with protections flowing as everyone from your engineers, data analysts to your business users works with data in the Fabric workspace. Keeping grounding data for your LLMs up to date is also made easier by being able to act on it with Real Time Intelligence.
-For example, you might have a product recommendation engine on an e-commerce site and using Real Time Intelligence, you can create granular conditions to listen for changes in your data, like new stock coming in, and update data pipelines feeding the grounding data for your large language models.
-So now, whereas before the gen AI may not have had the latest inventory data available to it to ground responses, with Real Time Intelligence, generated responses can benefit from the most real-time, up-to-date information so you don’t lose out on sales. And as you work with your data, gen AI experiences are infused throughout Fabric. In fact, Copilot in Fabric experiences are available for all Microsoft Fabric workloads to assist you as you work.
-And once your data set is complete, connecting it from Microsoft Fabric to ground large language models in your gen AI apps is made easy with Azure AI Studio, where you can bring in data from OneLake seamlessly and choose from some of the most sophisticated large language models hosted in Azure to build custom AI experiences on your data, all of which is only made possible when you unify your data and act on it with Microsoft Fabric.
This article is contributed. See the original author and article here.
Discover how Copilot for Dynamics 365 Commerce can help you deliver personalized customer experiences, optimize product management, and streamline retail operations for store associates, managers, and back-office staff with AI.
Transformative AI in Dynamics 365 Commerce
The retail industry is facing significant challenges and opportunities in today’s digital world. AI can help you create value for your customers and stand out from your competitors. It can help you solve critical challenges such as improving customer service, refining product management, increasing store associate productivity, and simplifying finances.
Dynamics 365 Commerce now includes Copilot, which helps you improve customer satisfaction, boost sales, increase profit margins, and enhance workforce productivity with automated insights, validations, and summaries that reduce the clicks and searches needed to find information, creating a near “one-click” retail experience.
Watch this brief video to see Copilot in action.
Video of Copilot in Dynamics 365 Commerce announcement on YouTube.
Customer insights
Copilot simplifies the process of understanding your customers. It aggregates data on customer preferences from Dynamics 365 Commerce, giving you an in-depth and comprehensive view of your customers, such as their favorite product categories, preferred price ranges, and lifetime value based on recency, frequency, and monetary metrics. You can view their previous interactions with a quick glance, making it easier to resume conversations or give customized follow-ups for better customer relationships and more personalized and effective engagement. Learn more about Copilot customer insights.
Copilot customer insights in Dynamics 365 Commerce.
Product insights
Whether you’re introducing new products or welcoming new store employees, keeping everyone informed and prepared is essential. Copilot provides comprehensive product insights, including clear and concise descriptions, key benefits, inventory levels, and discount details, empowering your staff to elevate product sales. Additionally, employees can access information on related items like accessories and bundles, promoting a cross-selling environment that enhances the shopping experience and boosts sales. By equipping store employees with the knowledge and confidence to engage customers effectively, Copilot turns interactions into opportunities, increasing customer satisfaction, sales, and the all-important average order value.
Copilot product insights in Dynamics 365 Commerce.
Report insights
Envision a scenario where synthesizing insights to assess the performance of your retail channels becomes seamless. With Copilot, it’s easy. Copilot provides instant insights, generating narrative summaries for channel reports. You’ll receive a precise and succinct overview of critical metrics such as sales, revenue, profit, margin, and overall store performance—right at your fingertips. Copilot’s real-time analysis keeps you ahead, updating summaries as new data arrives, empowering your store associates to communicate results effectively, accurately, instantly. Embrace the future of retail intelligence with Copilot and revolutionize the way you interact with your data. Learn more about Copilot report insights in Dynamics 365 Commerce.
Copilot report insights in Dynamics 365 Commerce.
Retail statement insights
Copilot can summarize posted and unposted retail statements, highlighting key insights such as the number of affected transactions, total sales amount, and risks like returns without receipts, expense transactions, and price overrides. These insights into retail statements allow for straightforward and efficient management of financial reports and help you detect and correct discrepancies and risks in your retail statements by providing a clear summary of anomalies in transactional activity. By using Copilot-powered insights, you can identify issues without wading through numerous forms, promptly take corrective measures, and reduce the need for support inquiries to solve problems. Learn more about Copilot retail statement insights in Dynamics 365 Commerce.
Copilot statement insights in Dynamics 365 Commerce.
Merchandise more efficiently
Merchandising is a complex and time-consuming process that involves configuring products, categories, catalogs, and attributes for each channel. Merchandisers need to ensure that their products are displayed correctly and accurately on the online store, and that they comply with each channel’s business rules and policies. However, manual configuration is prone to human error, and it doesn’t scale for businesses that have millions of products, thousands of attributes, and hundreds of categories and catalogs across hundreds of stores.
Copilot enhances merchandiser efficiency by streamlining merchandising workflows, summarizing product settings, and automating data validation by checking for errors and inconsistencies in your product merchandising data. From the Copilot summaries, you can navigate to a list of issues and act without losing context to address problems promptly and efficiently. Your products are always correctly configured and displayed, enhancing customer satisfaction and boosting sales. Learn more about Copilot merchandising insights in Dynamics 365 Commerce.
Copilot merchandising insights in Dynamics 365 Commerce.
Ensuring the ethical use of AI technology
Microsoft is committed to the ethical deployment of AI technologies. Through our Responsible AI practices, we ensure that all AI-powered features in Dynamics 365 adhere to stringent data privacy laws and ethical AI usage standards, promoting transparency, fairness, and accountability.
Conclusion
Copilot features for Dynamics 365 Commerce are revolutionizing the retail experience by bringing AI to store associates, store managers, channel managers in the back office, and merchandisers. They’re simplifying complex data analysis, personalizing customer service, optimizing product management, and driving business growth by improving customer loyalty, increasing sales, and enhancing profitability.
If you’re ready to take your retail business to the next level, contact us today to learn more about how Copilot can help you transform your retail business.
Copilot functionalities in Store Commerce are available starting with the following versions:
10.0.39, from Proactive Quality Update 4 (PQU-4) onwards (CSU: 9.49.24184.3, Store Commerce App 9.49.24193.1)
10.0.40, from Proactive Quality Update 1 (PQU-1) onwards (CSU: 9.50.24184.2, Store Commerce App 9.50.24189.1)
Copilot functionalities in back office (Headquarters) are available starting with the following versions:
10.0.38 from Proactive Quality Update 5 (PQU-5) or subsequent updates
10.0.39 from Proactive Quality Update 3 (PQU-3) or later versions
This article is contributed. See the original author and article here.
This is the next segment of our blog series highlighting Microsoft Learn Student Ambassadors who achieved the Gold milestone, the highest level attainable, and have recently graduated from university. Each blog in the series features a different student and highlights their accomplishments, their experience with the Student Ambassador community, and what they’re up to now.
Today we meet Flora who recently graduated with a bachelor’s in biotechnology from Federal University of Technology Akure in Nigeria.
Responses have been edited for clarity and length.
When did you join the Student Ambassadors community?
July 2021
What was being a Student Ambassador like?
Being a student ambassador was an amazing experience for me. I joined the program at a crucial time when I was just beginning my tech journey and was on the verge of giving up on a tech career, thinking it might not be for me. Over the three years I served as an ambassador, I not only enhanced my technical skills but also grew as an individual. I transformed from a shy person into someone who could confidently address an audience, developing strong presentation and communication skills along the way. As an ambassador, I made an impact on both small and large scales, excelled in organizing events, and mentored other students to embark on their own tech career paths.
Was there a specific experience you had while you were in the program that had a profound impact on you and why?
One significant impact I made as an ambassador was organizing a Global Power Platform event in Nigeria, which is presumed the largest in West Africa, with around 700 students attending. During this event, I collaborated with MVPs in the Power Platform domain to upskill students in Power BI and Power Apps technology. Leveraging my position as a Microsoft ambassador, I secured access to school facilities, including computer systems for students to use for learning. This pivotal experience paved the way for me to organize international events outside the ambassador program.
These experiences aside helped me develop skills in project management, networking, and making a large-scale impact.
Tell us about a technology you had the chance to gain a skillset in as a Student Ambassador. How has this skill you acquired helped you in your post-university journey?
During my time as an ambassador, I developed a strong skillset in data analytics. I honed my abilities using various Microsoft technologies, including Power BI, Excel, and Azure for Data Science. I shared this knowledge with my community through classes, which proved invaluable in my post-university journey. Additionally, I honed my technical writing skills by contributing to the Microsoft Blog, with one of my articles becoming one of the top most viewed blogs of the year. This experience helped me secure an internship while in school and side-gigs via freelancing, and ultimately landing a job before graduating.
What is something you want all students, globally, to know about the Microsoft Learn Student Ambassador Program?
I want students worldwide to know that the Microsoft Learn Student Ambassador program is for everyone, regardless of how new they are to tech. It offers opportunities to grow, learn, and expand their skills, preparing them for success in the job market. They shouldn’t view it as a program only for geniuses but as a place that will shape them in ways that traditional academics might not
Flora and other Microsoft Learn Student Ambassadors in her university.
What advice would you give to new Student Ambassadors, who are just starting in the program?
I would advise students just starting in the program to give it their best and, most importantly, to look beyond the SWAG! Many people focus on the swag and merchandise, forgetting that there’s much more to gain, including developing both soft and technical skills. So, for those just starting out, come in, make good connections, and leverage those connections while building your skills in all areas.
Share a favorite quote with us! It can be from a movie, a song, a book, or someone you know personally. Tell us why you chose this. What does it mean for you?
Maya Angelou’s words deeply resonate with me: ‘Whatever you want to do, if you want to be great at it, you have to love it and be willing to make sacrifices.’ This truth became evident during my journey as a student ambassador. I aspired to be an effective teacher, presenter, and communicator. To achieve that, I knew I had to overcome my shyness and embrace facing the crowd. Making an impact on a large scale requires stepping out of my comfort zone. Over time, I transformed into a different person from when I first joined the program.
Tell us something interesting about you, about your journey.
One fascinating aspect of my involvement in the program and my academic journey was when I assumed the role of community manager. Our goal was to elevate the MLSA community to a prominent position within the school, making it recognizable to both students and lecturers. However, through collaborative efforts and teamwork with fellow ambassadors, we achieved significant growth. The community expanded to nearly a thousand members, and we successfully registered it as an official club recognized by the Vice-Chancellor and prominent lecturers. I owe a shout-out to Mahmood Ademoye and other ambassadors from FUTA who played a pivotal role in shaping our thriving community.
This article is contributed. See the original author and article here.
Introducing exciting new features to help you better understand and improve adoption and impact of Copilot for Microsoft 365 through the Copilot Dashboard. These features will help you track Copilot adoption trends, estimate impact, interpret results, delegate access to others for improved visibility, and query Copilot assisted hours more effectively. This month, we have released four new features:
Updates to Microsoft Copilot Dashboard:
Trendlines
Copilot Value Calculator
Metric guidance for Comparison
Delegate Access to Copilot Dashboard
We have also expanded the availability of the Microsoft Copilot Dashboard. As recently announced, the Microsoft Copilot Dashboard is now available as part of Copilot for Microsoft 365 licenses and no longer requires a Viva Insights premium license. The rollout of the Microsoft Copilot Dashboard to Copilot for Microsoft 365 customers started in July. Customers with over 50 assigned Copilot for Microsoft 365 licenses or 10 assigned premium Viva Insights licenses have begun to see the Copilot Dashboard. Customers with fewer than 50 assigned Copilot for Microsoft 365 licenses will continue to have access to a limited Copilot Dashboard that features tenant-level metrics.
Let’s take a closer look at the four new features in the Copilot Dashboard as well as an update to more advanced reporting options in Viva Insights.
Trendline Feature
Supercharge your insights with our new trendline feature. Easily track your company’s Copilot adoption trends over the past 6 months. See overall adoption metrics like the number of Copilot-licensed employees and active users. Discover the impact of Copilot over time – find out how many hours Copilot has saved, how many emails were sent with its assistance, and how many meetings it summarized. Stay ahead with trendline and see how Copilot usage changes over time at your organization. For detailed views of Copilot usage within apps and Copilot impact across groups for timeframes beyond 28 days, use Viva Insights Analyst Workbench (requires premium Viva Insights license).
Customize and estimate the value of Copilot at your organization. This feature estimates Copilot’s impact over a given period by multiplying Copilot-assisted hours by an average hourly rate. By default, this rate is set to $72, based on data from the U.S. Bureau of Labor Statistics. You can customize it by updating and saving your own average hourly rate and currency settings to get a personalized view. This feature is enabled by default, but your Global admin can manage it using Viva feature access management. See our Learn article for more information on Copilot-assisted hours and value.
Metric Guidance for Comparisons
Discover research-backed metric guidance when comparing different groups of Copilot usage, for example, Copilot active users and non-Copilot users. This guidance is based on comprehensive research compiled in our e-book and helps users interpret changes to meetings, email and chat metrics. For the best results, compare two similar groups, such as employees with similar job functions or ranks. Use our in-product metric guidance to interpret results and make informed decisions with confidence. Click here for more information.
Delegate Access to Copilot Dashboard
Leaders can now delegate access to their Microsoft Copilot Dashboard to others in their company to improve visibility and efficiency. Designated delegates, such as the leader’s chief of staff or direct reports, will be able to view Copilot Dashboard insights and use them to make data-driven decisions. Learn more about the delegate access feature here. Admins can control access to the delegation feature by applying feature management policies.
Go Deeper with Viva Insights – Copilot Assisted Hours Metric in Analyst Workbench
For customers wanting a more advanced, customizable Copilot reporting experience, Viva Insights is available with a premium Viva Insights license. With Viva Insights, customers can build custom views and reports, view longer data sets of Copilot usage, compare usage against third party data, and customize the definition of active Copilot users and other metrics.
The Copilot assisted hours metric featured in the Microsoft Copilot Dashboard is now also available to query in the Viva Insights Analyst Workbench. When running a person query and adding new metrics, Viva Insights analysts will be able to find this metric under the “Microsoft 365 Copilot” metric category. The metric is computed based on your employees’ actions in Copilot and multipliers derived from Microsoft’s research on Copilot users. Use this new available metric for your own custom Copilot reports.
Summary
We hope you enjoy these new enhancements to Copilot reporting to help you accelerate adoption and impact of AI at your organization. We’ll keep you posted as more enhancements become available to measure Copilot.
This article is contributed. See the original author and article here.
Enterprise resource planning (ERP) platforms were designed to help integrate the fragmented processes that comprise the operation of a large enterprise. But the way we do business keeps fundamentally changing. New business models disrupt the way companies sell products and services, blurring industry lines and transforming customer experiences.
Business complexity continues to intensify, and the rise of data as a driver of business—plus the attendant proliferation of data streams—means reaping the full promise of comprehensive ERP platforms can still be elusive.
In fact, according to a Gartner® Research report, “by 2027, less than 30% of customers who adopted new solutions from ERP platform mega-vendors will have proven their original adoption business case.”1
The arrival of generative AI brings hope of renewed promise. AI is elevating performance and creating advantages for those who understand how to apply it to data-centric systems like ERP platforms. By 2027, according to the same Gartner report, at least 50 percent of ERP systems with AI-enabled features will be enabled through generative AI capabilities.1
People often think of generative AI as a tool to automate routine tasks, but its capabilities are so much broader. Improved decision-making is an area where AI becomes a valuable tool. In fact, a report from the market research firm IDC found that, by mid-2024, 30% of global organizations will take advantage of human-like interfaces in their enterprise applications to gain more insights quickly, improving decision velocity.2
While AI can inform and enhance any number of operations across an enterprise, it’s worth looking at some specific processes in detail to see how much AI can elevate ERP solutions. Learn more about current trends in ERP platform modernization in the age of generative AI in this webinar.
How AI creates a better plan-to-produce process
Most manufacturing firms implement a plan for how they will schedule production runs to meet materials capacity, deliver quality products on time, and maintain cost-effectiveness.
Sometimes, though, this plan-to-produce process becomes an accretion of good ideas at the time, a fragmented assemblage of tools and strategies trying to work together to paint the big picture of what’s happening on the production floor. This can lead to quality control issues, and manpower and equipment shortages that fail to meet production surges, or inaccurate forecasts that waste resources or leave customers high and dry—among other issues.
Generative AI integrated with a robust ERP system can aggregate data from across an enterprise—even data residing in multiple clouds—in real time, so managers have a clear picture of the state of play at any given moment, allowing them to reduce lead times necessary to plan or alter production runs.
The complicated interdependence of tasks on a manufacturing floor—for example, Part A must be installed before Part B can be attached—is a perfect puzzle for AI to help solve. The predictive analytics capacity of generative AI allows it to better forecast demand and synch production with supplies, and then optimize timing to match resource availability with manpower. AI can also forecast and build scenarios for supply chain disruption or changes in demand.
Whether a manufacturer needs to increase production volume to meet increasing demand or build whole new facilities, AI excels at building scalable networks, finding efficiencies, and reducing costly interventions.
Another process most large organizations seek to optimize is the integration of procurement with accounts payable. When you need to spend the money, it’s good to know that you have the money. IDC reports that, by mid-2025, 70% of global businesses will use embedded financing to collect and make payments.2
More than most industries, healthcare organizations must reckon with a complex field of myriad payers, purchasers, and suppliers. Healthcare organizations face layers of challenging regulatory compliance and the need to control ever-rising costs. Many organizations in this field still rely on antiquated, paper-based invoicing and payables.
Fragmented processes and siloed data make regulatory issues more fraught, while also increasing attack surfaces to create security risks. AI can remove complexities by integrating processes in one ERP platform, helping to reduce vulnerabilities. By mapping operations to standards, AI supports compliance efforts, efficiently creating the audit trails and tedious reports that often take staff hours to produce.
AI streamlines procurement, reducing the potential for human error present when ordering supplies and equipment from a diverse range of providers. It tracks expenses to help control costs, providing easily accessible price information about competing products and services so the organization can continually find cost efficiencies.
An ERP solution enhanced with AI allows planners to automate the maintenance of inventory with both real-time and predictive information, reducing the risk of stockouts or overstock situations and more effectively communicating with suppliers.
Quicker quote-to-cash with personalization and automation
AI integrates and improves the sales, finance, and supply elements of an ERP platform by increasing automation in negotiations, contract lifecycles, production, order management, billing, and delivery. For businesses with retail components, making the quote-to-cash cycle faster and more accurate creates efficiencies—which can help keep customers happier.
Automating price and quote information speeds up the resolution of even highly complex deals. The same is true once a quote is accepted—an accurate, automatically generated proposal follows immediately. AI-generated purchase orders and invoices free sellers to spend more time interacting with customers and accounting teams to focus on tasks that increase the organization’s productivity.
AI’s predictive analytics ensure on-time delivery of products but also allow firms to quickly identify current and future trends and make data-driven decisions about ordering and pricing. Automating invoicing tracks payments accurately and creates a real-time picture of cash flow. AI can continually improve cash flow forecasts by comparing projections with results and adjusting from the outcomes over time. And analytics enabled by AI offer suggestions for improving sales performance and strategic decisions.
Get more from your business data with AI-enabled ERP processes
Across the organization, optimizing finance and supply chains can create a connected enterprise that allows enterprises to infuse AI, automation, and analytics into ERP processes. Today, companies can confidently move to the cloud with AI-powered ERP solutions, modernize business processes, and unlock the agility needed to lead the way in today’s rapidly evolving marketplace.
A recent Forrester Research study interviewed IT leaders and professionals who had experience using Microsoft Dynamics 365 ERP software.3 Forrester aggregated the interviewees’ experiences and combined the results into a single composite organization that has 5,000 employees and generates USD1 billion in annual revenue. Forrester found that, over a three-year period, the value added to the composite included:
USD1.2 million in increased profitability from real-time visibility and enhanced decision-making.
USD8.9 million in increased productivity from unified data access, streamlined processes, automated workflows, and other gained efficiencies.
USD3.9 million in reduced infrastructure and IT operations spend from cloud migration.
USD8.9 million in productivity improvements in finance/accounting, supply chain/logistics, and other personnel.
The study estimated a net present value of USD8.1 million and an ROI of 106%, as well as additional benefits like an improved cybersecurity posture and enhanced employee experiences. The composite organization would pay back its investment in Microsoft Dynamics 365 ERP software in 17 months.
Microsoft Dynamics 365 ERP software
Learn more about the total economic impact of Microsoft Dynamics 365 ERP software
AI-enabled ERP platforms allow you to protect, connect, and get more from your business data while gaining security. With the right ERP solution, you can scale globally to drive business expansion and environmental, social, and governance (ESG) while ensuring regulatory compliance, supercharging productivity, and realizing the business impacts of generative AI even faster.
Microsoft Dynamics 365 Virtual Training Day
Join us at a Microsoft Dynamics 365 Virtual Training Day to gain the skills needed to help your organization sell, service, and deliver on the customer expectations of tomorrow. Register for free, in-depth training events, where you’ll uncover new efficiencies of scale, discover smarter connections, and utilize built-in intelligence for deeper insights into your business.
GARTNER is a registered trademark and service mark of Gartner, Inc. and/or its affiliates in the U.S. and internationally and is used herein with permission. All rights reserved.
This article is contributed. See the original author and article here.
In a previous blog, we discussed how modern, AI-enabled customer relationship management (CRM) platforms and enterprise resource planning (ERP) systems help drive new, more effective ways of working for employees and more satisfying outcomes for the customers they serve in three key ways: by streamlining operations, by empowering more informed and insightful decisions, and by elevating customer and employee experiences.
In this blog post, we’re going to focus on the third item: elevating customer and employee experiences by showing how AI-enabled CRM solutions help increase productivity and provide unprecedented levels of personalized service across three key business functions: marketing, sales, and customer service. We’ll also provide insights and best practices for how to help employees get the most from AI, including how they can be empowered to create personalized experiences for their customers.
Revolutionizing marketing: How AI-enabled CRM software drives personalized experiences and enhance customer engagement
Marketing is one area where generative AI is already in active use. According to a Forrester survey of CMOs, more than half (56%) of B2C marketing or advertising decision makers have been using generative AI in three key ways:
To help employees minimize tedious tasks, allowing for more time to focus on strategically important work.
To summarize insights and enable swift action without the need to dig through data manually.
To boost the scale of creative output by generating starter ideas along with visuals and copy.1
AI is particularly capable of delivering personalized experiences in marketing, where AI-enabled CRM platforms can marry customer data to messaging to create memorable moments and impact sales. One example of a company using AI-enabled CRM solutions to generate marketing content is North Carolina-based sports club NC Fusion, which used Microsoft Copilot in Dynamics 365 Customer Insights to help its marketers create personalized messaging tailored to its audience segments, increasing the reach of the brand. Using AI-enabled content ideas, descriptions, and summaries has provided significant time savings, and personalizing campaigns has been more effective with Copilot.2
“For families, we are able to tailor the message they receive. This means a family will only receive messages that apply to their situation, and not a multitude of emails that have no application to their family situation. With AI-assisted content production, our customer engagement has increased from 10% to 30%.”
Chris Barnhart, Head of IT and Data Systems at NC Fusion
Empowering sales teams: How AI-driven personalization can transform customer interactions and boost revenue
Another area where personalization can impact an organization’s bottom line is sales, where making authentic connections with customers at the right time is paramount. Few organizations know this better than superyacht brokerage Northrop & Johnson, which has used AI to deliver highly personal sales experiences tailored to the wants and needs of its high-value clients.
“In this market, we have high-wealth customers who are considering very high-value purchases, and we can’t afford any interactions that leave them feeling anything less than special.”
Keith Perfect, Director of Technoloy & Intelligence at Northrop & Johnson
Microsoft Dynamics 365 Sales, Microsoft Dynamics 365 Customer Insights, and Copilot provide Northrop & Johnson sales teams with comprehensive and timely data for each client, which helps them deliver personalized conversations at precisely the right time to engage.
“Clients at this level want to know they are taken care of. And when you must make an impact in minutes, which is all you have with these busy clients, you need to be very attuned to them. Otherwise, it could cost you the sale. So, having a solution at your fingertips that connects the entire journey is huge for our sales team.”
Daniel Ziriakus, President & Chief Operating Officer at Northrop & Johnson
Sales teams using AI-enabled CRM software also realize significant time savings as salespeople assign more tedious tasks to their AI assistants. In fact, according to new Microsoft research, 79% of Microsoft Copilot for Sales users say it reduces the amount of administrative work they have to do, 67% say it helps them spend more time with customers, and 64% say it allows them to better personalize customer engagements.3
Transforming customer support: How AI-driven assistants enhance productivity, satisfaction, and retention
Customer service is still another area where AI-enabled CRM platforms can make an immediate impact. According to a November 2023 study from the National Bureau of Economic Research (NBER), customer service agents using a generative AI-based conversational assistant were able to increase productivity—specifically measured by the number of issues resolved per hour—by an average of 14%. The effect was even more pronounced with novice and low-skill workers, who experienced productivity increases of 34%. Researchers also found that AI assistance improves customer sentiment and increases employee retention.4
One company using an AI-enabled CRM solution in customer service is Microsoft. We operate one of the largest customer support teams in the world and process more than 145 million contacts per year. We use Microsoft Dynamics 365 Customer Service to help utilize the full expertise of the engineers on staff and provide better resolution of customer issues across the board.5
“The challenge for every support engineer is to connect with the human being on the other end of the call who has a problem that needs solving. You want to connect with them, but you also need to be able to pull in a great deal of technical information. Copilot provides us the support to offer the customer understanding while also sorting out their technical problems.”
Ric Todd, Director of Support Strategy at Microsoft
For leaders looking to roll out AI solutions in their organizations, we have some encouraging news: people new to AI begin recognizing its value quickly. Recent Microsoft research shows it takes a time savings of just 11 minutes per day for most people to see its usefulness (a key factor in getting new work habits to stick). 6
Encouragingly, most respondents report having saved more than 11 minutes. The most efficient among them are saving up to 30 minutes per day—the equivalent of 10 hours per month—and the average person is saving 14 minutes per day for a time savings of almost five hours per month.
What’s more, the breakthrough moment by which respondents report seeing improvements in productivity (75%), work enjoyment (57%), work-life balance (34%), and the ability to attend fewer meetings (37%) happens within one business quarter—11 weeks.7
While personal productivity gains from Copilot are real and significant, building an AI-powered organization requires committing to working in a new way. Some best practices to consider include:
Encourage daily use. Realizing productivity gains from AI will take intentional everyday practice. Those who start building the habit early will pull ahead. And don’t forget—11 weeks is all it takes for people to recognize the effect.
Help people manage their new assistants. Employees taught to treat their generative AI tools as assistants, not search engines, will get the most value. Teach team members to manage their new assistant and to recognize when to delegate a task to AI and when to apply their human intelligence, judgment, and skill.
Find good use of reclaimed time. Help your team take advantage of time savings to focus on the higher-order and creative tasks only people can do. Salespeople can devote more time to building relationships with customers and closing deals. Marketers can carve out time to dream up new solutions. Customer service teams can focus on solving problems, and managers across the organization can spend more time coaching and caring for their teams.
Taking the next step forward
Take the next step in your AI adoption journey by learning more about Copilot and other AI-powered capabilities in Microsoft Dynamics 365. Discover how to keep your organization on the cutting-edge by realizing that a new paradigm of customer engagement through AI-enabled personalization empowers both customers and employees.
Join us at a Microsoft Dynamics 365 Virtual Training Day to gain the skills needed to help your organization sell, service, and deliver on the customer expectations of tomorrow. Register for free, in-depth training events, where you’ll uncover new efficiencies of scale, discover smarter connections, and utilize built-in intelligence for deeper insights into your business. Register now!
Join us at a Microsoft Dynamics 365 Virtual Training Day to gain the skills needed to help your organization sell, service, and deliver on the customer expectations of tomorrow. Register for free, in-depth training events, where you’ll uncover new efficiencies of scale, discover smarter connections, and utilize built-in intelligence for deeper insights into your business.
This article is contributed. See the original author and article here.
In this blog post I am going to talk about splitting logs to multiple tables and opting for basic tier to save cost in Microsoft Sentinel. Before we delve into the details, let’s try to understand what problem we are going to solve with this approach.
Azure Monitor offers several log plans which our customers can opt for depending on their use cases. These log plans include:
Analytics Logs – This plan is designed for frequent, concurrent access and supports interactive usage by multiple users. This plan drives the features in Azure Monitor Insights and powers Microsoft Sentinel. It is designed to manage critical and frequently accessed logs optimized for dashboards, alerts, and business advanced queries.
Basic Logs – Improved to support even richer troubleshooting and incident response with fast queries while saving costs. Now available with a longer retention period and the addition of KQL operators to aggregate and lookup.
Auxiliary Logs – Our new, inexpensive log plan that enables ingestion and management of verbose logs needed for auditing and compliance scenarios. These may be queried with KQL on an infrequent basis and used to generate summaries.
Following diagram provides detailed information about the log plans and their use cases:
I would also recommend going through our public documentation for detailed insights about feature-wise comparison for the log plans which should help you in taking right decisions for choosing the correct log plans.
**Note** Auxiliary logs are out of scope for this blog post, I will write a separate blog on the Auxiliary logs later.
So far, we know about different log plans available and their use cases.
The next question is which tables support Analytics and Basic log plan?
Analytics Logs: All tables support the Analytics plan.
You can switch between the Analytics and Basic plans; the change takes effect on existing data in the table immediately.
When you change a table’s plan from Analytics to Basic, Azure monitor treats any data that’s older than 30 days as long-term retention data based on the total retention period set for the table. In other words, the total retention period of the table remains unchanged, unless you explicitly modify the long-term retention period.
I will focus on splitting Syslog table and setting up the DCR-based table to Basic tier in this blog.
Typically Firewall logs contribute to high volume of log ingestion to a SIEM solution.
In order to manage cost in Microsoft Sentinel its highly recommended to thoroughly review the logs and identify which logs can be moved to Basic log plan.
At a high level, the following steps should be enough to achieve this task:
Ingest Firewall logs to Microsoft Sentinel with the help of Linux Log Forwarder via Azure Monitor Agent.
Assuming the log is getting ingested in Syslog table, create a custom table with same schema as Syslog table.
Update the DCR template to split the logs.
Set the table plan to Basic for the identified DCR-based custom table.
Set the required retention period of the table.
At this point, I anticipate you already have log forwarder set up and able to ingest Firewall logs to Microsoft Sentinel’s workspace.
Let’s focus on creating a custom table now
This part used to be cumbersome but not anymore, thanks to my colleague Marko Lauren who has done a fantastic job in creating this PowerShell Script which can create a custom table easily. All you need to do is to enter the pre-existing table name and the script will create a new DCR-Based custom table with same schema.
Let’s see it in action:
Download the script locally.
Open the script in PowerShell ISE and update workspace ID & resource ID details as shown below.
Save it locally and upload to Azure PowerShell.
Load the file and enter the table name from which you wish to copy the schema.
Provide the new table name as per your wish, ensure the name has suffix “_CL” as shown below:
This should create a new DCR-based custom table which you can check in Log Analytics Workspace > Table blade as shown below:
**Note**We highly recommend you should review the PowerShell script thoroughly and do proper testing before executing it in production. We don’t take any responsibility for the script.
The next step is to update the Data Collection Rule template to split the logs
Since we already created custom table, we should create a transformation logic to split the logs and send less relevant log to the custom table which we are going to set to Basic log tier.
For demo purposes, I’m going to split logs based on SeverityLevel. I will drop “info” logs from Syslog table and stream it to Syslog_CL table.
Let’s see how it works:
Browse to Data Collection Rule blade.
Open the DCR for Syslog table, click on Export template > Deploy > Edit Template as shown below:
In the dataFlows section, I’ve created 2 streams for splitting the logs. Details about the streams as follows:
1st Stream: It’s going to drop the Syslog messages where SeverityLevel is “info” and send the logs to Syslog table.
2nd Stream: It’s going to capture all Syslog messages where SeverityLevel is “info” and send the logs to Syslog_CL table.
Save and deploy.
Let’s validate if it really works
Go to the Log Analytics Workspace > Logs and check if the tables contains the data which we have defined it for.
In my case as we can see, Syslog table contains all logs except those where SeverityLevel is “info”
Additionally, our custom table: Syslog_CL contains those Syslog data where SeverityLevel is “info”
Now the next part is to set the Syslog_CL table to Basic log plan
Since Syslog_CL is a DCR-based custom table, we can set it to Basic log plan. Steps are straightforward:
Go to the Log Analytics Workspace > Tables
Search for the table: Syslog_CL
Click on the ellipsis on the right side and click on Manage table as shown below:
Select the table plan to Basic and set desired retention period
Save the settings.
Now you can enjoy some cost benefits, hope this helps.
Recent Comments