This article is contributed. See the original author and article here.
In today’s data-driven world, businesses rely on customer data to fuel their marketing strategies. They need to access, analyze, and act on this data to power personalized experiences that drive return on marketing investments. However, this comes with the challenges of (1) configuring systems like a Customer Data Platform correctly and (2) ensuring high data quality within these systems.
A Gartner research study1 reported that high quality data provides “better leads, better understanding of customers, and better customer relationships” and that “every year, poor quality data costs organizations an average of $12.9 million.” This is why it is crucial to understand the current configuration state of your Customer Insights – Data environment and the quality of your data; addressing these challenges is the key to unlocking the most relevant and impactful insights about your customers.
We recently shipped generative-AI powered features in D365 Customer Insights – Data to help organizations improve data quality and configuration with Copilot so they can empower business users with the best insights to deliver highly personalized customer experiences.
This blog post will share more information on how you can improve data quality and configuration with Copilot. With these features you can:
Review the current status of your Customer Insights – Data environment,
Understand the overall health of your data,
Consult which insights can be generated successfully from your data,
Act on recommendations to unlock more insights.
To illustrate how these features work, let’s see how they can be used to improve the speed and quality of an email marketing campaign to target high lifetime value customers with a ‘thank you’ discount on their next purchase.
Quickly know if your jobs have run successfully, and where to go if not with Copilot
Contoso Coffee recently implemented Customer Insights – Data, which involved integrating source data from various systems and creating unified customer profiles. To ensure that everything was running smoothly, they checked the system settings. Environment Status Summary, a Copilot feature not only highlighted a recent issue, but also used AI to identify where the issue occurred and provided a direct link to investigate. Thanks to this feature, Contoso’s IT Team was able to quickly fix a skipped customer profile job that would have otherwise blocked them from generating insights for an upcoming email marketing campaign. With the problem resolved in minutes, they could focus on re-engaging high lifetime value customers in a timely manner.
Understand your overall data quality with Copilot
Now that Contoso’s environment is running smoothly, they want to quickly understand the general health of their data estate.
They review a summary of their data quality on the Home Page by the Data Prep Report, a Copilot feature. This summary includes a data quality grade, which insights are available, the most critical data quality issues, and a link to a detailed data prep report to learn more. Using this summary, Contoso can see that their data quality is medium with a 75% score. They are able to generate some insights, but not the customer lifetime value prediction they want for their email marketing campaign.
If not for this summary, Contoso would have attempted to configure, train, score, and run a customer lifetime value prediction that would have failed completely or had low-grade results. The summary show where their data stands. Thus they don’t have to go through the frustration of trying to generate insights based on unusable data.
See which insights can be generated successfully from your data
Next, Contoso wants to deep dive the report to understand the next steps to build their email campaign. They click into the full Data Prep Report, which informs them that they can generate churn predictions, segments, or measures based on their current data. However, they want to pursue a customer lifetime value prediction to support their campaign. They filter the report to review the detailed issues and recommendations specific to customer lifetime value and see the issues listed in priority order from highest to lowest severity. The report gives them the targeted, easy-to-digest information they need to know how to proceed.
Act on recommendations to unlock more insights
Finally, Contoso engages their IT Team to act on the detailed issues and recommendations. The IT Team follows the recommendations by taking the suggested actions such as adding more data incorporating products with a sufficient number of purchases. With minimal time, effort, and ambiguity they are able to improve their data and light up the customer lifetime value prediction they want for their marketing campaign.
Create and use high-impact insights in marketing campaigns
With the help of Environment Status Summary and Data Prep Report, Contoso Coffee is able to get their Customer Data Platform environment set up correctly and resolve their top data quality issues effectively. By improving data quality and configuration with Copilot they are able to instantly get rich insights, such as customer lifetime value predictions, which are conveniently available out-of-the box in Customer Insights – Data. This lets their marketing team focus on launching an effectiveemail campaign that provides relevant and in-the-moment offers to their highest value customers to drive business results. Consult our product documentation and start using these AI-powered features today to achieve similar results!
What are some ways to engage further with Customer Insights – Data?
If you’re a new user, or want to test with demo data: Start a trial of Customer Insights
This article is contributed. See the original author and article here.
We are constantly evolving the Microsoft 365 platform by introducing new experiences like Microsoft Clipchamp and Microsoft Loop—available now for Microsoft 365 Business Standard or Microsoft 365 Business Premium subscribers.
This article is contributed. See the original author and article here.
The Viva Engage Festival, hosted by Swoop Analytics, is an interactive virtual event that brings together Viva Engage thought leaders, communication innovators, and community enthusiasts from around the globe. This is not just another webinar; it’s an opportunity to dive deep into the future of employee engagement, learn about new tech, explore the latest Viva Engage experiences, and connect with a community passionate about driving change in their businesses.
Hear from leading customers and directly from Microsoft
Viva Engage Festival includes customer speakers and industry experts who will share knowledge and expertise on a wide range of topics around Viva Engage, from Comcast, NSW Government, Johnson and Johnson, Vestas and more. Join us for an exclusive look into Microsoft’s journey with Viva Engage and communities as we share our own experiences.
We hope you join us to connect with like-minded individuals who share a passion for driving meaningful engagement. Whether you’re a business leader, a professional, or an enthusiast, you’ll leave the festival with the inspiration and knowledge needed to take your Viva Engage investments to the next level.
Nominate Viva Engage Community Champion!
As part of our 2023 Viva Engage Festival, Microsoft and SWOOP Analytics will announce this year’s regional winners of the Community Champion Award. The Viva Engage Community Champion Award is an opportunity to recognize passionate community managers around the world who are committed to employee engagement, knowledge sharing, and collaboration in their Viva Engage networks. Can you think of anyone who deserves this title? Let us know who it might be! The 2023 Viva Engage Community Champion will be announced for each region during the festival. Nominations close November 30, 2023.
This article is contributed. See the original author and article here.
Ignite has come to an end, but that doesn’t mean you can’t still get in on the action!
Display Your Skills and Earn a New Credential with Microsoft Applied Skills
Advancements in AI, cloud computing, and emerging technologies have increased the importance of showcasing proficiency in sought-after technical skills. Organizations are now adopting a skills-based approach to quickly find the right people with the appropriate skills for specific tasks. With this in mind, we are thrilled to announce Microsoft Applied Skills, a new platform that enables you to demonstrate your technical abilities for real-world situations.
Microsoft Applied Skills gives you a new opportunity to put your skills center stage, empowering you to showcase what you can do and what you can bring to key projects in your organization. This new verifiable credential validates that you have the targeted skills needed to implement critical projects aligned to business goals and objectives.
There are two Security Applied Skills that have been introduced:
Learners should have expertise in Azure infrastructure as a service (IaaS) and platform as a service (PaaS) and must demonstrate the ability to implement regulatory compliance controls as recommended by the Microsoft cloud security benchmark by performing the following tasks:
Learners should be familiar with Microsoft Security, compliance, identity products, Azure portal, and administration, including role-based access control (RBAC), and must display their ability to set up and configure Microsoft Sentinelb by demonstrating the following:
Create and configure a Microsoft Sentinel workspace
Deploy a Microsoft Sentinel content hub solution
Configure analytics rules in Microsoft Sentinel
Configure automation in Microsoft Sentinel
Earn these two credentials for free for a limited time only.
View the Learn Live Sessions at Microsoft Ignite On-demand
Learn Live episodes guide learners through a module on Learn and work through it in real-time. Microsoft experts lead each episode, providing helpful commentary and insights and answering questions live.
The Microsoft Ignite Edition of Microsoft Learn Cloud Skills Challenge is underway. There are several challenges to choose from, including the security-focused challenge Microsoft Ignite: Optimize Azure with Defender for Cloud. If you complete the challenge, you can earn an entry into a drawing for VIP tickets to Ignite next year. You have until January 15th to complete the challenge. Get started today!
Keep up-to-date on Microsoft Security with our Collections
This article is contributed. See the original author and article here.
Have you ever wondered why some SQL queries take forever to execute, even when the CPU usage is relatively low? In our latest support case, we encountered a fascinating scenario: A client was puzzled by a persistently slow query. Initially, the suspicion fell on CPU performance, but the real culprit lay elsewhere. Through a deep dive into the query’s behavior, we uncovered that the delay was not due to CPU processing time. Instead, it was the sheer volume of data being processed, a fact that became crystal clear when we looked at the elapsed time. The eye-opener was our use of SET STATISTICS IO, revealing a telling tale: SQL Server Execution Times: CPU time = 187 ms, elapsed time = 10768 ms. Join us in our latest blog post as we unravel the intricacies of SQL query performance, emphasizing the critical distinction between CPU time and elapsed time, and how understanding this can transform your database optimization strategies.
Introduction
In the realm of database management, performance tuning is a critical aspect that can significantly impact the efficiency of operations. Two key metrics often discussed in this context are CPU time and elapsed time. This article aims to shed light on these concepts, providing practical SQL scripts to aid database administrators and developers in monitoring and optimizing query performance.
What is CPU Time?
CPU time refers to the amount of time for which a CPU is utilized to process instructions of a SQL query. In simpler terms, it’s the actual processing time spent by the CPU in executing the query. This metric is essential in understanding the computational intensity of a query.
What is Elapsed Time?
Elapsed time, on the other hand, is the total time taken to complete the execution of a query. It includes CPU time and any additional time spent waiting for resources (like IO, network latency, or lock waits). Elapsed time gives a more comprehensive overview of how long a query takes to run from start to finish.
Why Are These Metrics Important?
Understanding the distinction between CPU time and elapsed time is crucial for performance tuning. A query with high CPU time could indicate computational inefficiency, whereas a query with high elapsed time but low CPU time might be suffering from resource waits or other external delays. Optimizing queries based on these metrics can lead to more efficient use of server resources and faster query responses.
Practical SQL Scripts
Let’s delve into some practical SQL scripts to observe these metrics in action.
Script 1: Table Creation and Data Insertion
CREATE TABLE EjemploCPUvsElapsed (
ID INT IDENTITY(1,1) PRIMARY KEY,
Nombre VARCHAR(5000),
Valor INT,
Fecha DATETIME
);
DECLARE @i INT = 0;
WHILE @i < 200000
BEGIN
INSERT INTO EjemploCPUvsElapsed (Nombre, Valor, Fecha)
VALUES (CONCAT(REPLICATE('N', 460), @i), RAND()*(100-1)+1, GETDATE());
SET @i = @i + 1;
END;
This script creates a table and populates it with sample data, setting the stage for our performance tests.
Script 2: Enabling Statistics
Before executing our queries, we enable statistics for detailed performance insights.
SET STATISTICS TIME ON;
SET STATISTICS IO ON;
Script 3: Query Execution
We execute a sample query to analyze CPU and elapsed time.
SELECT *
FROM EjemploCPUvsElapsed
ORDER BY NEWID() DESC;
Script 4: Fetching Performance Metrics
Finally, we use the following script to fetch the CPU and elapsed time for our executed queries.
SELECT
sql_text.text,
stats.execution_count,
stats.total_elapsed_time / stats.execution_count AS avg_elapsed_time,
stats.total_worker_time / stats.execution_count AS avg_cpu_time
FROM
sys.dm_exec_query_stats AS stats
CROSS APPLY
sys.dm_exec_sql_text(stats.sql_handle) AS sql_text
ORDER BY
avg_elapsed_time DESC;
Conclusion
Understanding and differentiating between CPU time and elapsed time in SQL query execution is vital for database performance optimization. By utilizing the provided scripts, database professionals can start analyzing and improving the efficiency of their queries, leading to better overall performance of the database systems.
Recent Comments