This article is contributed. See the original author and article here.
In email marketing, tracking metrics has been the key to understanding and improving campaign effectiveness. For years, one of the most reliable metrics has been the open rate, but as the digital landscape evolves, so do the challenges of accurately measuring this essential statistic. In this article, we’ll explore why open rates are becoming less reliable and how you can adapt.
The open rate and the challenges of modern email clients
Traditionally, the open rate—the percentage of recipients who open an email—has been a fundamental metric in email marketing. This metric has been invaluable for marketers, helping them gauge the success of their campaigns and make data-driven decisions.
Email opens are tracked using tracking pixels, tiny 1×1-pixel images embedded in the email content. When the recipient opens the email, the pixel loads from a remote server, sending data back to the sender. However, the reliability of open rates is increasingly under threat due to privacy concerns and changes in how email clients handle images. Here’s why open rates are becoming less dependable:
Image blocking: Many email clients now block image loading by default. Recipients can open the email and consume its content without loading images. Such opens aren’t counted, resulting in a lower-than-actual open rate being reported. Image blocking is even more prevalent now because of mobile devices that automatically block image loading for privacy, speed, and conserving data usage. This means that a significant portion of your audience might be missed in the open rate calculation.
Preview panes: Some email clients allow users to preview an email without actually opening it. In these cases, the open rate may register false positives, counting emails as opened when they were merely previewed.
Privacy concerns: To protect user privacy, email clients and webmail services are increasingly blocking tracking pixels, making it harder to track open rates accurately.
Apple privacy changes: Apple devices that run iOS 15 automatically open all emails, which can result in an inflated open rate.
The future of email metrics is a dual approach
So, what’s the way forward for email marketers like you who rely on open rates to measure engagement and success? It’s essential to employ a dual approach: improving how to measure engagement and applying strategies to improve engagement itself. Here’s how you can adapt.
Measure engagement more effectively
Engagement metrics in Dynamics 365 Customer Insights – Journeys offer a more complete picture than open rates alone.
Diversify engagement metrics: Instead of relying solely on open rates, evaluate other indicators such as click-through rates, conversion rates, and ROI. These metrics offer a more comprehensive view of your email marketing performance.
Implement email authentication: Email authentication protocols like DMARC, SPF, and DKIM improve email deliverability and enhance your sender reputation, indirectly affecting engagement rates.
Use alternative metrics: Consider using alternative metrics like measuring conversion attribution through unique coupon codes or UTM parameters. These tools can help track the direct impact of your emails beyond the open rate.
Strategies for improving engagement
Personalization and segmentation: Tailoring your email content to individual recipients’ preferences and behaviors can drive higher engagement. By segmenting your audience and sending personalized content, you can increase the chances of your emails being opened and acted on.
Expand your messaging channels: SMS has a 98% open rate. Start taking advantage of the SMS channel today!
Test your content: Testing and refining content allows you to continuously improve email performance by identifying what resonates best with your audience using real data. Evaluating device data analytics, such as OS, browser, and device type, along with click heatmap analytics provides deeper insights into how recipients interact with your emails. Using this information, you can optimize design and content for the best user experience across all devices, boosting overall effectiveness and engagement.
Analytics by device type in Dynamics 365 Customer Insights – Journeys can help you optimize design and content for a more engaging experience across all devices.
Use your own data: Rely on first-party data such as transactions and in-store visits that can be collected using a customer data platform like Dynamics 365 Customer Insights – Data.
While email opens as a metric is not going away, it’s certainly less reliable. Dynamics 365 Customer Insights already provides an entire suite of capabilities for you to easily tackle challenges like this, and continues to invest in finding solutions that align with the evolving privacy landscape. Together, we’ll navigate these changes and continue to deliver successful email marketing campaigns, maintaining your connection with your audience while respecting their privacy in this new era of digital marketing.
This article is contributed. See the original author and article here.
We’re honored to announce that Microsoft has, once again, been recognized as a Leader in the 2024 Gartner® Magic Quadrant™ for Unified Communications as a Service (UCaaS), Worldwide. This is the sixth year we’ve received this recognition and we’re thrilled to be positioned highest in both the ability to execute and furthest on completeness of vision axes.
This article is contributed. See the original author and article here.
As Microsoft Copilot features continue to roll out across Microsoft Dynamics 365 and Microsoft Power Platform, it can be easy to get overwhelmed and lose track of critical new capabilities. Thankfully, the Microsoft Business Applications Launch Event is just around the corner.
Register today for the virtual launch event on Tuesday, October 29, 2024—a showcase of new and enhanced capabilities releasing between October 2024 and March 2025. Packed with demos and a live Q&A chat with Microsoft experts, you’ll get a sneak peek at innovation that can empower your workforce, optimize business processes, and enhance customer engagement.
Microsoft product leaders and engineers will be live at the event to give you an in-depth look at the latest Copilot capabilities for Dynamics 365 and Microsoft Power Platform, including new ways to automate business process across your organization and scale your team. Our team will also showcase organizations across industries using new Copilot and Dynamics 365 features to drive transformation.
Top 4 reasons to attend the launch event
Twice a year, the Business Applications Launch Event gives you a sneak peek at product news, demos and insights into upcoming features and capabilities across Dynamics 365, Microsoft Power Platform, and Copilot. Here are four top reasons to attend the October 2024 event:
Get a sneak peek at highlights from the 2024 release wave 2. Discover what’s new and improved in Dynamics 365 and Microsoft Power Platform. Hear from Charles Lamanna, Microsoft Corporate Vice President Business and Industry Copilot, and other leaders as they guide you through dozens of new Copilot and core platform capabilities releasing over the next six months.
Personalize sales and service experiences. Learn how to elevate customer experiences with demonstrations of new capabilities across Microsoft Dynamics 365 Customer Service, Microsoft Dynamics 365 Contact Center, and Microsoft Dynamics 365 Sales. You’ll also discover how Sweden-based automotive company, Lynk & Co, is using Dynamics 365 to drive highly personalized experiences.
Transform business operations with AI-enabled enterprise resource planning (ERP) processes. Get a sneak peek at the enhancements that improve both core functionality and autonomous capabilities across ERP applications like Microsoft Dynamics 365 Finance, Microsoft Dynamics 365 Supply Chain Management, and Microsoft Dynamics 365 Business Central through the lens of our customer Lifetime Products, as well as the latest features for Business Central.
Exploring the future of Microsoft Power Platform.Learn how Copilot is transforming how you build, what you build, how you automate, and get a first-hand look at how Applied Information Sciences is innovating business solutions using the newest capabilities for Microsoft Power Apps, Microsoft Power Automate, and Microsoft Copilot Studio.
That’s not all. You’ll also hear from other Microsoft leaders about their roadmap for the future of AI, customer service, and operations and how to use these new technologies to take on your organization’s most time-consuming tasks.
The Business Applications Launch Event streams live on Tuesday, October 29, 2024 starting at 9:00 AM Pacific Time and then available on-demand. Be sure to register for updates and reminders as the event day approaches. We’ll see you there!
Microsoft Business Applications Launch Event
Tuesday, October 29, 2024 9:00 AM-10:00 AM Pacific Time (UTC-7)
This article is contributed. See the original author and article here.
We’re excited to introduce the cross-location shifts feature in Microsoft Shifts app, designed to enhance flexibility and efficiency for both frontline managers and workers. Currently in public preview, this feature empowers businesses to better manage staffing across multiple locations while giving employees more control over their schedules.
For Frontline Managers
With cross-location shifts, managers can offer open shifts across various teams and locations, helping to balance workforce needs, fill last-minute gaps, and improve customer satisfaction. By turning on the cross-location feature in Shifts settings, managers can post open shifts that employees from other locations can request, ensuring their store or site is always fully staffed.
Managers will be notified when employees from other locations request shifts and can easily approve or decline requests. Once approved, workers from other locations will appear in your schedule as external employees, making it seamless to track staff across locations.
For Frontline Workers
The cross-location feature provides more flexibility for employees, allowing them to pick up open shifts at different locations that suit their schedules. Workers can view open shifts at other sites and submit a request to pick up the shift. Once approved by the manager at that location, the shift will appear in their schedule.
Manager of employees can opt in for approval when they are working at other locations in addition to approval of the target store manager
Manager of employee can view the location name in the team schedule when they are working at other locations
This powerful new feature helps businesses optimize staffing, enhance worker flexibility, and improve overall operational efficiency. Stay tuned as we refine this feature during its public preview phase, and we encourage you to share your feedback!
Please take a moment to share your feedback/questions on this feature via this brief survey (https://aka.ms/CrossLocationShifts) and include your email for any further queries. We’re eager to connect with you!
This article is contributed. See the original author and article here.
Introduction
In today’s data-driven world, the ability to act upon data as soon as its generated is crucial for businesses to make informed decisions quickly. Organizations seek to harness the power of up-to-the-minute data to drive their operations, marketing strategies, and customer interactions.
This becomes challenging in the world of real-time data where it is not always possible to do all the transformations while the data is being streamed. Therefore, you must come up with a flow that does not impact the data stream and is also quick.
This is where Microsoft Fabric comes into play. Fabric offers a comprehensive suite of services including Data Engineering, Data Factory, Data Science, Real-Time Intelligence, Data Warehouse, and Databases. But today, we are going to focus on Real-Time Intelligence.
Use-Cases
This set up can be used in scenarios where data transformation is needed to be used in downstream processing/analytical workload. As example of this would be to enable OneLake availability in KQL table and use that data to be accessed by other Fabric engines like Notebooks, Lakehouse etc. for training ML models/analytics.
Another example let’s say you have a timestamp column in your streaming data and you would like to change its format based on your standard. You can use the update policy to transform the timestamp data format and store it.
Overview
Fabric Real-Time Intelligence supports KQL database as its datastore which is designed to handle real-time data streams efficiently. After ingestion, you can use Kusto Query Language (KQL) to query the data in the database.
KQL Table is a Fabric item which is part of the KQL Database. Both these entities are housed within an Eventhouse. An Eventhouse is a workspace of databases, which might be shared across a certain project. It allows you to manage multiple databases at once, sharing capacity and resources to optimize performance and cost. Eventhouses provide unified monitoring and management across all databases and per database.
Figure 1: Hierarchy of Fabric items in an Eventhouse
Update policies are automated processes activated when new data is added to a table. They automatically transform the incoming data with a query and save the result in a destination table, removing the need for manual orchestration. A single table can have multiple update policies for various transformations, saving data to different tables simultaneously. These target tables can have distinct schemas, retention policies, and other configurations from the source table.
Scope
In this blog, we have a scenario where we will be doing data enrichment on the data that lands in the KQL table. In this case, we will be dropping the columns we don’t need but you can also do other transformations supported in KQL on the data.
Here we have a real-time stream pushing data to a KQL table. Once loaded into the source table, we will use an update policy which will drop columns not needed and push the data of interest to the destination table from the source table.
Prerequisites
A Microsoft account or a Microsoft Entra user identity. An Azure subscription isn’t required.
The above policy drops the Longitude and Latitude columns and stores the rest of the columns in the destination table. You can do more transformations as per your requirements, but the workflow remains the same.
After running the above command, your destination table will start populating with the new data as soon as the source table gets data. To review the policy on the destination table, you can run the following command:
.show table policy update
Conclusion
To summarize, we took a real-time data stream, stored the data in a KQL database and then performed data enrichment on the data and stored in a destination table. This flow caters the scenarios where you want to perform processing on the data once its ingested from the stream.
Recent Comments