You must have seen Security Roles, Field Level Security, Access Teams etc. as a way to access records in Dynamics 365 CE (or CRM). Here’s what Hierarchy Security is all about – There are 2 types of Hierarchy Settings in Dynamics 365 CE and what they mean in a short explanation – Configure Hierarchy Settings … Continue reading Hierarchy Settings in Dynamics 365 CE | Power Platform Admin Center
Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.
In this post, you’ll learn how to configure Position Hierarchy for Dynamics 365 CE environment – Let’s first look at the scenario which we want to look at and then how we can configure the Hierarchy to limit and show the Positions of the Users the intended data. Scenario Let’s consider the below scenario on … Continue reading Position Hierarchy Settings in Dynamics 365 CE
Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.
In this post, you’ll learn how to configure Manager Hierarchy for Dynamics 365 CE environment – Let’s first look at the scenario which we want to look at and then how we can configure the Hierarchy to limit and show the Managers the intended data. Scenario Let’s consider the below scenario on who reports to … Continue reading Manager Hierarchy Settings in Dynamics 365 CE
Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.
This article is contributed. See the original author and article here.
TL;DR: This post navigates the intricate world of AI model upgrades, with a spotlight on Azure OpenAI’s embedding models like text-embedding-ada-002. We emphasize the critical importance of consistent model versioning ensuring accuracy and validity in AI applications. The post also addresses the challenges and strategies essential for effectively managing model upgrades, focusing on compatibility and performance testing.
Introduction
What are Embeddings?
Embeddings in machine learning are more than just data transformations. They are the cornerstone of how AI interprets the nuances of language, context, and semantics. By converting text into numerical vectors, embeddings allow AI models to measure similarities and differences in meaning, paving the way for advanced applications in various fields.
Importance of Embeddings
In the complex world of data science and machine learning, embeddings are crucial for handling intricate data types like natural language and images. They transform these data into structured, vectorized forms, making them more manageable for computational analysis. This transformation isn’t just about simplifying data; it’s about retaining and emphasizing the essential features and relationships in the original data, which are vital for precise analysis and decision-making.
Embeddings significantly enhance data processing efficiency. They allow algorithms to swiftly navigate through large datasets, identifying patterns and nuances that are difficult to detect in raw data. This is particularly transformative in natural language processing, where comprehending context, sentiment, and semantic meaning is complex. By streamlining these tasks, embeddings enable deeper, more sophisticated analysis, thus boosting the effectiveness of machine learning models.
Implications of Model Version Mismatches in Embeddings
Lets discuss the potential impacts and challenges that arise when different versions of embedding models are used within the same domain, specifically focusing on Azure OpenAI embeddings. When embeddings generated by one version of a model are applied or compared with data processed by a different version, various issues can arise. These issues are not only technical but also have practical implications on the efficiency, accuracy, and overall performance of AI-driven applications.
Compatibility and Consistency Issues
Vector Space Misalignment: Different versions of embedding models might organize their vector spaces differently. This misalignment can lead to inaccurate comparisons or analyses when embeddings from different model versions are used together.
Semantic Drift: Over time, models might be trained on new data or with updated techniques, causing shifts in how they interpret and represent language (semantic drift). This drift can cause inconsistencies when integrating new embeddings with those generated by older versions.
Impact on Performance
Reduced Accuracy: Inaccuracies in semantic understanding or context interpretation can occur when different model versions process the same text, leading to reduced accuracy in tasks like search, recommendation, or sentiment analysis.
Inefficiency in Data Processing: Mismatches in model versions can require additional computational resources to reconcile or adjust the differing embeddings, leading to inefficiencies in data processing and increased operational costs.
Best Practices for Upgrading Embedding Models
Upgrading Embedding – Overview
Now lets move to the process of upgrading an embedding model, focusing on the steps you should take before making a change, important questions to consider, and key areas for testing.
Pre-Upgrade Considerations
Assessing the Need for Upgrade:
Why is the upgrade necessary?
What specific improvements or new features does the new model version offer?
How will these changes impact the current system or process?
Understanding Model Changes:
What are the major differences between the current and new model versions?
How might these differences affect data processing and results?
Data Backup and Version Control:
Ensure that current data and model versions are backed up.
Implement version control to maintain a record of changes.
Questions to Ask Before Upgrading
Compatibility with Existing Systems:
Is the new model version compatible with existing data formats and infrastructure?
What adjustments, if any, will be needed to integrate the new model?
Cost-Benefit Analysis:
What are the anticipated costs (monetary, time, resources) of the upgrade?
How do these costs compare to the expected benefits?
Long-Term Support and Updates:
Does the new model version have a roadmap for future updates and support?
How will these future changes impact the system?
Key Areas for Testing
Performance Testing:
Test the new model version for performance improvements or regressions.
Compare accuracy, speed, and resource usage against the current version.
Compatibility Testing:
Ensure that the new model works seamlessly with existing data and systems.
Test for any integration issues or data format mismatches.
Fallback Strategies:
Develop and test fallback strategies in case the new model does not perform as expected.
Ensure the ability to revert to the previous model version if necessary.
Post-Upgrade Best Practices
Monitoring and Evaluation:
Continuously monitor the system’s performance post-upgrade.
Evaluate whether the upgrade meets the anticipated goals and objectives.
Feedback Loop:
Establish a feedback loop to collect user and system performance data.
Use this data to make informed decisions about future upgrades or changes.
Upgrading Embedding – Conclusion
Upgrading an embedding model involves careful consideration, planning, and testing. By following these guidelines, customers can ensure a smooth transition to the new model version, minimizing potential risks and maximizing the benefits of the upgrade.
Use Cases in Azure OpenAI and Beyond
Embedding can significantly enhance the performance of various AI applications by enabling more efficient data handling and processing. Here’s a list of use cases where embeddings can be effectively utilized:
Enhanced Document Retrieval and Analysis: By first performing embeddings on paragraphs or sections of documents, you can store these vector representations in a vector database. This allows for rapid retrieval of semantically similar sections, streamlining the process of analyzing large volumes of text. When integrated with models like GPT, this method can reduce the computational load and improve the efficiency of generating relevant responses or insights.
Semantic Search in Large Datasets: Embeddings can transform vast datasets into searchable vector spaces. In applications like eCommerce or content platforms, this can significantly improve search functionality, allowing users to find products or content based not just on keywords, but on the underlying semantic meaning of their queries.
Recommendation Systems: In recommendation engines, embeddings can be used to understand user preferences and content characteristics. By embedding user profiles and product or content descriptions, systems can more accurately match users with recommendations that are relevant to their interests and past behavior.
Sentiment Analysis and Customer Feedback Interpretation: Embeddings can process customer reviews or feedback by capturing the sentiment and nuanced meanings within the text. This provides businesses with deeper insights into customer sentiment, enabling them to tailor their services or products more effectively.
Language Translation and Localization: Embeddings can enhance machine translation services by understanding the context and nuances of different languages. This is particularly useful in translating idiomatic expressions or culturally specific references, thereby improving the accuracy and relevancy of translations.
Automated Content Moderation: By using embeddings to understand the context and nuance of user-generated content, AI models can more effectively identify and filter out inappropriate or harmful content, maintaining a safe and positive environment on digital platforms.
Personalized Chatbots and Virtual Assistants: Embeddings can be used to improve the understanding of user queries by virtual assistants or chatbots, leading to more accurate and contextually appropriate responses, thus enhancing user experience. With similar logic they could help route natural language to specific APIs. See CompactVectorSearch repository, as an example.
Predictive Analytics in Healthcare: In healthcare data analysis, embeddings can help in interpreting patient data, medical notes, and research papers to predict trends, treatment outcomes, and patient needs more accurately.
In all these use cases, the key advantage of using embeddings is their ability to process and interpret large and complex datasets more efficiently. This not only improves the performance of AI applications but also reduces the computational resources required, especially for high-cost models like GPT. This approach can lead to significant improvements in both the effectiveness and efficiency of AI-driven systems.
Specific Considerations for Azure OpenAI
Model Update Frequency: Understanding how frequently Azure OpenAI updates its models and the nature of these updates (e.g., major vs. minor changes) is crucial.
Backward Compatibility: Assessing whether newer versions of Azure OpenAI’s embedding models maintain backward compatibility with previous versions is key to managing version mismatches.
Version-Specific Features: Identifying features or improvements specific to certain versions of the model helps in understanding the potential impact of using mixed-version embeddings.
Strategies for Mitigation
Version Control in Data Storage: Implementing strict version control for stored embeddings ensures that data remains consistent and compatible with the model version used for its generation.
Compatibility Layers: Developing compatibility layers or conversion tools to adapt older embeddings to newer model formats can help mitigate the effects of version differences.
Baseline Tests: Create few simple baseline tests, that would identify any drift of the embeddings.
Azure OpenAI Model Versioning: Understanding the Process
Azure OpenAI provides a systematic approach to model versioning, applicable to models liketext-embedding-ada-002:
Regular Model Releases:
New models are released periodically with improvements and new features.
Model version mismatches in embeddings, particularly in the context of Azure OpenAI, pose significant challenges that can impact the effectiveness of AI applications. Understanding these challenges and implementing strategies to mitigate their effects is crucial for maintaining the integrity and efficiency of AI-driven systems.
References
“Learn about Azure OpenAI Model Version Upgrades.” Microsoft Tech Community.Link
This article is contributed. See the original author and article here.
In recent scenarios encountered with our customers, we have come across a specific need: restricting certain users from using SQL Server Management Studio (SSMS) or other applications to connect to a designated database in Azure SQL Database. A common solution in traditional SQL Server environments, like the use of LOGIN TRIGGERS, is not available in Azure SQL Database. This limitation poses a unique challenge in database management and security.
To address this challenge, I’d like to share an alternative that combines the power of Extended Events in Azure SQL Database with PowerShell scripting. This method effectively captures and monitors login events, providing administrators with timely alerts whenever a specified user connects to the database using a prohibited application, such as SSMS.
How It Works
Extended Events Setup: We start by setting up an Extended Event in Azure SQL Database. This event is configured to capture login activities, specifically focusing on the application name used for the connection. By filtering for certain applications (like SSMS), we can track unauthorized access attempts.
PowerShell Script: A PowerShell script is then employed to query these captured events at regular intervals. This script connects to the Azure SQL Database, retrieves the relevant event data, and checks for any instances where the specified users have connected via the restricted applications.
Email Alerts: Upon detecting such an event, the PowerShell script automatically sends an email notification to the database administrator. This alert contains details of the unauthorized login attempt, such as the timestamp, username, and application used. This prompt information allows the administrator to take immediate corrective measures.
Advantages
Proactive Monitoring: This approach provides continuous monitoring of the database connections, ensuring that any unauthorized access is quickly detected and reported.
Customizable: The method is highly customizable. Administrators can specify which applications to monitor and can easily adjust the script to cater to different user groups or connection parameters.
No Direct Blocking: While this method does not directly block the connection, it provides immediate alerts, enabling administrators to react swiftly to enforce compliance and security protocols.
This article provides a high-level overview of how to implement this solution. For detailed steps and script examples, administrators are encouraged to tailor the approach to their specific environment and requirements.
Extended Event
CREATE EVENT SESSION Track_SSMS_Logins
ON DATABASE
ADD EVENT sqlserver.sql_batch_starting(
ACTION(sqlserver.client_app_name, sqlserver.client_hostname, sqlserver.username, sqlserver.session_id)
WHERE (sqlserver.client_app_name LIKE '%Management Studio%')
)
ADD TARGET package0.ring_buffer
(SET max_events_limit = 1000, max_memory = 4096)
WITH (EVENT_RETENTION_MODE = NO_EVENT_LOSS, MAX_DISPATCH_LATENCY = 5 SECONDS);
GO
ALTER EVENT SESSION Track_SSMS_Logins ON DATABASE STATE = START;
Query to run using ring buffers
SELECT
n.value('(@timestamp)[1]', 'datetime2') AS TimeStamp,
n.value('(action[@name="client_app_name"]/value)[1]', 'varchar(max)') AS Application,
n.value('(action[@name="username"]/value)[1]', 'varchar(max)') AS Username,
n.value('(action[@name="client_hostname"]/value)[1]', 'varchar(max)') AS HostName,
n.value('(action[@name="session_id"]/value)[1]', 'int') AS SessionID
FROM
(SELECT CAST(target_data AS xml) AS event_data
FROM sys.dm_xe_database_session_targets
WHERE event_session_address =
(SELECT address FROM sys.dm_xe_database_sessions WHERE name = 'Track_SSMS_Logins')
AND target_name = 'ring_buffer') AS tab
CROSS APPLY event_data.nodes('/RingBufferTarget/event') AS q(n);
Powershell Script
# Connection configuration
$Database = "DBNAme"
$Server = "Servername.database.windows.net"
$Username = "username"
$Password = "pwd!"
$emailFrom = "EmailFrom@ZYX.com"
$emailTo = "EmailTo@XYZ.com"
$smtpServer = "smtpservername"
$smtpUsername = "smtpusername"
$smtpPassword = "smtppassword"
$smtpPort=25
$ConnectionString = "Server=$Server;Database=$Database;User Id=$Username;Password=$Password;"
# Last check date
$LastCheckFile = "c:tempLastCheck.txt"
$LastCheck = Get-Content $LastCheckFile -ErrorAction SilentlyContinue
if (!$LastCheck) {
$LastCheck = [DateTime]::MinValue
}
# SQL query
$Query = @"
SELECT
n.value('(@timestamp)[1]', 'datetime2') AS TimeStamp,
n.value('(action[@name="client_app_name"]/value)[1]', 'varchar(max)') AS Application,
n.value('(action[@name="username"]/value)[1]', 'varchar(max)') AS Username,
n.value('(action[@name="client_hostname"]/value)[1]', 'varchar(max)') AS HostName,
n.value('(action[@name="session_id"]/value)[1]', 'int') AS SessionID
FROM
(SELECT CAST(target_data AS xml) AS event_data
FROM sys.dm_xe_database_session_targets
WHERE event_session_address =
(SELECT address FROM sys.dm_xe_database_sessions WHERE name = 'Track_SSMS_Logins')
AND target_name = 'ring_buffer') AS tab
CROSS APPLY event_data.nodes('/RingBufferTarget/event') AS q(n)
WHERE
n.value('(@timestamp)[1]', 'datetime2') > '$LastCheck'
"@
# Create and open SQL connection
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = $ConnectionString
$SqlConnection.Open()
# Create SQL command
$SqlCommand = $SqlConnection.CreateCommand()
$SqlCommand.CommandText = $Query
# Execute SQL command
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter $SqlCommand
$DataSet = New-Object System.Data.DataSet
$SqlAdapter.Fill($DataSet)
$SqlConnection.Close()
# Process the results
$Results = $DataSet.Tables[0]
# Check for new events
if ($Results.Rows.Count -gt 0) {
# Prepare email content
$EmailBody = $Results | Out-String
$smtp = New-Object Net.Mail.SmtpClient($smtpServer, $smtpPort)
$smtp.EnableSsl = $true
$smtp.Credentials = New-Object System.Net.NetworkCredential($smtpUsername, $smtpPassword)
$mailMessage = New-Object Net.Mail.MailMessage($emailFrom, $emailTo)
$mailMessage.Subject = "Alert: SQL Access in database $Database"
$mailMessage.Body = "SQL Access Alert in database $Database on server $Server at $LastCheck."
$smtp.Send($EmailBody)
# Save the current timestamp for the next check
Get-Date -Format "o" | Out-File $LastCheckFile
}
# Remember to schedule this script to run every 5 minutes using Windows Task Scheduler
Of course, that using SQL auditing o Log analytics will be another alternative.
You’re in for a treat! The world of e-commerce has undergone a massive transformation over the past few years, and it’s all thanks to the revolutionary concept of composable commerce. This approach has taken the industry by storm, and it’s not hard to see why. Composable commerce is versatile, scalable, and innovative approach, allowing businesses of all sizes to provide exceptional customer experiences across various platforms and devices.
In this article, we’ll look closer at the intricacies of composable commerce, exploring its core benefits and examining how it’s changing the game for the e-commerce industry. Get ready to be blown away by the possibilities of composable commerce!
Image: Multiple Ecommerce Channels
Many organizations have started adopting Dynamics 365 Commerce, a composable commerce engine to enable customers to unify back office, in-store and e-commerce channels. While also serving as the single integration point for third-party channel solutions. This gives customers the key advantage of using a variety of best of breed commerce solutions to engage and deliver goods and services to their customers.
What is Composable Commerce:
Composable commerce is a contemporary approach to e-commerce that separates the front-end (presentation layer) and back-end (commerce logic) of an e-commerce platform. Unlike traditional e-commerce systems, where changes to one component can affect the other, composable commerce decouples these two layers, enabling independent development and greater flexibility. This separation allows for greater agility, faster innovation, and the ability to adapt quickly to changing market demands.
Image: Composable Commerce Diagram
In contrast, traditional e-commerce systems often have monolithic front and back ends, leading to certain limitations. Modifying the underlying codebase to change the front-end design or user experience can be complex and time-consuming. Additionally, traditional systems are not easily scalable across different devices or channels. Composable commerce addresses these challenges by allowing businesses to easily update their website’s design or incorporate new features without disrupting the core e-commerce functionality.
What options do companies have:
Businesses have two powerful options to customize their e-commerce experiences: headless commerce and composable commerce. Headless commerce allows companies to develop and update front-end and back-end components independently, enabling quick adaptation to market changes and experimentation with innovative features. Composable commerce takes flexibility and customization to the next level by enabling businesses to select modular components from different vendors, providing the ultimate flexibility to create an e-commerce ecosystem that is tailored to their unique needs.
Benefits of Composable Commerce:
To start with, the flexibility and agility of a digital environment is continuously evolving, thus using a decouple architecture business can quickly adapt to customers changing preferences. Separating the front-end from the back end ensures that branding, user experience, and functionality stay consistent across various channels. By having cohesive experience across web, mobile, social, media, voice assistant and other Artificial Intelligence (AI), Virtual Reality (VR), Augmented Reality (AR), Voice Commerce, based emerging technologies lead to higher customer satisfaction, engagement, and loyalty.
Image: With composable commerce, businesses can provide cohesive experience to customers on various channels
In addition, scalability and performance are also greatly enhanced because businesses can independently scale each layer resulting in better resource allocation. Websites can now handle increased traffic, sales volume, and complex operations leading to faster page upload time and better user experience.
End-user Benefits:
Whether customers interact with your brand through a website, mobile app, voice assistant, marketplace, or social media platform, composable commerce ensures a seamless and tailored experience. In addition, faster loading times and improved website performance reduce the long wait time for the entire page to load, resulting in a smoother and more responsive user interface. More importantly customers are browsing via desktop, smartphone, tablet, or using voice assistants to access your products and services seamlessly. This omni-channel capability enhances convenience and accessibility for customers, meeting their expectations for a seamless cross-channel experience. Dynamics 365 Commerce enables businesses to build this experience.
Empowering Vision: ABB Optical Group’s Intelligent Contact Lens Ordering Platform with Microsoft Dynamics 365″
Embarking on a technological evolution, ABB Optical Group introduces its Intelligent Contact Lens Ordering Platform, a game-changer crafted in collaboration with Visionet Systems Inc. and Microsoft. This innovation involved the implementation of Microsoft Dynamics 365 Finance and Operations, Azure Cloud, and Data Lake, providing a solid technological foundation. ABB Optical aimed to transcend its legacy Patient Ordering Platform, yourlens.com, seeking a modern, intelligent, and scalable user experience. This vision materialized through the development of a robust Minimum Viable Product(MVP), introducing a microservices headless experience and harnessing the capabilities of Microsoft D365 Retail and HQ APIs, alongside Proof of Concepts.
The outcome was nothing short of transformative. The MVP’s successful pilot garnered positive feedback, propelling the rapid development of additional customer-demanded features. In just six months, Visionet spearheaded the launch of phase two of the Abby Platform, seamlessly integrating a data analytics component through Data Lake with Dynamics 365 F&O and Power BI. ABB Optical Group now stands at the forefront of innovation, offering eyecare providers and patients an intelligent, forward-thinking ordering system.
Conclusion:
In conclusion, the emergence of composable commerce signifies a pivotal shift in the digital marketplace. This approach, distinguished by its modular structure, cloud-native integration, and technology-independent capabilities, provides businesses with unparalleled flexibility and adaptability. It enables businesses to customize their digital experiences, integrate seamlessly with best of breed solution providers for individual capabilities, and respond swiftly to market changes and complexities.
Learn more
Dynamics 365 Commerce delivers a comprehensive, yet composable, set of capabilities for both consumer and business-facing organizations seeking to expand beyond traditional digital commerce limitations and improve customer engagement, build brand awareness, streamline purchasing, and deliver exceptional customer experiences.
This article is contributed. See the original author and article here.
We’re looking for Microsoft Fabric Analytics Engineers to take our new beta exam. Do you have subject matter expertise in designing, creating, and deploying enterprise-scale data analytics solutions? If so, and if you know how to transform data into reusable analytics assets by using Microsoft Fabric components, such as lakehouses, data warehouses, notebooks, dataflows, data pipelines, semantic models, and reports, be sure to check out this exam. Other helpful qualifications include the ability to implement analytics best practices in Fabric, including version control and deployment.
This certification could be a great fit if you have in-depth familiarity with the Fabric solution and you have experience with data modeling, data transformation, Git-based source control, exploratory analytics, and languages, including Structured Query Language (SQL), Data Analysis Expressions (DAX), and PySpark.
Review the Exam DP-600 (beta) page for details, and check out the self-paced learning paths and instructor-led training there. The Exam DP-600 study guide alerts you for key topics covered on the exam.
Ready to prove your skills?
Take advantage of the discounted beta exam offer. The first 300 people who take Exam DP-600 (beta) on or before January 25, 2024, can get 80 percent off market price.
To receive the discount, when you register for the exam and are prompted for payment, use code DP600Winfield. This is not a private access code. The seats are offered on a first-come, first-served basis. As noted, you must take the exam on or before January 25, 2024. Please note that this beta exam is not available in Turkey, Pakistan, India, or China.
The rescore process starts on the day an exam goes live—8 to 12 weeks after the beta period, and final scores for beta exams are released approximately 10 days after that. For details on the timing of beta exam rescoring and results, read my post Creating high-quality exams: The path from beta to live.
Get ready to take Exam DP-600 (beta)
Explore the Fabric Career Hub. Access live training, skills challenges, group learning, and career insights.
Join the Fabric Cloud Skills Challenge. Complete all modules in the challenge within 30 days and become eligible for 50% off the cost of a Microsoft Certification exam. This 50% discount can’t be used toward the Exam DP-600 (beta). If you miss the beta period, you can use it later once the exam goes live or for another live certification exam.
Did you know that you can take any role-based exam online? Online delivered exams—taken from your home or office—can be less hassle, less stress, and even less worry than traveling to a test center, especially if you’re adequately prepared for what to expect. To find out more, check out my blog post Online proctored exams: What to expect and how to prepare.
Ready to get started?
Remember, the number of spots for the discounted beta exam offer is limited to the first 300 candidates taking Exam DP-600 (beta) on or before January 25, 2024.
This article is contributed. See the original author and article here.
In the race to deliver engaging in-store experiences, Microsoft is uniquely positioned to equip retailers with the tech they need to transform their store team’s workdays. At the National Retail Federation (NRF) 2024, we are announcing new solutions designed to enable store teams to efficiently meet customers’ expectations and improve the retail experience in this new era of AI.
This article is contributed. See the original author and article here.
As we ring in the start of 2024, we’re gearing up to showcase a host of new innovations across Microsoft Teams at the annual National Retail Federation (NRF) conference, taking place January 14th – January 16th in New York City.
We’re announcing new solutions designed to enable store teams to efficiently meet customers’ expectations and improve the retail experience in this new era of AI.
Keep reading below for the latest product and feature capabilities coming to Teams to help simplify operations and enable first-class retail experiences for all retail workers – including the frontline.
Enhanced Store Team Communication and Collaboration
Route announcements to frontline teams by location, department, and role Target important announcements to the right frontline employees based on location, department, and job role information. Targeted announcements will surface on the Teams home experience so your frontline employees will never miss an important communication. This feature will be generally available in March 2024. Learn more.
Boost frontline teamwork with auto-generated role and department tagging Reach the right person at the right time with automatic tags for your frontline teams. Tags for department and job roles can be configured and created automatically for your frontline workers in the Teams Admin Center. Frontline employees can leverage these automatic tags in their frontline teams to connect with the right person every time. This feature will be in public preview in February 2024. Learn more.
Bring answers to communities for easier information sharing In Viva Engage in Teams, answers from Q&A conversations will now be available in communities, better enabling frontline workers to easily source needed information. This feature will be generally available January 2024.
Monitor how employee engagement drives business performance Also coming to Viva Engage in Teams, network analytics will bring AI-powered theme extraction and employee retention metrics to users to help enhance insights into workforce dynamics and help drive informed decision making. This feature will be generally available in February 2024. Learn more.
Automatically hear push-to-talk transmissions from multiple channels Frontline workers using Walkie Talkie in Teams now have the option to automatically hear incoming transmissions from any of their pinned favorite Teams channels. With this new feature, users can stay better connected to multiple channels without needing to switch channels manually. This feature will be generally available by end of month. Learn more on how to get started.
Use any generic wired (USB-C and 3.5mm) headset for instant team communication on Android Frontline workers often need to instantly communicate with each other even when their phones are locked. We integrated Walkie Talkie in Teams with audio accessories partners to make this experience possible with the dedicated push-to-talk (PTT) button on headsets, which instantly brings up walkie talkie for clear and secure voice communication. In addition to select specialized headsets, we are excited to announce that Walkie Talkie in Teams will now work with any generic wired (USB-C and 3.5mm) headsets on Android.
As long as the generic headsets have a control to play/pause button or to accept/decline calls, frontline workers can tap the play/pause button to start and stop transmissions on walkie talkie. Frontline organizations will be able to easily start using walkie talkie with these lower-cost generic headsets. This feature will be generally available starting February 2024. Learn more.
Streamline Retail Store Operations
Allow frontline teams to set their shift availability for specific dates Frontline workers will now have the flexibility to set their availability preferences on specific dates, enhancing their ability to manage unique scheduling needs. This added feature complements existing options for recurring weekly availability. This feature is available in January 2024. To learn more about recent enhancements to Shifts in Teams, read the latest blog – Discover the latest enhancements in Microsoft Shifts.
Easily deploy shifts at scale for your frontline Teams admins can now standardize Shifts settings across all frontline teams and manage them centrally by deploying Shifts to frontline teams at scale in the Teams admin center. You can select which capabilities to turn on or off like (showing open shifts, swap shift requests, offer shift requests, time off requests, and time clock.)
Admins can also identify schedule owners and create scheduling groups uniformly for all frontline teams at the tenant level and create schedule groups and time-off reasons that will be set uniformly across all frontline teams. Your frontline managers are able to start using Shifts straight out-of-the-box with minimal setup required. This feature is currently in public preview and will be generally available in March 2024. Learn more.
Streamline Teams deployment for your frontline and manage at scale Whether due to seasonality or the natural turnover seen on the frontline in retail, simplifying user membership is key to easing management needs. Now generally available, Microsoft has added new capabilities in the Teams Admin Center to deploy frontline dynamic teams at scale for your entire frontline workforce. Through the power of dynamic teams, team membership is automatically managed and always up to date with the right users as people enter, move within, or leave the organization using dynamic groups from Entra ID.
This deployment tool streamlines the admin experience to create a Teams structure that maps the frontline workforces’ real-world into digital world and makes it easy to set up a consistent channel structure to optimize for strong frontline collaboration on day one. Available in February, customers can use custom user attributes in Entra ID to define frontline and location attributes, with additional enhancements that make it easier to assign team owners by adding a people picker to the setup wizard.
Map your operational hierarchy to frontline teams Admins will be able to set up their frontline operational hierarchy to map their organization’s structure of frontline locations and teams to a hierarchy in the Teams Admin Center. Admins can also define attributes for their teams that range from department information to brand information. The operational hierarchy coupled with this added metadata will enable frontline apps and experiences in the future like task publishing. This feature will be in public preview in January 2024. Learn more.
Leverage generative AI to streamline in-store shift management Store managers can also identify items such as open shifts, time off, and existing shifts with a new Shifts plug-in for Microsoft 365 Copilot. Microsoft 365 Copilot can now ground prompts and retrieve insights for frontline managers leveraging data from the Shifts app in addition to user and company data it has access to such as Teams chat history, SharePoint, emails, and more.
Automate and simplify corporate to store task publishing With task publishing, you can now create a list of tasks and schedule them to be automatically published to your frontline teams on a regular cadence, such as every month on the 15th. Once you publish a list, the task publishing feature will handle the scheduling and ensure that the list is published at the desired cadence. This feature is useful for tasks that need to be done regularly, such as store opening and closing processes or conducting periodic inspections and compliance checks. This feature will be generally available in March 2024.
Publish a task that everyone in the team must complete This new capability provides the option to create a task that every member of the recipient team must complete. Organizations can assign tasks like complete training or review a new policy to all or a specific set of frontline workers. The task will be created for each worker at the designated location. This feature will become generally available in March 2024.
Require additional completion requirements for submitting tasks When you create a task within the task publishing feature, you have the option to request a form and/or photo completion. When you publish that task, each recipient team will be unable to mark the task complete until the form is submitted by a member of the team. This ensures that the task is completed properly by each team member.
Additionally, with approval completion requirements, organizations can hold frontline managers and their teams accountable for verifying the work was done to standard before reflecting that work as completed. This allows an organization to increase attention to detail and accountability for important tasks. These features will become generally available in March 2024.
Secure and Manage your Business
Simplify authentication with domain-less sign-in Since a single device is often shared among multiple frontline workers, they need to sign-in and out multiple times a day throughout a shift or across shifts. Typing out long user names with a domain is prone to mistakes and can be time consuming. With domain-less sign-in, frontline workers can now sign-in to Teams quicker using only the first part of their username (i.e., without the domain), then enter the password to access Teams on shared and corporate-managed devices. For example, if the username is 123456@microsoft.com or alland@microsoft.com, users can now sign in with only “123456” or “alland”, respectively.
We’re excited to share more updates and new features throughout the calendar year. To learn more about how Microsoft Teams empowers frontline workers, please visit our webpage to learn how.
This article is contributed. See the original author and article here.
In an industry defined by both growth and disruption, retailers are depending on technology to navigate challenges ranging from shifting purchase habits to supply chain complexities. Next week, at the National Retail Federation (NRF) Big Show, Microsoft will demonstrate Dynamics 365 solutions powered by AI to help accelerate retail agility and innovation in the next decade.
Gain valuable AI insights for your business
Learn more at the National Retail Federation Big Show
Microsoft Dynamics 365 Customer Insights, providing retailers with AI-powered experiences to transform daily marketing workflows.
Microsoft Dynamics 365 Supply Chain Management, providing AIpowered guidance for demand planning, streamlining procurement, and enhancing supply chain visibility.
NRF attendees can learn more about the transformative power of AI across the retail industry by attending two Big Ideas Sessions hosted by Shelley Bransten, Corporate Vice President, Global Retail, Consumer Goods, and Gaming Industries, and Kathleen Mitford, Corporate Vice President, Global Industry Marketing.
Helping retailers personalize the shopping experience
Retailers often tell us that they’re under pressure to get marketing and customer experience projects and campaigns to market faster and are asked to do more with less. Yet, the processes and tools they use haven’t evolved to meet this demand.
Deploying a project to market requires various roles or specialists, costly third-party agencies, and siloed applications to review data and create content. Monitoring results for optimization also becomes a timely and tedious task, having to track down the right people with the right application and the right data. These challenges not only hinder a campaign’s time to market and employee productivity, but can also result in a disjointed customer experience.
It’s not just our customers who are feeling the burden of these challenges. The market is feeling it too. For instance, 63% of surveyed retailers said they hope they can improve their marketing with AI in the next 18 to 24 months.1 In the age of AI, shouldn’t it be easier to get your campaigns to market?
We are announcing new Copilot features in Dynamics 365 Customer Insights that will transform how marketers manage and maintain projects and campaigns, increasing productivity, efficiency, and speed to market. These new capabilities build on Copilot features introduced in the past year, including, but not limited to, the ability to generate content ideas, query customer data using natural language, and create customer segments and journeys using next-generation AI.
Marketers can kick-start their marketing project by writing their campaign objective in natural language, or by uploading an existing creative brief. The project board is then generated using the prompt or brief, connected organizational data, and previous campaigns in Customer Insights. The project board streamlines and connects all workflows into one place for building and managing marketing assets.
“These new copilot capabilities in Dynamics 365 Customer Insights will enable us to focus our time and energy in the right places—better informing us on optimization priorities without the need to dig into details manually. That alone saves so much time.”
—Hannah Harper, Leatherman, Digital Marketing Manager
From the project board, marketers can view the campaign’s target audience and segments, as well as recommendations from Copilot for additional segments that may not have been previously considered. Selecting a suggested audience segment automatically generates a complementary customer journey, saving marketers time while also helping them deliver a personalized customer experience.
End-to-end customer journeys containing personalized touchpoints, such as promotional emails or event invitations, are generated using Copilot. Through our partnership with Typeface and its enterprise-grade generative AI capabilities, marketers can produce brand-authentic images across assets, supercharging personalized content for greater impact—all from within Dynamics 365 Customer Insights. Additionally, Typeface helps align content to the organization’s brand guidelines, including themes, fonts, and product images—extracted from a central asset library.
“Every aspect of the enterprise is already being redefined with generative AI, from developer to product to sales experiences. By combining Dynamics 365 Customer Insights with Typeface’s powerful storytelling engine, we’re fundamentally reshaping campaign workflows with generative AI by starting with just a goal. This means personalizing content at an unprecedented scale, bridging the gap between content and data, and ushering in a new era of marketing creativity and productivity.”
—Abhay Parasnis, Founder and CEO of Typeface
These Copilot capabilities will be available in preview in the first quarter of 2024, with general availability by the third quarter of 2024. Existing Customer Insights customers can sign up now for the early access public preview program here.
This is just the beginning; we will be delivering further content curation, journey testing, and metrics monitoring to optimize campaigns. Our vision is that, together, this new AI-first experience will transform how marketers work by reducing the complexities of end-to-end campaign management and enhancing marketer productivity and ROI.
Click the image below to watch a video and learn more about our vision.
Build a real-time retail supply chain
In 2024, retail supply chains face countless challenges, from labor shortages and increasing costs to complexities across omnichannel retail experiences. Enterprise AI solutions, now readily available for retailers, can power greater efficiency, productivity, and innovation across the supply chain.
At Microsoft, we aim to deliver new supply chain innovations powered by Copilot to our customers through our open, flexible, and collaborative Microsoft platform; helping organizations to reduce risk, manage inventory, plan with flexibility, and make quick decisions across the whole supply chain.
New copilot capabilities to improve demand planning
A retailer’s success hinges on having the right inventory at the right place at the right time, and that starts with successful demand planning. We recently announced new demand planning capabilities in Dynamics 365 in November 2023 that uses AI, machine learning, and external signals to predict demand accurately, and now we are enhancing it with Copilot. This will help planners understand how a forecast was generated and help them find patterns and anomalies.
Copilot will also help them make sense of complex relationships across datasets using natural language interactions, and it will also assist with the routine tasks of making demand review reports, saving the planners time to focus on high-priority activities.
Some of our customers, including Domino’s Pizza UK & Ireland, can use the new demand planning capabilities to make smart predictions from the data and insights.
“The demand planning capabilities in Dynamics 365 are helping us make the right decisions to lower wastage, avoid unnecessary deliveries, and be cybersafe.”
—Neha Batra, Head of Business Solutions, Domino’s Pizza UK & Ireland
The new demand planning capabilities create a more flexible, simplified, and intuitive user experience. Planners have an increased level of trust and can rely more on the forecast, knowing how it’s generated. The latest demand planning capabilities help reduce excess inventory and increase working capital for retailers.
New Copilot capabilities to improve productivity and proactively mitigate disruptions
In November 2023, we also announced new Copilot capabilities in preview for Dynamics 365 that enable supply chain teams to take actions based on insights with conversational help while in the flow of work. This helps increase productivity and improved collaboration among employees across the supply chain and other cross-functional teams to proactively mitigate disruptions and further automate their workflows. See the capabilities in action.
We also added new Copilot capabilities that will enhance inventory visibility and enable businesses to promise orders with improved accuracy, significantly helping brands elevate their consumers’ buying experience.
In addition, a new copilot capability that helps to streamline procurement is now generally available. Procurement teams can seamlessly handle the purchase order changes in a scalable and efficient manner and assess the impact of changes downstream to production and distribution before making the right decision.
Generate product enrichment content for e-commerce sites with Copilot
Informative, story-rich product content can drive customer engagement and sales on e-commerce sites. Creating that content, however, can be time-consuming and challenging. In October 2023, we launched in preview the ability for business-to-business and business-to-consumer online retailers to use Copilot in Dynamics 365 Commerce to generate enriched product marketing content for their websites. This helps to decrease the time it takes to create compelling marketing content, while increasing productivity and increasing the overall number of online orders.
Visit our Microsoft booth at NRF this year to see these innovations in action.
Sunday, January 14, 2024 | 1:00 – 1:30 PM Eastern Standard Time (EST)
Join this interactive session to hear about one retailer’s AI journey to date. Hosted by Microsoft’s Corporate Vice President, Retail, Consumer Goods & Gaming Industries, Shelley Bransten, you’ll also learn about new AI-focused findings from Futurum Research and all new AI capabilities in Microsoft Cloud for Retail that will help power your AI transformation.
Generative AI and large language models have captured the attention of executives across industries. While the technology’s use cases seem endless, smart retailers and brands must identify and prioritize the applications of generative AI that will be most valuable to their organization and partner with organizations who will treat their data with the highest privacy standards. Join us to hear how Microsoft is helping organizations large and small maximize their generative AI opportunities safely and responsibly.
Retailers are swimming in data all day, every day. Even with sophisticated legacy technologies and cutting-edge data science, the majority of that data goes uncollected. Insights stay hidden—often in plain sight. But that’s starting to change. AI tools are enabling retailers to understand their customers, merchandising, supply chains, operations, and workforces better than ever before. Join us to hear about the myriad insights that retailers are drawing from newfound and increasingly precise data sources to run leaner, smarter stores.
Recent Comments