Announcing automatic Copilot enablement in Customer Service

Announcing automatic Copilot enablement in Customer Service

This article is contributed. See the original author and article here.

Starting January 19, 2024, Microsoft Copilot in Dynamics 365 Customer Service will be automatically installed and enabled in your Dynamics 365 Customer Service environment. This update will install the case summarization and conversation summarization features. These features are available to all users with a Dynamics 365 Customer Service Enterprise license, and/or digital messaging or Voice add-on license for conversation summary enablement.

If your organization has already enabled Copilot in Customer Service, there will be no change to your environment.

Key dates

  • Disclosure date: December 2023
    Administrators received a notification about the change in the Microsoft 365 admin center and Power Platform admin center.
  • Installation date: January 19 – February 2, 2024
    Copilot in Customer Service is installed and enabled by default.

Please note that specific dates for messages and auto-installation will vary based on the geography of your organization. The date applicable to your organization is in the messages in Microsoft 365 admin center and Power Platform admin center. Copilot auto-installation will occur only if your organization is in a geography where all Copilot data handling occurs “in geo.” These regions are currently Australia, United Kingdom, and United States. Organizations where Copilot data handling does not occur “in geo” must opt in to cross-geo data transmission to receive these capabilities.

What is Copilot in Dynamics 365 Customer Service?

Copilot in Customer Service is a key part of the Dynamics 365 Customer Service experience. Copilot provides real-time, AI-powered assistance to help customer support agents solve issues faster. By relieving them from mundane tasks such as searching and note-taking, Copilot gives them time for more high-value interactions with customers. Contact center managers can also use Copilot analytics to view Copilot usage and better understand how it impacts the business.

Why is Microsoft deploying this update?

We believe this update presents a significant opportunity to fundamentally alter the way your organization approaches service by quickly improving and enhancing the agent experience. The feedback we have received from customers who are already using Copilot has been overwhelmingly positive. Generative AI-based service capabilities have a profound impact on efficiency and customer experience, leading to improved customer satisfaction. This update applies only to the Copilot summarization capabilities, which integrate with service workflows and require minimal change management.

Learn more about Copilot in Dynamics 365 Customer Service

For more information, read the documentation: Enable copilots and generative AI features – Power Platform | Microsoft Learn

The post Announcing automatic Copilot enablement in Customer Service appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

The Publisher failed to allocate a new set of identity ranges for the subscription

This article is contributed. See the original author and article here.

Problem:


===========


Assume that you have tables with Identity columns declared as datatype INT and you are using Auto Identity management for those articles in a Merge Publication.


This Publication has one or more subscribers and you tried to re-initialize one subscriber using a new Snapshot.


Merge agent fails with this error:


>>
Source:  Merge Replication Provider


Number:  -2147199417


Message: The Publisher failed to allocate a new set of identity ranges for the subscription. This can occur when a Publisher or a republishing Subscriber has run out of identity ranges to allocate to its own Subscribers or when an identity column data type does not support an additional identity range allocation. If a republishing Subscriber has run out of identity ranges, synchronize the republishing Subscriber to obtain more identity ranges before restarting the synchronization. If a Publisher runs out of identit


 


 


Cause:


============


Identity range Merge agent is trying to allocate, exceeds maximum value an INT datatype can have.


 


Resolution


=================


Assume that publisher database has only one Merge publication with 2 subscribers, and your merge articles have this definition:


>>>
exec sp_addmergearticle @publication = N’MergeRepl_ReproDB’, @article = N’tblCity’, @source_owner = N’dbo’, @source_object = N’tblCity’, @type = N’table’, @description = N”, @creation_script = N”, @pre_creation_cmd = N’drop’, @schema_option = 0x000000004C034FD1, @identityrangemanagementoption = N’auto’, @pub_identity_range = 1000, @identity_range = 1000, @threshold = 90, @destination_owner = N’dbo’, @force_reinit_subscription = 1, @column_tracking = N’false’, @subset_filterclause = N”, @vertical_partition = N’false’, @verify_resolver_signature = 1, @allow_interactive_resolver = N’false’, @fast_multicol_updateproc = N’true’, @check_permissions = 0, @subscriber_upload_options = 0, @delete_tracking = N’true’, @compensate_for_errors = N’false’, @stream_blob_columns = N’false’, @partition_options = 0


 


exec sp_addmergearticle @publication = N’MergeRepl_ReproDB’, @article = N’tblCity1′, @source_owner = N’dbo’, @source_object = N’tblCity1′, @type = N’table’, @description = N”, @creation_script = N”, @pre_creation_cmd = N’drop’, @schema_option = 0x000000004C034FD1, @identityrangemanagementoption = N’auto’, @pub_identity_range = 1000, @identity_range = 1000, @threshold = 90, @destination_owner = N’dbo’, @force_reinit_subscription = 1, @column_tracking = N’false’, @subset_filterclause = N”, @vertical_partition = N’false’, @verify_resolver_signature = 1, @allow_interactive_resolver = N’false’, @fast_multicol_updateproc = N’true’, @check_permissions = 0, @subscriber_upload_options = 0, @delete_tracking = N’true’, @compensate_for_errors = N’false’, @stream_blob_columns = N’false’, @partition_options = 0


 


 


You can run this query against the Published database to see what articles range is full or have very few values left:


>>>
select a.name,


       max_used=max_used,


       diff_pub_range_end_max_used=range_end – max_used, –this tells how many values are left


       pub_range_begin=range_begin,


       pub_range_end=range_end


from dbo.MSmerge_identity_range b ,


       sysmergearticles a


where


       a.artid = b.artid


       and is_pub_range=1


order by max_used desc


 


 


name           max_used                                diff_pub_range_end_max_used             pub_range_begin                         pub_range_end


————– ————————————— ————————————— ————————————— ————-


tblCity        2147483647                              0                                       2147477647                              2147483647


tblCity1       6001                                    2147477646                              1                                             2147483647


 


 


 


As you see from above diff_pub_range_end_max_used column is zero for tblCity.


When Merge agent runs depending on how many servers are involved it has to allocate 2 ranges for each.


In the example above we have Publisher and 2 subscribers and @identity_range is 1000. So, we will have to allocate range for 3 servers i.e., 3 * (2*1000) = 6000


Our diff_pub_range_end_max_used should be greater than 6000, only then we will be able to allocate a new range for all the servers.


 


To resolve the issue.


 



  1. Remove tblCity table from publication.

  2. Change the datatype from int to bigint and add this table back to publication.

  3. Then generate a new snapshot. It will generate snapshots for all articles, but only this 1 table will be added back to the existing Subscribers.

Future-Proofing AI: Strategies for Effective Model Upgrades in Azure OpenAI

This article is contributed. See the original author and article here.

TL;DR: This post navigates the intricate world of AI model upgrades, with a spotlight on Azure OpenAI’s embedding models like text-embedding-ada-002. We emphasize the critical importance of consistent model versioning ensuring accuracy and validity in AI applications. The post also addresses the challenges and strategies essential for effectively managing model upgrades, focusing on compatibility and performance testing. 


 


Introduction


What are Embeddings?


 


Embeddings in machine learning are more than just data transformations. They are the cornerstone of how AI interprets the nuances of language, context, and semantics. By converting text into numerical vectors, embeddings allow AI models to measure similarities and differences in meaning, paving the way for advanced applications in various fields.


 


Importance of Embeddings


 


In the complex world of data science and machine learning, embeddings are crucial for handling intricate data types like natural language and images. They transform these data into structured, vectorized forms, making them more manageable for computational analysis. This transformation isn’t just about simplifying data; it’s about retaining and emphasizing the essential features and relationships in the original data, which are vital for precise analysis and decision-making.


Embeddings significantly enhance data processing efficiency. They allow algorithms to swiftly navigate through large datasets, identifying patterns and nuances that are difficult to detect in raw data. This is particularly transformative in natural language processing, where comprehending context, sentiment, and semantic meaning is complex. By streamlining these tasks, embeddings enable deeper, more sophisticated analysis, thus boosting the effectiveness of machine learning models.


 


Implications of Model Version Mismatches in Embeddings


 


Lets discuss the potential impacts and challenges that arise when different versions of embedding models are used within the same domain, specifically focusing on Azure OpenAI embeddings. When embeddings generated by one version of a model are applied or compared with data processed by a different version, various issues can arise. These issues are not only technical but also have practical implications on the efficiency, accuracy, and overall performance of AI-driven applications.


 


Compatibility and Consistency Issues



  • Vector Space Misalignment: Different versions of embedding models might organize their vector spaces differently. This misalignment can lead to inaccurate comparisons or analyses when embeddings from different model versions are used together.

  • Semantic Drift: Over time, models might be trained on new data or with updated techniques, causing shifts in how they interpret and represent language (semantic drift). This drift can cause inconsistencies when integrating new embeddings with those generated by older versions.


 


Impact on Performance



  • Reduced Accuracy: Inaccuracies in semantic understanding or context interpretation can occur when different model versions process the same text, leading to reduced accuracy in tasks like search, recommendation, or sentiment analysis.

  • Inefficiency in Data Processing: Mismatches in model versions can require additional computational resources to reconcile or adjust the differing embeddings, leading to inefficiencies in data processing and increased operational costs.


 


Best Practices for Upgrading Embedding Models


 


Upgrading Embedding – Overview


 


Now lets move to the process of upgrading an embedding model, focusing on the steps you should take before making a change, important questions to consider, and key areas for testing.


Pre-Upgrade Considerations




  • Assessing the Need for Upgrade:



    • Why is the upgrade necessary?

    • What specific improvements or new features does the new model version offer?

    • How will these changes impact the current system or process?




  • Understanding Model Changes:



    • What are the major differences between the current and new model versions?

    • How might these differences affect data processing and results?




  • Data Backup and Version Control:



    • Ensure that current data and model versions are backed up.

    • Implement version control to maintain a record of changes.




Questions to Ask Before Upgrading




  • Compatibility with Existing Systems:



    • Is the new model version compatible with existing data formats and infrastructure?

    • What adjustments, if any, will be needed to integrate the new model?




  • Cost-Benefit Analysis:



    • What are the anticipated costs (monetary, time, resources) of the upgrade?

    • How do these costs compare to the expected benefits?




  • Long-Term Support and Updates:



    • Does the new model version have a roadmap for future updates and support?

    • How will these future changes impact the system?




Key Areas for Testing




  • Performance Testing:



    • Test the new model version for performance improvements or regressions.

    • Compare accuracy, speed, and resource usage against the current version.




  • Compatibility Testing:



    • Ensure that the new model works seamlessly with existing data and systems.

    • Test for any integration issues or data format mismatches.




  • Fallback Strategies:



    • Develop and test fallback strategies in case the new model does not perform as expected.

    • Ensure the ability to revert to the previous model version if necessary.




Post-Upgrade Best Practices




  • Monitoring and Evaluation:



    • Continuously monitor the system’s performance post-upgrade.

    • Evaluate whether the upgrade meets the anticipated goals and objectives.




  • Feedback Loop:



    • Establish a feedback loop to collect user and system performance data.

    • Use this data to make informed decisions about future upgrades or changes.




Upgrading Embedding – Conclusion


Upgrading an embedding model involves careful consideration, planning, and testing. By following these guidelines, customers can ensure a smooth transition to the new model version, minimizing potential risks and maximizing the benefits of the upgrade.


Use Cases in Azure OpenAI and Beyond


Embedding can significantly enhance the performance of various AI applications by enabling more efficient data handling and processing. Here’s a list of use cases where embeddings can be effectively utilized:




  1. Enhanced Document Retrieval and Analysis: By first performing embeddings on paragraphs or sections of documents, you can store these vector representations in a vector database. This allows for rapid retrieval of semantically similar sections, streamlining the process of analyzing large volumes of text. When integrated with models like GPT, this method can reduce the computational load and improve the efficiency of generating relevant responses or insights.




  2. Semantic Search in Large Datasets: Embeddings can transform vast datasets into searchable vector spaces. In applications like eCommerce or content platforms, this can significantly improve search functionality, allowing users to find products or content based not just on keywords, but on the underlying semantic meaning of their queries.




  3. Recommendation Systems: In recommendation engines, embeddings can be used to understand user preferences and content characteristics. By embedding user profiles and product or content descriptions, systems can more accurately match users with recommendations that are relevant to their interests and past behavior.




  4. Sentiment Analysis and Customer Feedback Interpretation: Embeddings can process customer reviews or feedback by capturing the sentiment and nuanced meanings within the text. This provides businesses with deeper insights into customer sentiment, enabling them to tailor their services or products more effectively.




  5. Language Translation and Localization: Embeddings can enhance machine translation services by understanding the context and nuances of different languages. This is particularly useful in translating idiomatic expressions or culturally specific references, thereby improving the accuracy and relevancy of translations.




  6. Automated Content Moderation: By using embeddings to understand the context and nuance of user-generated content, AI models can more effectively identify and filter out inappropriate or harmful content, maintaining a safe and positive environment on digital platforms.




  7. Personalized Chatbots and Virtual Assistants: Embeddings can be used to improve the understanding of user queries by virtual assistants or chatbots, leading to more accurate and contextually appropriate responses, thus enhancing user experience. With similar logic they could help route natural language to specific APIs. See CompactVectorSearch repository, as an example.




  8. Predictive Analytics in Healthcare: In healthcare data analysis, embeddings can help in interpreting patient data, medical notes, and research papers to predict trends, treatment outcomes, and patient needs more accurately.




In all these use cases, the key advantage of using embeddings is their ability to process and interpret large and complex datasets more efficiently. This not only improves the performance of AI applications but also reduces the computational resources required, especially for high-cost models like GPT. This approach can lead to significant improvements in both the effectiveness and efficiency of AI-driven systems.


Specific Considerations for Azure OpenAI



  • Model Update Frequency: Understanding how frequently Azure OpenAI updates its models and the nature of these updates (e.g., major vs. minor changes) is crucial.

  • Backward Compatibility: Assessing whether newer versions of Azure OpenAI’s embedding models maintain backward compatibility with previous versions is key to managing version mismatches.

  • Version-Specific Features: Identifying features or improvements specific to certain versions of the model helps in understanding the potential impact of using mixed-version embeddings.


Strategies for Mitigation



  • Version Control in Data Storage: Implementing strict version control for stored embeddings ensures that data remains consistent and compatible with the model version used for its generation.

  • Compatibility Layers: Developing compatibility layers or conversion tools to adapt older embeddings to newer model formats can help mitigate the effects of version differences.

  • Baseline Tests: Create few simple baseline tests, that would identify any drift of the embeddings. 


Azure OpenAI Model Versioning: Understanding the Process


Azure OpenAI provides a systematic approach to model versioning, applicable to models like text-embedding-ada-002:




  1. Regular Model Releases:





  2. Version Update Policies:



    • Options for auto-updating to new versions or deploying specific versions.

    • Customizable update policies for flexibility.

    • Details on update options.




  3. Notifications and Version Maintenance:





  4. Upgrade Preparation:



    • Recommendations to read the latest documentation and test applications with new versions.

    • Importance of updating code and configurations for new features.

    • Preparing for version upgrades.




Conclusion


Model version mismatches in embeddings, particularly in the context of Azure OpenAI, pose significant challenges that can impact the effectiveness of AI applications. Understanding these challenges and implementing strategies to mitigate their effects is crucial for maintaining the integrity and efficiency of AI-driven systems.


 


References



  1. “Learn about Azure OpenAI Model Version Upgrades.” Microsoft Tech Community. Link

  2. “OpenAI Unveils New Embedding Model.” InfoQ. Link

  3. “Word2Vec Explained.” Guru99. Link

  4. “GloVe: Global Vectors for Word Representation.” Stanford NLP. Link

Lesson Learned #474:Identifying and Preventing Unauthorized Application Access to Azure SQL Database

This article is contributed. See the original author and article here.

In recent scenarios encountered with our customers, we have come across a specific need: restricting certain users from using SQL Server Management Studio (SSMS) or other applications to connect to a designated database in Azure SQL Database. A common solution in traditional SQL Server environments, like the use of LOGIN TRIGGERS, is not available in Azure SQL Database. This limitation poses a unique challenge in database management and security.


 


To address this challenge, I’d like to share an alternative that combines the power of Extended Events in Azure SQL Database with PowerShell scripting. This method effectively captures and monitors login events, providing administrators with timely alerts whenever a specified user connects to the database using a prohibited application, such as SSMS.


 


How It Works




  1. Extended Events Setup: We start by setting up an Extended Event in Azure SQL Database. This event is configured to capture login activities, specifically focusing on the application name used for the connection. By filtering for certain applications (like SSMS), we can track unauthorized access attempts.




  2. PowerShell Script: A PowerShell script is then employed to query these captured events at regular intervals. This script connects to the Azure SQL Database, retrieves the relevant event data, and checks for any instances where the specified users have connected via the restricted applications.




  3. Email Alerts: Upon detecting such an event, the PowerShell script automatically sends an email notification to the database administrator. This alert contains details of the unauthorized login attempt, such as the timestamp, username, and application used. This prompt information allows the administrator to take immediate corrective measures.




 


Advantages




  • Proactive Monitoring: This approach provides continuous monitoring of the database connections, ensuring that any unauthorized access is quickly detected and reported.




  • Customizable: The method is highly customizable. Administrators can specify which applications to monitor and can easily adjust the script to cater to different user groups or connection parameters.




  • No Direct Blocking: While this method does not directly block the connection, it provides immediate alerts, enabling administrators to react swiftly to enforce compliance and security protocols.




 


This article provides a high-level overview of how to implement this solution. For detailed steps and script examples, administrators are encouraged to tailor the approach to their specific environment and requirements.


 


Extended Event


 

CREATE EVENT SESSION Track_SSMS_Logins
ON DATABASE
ADD EVENT sqlserver.sql_batch_starting(
    ACTION(sqlserver.client_app_name, sqlserver.client_hostname, sqlserver.username, sqlserver.session_id)
    WHERE (sqlserver.client_app_name LIKE '%Management Studio%')
)
ADD TARGET package0.ring_buffer
(SET max_events_limit = 1000, max_memory = 4096)
WITH (EVENT_RETENTION_MODE = NO_EVENT_LOSS, MAX_DISPATCH_LATENCY = 5 SECONDS);
GO

ALTER EVENT SESSION Track_SSMS_Logins ON DATABASE STATE = START;

 


 


Query to run using ring buffers


 

	 SELECT 
    n.value('(@timestamp)[1]', 'datetime2') AS TimeStamp,
    n.value('(action[@name="client_app_name"]/value)[1]', 'varchar(max)') AS Application,
    n.value('(action[@name="username"]/value)[1]', 'varchar(max)') AS Username,
    n.value('(action[@name="client_hostname"]/value)[1]', 'varchar(max)') AS HostName,
    n.value('(action[@name="session_id"]/value)[1]', 'int') AS SessionID
FROM 
    (SELECT CAST(target_data AS xml) AS event_data
     FROM sys.dm_xe_database_session_targets
     WHERE event_session_address = 
         (SELECT address FROM sys.dm_xe_database_sessions WHERE name = 'Track_SSMS_Logins')
     AND target_name = 'ring_buffer') AS tab
     CROSS APPLY event_data.nodes('/RingBufferTarget/event') AS q(n);

 


 


Powershell Script


 

# Connection configuration
$Database = "DBNAme"
$Server = "Servername.database.windows.net"
$Username = "username"
$Password = "pwd!"

$emailFrom = "EmailFrom@ZYX.com"
$emailTo = "EmailTo@XYZ.com"
$smtpServer = "smtpservername"
$smtpUsername = "smtpusername"
$smtpPassword = "smtppassword"
$smtpPort=25


$ConnectionString = "Server=$Server;Database=$Database;User Id=$Username;Password=$Password;"

# Last check date
$LastCheckFile = "c:tempLastCheck.txt"
$LastCheck = Get-Content $LastCheckFile -ErrorAction SilentlyContinue
if (!$LastCheck) {
    $LastCheck = [DateTime]::MinValue
}


# SQL query
$Query = @"
SELECT 
    n.value('(@timestamp)[1]', 'datetime2') AS TimeStamp,
    n.value('(action[@name="client_app_name"]/value)[1]', 'varchar(max)') AS Application,
    n.value('(action[@name="username"]/value)[1]', 'varchar(max)') AS Username,
    n.value('(action[@name="client_hostname"]/value)[1]', 'varchar(max)') AS HostName,
    n.value('(action[@name="session_id"]/value)[1]', 'int') AS SessionID
FROM 
    (SELECT CAST(target_data AS xml) AS event_data
     FROM sys.dm_xe_database_session_targets
     WHERE event_session_address = 
         (SELECT address FROM sys.dm_xe_database_sessions WHERE name = 'Track_SSMS_Logins')
     AND target_name = 'ring_buffer') AS tab
     CROSS APPLY event_data.nodes('/RingBufferTarget/event') AS q(n)
WHERE 
    n.value('(@timestamp)[1]', 'datetime2') > '$LastCheck'
"@

# Create and open SQL connection
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = $ConnectionString
$SqlConnection.Open()

# Create SQL command
$SqlCommand = $SqlConnection.CreateCommand()
$SqlCommand.CommandText = $Query

# Execute SQL command
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter $SqlCommand
$DataSet = New-Object System.Data.DataSet
$SqlAdapter.Fill($DataSet)
$SqlConnection.Close()

# Process the results
$Results = $DataSet.Tables[0]

# Check for new events
if ($Results.Rows.Count -gt 0) {
    # Prepare email content
    $EmailBody = $Results | Out-String
    $smtp = New-Object Net.Mail.SmtpClient($smtpServer, $smtpPort)
    $smtp.EnableSsl = $true
    $smtp.Credentials = New-Object System.Net.NetworkCredential($smtpUsername, $smtpPassword)
    $mailMessage = New-Object Net.Mail.MailMessage($emailFrom, $emailTo)
    $mailMessage.Subject = "Alert: SQL Access in database $Database"
    $mailMessage.Body = "SQL Access Alert in database $Database on server $Server at $LastCheck."
    $smtp.Send($EmailBody)

    # Save the current timestamp for the next check
    Get-Date -Format "o" | Out-File $LastCheckFile
}

# Remember to schedule this script to run every 5 minutes using Windows Task Scheduler

 


Of course, that using SQL auditing o Log analytics will be another alternative. 


 

The Composable Commerce Revolution, the Future of E-Commerce, Dynamics 365 Commerce has arrived!

The Composable Commerce Revolution, the Future of E-Commerce, Dynamics 365 Commerce has arrived!

This article is contributed. See the original author and article here.

Editor:
@denisconway

Introduction:

You’re in for a treat! The world of e-commerce has undergone a massive transformation over the past few years, and it’s all thanks to the revolutionary concept of composable commerce. This approach has taken the industry by storm, and it’s not hard to see why. Composable commerce is versatile, scalable, and innovative approach, allowing businesses of all sizes to provide exceptional customer experiences across various platforms and devices.

In this article, we’ll look closer at the intricacies of composable commerce, exploring its core benefits and examining how it’s changing the game for the e-commerce industry. Get ready to be blown away by the possibilities of composable commerce!

graphical user interface, website
Image: Multiple Ecommerce Channels

Many organizations have started adopting Dynamics 365 Commerce, a composable commerce engine to enable customers to unify back office, in-store and e-commerce channels. While also serving as the single integration point for third-party channel solutions. This gives customers the key advantage of using a variety of best of breed commerce solutions to engage and deliver goods and services to their customers.

What is Composable Commerce:

Composable commerce is a contemporary approach to e-commerce that separates the front-end (presentation layer) and back-end (commerce logic) of an e-commerce platform. Unlike traditional e-commerce systems, where changes to one component can affect the other, composable commerce decouples these two layers, enabling independent development and greater flexibility. This separation allows for greater agility, faster innovation, and the ability to adapt quickly to changing market demands.

diagram
Image: Composable Commerce Diagram

In contrast, traditional e-commerce systems often have monolithic front and back ends, leading to certain limitations. Modifying the underlying codebase to change the front-end design or user experience can be complex and time-consuming. Additionally, traditional systems are not easily scalable across different devices or channels. Composable commerce addresses these challenges by allowing businesses to easily update their website’s design or incorporate new features without disrupting the core e-commerce functionality.

What options do companies have:

Businesses have two powerful options to customize their e-commerce experiences: headless commerce and composable commerce. Headless commerce allows companies to develop and update front-end and back-end components independently, enabling quick adaptation to market changes and experimentation with innovative features. Composable commerce takes flexibility and customization to the next level by enabling businesses to select modular components from different vendors, providing the ultimate flexibility to create an e-commerce ecosystem that is tailored to their unique needs.

Benefits of Composable Commerce:

To start with, the flexibility and agility of a digital environment is continuously evolving, thus using a decouple architecture business can quickly adapt to customers changing preferences. Separating the front-end from the back end ensures that branding, user experience, and functionality stay consistent across various channels. By having cohesive experience across web, mobile, social, media, voice assistant and other Artificial Intelligence (AI), Virtual Reality (VR), Augmented Reality (AR), Voice Commerce, based emerging technologies lead to higher customer satisfaction, engagement, and loyalty.

diagram
Image: With composable commerce, businesses can provide cohesive experience to customers on various channels

In addition, scalability and performance are also greatly enhanced because businesses can independently scale each layer resulting in better resource allocation. Websites can now handle increased traffic, sales volume, and complex operations leading to faster page upload time and better user experience.

End-user Benefits:

Whether customers interact with your brand through a website, mobile app, voice assistant, marketplace, or social media platform, composable commerce ensures a seamless and tailored experience. In addition, faster loading times and improved website performance reduce the long wait time for the entire page to load, resulting in a smoother and more responsive user interface. More importantly customers are browsing via desktop, smartphone, tablet, or using voice assistants to access your products and services seamlessly. This omni-channel capability enhances convenience and accessibility for customers, meeting their expectations for a seamless cross-channel experience. Dynamics 365 Commerce enables businesses to build this experience.

graphical user interface, diagram

Customer Image: Front page ABBY Site  Easy, online contact lens ordering | Doctor Trusted | Patient Approved | Free Shipping | HelloAbby

Customer Story:

Empowering Vision: ABB Optical Group’s Intelligent Contact Lens Ordering Platform with Microsoft Dynamics 365″

Embarking on a technological evolution, ABB Optical Group introduces its Intelligent Contact Lens Ordering Platform, a game-changer crafted in collaboration with Visionet Systems Inc. and Microsoft. This innovation involved the implementation of Microsoft Dynamics 365 Finance and Operations, Azure Cloud, and Data Lake, providing a solid technological foundation. ABB Optical aimed to transcend its legacy Patient Ordering Platform, yourlens.com, seeking a modern, intelligent, and scalable user experience. This vision materialized through the development of a robust Minimum Viable Product(MVP), introducing a microservices headless experience and harnessing the capabilities of Microsoft D365 Retail and HQ APIs, alongside Proof of Concepts.

The outcome was nothing short of transformative. The MVP’s successful pilot garnered positive feedback, propelling the rapid development of additional customer-demanded features. In just six months, Visionet spearheaded the launch of phase two of the Abby Platform, seamlessly integrating a data analytics component through Data Lake with Dynamics 365 F&O and Power BI. ABB Optical Group now stands at the forefront of innovation, offering eyecare providers and patients an intelligent, forward-thinking ordering system.

Conclusion:

In conclusion, the emergence of composable commerce signifies a pivotal shift in the digital marketplace. This approach, distinguished by its modular structure, cloud-native integration, and technology-independent capabilities, provides businesses with unparalleled flexibility and adaptability. It enables businesses to customize their digital experiences, integrate seamlessly with best of breed solution providers for individual capabilities, and respond swiftly to market changes and complexities.


Learn more

Dynamics 365 Commerce delivers a comprehensive, yet composable, set of capabilities for both consumer and business-facing organizations seeking to expand beyond traditional digital commerce limitations and improve customer engagement, build brand awareness, streamline purchasing, and deliver exceptional customer experiences.

To learn more about Dynamics 365 Commerce:

Visit our website on commerce today.

The post The Composable Commerce Revolution, the Future of E-Commerce, Dynamics 365 Commerce has arrived! appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.