Microsoft named as a worldwide Leader in four IDC MarketScapes for Field Service Management & Service Life-Cycle

Microsoft named as a worldwide Leader in four IDC MarketScapes for Field Service Management & Service Life-Cycle

This article is contributed. See the original author and article here.

Across industries and around the world, field service leaders face any number of challenges in areas including digitalization and modernization of traditionally paper-based processes, rising customer expectations, and employee training and retention. With these challenges top of mind, we have continually invested in Microsoft Dynamics 365 Field Service as a solution to meet the growing requirements of field service management (FSM) operations. That’s why we take great pride in sharing we’re the only vendor positioned as a Leader in the following four IDC MarketScapes:

Field service management applications

Source:  IDC MarketScape: Worldwide Field Service Management Applications 2023 Vendor Assessment”, Aly Pinder, December 2023 IDC Doc# US49989523.

According to the 2023 report “Product Innovation and Aftermarket Service Survey,” IDC notes that “the top metric prioritized by service leaders as determining success in service was customer satisfaction (46.2%), followed by customer retention (39.0%).” This means frontline worker roles such as service agents and field technicians are absolutely critical to ensuring the best possible customer experience. That’s why we’re continuously developing Dynamics 365 Field Service so that organizations can equip those workers with modern digital tools to make them more responsive and efficient.

Microsoft was positioned as a Leader in the 2023-2024 IDC MarketScape for worldwide field service management applications based on two strengths: “innovation at scale and pace” and “infusion of AI into field service processes.” The IDC MarketScape notes that “Microsoft’s end-to-end service experience capabilities aid field service companies in a continuous transformation journey. As customer expectations evolve, frontline workforces shift, and business models get disrupted, Microsoft leverages its platform to incorporate technologies like the Internet of Things (IoT), mixed reality, industrial metaverse, and digital twins.” In addition, the IDC MarketScape says that in the areas of AI and generative AI, which would include Copilot in Field Service, “Microsoft is enabling service organizations to realize near-term and long-term strategies around this innovative technology.”

The IDC MarketScape suggests that organizations consider Microsoft field service solutions “if they are looking for a vendor that can incorporate end-to-end capabilities with innovative technologies for transformation and growth.”

Service life-cycle management

Source: IDC MarketScape: Worldwide Service Life-Cycle Management Platforms 2023–2024 Vendor Assessment, by Aly Pinder, October 2023, IDC Doc# US49989623

IDC has noted that service is no longer something that happens only after a sale is complete. More and more, organizations are aligning services and sales to help drive greater revenue through better alignment and new service offerings. This IDC MarketScape report highlights two key Microsoft strengths: an integrated platform supporting the front and back office, and innovation accelerators that enhance experiences.

From an integration standpoint, the IDC MarketScape notes that “the service team can no longer operate in a silo and requires tools that allow it to connect to other business functions, customers, and the wide network of partners. Microsoft’s integrated platform of back-office, midoffice, and front-office applications aids customers across their digital journey and not just within a single function.” The integration of Dynamics 365 Field Service with Microsoft 365 and Microsoft Teams is key. Dynamics 365 Field Service integrates with Outlook, Teams, and Microsoft Viva Connections so that frontline workers and managers can create, view, and manage work orders within Outlook and Teams. This integration enhances collaboration between dispatchers, frontline technicians, and managers by enabling work order data to sync automatically between Dynamics 365 and Microsoft 365. Additionally, frontline technicians can quickly start their day with access to key workday information at a glance, with work orders visible from the Viva Connections homepage in Teams. Dynamics 365 and Microsoft 365 empower technicians with the right information to resolve issues the first time, which is key to creating a positive customer experience.

When it comes to innovation, the IDC MarketScape explains, “Microsoft through its AI, GenAI, IoT, and mixed reality capabilities and tools allows service organizations to deliver enhanced experiences for the service team and the customer. Microsoft customers value this level of shared innovations, which has cemented partnerships for shared growth.” Dynamics 365 Field Service can be integrated with Microsoft Dynamics 365 Remote Assist on Microsoft HoloLens, Microsoft HoloLens 2, Android, or iOS devices to enable technicians to collaborate more efficiently by working together from different locations. This means service technicians can find and connect with technical experts working at other locations to share what they’re seeing, receive remote assistance, and quickly resolve customer issues. Dynamics 365 Field Service can also be integrated with Microsoft Dynamics 365 Guides to attach mixed reality guides to Field Service tasks. Overall, the integration between Dynamics 365 Field Service, Dynamics 365 Remote Assist, Dynamics 365 Guides, and tools like HoloLens helps to elevate field service operations by enabling them to optimize processes and deliver unparalleled customer experiences.

The IDC Life-Cycle Management report suggests organizations “consider Microsoft when searching for capabilities that will enable continuous exploration of innovation across the service life cycle and partner networks. Microsoft has enabled a broad set of innovation capabilities, which support collaboration, co-innovation, and prescriptive service at speed and a global scale.”

Field service management for utilities

chart, bubble chart
Source:  IDC MarketScape: Worldwide Field Service Management Solutions for Utilities 2023-2024 Vendor Assessment, By: Jean-François Segalotto, John Villali and Daniele Arenga, November 2023, IDC Doc #US50036223

For customers in the utilities industry, the IDC MarketScape explains that a key strength for Microsoft is that “[customers] recognize Dynamics 365 Field Service as a well-engineered, flexible FSM solution, offering a solid user experience in terms of usability, configurability, ease of integration into complex landscapes, and extensibility thanks to the Microsoft portfolio.”

The IDC MarketScape also notes, “Microsoft is putting considerable resources behind the product, including significantly increasing the engineering budget this year.” It also states that “[the] ability to instantly access this innovation through a pure-play SaaS ultimately results in good value for money.” Many Field Service customers experienced this with the addition of the Copilot in Dynamics 365 Field Service Outlook add-in, which streamlines work order creation with relevant details pre-populated from emails and optimizes technician scheduling with data-driven recommendations based on factors such as travel time, availability, and skillset. Frontline managers can see relevant work orders and review them before creating new work orders, and they can easily reschedule or update those work orders as customers’ needs change.

Field service management for oil and gas

For customers in the oil and gas (O&G) industry, the IDC MarketScape stated, “Microsoft’s FSM solution comes in an integrated and comprehensive portfolio catering to core O&G field services and asset operations. By seamlessly integrating FSM with Mixed Reality, Microsoft 365, AI, IoT, and Azure, it provides customers the flexibility to tailor solutions, enhancing efficiency, driving innovation, and boosting productivity in a highly customizable manner.”

“Drawing on its long-established customer base, Microsoft works with major O&G players addressing a wide range of field service challenges. Typically, these collaborations focus on enabling frontline workers and optimizing planning and service workflow automation in vast scale operations spanning large assets such as refineries, petrochemical plants, LNG facilities, renewable gas plants, and the extensive network of gas stations.” For service technicians on the frontline, a primary benefit of Dynamics 365 Field Service is the Field Service mobile app which enables technicians to see their workdays at a glance so they can view and update work orders, customer assets, accounts, and more, no matter where they are working—even in areas with limited connectivity. Technicians can also easily access up-to-date inventory information, eliminating the need for cumbersome manual inventory checks and reducing delays caused by missing parts. The Field Service mobile app also incorporates safety checklists and real-time reporting, helping to ensure compliance with safety regulations and to improve the well-being of service technicians who often work under hazardous conditions.

We invite you to read the following IDC MarketScape report excerpts for full details:

Learn more about how Microsoft customers are optimizing service operations with Dynamics 365 Field Service:


Source: IDC MarketScape vendor analysis model is designed to provide an overview of the competitive fitness of ICT suppliers in a given market. The research methodology utilizes a rigorous scoring methodology based on both qualitative and quantitative criteria that results in a single graphical illustration of each vendor’s position within a given market. The Capabilities score measures vendor product, go-to-market and business execution in the short-term. The Strategy score measures alignment of vendor strategies with customer requirements in a 3-5-year timeframe. Vendor market share is represented by the size of the circles.  

The post Microsoft named as a worldwide Leader in four IDC MarketScapes for Field Service Management & Service Life-Cycle appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Viva People Science Industry Trends: Retail

Viva People Science Industry Trends: Retail

This article is contributed. See the original author and article here.

Welcome to the fourth edition of Microsoft Viva People Science industry trends, where the Viva People Science team share learnings from customers across a range of different industries. Drawing on data spanning over 150 countries, 10 million employees, and millions of survey comments, we uncover the unique employee experience challenges and best practices for each industry. 


 


In this blog, @Jamie_Cunningham and I share our insights on the state of employee engagement in the retail industry. You can also access the recording from our recent live webinar, where we discussed this topic in depth.  


 


Let’s first look at what’s impacting the retail industry today. In summary, we are hearing about market volatility, supply chain constraints, changing consumer behavior, technological advancements, labor pressures, and rising costs. According to the Deloitte Retail Trends 2023 report, the top-of-mind issues for retail leaders are: 


 



  • Growth versus sustainability: Retailers need to balance the short-term pressures of profitability and cash flow with the long-term goals of environmental and social responsibility. 

  • Consumer confidence and retail sales: Retailers need to cope with the uncertain and volatile consumer demand, which is influenced by factors such as inflation, health concerns, and government policies. 

  • Leadership quality and brand strength: Retailers need to demonstrate strong and visionary leadership, as well as to build and maintain a distinctive and trusted brand identity. 

  • Technological innovation: Retailers need to leverage technology and data to create personalized, seamless, and omnichannel customer experiences, as well as to optimize their operations and supply chains. 


 


These issues require retailers to be agile, resilient, and innovative in their employee experience strategies and execution. The retail industry also faces some specific challenges in attracting and retaining talent, such as: 


 



  • Rewards: Retail jobs often pay comparatively lower wages and benefits to other industries and can lack recognition and rewards for employees’ hard work.  

  • Wellbeing: Retail employees often deal with high-stress, low-flexibility, and high-risk work environments, which can affect their physical and mental health. 

  • Growth: Retail employees often perceive limited opportunities for career advancement, skill development, and learning, which can lead to disengagement and attrition. 


 


According to Glint benchmark data (2023), employee engagement in retail has declined by two points between 2021 and 2022. It’s clear that retailers need to invest in improving the employee experience, especially for the frontline workers, who are the face of the brand and the key to customer loyalty. So, how do they do this? Here are three examples of how retailers we’ve worked with have addressed the needs of their employees with the support of Microsoft Viva: 


 


1. Create a compelling future 


 


We worked with the leadership team of a MENA (Middle East and North Africa) based retailer to recognize that there was a connection between their ability to communicate the future of the direction of the organization effectively, and the degree to which employees saw a future for themselves in the organization. The team committed to clarifying how the business initiatives they were rolling out connected to future work opportunities for their teams. 


 


2. Build bridges with frontline employees 


 


According to the Microsoft Work Trend Report (2022), sixty-three percent of all frontline workers say messages from leadership don’t make it to them. A global fashion brand recognised after several years of employee listening that the actions being taken by leadership were not being felt on the shop floor. We worked with them to adopt a simplified action taking model with one clear commitment from leaders, that was efficient and effective in terms of communication and adoption. They also increased their investment in manager enablement to support better conversations within teams, when results from Viva Glint were released. This simplified approach led to improved perceptions of the listening process, and greater clarity at all levels on where to focus for a positive employee experience. 


 


3. One internal team, one goal 


 


Through an Executive Consultation with leaders of a UK retailer, it was identified that wellbeing was a risk for the business that unless addressed, would severely impact their priorities. With that in mind, the team created internal alignment – to prioritise wellbeing through both training investment and policy changes, resulting in a thirteen-point improvement in the wellbeing score year over year. 


 


Conclusions 


 


To succeed in this dynamic and competitive market, retailers need to focus on their most valuable asset: their employees. By investing in the employee experience, especially for the frontline workers, retailers can boost their employee engagement, customer satisfaction, and business performance. 


 


A downloadable one-page summary is also available with this blog for you to share with your colleagues and leaders. 


 


Leave a comment below to let us know if this resonates with what you are seeing with your employees in this industry. 


 


EmilyPerina_0-1705509572732.png


 


 


References: 


Deloitte retail trends report (2023) 


Microsoft Work Trend Index special report (2022) 


 

Forms practice mode is here to enhance your learning process

Forms practice mode is here to enhance your learning process

This article is contributed. See the original author and article here.

Practice mode is now available in Forms. It’s tailored for EDU users, particularly students, offering a new way for students to review, test, and reinforce their knowledge. Follow me, let’s check out more details of practice mode. You can just try it from this template. (Note: Practice mode is only available for quizzes. ) 


Practice modePractice mode


 


Instant feedback after answering each question
In practice mode, questions will be shown one at a time. Students will receive immediate feedback after submitting each question, indicating whether their answer is right or wrong.


Instant feedback after answering each questionInstant feedback after answering each question


 


Try multiple times for the correct answer
Students can reconsider and try a question multiple times if they answer it incorrectly, facilitating immediate re-learning, and consequently strengthening their grasp of certain knowledge.


Try multiple times to get the correct answerTry multiple times to get the correct answer


 


Encouragement and autonomy during practice
Students will receive an encouraging message after answering a question, whether their answer is correct or not, giving them a positive practice experience. And They have the freedom to learn at their own pace. If they answer a question incorrectly, they can choose to retry, view the correct answer, or skip this question.


 


Encouragement message and other optionsEncouragement message and other options


 


Recap questions
After completing the practice, students can review all the questions along with the correct answers, offering a comprehensive overview to assess their overall performance.


Recap questionsRecap questions


 


Enter practice mode
Practice mode is only available for quizzes. You can turn it on from the “…” icon in the upper-right corner. Once you distribute the quiz recipients will automatically enter practice mode. Try out practice mode from this template now!


Enter practice modeEnter practice mode


 

Prepare your organization for conversation intelligence data migration into Dataverse

Prepare your organization for conversation intelligence data migration into Dataverse

This article is contributed. See the original author and article here.

During the 1st quarter of 2024, D365 Sales conversation intelligence data will migrate from its current storage location (Microsoft provided storage) across to each customer’s Dataverse organization. This blog post describes this change and provides answers to questions raised by admins when preparing their organizations for this data migration.  

Sales conversation intelligence data is the general term for any outcome of the processing of phone calls made through the embedded Teams dialer within Dynamics 365. This includes files, such as the audio recording file or transcript file, as well as all the insights collected during a call. Examples include: 

  • Sentiment 
  • Tracked keywords 
  • Asked questions  
  • Summary suggestions 

Important: during the migration, no data will be transferred outside of your tenant.  

Moving the data into Dataverse allows you to meet the highest data management standards, such as data encryption using Customer Managed Key (CMK) – and management of customer data using Lockbox.  

The migration also allows for granular control over the conversation intelligence data: orgs can now allow access to specific types of data only for specific security roles. For example, the admin can assign privileges to the ‘sentiment’ entity in Dataverse only for sales managers. This granular control also allows for deletion of specific types of data while retaining others. For example, the admin can store sentiment data for only 1 month, while storing the transcript of the call for 1 year, and by this maximizing the Dataverse storage capacity. 

Having conversation intelligence stored in Dataverse also allows organizations and 3rd party apps to consume the data per the organization’s needs. For example, organizations can create tailored dashboards and visualizations based on the data. Furthermore, the admin can allow third-party apps to access the conversation intelligence data and to provide extensible services based on it. 

Storage location by type

The following table describes the storage location of conversation intelligence data before and after the change: 

Current storage  Type of data  Before the change  After the migration 
Microsoft provided storage  Files (recording, transcript)  Microsoft provided storage  Organization’s Dataverse1 
Conversation intelligence insights   Microsoft provided storage  Organization’s Dataverse1 
Your own Azure blob storage  Files (recording, transcript)  Your own Azure blob storage  Your own Azure blob storage2 
Conversation intelligence insights   Microsoft provided storage  Organization’s Dataverse1 

1 After the data is successfully migrated, it will be deleted from the Microsoft-provided storage. 

2 No change. This data is not migrated. 

After the successful migration of existing data, data from new calls will be automatically saved to Dataverse. 

Action required by admins:

For all organizations: 

  1. Check the solution version (mandatory)
    Make sure you have the latest version of the conversation intelligence solution (msdyn_Conversation_Intelligence version 9.0.1.1139 or higher) installed in your organization. 
  1. Provide access to new Dataverse entities (mandatory)
    Make sure the relevant security roles have read and write privileges to the new Dataverse entities (see below a list of entities). 
  1. Make sure you have sufficient storage space in Dataverse (mandatory)

Find the number of calls currently in storage: 

    • Navigate to the System monitoring page. 
    • Set the time filter to ‘All time’. 
    • Note the number of total calls. 

    Calculate the amount of storage space required: 

      • Database storage: Multiply the number of calls by 160KB. 
      • File storage (only relevant for orgs previously using Microsoft provided storage): Multiply the number of calls by 0.93MB. 
      • For example: if you had 20,000 calls, and you previously used the Microsoft provided storage, you will need to have 32GB of DB storage and 18.6GB of file storage for the migrated data. 

      Note: The above numbers are based on average call duration and number of insights per call. Actual sizes may vary. 

      1. Set a retention policy (optional):  
        Previously, conversation intelligence data was automatically deleted according to the retention policy set by the admin in the conversation intelligence settings. By default, data saved into Dataverse does not have an automatic retention policy like this. If you wish to set a retention policy for your conversation intelligence data in Dataverse, you can do so by following this documentation

      For organizations currently using own Azure blob storage: 

      1. Set up Service principal (mandatory)
        To allow conversation intelligence access to your blob storage in a more secure way. See this article to learn more on this setup. 
         

      Opting out of migrating the existing data into Dataverse 

      By default, your existing data will be migrated to Dataverse. If you wish to opt out of the migration, (because your organization is no longer using conversation intelligence or you don’t want to migrate the existing files or insights for example), you will need to send an email, containing your first and last name and the organization ID to this email address: CI-data-migration@microsoft.com before January 31st 2024.  
      The data of organizations which opted-out of the migration will be permanently deleted by April 1st 2024. 

          Frequently asked questions

          Here are some answers for questions you might have on this process: 

          What will happen to my organization’s saved data? 
          The data will be transferred from where it is stored today (Microsoft provided storage) to your organization’s Dataverse database. After verifying the transfer and customer confirmation, the data will be permanently deleted from the previous storage location (data will not be automatically deleted from your Azure blob storage). Note that data older than 90 days will not be migrated. 

          What type of Dataverse storage will be used? 
          Conversation intelligence uses 2 types of Dataverse storage: File storage will be used for storing the recording and transcript files (unless stored in your org’s Azure blob storage), while DB storage will be used for storing the conversation intelligence insights.  

          What are the expected implications of moving the data into Dataverse? 

            Migrating data into your Dataverse will require free Dataverse storage space. See above on how to calculate the required storage space. 

            Who will have access to the transferred data? 

            Out-of-the-box security roles (such as Salesperson and Sales Manager) will automatically receive privileges to the new entities where the data is stored. If your org uses custom security roles, make sure you assign them with the required privileges for the new Dataverse tables as listed below. You can do this prior to the migration of the data. 
            List of new Dataverse entities: 

            • Conversation Action item 
            • Conversation Aggregated Insights 
            • Conversation Comment 
            • Conversation Participant Insights 
            • Conversation Participant Sentiment 
            • Conversation Question 
            • Conversation Segment Sentiment 
            • Conversation Sentiment 
            • Conversation Signal 
            • Conversation Subject 
            • Conversation Summary Suggestion 
            • Conversation System Tag 
            • Conversation Tag 
            • Ocrecording 
            • Recording 
            • SCI Conversation 
            • Sci Environment Settings 
            • Sci User Settings 
            • Transcript 

            Will users in my organization be able to continue using the conversation intelligence app? 

            Once the data is migrated into Dataverse, the conversation intelligence app will no longer work. The aggregated conversation intelligence data will be available through a new Power BI based dashboard. 

            How to opt-out of moving my organization’s existing data into Dataverse? 
            You can opt-out of moving the existing data by sending an email to: CI-data-migration@microsoft.com before January 31st 2024. If you chose to do so, the existing data of your organization which is currently saved in the Microsoft-provided storage will be permanently deleted by April 1st 2024. 

            What’s next? 
            If you don’t choose to opt-out, your organization’s conversation intelligence data will be transferred to Dataverse between February 1st 2024 and March 30th 2024. You will receive an email with a confirmation of successful data migration. After the moves to Dataverse, all new conversation intelligence data will be saved to Dataverse as well. 

            Learn more

            Understand the new conversation intelligence entities in Dataverse 

            The post Prepare your organization for conversation intelligence data migration into Dataverse appeared first on Microsoft Dynamics 365 Blog.

            Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

            Understand Microsoft Shifts settings as an end-user

            Understand Microsoft Shifts settings as an end-user

            This article is contributed. See the original author and article here.

            Frontline managers have gained greater control, on a team-level, over the capabilities offered in Microsoft Shifts.



            With the latest releases now available on the Shifts settings page, we have made updates to improve the end-user experience for frontline manager and workers. The updates are as follows:


             


            Open shifts


            Previously, when the Open Shifts setting was off, frontline managers could create but not publish open shifts. Also, they could view open and assigned shifts listed on their team’s schedule (including when workers are scheduled for time off).



            Now, when the setting is turned off, frontline managers can’t create open shifts and can only view on their team’s schedule the assigned shifts (including scheduled time off).



            See the differences from the past and new experience for frontline managers:


            Open shift updates 2.gif


             


            Time-off requests


            Previously, when the time-off request setting was turned off, frontline managers couldn’t assign time off to their team members; more over, frontline workers couldn’t request time-off.



            Now, when the setting is turned off, frontline managers can continue to assign time off to their team members. However, frontline workers will not have the ability to create time-off requests if this setting remains off.



            Your organization can leverage Shifts as the place where the frontline may view their working and non-working schedules despite not using Shifts as your leave management tool.



            See the new experience for frontline managers:


            Time off requests (after).gif


             


            Open shifts, swap shifts, offer shifts and time-Off requests


            Previously, when any of the request-related setting toggled between on to off, frontline managers couldn’t manage previous requests that were submitted when the setting was on.



            Now, frontline managers can directly manage previous requests on the Requests page while frontline workers can view status and details of their individual requests.


             


            Read more about


            Shifts settings: Manage settings in Shifts – Microsoft Support
            Latest Shifts enhancements: Discover the latest enhancements in Microsoft Shifts – Microsoft Community Hub

            Revolutionizing Demand Planning: Unleashing Cutting-Edge Features in Dynamics 365 Supply Chain Management’s January Update

            Revolutionizing Demand Planning: Unleashing Cutting-Edge Features in Dynamics 365 Supply Chain Management’s January Update

            This article is contributed. See the original author and article here.

            Introduction:

            The start of the new year has brought a wave of exciting enhancements to the Demand Planning module in Dynamics 365 Supply Chain Management. We’re thrilled to introduce you to five groundbreaking features that will redefine the way you approach demand planning. In this blog post, we’ll look into each feature, highlighting their benefits and showcasing live demos hosted by the expert, Anders Girke.

            Feature 1: Edit on Total Level

            The new feature in our January release is the revolutionary “Edit on Total Level” functionality. This empowers planners to expedite their planning workflows through effective edits on a broader scale. Let’s swiftly explore the advantages:

            • Edit on Total Level: Accelerate planning with efficient edits on a larger scale.
            • ? Date Filters: Navigate and analyze data effortlessly.
            • ? Distribute Proportional Over Time: Streamline workflows with proportional changes.
            • ? Allocate Proportional Amongst Dimensions: Optimize precision in planning.

             

            table
            Image: Edit on Total Level

            Watch the video here  Edit on Total Level

            Feature 2: Filter in Transformation

            The second feature in our January release series is “Filter in Transformation.” This powerful tool allows precise data transformation for enhanced what-if analysis and forecasting on a focused dataset. Here are the key benefits:

            • ? Perform What-if forecasts on a filtered sub-set of data
            • ? Filter staging data prior to transformation
            • ? Ensure secure performance
            • ? Experiment with Dimensions to refine your planning

            Witness the possibilities unfold as you perform What-if forecasts, filter staging data, ensure secure performance, and experiment with dimensions to refine your planning. Your demand planning just got a whole lot smarter!

            graphical user interface, application, email
            Image: Filter in Transformation

            Watch the video here: Filter in Transformation

            Feature 3: Comments

            The third installment of our January release series introduces “Comments.” This feature is set to transform collaboration and communication within the demand planning application. Key highlights include:

            • ? Enhanced Communication: Provide detailed explanations for changes, fostering transparency.
            • ? Real-time Collaboration: Facilitate consensus-building among team members.
            • Retrospective Analysis: Analyze changes retrospectively, identifying key decision points.
            • ?️ Organized Communication: Dedicated section for structured and organized discussions.

            Watch a live demo hosted by Anders Girke to see “Comments” in action and discover how it will elevate your team’s collaboration to new heights.

            chart, line chart
            Image: Comments

            Watch the video here: Comments

            Feature 4: System Administrator Role for Demand Planning

            In this release, we introduce the pivotal role of the System Administrator for Demand Planning. This role is responsible for installing the app, assigning roles, managing teams, and overseeing critical operations. Highlights include:

            • ? Role Level Access for Contributors: Empower limited users with the ability to view shared worksheets, create personalized views, and edit data within their permissions.
            • ? Row Level Access Rules: Define conditions for specific tables, columns, and operators for unparalleled flexibility.
            • ? Editing Demand Plans with Flexibility: Highlighting the power of role level access, added experience, and disaggregation in editing demand plans.

            Get a sneak peek into the upcoming February release, emphasizing the balance between limiting filters for optimal performance and ensuring an exceptional user experience.

             

            Image: Row Level Access

            Watch the video here: Role Level Access

            ?How to Get Started?

            Ready to experience these game-changing features? Follow these steps to get started:

            1. Go to Power Platform admin center.
            2. Choose the specific environment to access the details page.
            3. Click on “Dynamics 365 apps” under Resources.
            4. Navigate to the apps list, find the Demand Planning app marked with “Update available” status.
            5. Select the respective row and click on “Update” in the top ribbon.
            a screenshot of a computer screen
            Image: Getting Started

            Watch the video here: Getting Started

            Conclusion

            In conclusion, the recent January release of Dynamics 365 Supply Chain Management Demand Planning has brought forth a wave of transformative features, including “Edit on Total Level,” “Filter in Transformation,” and “Comments,” redefining the landscape for planners with tools that enhance efficiency and collaboration. The incorporation of the System Administrator role, Role Level Access for Contributors, Row Level Access Rules, and advanced security features positions the platform as a robust and secure solution for demand planning needs. With increased flexibility in editing demand plans and promising additions in the upcoming February release, Dynamics 365 is shaping a future of more streamlined and user-friendly demand planning experiences. This release marks a substantial leap forward, promising organizations worldwide a future characterized by smarter and more precise demand planning. As we embrace this evolution in demand planning, Dynamics 365 Supply Chain Management stands as a pioneer, leading the way with innovative features. Stay tuned for ongoing updates and enhancements that will continuously elevate your planning processes to unprecedented heights!


            ?North America Demand Planning Workshop?

            Join us at the forthcoming Demand Planning Workshop, hosted at Microsoft’s state-of-the-art facility – Microsoft in Redmond, WA (98052). This event is tailored to introduce the innovative Demand Planning application to both our valued Customers and Partners.

            ✍ Register here : Demand Planning Workshop

            The post Revolutionizing Demand Planning: Unleashing Cutting-Edge Features in Dynamics 365 Supply Chain Management’s January Update appeared first on Microsoft Dynamics 365 Blog.

            Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

            Expanding Copilot for Microsoft 365 to businesses of all sizes

            Expanding Copilot for Microsoft 365 to businesses of all sizes

            This article is contributed. See the original author and article here.

            We are updating our Microsoft Copilot product line-up with a new Copilot Pro subscription for individuals; expanding Copilot for Microsoft 365 availability to small and medium-sized businesses; and announcing no seat minimum for commercial plans.

            The post Expanding Copilot for Microsoft 365 to businesses of all sizes appeared first on Microsoft 365 Blog.

            Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

            Prepare your organization for conversation intelligence data migration into Dataverse

            Announcing automatic Copilot enablement in Customer Service

            This article is contributed. See the original author and article here.

            Starting January 19, 2024, Microsoft Copilot in Dynamics 365 Customer Service will be automatically installed and enabled in your Dynamics 365 Customer Service environment. This update will install the case summarization and conversation summarization features. These features are available to all users with a Dynamics 365 Customer Service Enterprise license, and/or digital messaging or Voice add-on license for conversation summary enablement.

            If your organization has already enabled Copilot in Customer Service, there will be no change to your environment.

            Key dates

            • Disclosure date: December 2023
              Administrators received a notification about the change in the Microsoft 365 admin center and Power Platform admin center.
            • Installation date: January 19 – February 2, 2024
              Copilot in Customer Service is installed and enabled by default.

            Please note that specific dates for messages and auto-installation will vary based on the geography of your organization. The date applicable to your organization is in the messages in Microsoft 365 admin center and Power Platform admin center. Copilot auto-installation will occur only if your organization is in a geography where all Copilot data handling occurs “in geo.” These regions are currently Australia, United Kingdom, and United States. Organizations where Copilot data handling does not occur “in geo” must opt in to cross-geo data transmission to receive these capabilities.

            What is Copilot in Dynamics 365 Customer Service?

            Copilot in Customer Service is a key part of the Dynamics 365 Customer Service experience. Copilot provides real-time, AI-powered assistance to help customer support agents solve issues faster. By relieving them from mundane tasks such as searching and note-taking, Copilot gives them time for more high-value interactions with customers. Contact center managers can also use Copilot analytics to view Copilot usage and better understand how it impacts the business.

            Why is Microsoft deploying this update?

            We believe this update presents a significant opportunity to fundamentally alter the way your organization approaches service by quickly improving and enhancing the agent experience. The feedback we have received from customers who are already using Copilot has been overwhelmingly positive. Generative AI-based service capabilities have a profound impact on efficiency and customer experience, leading to improved customer satisfaction. This update applies only to the Copilot summarization capabilities, which integrate with service workflows and require minimal change management.

            Learn more about Copilot in Dynamics 365 Customer Service

            For more information, read the documentation: Enable copilots and generative AI features – Power Platform | Microsoft Learn

            The post Announcing automatic Copilot enablement in Customer Service appeared first on Microsoft Dynamics 365 Blog.

            Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

            The Publisher failed to allocate a new set of identity ranges for the subscription

            This article is contributed. See the original author and article here.

            Problem:


            ===========


            Assume that you have tables with Identity columns declared as datatype INT and you are using Auto Identity management for those articles in a Merge Publication.


            This Publication has one or more subscribers and you tried to re-initialize one subscriber using a new Snapshot.


            Merge agent fails with this error:


            >>
            Source:  Merge Replication Provider


            Number:  -2147199417


            Message: The Publisher failed to allocate a new set of identity ranges for the subscription. This can occur when a Publisher or a republishing Subscriber has run out of identity ranges to allocate to its own Subscribers or when an identity column data type does not support an additional identity range allocation. If a republishing Subscriber has run out of identity ranges, synchronize the republishing Subscriber to obtain more identity ranges before restarting the synchronization. If a Publisher runs out of identit


             


             


            Cause:


            ============


            Identity range Merge agent is trying to allocate, exceeds maximum value an INT datatype can have.


             


            Resolution


            =================


            Assume that publisher database has only one Merge publication with 2 subscribers, and your merge articles have this definition:


            >>>
            exec sp_addmergearticle @publication = N’MergeRepl_ReproDB’, @article = N’tblCity’, @source_owner = N’dbo’, @source_object = N’tblCity’, @type = N’table’, @description = N”, @creation_script = N”, @pre_creation_cmd = N’drop’, @schema_option = 0x000000004C034FD1, @identityrangemanagementoption = N’auto’, @pub_identity_range = 1000, @identity_range = 1000, @threshold = 90, @destination_owner = N’dbo’, @force_reinit_subscription = 1, @column_tracking = N’false’, @subset_filterclause = N”, @vertical_partition = N’false’, @verify_resolver_signature = 1, @allow_interactive_resolver = N’false’, @fast_multicol_updateproc = N’true’, @check_permissions = 0, @subscriber_upload_options = 0, @delete_tracking = N’true’, @compensate_for_errors = N’false’, @stream_blob_columns = N’false’, @partition_options = 0


             


            exec sp_addmergearticle @publication = N’MergeRepl_ReproDB’, @article = N’tblCity1′, @source_owner = N’dbo’, @source_object = N’tblCity1′, @type = N’table’, @description = N”, @creation_script = N”, @pre_creation_cmd = N’drop’, @schema_option = 0x000000004C034FD1, @identityrangemanagementoption = N’auto’, @pub_identity_range = 1000, @identity_range = 1000, @threshold = 90, @destination_owner = N’dbo’, @force_reinit_subscription = 1, @column_tracking = N’false’, @subset_filterclause = N”, @vertical_partition = N’false’, @verify_resolver_signature = 1, @allow_interactive_resolver = N’false’, @fast_multicol_updateproc = N’true’, @check_permissions = 0, @subscriber_upload_options = 0, @delete_tracking = N’true’, @compensate_for_errors = N’false’, @stream_blob_columns = N’false’, @partition_options = 0


             


             


            You can run this query against the Published database to see what articles range is full or have very few values left:


            >>>
            select a.name,


                   max_used=max_used,


                   diff_pub_range_end_max_used=range_end – max_used, –this tells how many values are left


                   pub_range_begin=range_begin,


                   pub_range_end=range_end


            from dbo.MSmerge_identity_range b ,


                   sysmergearticles a


            where


                   a.artid = b.artid


                   and is_pub_range=1


            order by max_used desc


             


             


            name           max_used                                diff_pub_range_end_max_used             pub_range_begin                         pub_range_end


            ————– ————————————— ————————————— ————————————— ————-


            tblCity        2147483647                              0                                       2147477647                              2147483647


            tblCity1       6001                                    2147477646                              1                                             2147483647


             


             


             


            As you see from above diff_pub_range_end_max_used column is zero for tblCity.


            When Merge agent runs depending on how many servers are involved it has to allocate 2 ranges for each.


            In the example above we have Publisher and 2 subscribers and @identity_range is 1000. So, we will have to allocate range for 3 servers i.e., 3 * (2*1000) = 6000


            Our diff_pub_range_end_max_used should be greater than 6000, only then we will be able to allocate a new range for all the servers.


             


            To resolve the issue.


             



            1. Remove tblCity table from publication.

            2. Change the datatype from int to bigint and add this table back to publication.

            3. Then generate a new snapshot. It will generate snapshots for all articles, but only this 1 table will be added back to the existing Subscribers.

            Future-Proofing AI: Strategies for Effective Model Upgrades in Azure OpenAI

            This article is contributed. See the original author and article here.

            TL;DR: This post navigates the intricate world of AI model upgrades, with a spotlight on Azure OpenAI’s embedding models like text-embedding-ada-002. We emphasize the critical importance of consistent model versioning ensuring accuracy and validity in AI applications. The post also addresses the challenges and strategies essential for effectively managing model upgrades, focusing on compatibility and performance testing. 


             


            Introduction


            What are Embeddings?


             


            Embeddings in machine learning are more than just data transformations. They are the cornerstone of how AI interprets the nuances of language, context, and semantics. By converting text into numerical vectors, embeddings allow AI models to measure similarities and differences in meaning, paving the way for advanced applications in various fields.


             


            Importance of Embeddings


             


            In the complex world of data science and machine learning, embeddings are crucial for handling intricate data types like natural language and images. They transform these data into structured, vectorized forms, making them more manageable for computational analysis. This transformation isn’t just about simplifying data; it’s about retaining and emphasizing the essential features and relationships in the original data, which are vital for precise analysis and decision-making.


            Embeddings significantly enhance data processing efficiency. They allow algorithms to swiftly navigate through large datasets, identifying patterns and nuances that are difficult to detect in raw data. This is particularly transformative in natural language processing, where comprehending context, sentiment, and semantic meaning is complex. By streamlining these tasks, embeddings enable deeper, more sophisticated analysis, thus boosting the effectiveness of machine learning models.


             


            Implications of Model Version Mismatches in Embeddings


             


            Lets discuss the potential impacts and challenges that arise when different versions of embedding models are used within the same domain, specifically focusing on Azure OpenAI embeddings. When embeddings generated by one version of a model are applied or compared with data processed by a different version, various issues can arise. These issues are not only technical but also have practical implications on the efficiency, accuracy, and overall performance of AI-driven applications.


             


            Compatibility and Consistency Issues



            • Vector Space Misalignment: Different versions of embedding models might organize their vector spaces differently. This misalignment can lead to inaccurate comparisons or analyses when embeddings from different model versions are used together.

            • Semantic Drift: Over time, models might be trained on new data or with updated techniques, causing shifts in how they interpret and represent language (semantic drift). This drift can cause inconsistencies when integrating new embeddings with those generated by older versions.


             


            Impact on Performance



            • Reduced Accuracy: Inaccuracies in semantic understanding or context interpretation can occur when different model versions process the same text, leading to reduced accuracy in tasks like search, recommendation, or sentiment analysis.

            • Inefficiency in Data Processing: Mismatches in model versions can require additional computational resources to reconcile or adjust the differing embeddings, leading to inefficiencies in data processing and increased operational costs.


             


            Best Practices for Upgrading Embedding Models


             


            Upgrading Embedding – Overview


             


            Now lets move to the process of upgrading an embedding model, focusing on the steps you should take before making a change, important questions to consider, and key areas for testing.


            Pre-Upgrade Considerations




            • Assessing the Need for Upgrade:



              • Why is the upgrade necessary?

              • What specific improvements or new features does the new model version offer?

              • How will these changes impact the current system or process?




            • Understanding Model Changes:



              • What are the major differences between the current and new model versions?

              • How might these differences affect data processing and results?




            • Data Backup and Version Control:



              • Ensure that current data and model versions are backed up.

              • Implement version control to maintain a record of changes.




            Questions to Ask Before Upgrading




            • Compatibility with Existing Systems:



              • Is the new model version compatible with existing data formats and infrastructure?

              • What adjustments, if any, will be needed to integrate the new model?




            • Cost-Benefit Analysis:



              • What are the anticipated costs (monetary, time, resources) of the upgrade?

              • How do these costs compare to the expected benefits?




            • Long-Term Support and Updates:



              • Does the new model version have a roadmap for future updates and support?

              • How will these future changes impact the system?




            Key Areas for Testing




            • Performance Testing:



              • Test the new model version for performance improvements or regressions.

              • Compare accuracy, speed, and resource usage against the current version.




            • Compatibility Testing:



              • Ensure that the new model works seamlessly with existing data and systems.

              • Test for any integration issues or data format mismatches.




            • Fallback Strategies:



              • Develop and test fallback strategies in case the new model does not perform as expected.

              • Ensure the ability to revert to the previous model version if necessary.




            Post-Upgrade Best Practices




            • Monitoring and Evaluation:



              • Continuously monitor the system’s performance post-upgrade.

              • Evaluate whether the upgrade meets the anticipated goals and objectives.




            • Feedback Loop:



              • Establish a feedback loop to collect user and system performance data.

              • Use this data to make informed decisions about future upgrades or changes.




            Upgrading Embedding – Conclusion


            Upgrading an embedding model involves careful consideration, planning, and testing. By following these guidelines, customers can ensure a smooth transition to the new model version, minimizing potential risks and maximizing the benefits of the upgrade.


            Use Cases in Azure OpenAI and Beyond


            Embedding can significantly enhance the performance of various AI applications by enabling more efficient data handling and processing. Here’s a list of use cases where embeddings can be effectively utilized:




            1. Enhanced Document Retrieval and Analysis: By first performing embeddings on paragraphs or sections of documents, you can store these vector representations in a vector database. This allows for rapid retrieval of semantically similar sections, streamlining the process of analyzing large volumes of text. When integrated with models like GPT, this method can reduce the computational load and improve the efficiency of generating relevant responses or insights.




            2. Semantic Search in Large Datasets: Embeddings can transform vast datasets into searchable vector spaces. In applications like eCommerce or content platforms, this can significantly improve search functionality, allowing users to find products or content based not just on keywords, but on the underlying semantic meaning of their queries.




            3. Recommendation Systems: In recommendation engines, embeddings can be used to understand user preferences and content characteristics. By embedding user profiles and product or content descriptions, systems can more accurately match users with recommendations that are relevant to their interests and past behavior.




            4. Sentiment Analysis and Customer Feedback Interpretation: Embeddings can process customer reviews or feedback by capturing the sentiment and nuanced meanings within the text. This provides businesses with deeper insights into customer sentiment, enabling them to tailor their services or products more effectively.




            5. Language Translation and Localization: Embeddings can enhance machine translation services by understanding the context and nuances of different languages. This is particularly useful in translating idiomatic expressions or culturally specific references, thereby improving the accuracy and relevancy of translations.




            6. Automated Content Moderation: By using embeddings to understand the context and nuance of user-generated content, AI models can more effectively identify and filter out inappropriate or harmful content, maintaining a safe and positive environment on digital platforms.




            7. Personalized Chatbots and Virtual Assistants: Embeddings can be used to improve the understanding of user queries by virtual assistants or chatbots, leading to more accurate and contextually appropriate responses, thus enhancing user experience. With similar logic they could help route natural language to specific APIs. See CompactVectorSearch repository, as an example.




            8. Predictive Analytics in Healthcare: In healthcare data analysis, embeddings can help in interpreting patient data, medical notes, and research papers to predict trends, treatment outcomes, and patient needs more accurately.




            In all these use cases, the key advantage of using embeddings is their ability to process and interpret large and complex datasets more efficiently. This not only improves the performance of AI applications but also reduces the computational resources required, especially for high-cost models like GPT. This approach can lead to significant improvements in both the effectiveness and efficiency of AI-driven systems.


            Specific Considerations for Azure OpenAI



            • Model Update Frequency: Understanding how frequently Azure OpenAI updates its models and the nature of these updates (e.g., major vs. minor changes) is crucial.

            • Backward Compatibility: Assessing whether newer versions of Azure OpenAI’s embedding models maintain backward compatibility with previous versions is key to managing version mismatches.

            • Version-Specific Features: Identifying features or improvements specific to certain versions of the model helps in understanding the potential impact of using mixed-version embeddings.


            Strategies for Mitigation



            • Version Control in Data Storage: Implementing strict version control for stored embeddings ensures that data remains consistent and compatible with the model version used for its generation.

            • Compatibility Layers: Developing compatibility layers or conversion tools to adapt older embeddings to newer model formats can help mitigate the effects of version differences.

            • Baseline Tests: Create few simple baseline tests, that would identify any drift of the embeddings. 


            Azure OpenAI Model Versioning: Understanding the Process


            Azure OpenAI provides a systematic approach to model versioning, applicable to models like text-embedding-ada-002:




            1. Regular Model Releases:





            2. Version Update Policies:



              • Options for auto-updating to new versions or deploying specific versions.

              • Customizable update policies for flexibility.

              • Details on update options.




            3. Notifications and Version Maintenance:





            4. Upgrade Preparation:



              • Recommendations to read the latest documentation and test applications with new versions.

              • Importance of updating code and configurations for new features.

              • Preparing for version upgrades.




            Conclusion


            Model version mismatches in embeddings, particularly in the context of Azure OpenAI, pose significant challenges that can impact the effectiveness of AI applications. Understanding these challenges and implementing strategies to mitigate their effects is crucial for maintaining the integrity and efficiency of AI-driven systems.


             


            References



            1. “Learn about Azure OpenAI Model Version Upgrades.” Microsoft Tech Community. Link

            2. “OpenAI Unveils New Embedding Model.” InfoQ. Link

            3. “Word2Vec Explained.” Guru99. Link

            4. “GloVe: Global Vectors for Word Representation.” Stanford NLP. Link