Empowering Business innovation for modernizing Retailers and manufacturers with Copilot: Unleashing Dynamics 365 Commerce investments for 2024 Wave 1

Empowering Business innovation for modernizing Retailers and manufacturers with Copilot: Unleashing Dynamics 365 Commerce investments for 2024 Wave 1

This article is contributed. See the original author and article here.

Editor: Denis Conway

Introduction:

Dynamics 365 Commerce is a comprehensive omnichannel solution that empowers retailers to deliver personalized, seamless, and differentiated shopping experiences across physical and digital channels. In the 2024 release Wave 1, Dynamics 365 Commerce continues to innovate and enhance its capabilities to improve store associate productivity and meet the evolving needs of customers and businesses. Here are some of the highlights of the new features coming soon:

Copilot in Site builder is going global and multi-lingual:

Copilot in Site builder is a generative AI assistant that helps users create engaging and relevant content for their e-commerce sites. Copilot uses the product information and the user’s input to generate product enrichment content that is crafted using brand tone and tailored for targeted customer segments.

graphical user interface
Image: Copilot Site Builder

In the 2024 release wave 1, Copilot in Site builder is expanding its language support to include support for 23 additional locales including German, French, Spanish, and more. This feature demonstrates Microsoft’s commitment to making Copilot accessible globally and empowering users to create multilingual content with ease and efficiency.

Strengthening our dedication to creating a comprehensive B2B solution for Digital Commerce by supporting B2B indirect commerce

Dynamics 365 Commerce supports both B2C and B2B commerce scenarios, enabling retailers to sell directly to consumers and businesses. In the 2024 release wave 1, Dynamics 365 Commerce fortifies its B2B investments by introducing support for B2B indirect commerce, which enables manufacturers selling through a network of distributors to get complete visibility into their sales and inventory.

graphical user interface
Image: New distributor capabilities

New distributor capabilities enable manufacturers to provide a self-service platform that simplifies distributor operations and builds meaningful, long-lasting business relationships through efficient and transparent transactions. Distributors can access product catalogs and pricing specific to their partner agreements, manufacturers can place orders on behalf of their customers with specific distributor, and outlets can track order status and history.

Dynamics 365 Commerce also streamlines multi-outlet ordering, enabling business buyers that are associated with more than one outlet organization to buy for all of them. Commerce provides the ability to seamlessly buy for multiple organizations using the same email account, enabling buyers to be more efficient.

graphical user interface, website
Image: Order for Organizations

Additionally, Dynamics 365 Commerce supports advance ordering, which is a common practice in some businesses to order products in advance to ensure they have adequate stock when needed. This feature enables customers to specify the desired delivery date and include additional order notes.

Also, introducing support for a promotions page on an e-commerce site that serves as a hub to showcase various deals and promotions that shoppers can take advantage of. The promotions page can display active and upcoming promotions.

graphical user interface
Image : Promotions Page
Adyen Tap to Pay is coming to Store Commerce app on iOS

The Store Commerce app is a mobile point of sale (POS) solution that enables store associates to complete transactions through a mobile device on the sales floor, pop-up store, or remote location. The Store Commerce app supports various payment methods, such as cash, card, gift card, and loyalty points.

a hand holding a remote control
Image: Adyen Tap to Pay

In the 2024 release wave 1, Dynamics 365 Commerce is introducing Adyen Tap to Pay capabilities into the Store Commerce app for iOS, so that retailers everywhere can accept payments directly on Apple iPhones. Adyen Tap to Pay enhances the utility and versatility of the Store Commerce app, as it eliminates the need for additional hardware or peripherals to process payments. It also enables retailers to offer a more customer-centric and engaging in-store retail experience, as store associates can interact with customers anywhere in the store and complete transactions on the spot.

Speed up your checkout process with simplified and consistent payment workflows for different payment methods on Store Commerce app

Efficiency and predictability are key to the smooth operation of a point of sale (POS) system, especially when it comes to payment processing. When store associates can process customer payments across a variety of payment types with minimal friction, customers spend less time waiting and more time shopping.

In the 2024 release wave 1, Dynamics 365 Commerce is improving the POS payment processing user experience to create more consistent workflows across payment types. The new user experience simplifies the payment selection and confirmation process, reduces the number of clicks and screens, and provides clear feedback and guidance to the store associate. The new user experience also supports split tendering, which allows customers to pay with multiple payment methods in a single transaction.

graphical user interface, application, Teams
Image: Check out process

The improved POS payment processing user experience will contribute to efficiencies in the checkout process and more satisfied customers. It will also reduce the training time and effort for store associates, as they can easily learn and master the payment workflows.

Enabling retailers to effectively monitor and track inventory of Batch-controlled products via Store Commerce app

Batch-controlled products are products that are manufactured in batches and associated with unique identifiers for quality control and traceability. Batch-controlled products are commonly used in food, chemical, and electronics industries, where the quality and safety of the products are critical.

graphical user interface, application, Teams
Image: Batch Control Products

In the 2024 release wave 1, Dynamics 365 Commerce enhances the Store Commerce app to support batch-controlled products. This feature enables store associates to scan or enter the batch number of the products during the sales or return transactions and validate the batch information against the inventory records. This feature also enables store associates to view the batch details of the products, such as the expiration date, manufacture date, and lot number.

With these new features, Dynamics 365 Commerce aims to provide you with the best tools and solutions to grow your business and delight your customers. Whether you want to create engaging and relevant content for your e-commerce site, automate and integrate your order management workflows, expand your B2B commerce opportunities, or improve your payment processing and inventory management, Dynamics 365 Commerce has something new for you.

To learn more about Dynamics 365 Commerce:

Learn more about additional investments and timeline for these investments here in release plans.

Visit our website on commerce today.

The post Empowering Business innovation for modernizing Retailers and manufacturers with Copilot: Unleashing Dynamics 365 Commerce investments for 2024 Wave 1 appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Azure Permissions 101: How to manage Azure access effectively

Azure Permissions 101: How to manage Azure access effectively

This article is contributed. See the original author and article here.

While onboarding customers to Azure they ask what permissions do we need to assign to our IT Ops or to partners and I’ve seen customer gets confused when we ask them for Azure AD permission for some task and they say we’ve provided owner access on Azure Subscription why Azure AD permission is required and how this is related. So thought of writing this blog to share how many permission domains are there when you use Azure.


We will talk about these RBAC Domain:


 




  • Classic Roles




  • Azure RBAC Roles




  • Azure AD Roles




  • EA RBAC




  • MCA RBAC




  • Reserved Instance RBAC




 


Classic Roles


 


So let us talk about RBAC first – When I used to work in Azure Classic portal it used to be fewer roles. Mostly Account Admin, Co-Admin and Service Admin. The person who created subscription would become service Admin and if that person wanted to share the admin privilege, then he used to assign co-administrator role to the other guy


So when you go to Subscription -> IAM blade you’ll still see this. I have seen customers trying to provide owner access just try to use this Add Co-administrator button. Now you know the difference. This is not mean for providing someone access to ARM resource.


 


Picture1.jpg


 


Azure RBAC


 


Let us talk about ARM RBAC now. When we moved to Azure RBAC from classic. We started with more fine-grained access control. With each service there was a role e.g. virtual machine contributor for managing VMs, Network contributor for managing network and so on. So, the user gets stored in Azure AD itself, but the permissions are maintained at subscription, resource group, management group level or resource level.


In each RBAC we have Actions which basically tells the role what it can perform.


 


Picture2.jpg


The actions are part of the control plane. Which you get access to manage the service and its settings or configurations. We also have data plane actions. Which provides you the actual data access. Let us take an example of Azure Blob storage, if you get reader role you would be able to see the resource itself but will not be able to see the actual data in blob storage if you authenticate via Azure AD. If you want to see the actual data, then you can get storage blob data contributor role assigned to the ID and you can see the actual data. Similarly, there are services which expose data actions e.g. Azure Key vault, Service Bus.


Getting into where this RBAC roles can be assigned at Resource, Resource Group level or management group level is another discussion which I will cover in another blog post.


 


Azure AD Roles


 


This is used when you deal with Azure AD itself or services of which roles are stored in Azure AD like SharePoint, Exchange, or Dynamics 365. Dealing with Azure AD roles might be required during multiple instances, for example using service which creates service principals in the backend like app registration. Azure Migrate, Site recovery etc. would require Azure AD permissions to be assigned to your ID.


This RBAC Domain is separate from the Azure RBAC, this gets stored in Azure AD itself and managed centrally from roles and administrator’s blade.


Picture3.jpg


The person who created the tenant gets a global admin role and then we have fine grained access based on the roles.


 


Though Azure AD roles are different than Azure RBAC which we assign to subscriptions, a global admin can elevate himself and get access to all the subscriptions in his tenant through a toggle.


Picture4.jpg


 


Once you enable this toggle you get the user access administrator role at the root scope under which all the management group gets created. So eventually you can access all the subscriptions.


 


This is a rare and exceptional procedure that requires consultation with your internal team and a clear justification for its activation.


 


EA RBAC


 


If you are an enterprise customer and have signed up for the EA Agreement from Microsoft, as a customer in order to create subscriptions and manage billing you need to log on to EA portal which is now moved to Azure portal. Hence we’ve set of 6 RBAC permissions which can be used from cost management + billing section in Azure portal.




  • Enterprise administrator




  • EA purchaser




  • Department administrator




  • Account owner




  • Service administrator




  • Notification contact




 


Which set of permission is assigned at specific hierarchy can be explained through the below image. this is copied from Microsoft learn documentation mentioned below.


 


Picture5.jpg


 


Below is the sample screenshot which you see when you click on cost management + billing portal. Here you will see Accounts, Departments, subscriptions.


 


Picture6.jpg


 


MCA RBAC


 


If you have purchased MCA, then you get hierarchy for permissions to be assigned. Top level permissions are assigned at the billing scope and then billing profile level.


 


Picture7.jpg


Billing account owner and Billing profile owner are the most common role you will use. More roles are mentioned in the article below which you can go through.


 


Reserved Instance RBAC


 


A common request from customers I get, I have got contributor/owner access to the subscription still I do not see the reserved Instance which is purchased by my colleague. Few years back the person who purchased reservation used to be the one who provided access to others by going to individual reservation. This is still possible but now you can get access to all reservations in the tenant.


Reservations when purchased by an admin he can see/manage it and seen by EA Admin or a person with reservation administrator role.


 


Picture8.jpg


 


Picture9.jpgPicture10.jpg


 


You can do this via PowerShell too, check this document for more information.


 


More information regarding who got access to RI is mentioned in the article below.


https://learn.microsoft.com/en-us/azure/cost-management-billing/reservations/view-reservations


Happy Learning!

You can find more such article at www.azuredoctor.com

??Microsoft Fabric AI Hack Together: Building RAG Application on Microsoft Fabric & Azure Open AI

This article is contributed. See the original author and article here.

Hack Together: The Microsoft Fabric Global AI Hack


The Microsoft Fabric Global AI Hack is your playground for creating and experimenting with Microsoft Fabric. With mentorship from Microsoft experts and access to the latest tech, you will learn how to build AI solutions with Microsoft Fabric! The possibilities are endless for what you can create… plus you can submit your hack for a chance to win exciting prizes! ?


Join the Microsoft Fabric Global AI Hackthon









Learn how to create amazing apps with RAG and Azure Open AI 



Are you ready to hack and build a RAG Application using Fabric and Azure Open AI? 
 


?Join us for the Fabric AI Hack Together event and learn the concepts behind RAG and how to use them effectively to empower with your data with AI.  


? You’ll get to hear from our own experts Pamela Fox (Principal Cloud Advocate at Microsoft) and Alvaro Videla Godoy (Senior Cloud Advocate at Microsoft) who will introduce you to the challenge, provide links to get started, and give you ideas an inspiration so you can start creating amazing AI solutions with minimal code and maximum impact. :fire: 


??‍ You’ll also get to network with other hackers, mentors, and experts who will help you along the way. Come with ideas or come for inspiration, we’d love to hear what you’re planning to build!   

How to resolve DNS issues with Azure Database for MySQL

How to resolve DNS issues with Azure Database for MySQL

This article is contributed. See the original author and article here.

If you’re using Azure Database for MySQL and have encountered issues with name resolution or the Domain Name System (DNS) when attempting to connect to your server from different sources and networks, then this blog post is for you! In the next sections, I’ll explain the causes of these types of issues and what you need to do to resolve them. 


 


What are DNS issues? 


DNS is a service that translates domain names (e.g., servername.mysql.database.azure.com) into IP addresses (e.g., 10.0.0.4) to make it easier for us to identify remember and access websites and servers. 


 


However, at time the DNS service can fail to resolve the domain name to the IP address, or it might resolve it to the wrong IP address. This can result in errors such as “Host not Known” or “Unknown host” when you specify the server name for making connections. 


 


Diagnosing DNS issues 


To diagnose DNS issues, use tools such as Ping or nslookup to verify that the host name is being resolved from the source. To test using ping, for example, on the source, run the following command: 


 


ping servername.mysql.database.azure.com 

 


If the server’s name is not resolving, a response similar to the following should appear: 


RaviShankar_0-1706945289306.png


Fig 1: Ping request not returning IP 


 


To test using nslookup, on the source, run the following command: 


 


nslookup servername.mysql.database.azure.com 

 


Again, if the server name is not resolving, a response similar to the following should appear: 


RaviShankar_1-1706945289308.png


Fig 2: nslookup to DNS request not returning IP 


 


If on the other hand the commands return the correct IP address of the server, then the DNS resolution is working properly. If the commands return an error or a different IP address, then there is a DNS issue. 


 


To verify the correct IP address of the server, you can check the Private DNS zone of the Azure Database for MySQL Flexible server. The Private DNS zone is a service that provides name resolution for private endpoints within a virtual network (vNet). You can find the Private DNS zone in the properties of the overview blade of the server, as shown in the following figure: 


 


RaviShankar_2-1706945289309.png


 


Fig 3: Checking the private DNS zone in the Properties of overview blade 


 


In the Private DNS zone, you can see the currently assigned IP address to the MySQL Flexible server, as shown in the following figure: 


 


RaviShankar_3-1706945289310.png


 


Fig 4: Private DNS Zone overview 


 


Resolving DNS issues 


The solution to fix DNS issues depends on the source and the network configuration of the server. In this blog, I will cover two common scenarios: when the source is using the default (Azure-provided) DNS, and when the source is using a custom DNS. 


 


Scenario 1: Source is using the default (Azure-provided) DNS 


The default (Azure-provided) DNS can only be used by sources in Azure that have private endpoint, vNet integration, or have IPs defined from a vNet. If you are using the default DNS and you are getting a DNS issue, you need to check the following: 


 



  • vNet of the source: Check the vNet of the source (also check NIC level configuration in case of Azure VM) and make sure that it is set to Azure-provided DNS. You can check this on the vNet > DNS servers blade, as shown in the following figure: 


RaviShankar_4-1706945289311.png


 


Fig 5: DNS servers blade in virtual network 


 



  • Private DNS zone of the server: Go to the Private DNS zone of the MySQL Flexible server and add the vNet of the source to the Virtual Network Link blade, as shown in the following figure: 


 


RaviShankar_5-1706945289312.png


Fig 6: Adding virtual network link to private DNS zone 


 


After these steps, you should be able to ping and nslookup the server’s name from the source and get the correct IP address. 


 


Scenario 2: Source is using a custom DNS 


This is the most commonly used scenario by the users. This pattern can be used in a hub-and-spoke model and also for name resolution from on-premises servers. In this scenario, a custom DNS server is deployed in a hub vNet that is linked to the on-premises DNS server. It can also be deployed without having on-prem connectivity, as shown in the following figure: 


 


RaviShankar_6-1706945289313.png


Fig 7: Network diagram showing access through custom DNS server in Hub and Spoke network. 


 


In this scenario, the MySQL Flexible server is deployed in a delegated subnet in Spoke2. Spoke1, Spoke2, and Spoke3 are connected through the Hub vNet. Spoke1 and Spoke3 have a custom DNS server configured which is deployed in the Hub vNet. Since both spoke vNets (1 and 3) are connected through the Hub vNet, clients can directly connect with the MySQL Flexible server with IP address only and DNS name resolution would not work. 


 


To fix this issue, perform the following steps:



  • Conditional forwarder: Add a conditional forwarder on the custom DNS for mysql.database.azure.com domain. This conditional forwarder must point to the Azure DNS IP address: 168.63.129.16, as shown in the following figure: 


RaviShankar_7-1706945289315.png


 


Fig 8: Adding conditional forwarder for mysql.database.azure.com 


 



  • Virtual network link: You need to add a virtual network link in the Private DNS zone for the custom DNS server’s vNet, as described in the previous scenario. 

  • On-premises DNS: If you have clients on-premises that need to connect to the Flexible server FQDN, then you need to add a conditional forwarder in the on-premises DNS server pointing to the IP address of the custom DNS server in Azure for mysql.database.azure.com. Alternatively, you can use the same custom DNS IP in additional DNS servers on on-premises clients. 


 


Conclusion 


In this blog, I have shown you how to solve DNS issues with Azure Database for MySQL using different DNS scenarios. I hope this helps you to enjoy the benefits of using Azure Database for MySQL for your applications. 


 


We are always interested in how you plan to use Flexible Server deployment options to drive innovation to your business and applications. Additional information on topics discussed above can be found in the following documents: 


 



 


If you have any questions about the detail provided above, please leave a comment below or email us at AskAzureDBforMySQL@service.microsoft.com. Thank you!


 

MVP’s Favorite Content: Windows, Generative AI, M365, Azure

MVP’s Favorite Content: Windows, Generative AI, M365, Azure

This article is contributed. See the original author and article here.

In this blog series dedicated to Microsoft’s technical articles, we’ll highlight our MVPs’ favorite article along with their personal insights.


 


Maison da Silva, Windows and Devices MVP, Brazil


Maison da Silva.jpg


create partition primary | Microsoft Learn


Resize-Partition (Storage) | Microsoft Learn


“Help technical users who had problem installing update KB5034441 2024-01-09 “0x80070643 – ERROR_INSTALL_FAILURE” Workaround: It might be necessary to increase the size of the WinRE partition in order to avoid this issue and complete the installation.”


*Relevant Blog: Redimensionamento Manual da Partição para Instalação da Atualização WinRE: Um Guia Passo a Passo – Maison da Silva


 


Tso Kar Ho, Microsoft Azure MVP, Hong Kong SAR


MVP Ejoe Tso.jpg


18 Lessons, Get Started Building with Generative AI


“The content in this repository is highly valuable for beginners, as it not only introduces the concepts of generative AI but also provides hands-on examples and code snippets to help users understand and implement the techniques. The lessons cover a wide range of topics, including language models, semantic search, transformers, and more, giving learners a holistic understanding of generative AI.


Moreover, the repository is actively maintained and has received significant community engagement, with over 23.6k stars and 15k forks. This level of community activity demonstrates the value and popularity of the content. Additionally, the presence of 50 contributors indicates a collaborative environment where users can benefit from the expertise and insights of others in the field.


 


I have personally found this content to be highly informative and well-structured, making it an ideal resource for individuals looking to explore generative AI. I believe that this repository will greatly benefit those who are new to the field and provide them with a solid foundation to build upon.


 


As an MVP, I have written a blog post discussing the significance of generative AI and its potential applications in healthcare. In the blog post, I have highlighted the “Generative AI for Beginners” repository as an excellent starting point for individuals interested in learning more about this field. I have shared practical examples and insights from the repository to showcase the practicality and versatility of generative AI in the healthcare domain.


 


Additionally, I have organized a virtual event for the local developer community, where I conducted a workshop using the lessons from the “Generative AI for Beginners” repository. The event aimed to introduce beginners to the world of generative AI and provide them with hands-on experience in building their own models. The event received positive feedback, with participants expressing their appreciation for the comprehensive and beginner-friendly content provided by Microsoft.


 


Tetsuro Takao, Developer Technologies MVP, Japan


Tetsuro Takao.jpg


Microsoft 365 guidance for security & compliance – Service Descriptions | Microsoft Learn


“This is a content that compiles all the installation methods for implementing Microsoft 365 security and compliance, and can be used as a reference when writing work or blog articles. It is also a very useful content for planning and proposals, as it serves as a guideline for design and includes license information, which is helpful when planning the order of implementation and construction.”


(In Japanese: Microsoft 365のセキュリティ、コンプライアンスの実装についてすべての設置方法がまとまっており、仕事やブログ記事を書く際にリファレンス的に参照することが可能なコンテンツ。企画、提案の際にも設計の指針になり、ライセンス情報も同コンテンツに記載があるため導入構築順などを計画する際にも役立つ非常に有用なコンテンツ。)


*Relevant Activity: .NETラボ – connpass


 


Sou Ishizaki, Microsoft Azure MVP, Japan


Sou Ishizaki.jpg


Connecting to Azure Services on the Microsoft Global Network – Microsoft Community Hub


“This is a valuable article that delves deeper into the Microsoft Global Network and provides an answer to the question, ‘Does this connection architecture connect to the internet or not?”


(In Japanese: Microsoft Global Network について一歩踏み込んで「この接続アーキテクチャはインターネットに出るのか否か」に答えを与えてくれる、ありがたい記事です。)

Power Platform Partners – We want to hear from you!

This article is contributed. See the original author and article here.

Power Platform and Low Code are a fast-growing practice area for many partners and we want to hear from you about how you are building and expanding this practice and the type of services you offer, so that we can share back broad trends and insights with the partner community.  


 


We are currently conducting a survey to identify and define the value partners realize through their relationship with Microsoft and Power Platform. This research is being done by IDC’s Channels and Alliances Research Group on behalf of Microsoft. Please use this link to take the survey: https://aka.ms/PowerPartnerProfitabilitySurvey. It takes approximately 10-15 minutes and we recommend that it is completed by your Power Platform practice lead. The questions included are related to Microsoft and Power Platform revenue growth, profit across resale, services and/or software, investments in your Power Platform practice, and Microsoft activities/programs that drive success.


 


We’re interested in learning about your practice development and the profitability of your Power Platform business. The information you provide will be aggregated and used in an IDC eBook, and will help Microsoft improve its partner strategy and programs related to Power Platform.


 


The deadline to submit is February 29. Thank you!

Sustainability in Dynamics 365 Business Central: A New Way to Measure and Manage Your Environmental Impact

Sustainability in Dynamics 365 Business Central: A New Way to Measure and Manage Your Environmental Impact

This article is contributed. See the original author and article here.

Learn how the upcoming 2024 release wave 1 will enable you to track and report your greenhouse gas emissions with ease and accuracy.

Sustainability is more than a buzzword. It’s a global imperative that demands urgent action from all sectors of society, including the business world. As regulations and expectations around environmental reporting evolve, organizations need reliable and efficient tools to measure and manage their impact on the planet.

That’s why we’re excited to announce that Dynamics 365 Business Central will soon offer new sustainability features that will help you comply with the latest standards and best practices, as well as drive positive changes for your business and the environment. Starting from the 2024 release wave 1, you’ll be able to track and report your greenhouse gas (GHG) emissions across three scopes defined by the ESG standard, using sustainability journals and built-in calculation methods. You’ll also be able to leverage the new Chart of Sustainability Accounts to organize and analyze your emission data with ease and transparency.

The new functionality is designed to oversee and regulate an organization’s environmental footprint by tracking various greenhouse gas (GHG) emissions, facilitating proper insights. This functionality will be a multi-wave investment and at this first release we delivered basic framework as a foundation for future expanding. The first version focuses on GHG emissions, and related to that the solution is focused on three emission scopes defined by the ESG standard. This feature supports the basic process of collecting emission data via sustainability journals, allowing for manual entry of known data, or utilizing built-in methods for calculating emissions footprints.

Chart of Sustainability Accounts

The Chart of Sustainability Accounts forms the foundational structured list used for recording all emissions data. It functions as a comprehensive framework that categorizes and organizes these accounts based on their attributes, such as scope or other groupings. Each account is typically assigned a unique code or number for easy reference and tracking, following the same structure as a traditional Chart of Accounts but customized specifically for monitoring sustainability-related data and metrics within an organization.

This chart typically encompasses categories such as energy consumption, greenhouse gas emissions, waste generation, and other pertinent sustainability metrics. Users have the flexibility to add Account Categories and Subcategories to define how the system behaves, selecting dedicated emissions for tracking, emission factors, formulas, and similar configurations.

In essence, the Chart of Sustainability Accounts serves as the backbone of the Sustainability feature, facilitating effective tracking, measurement, and management of sustainability-related data and metrics within the organization.

table

Chart of Sustainability Accounts

Sustainability Journals

Sustainability Journals are designed to track and record sustainability-related activities, and metrics within an organization, using the same user experience as other journals in Business Central. Within the journal, users have the option to input emissions manually if they possess the necessary information. Alternatively, if they lack this data, they can utilize built-in formulas to accurately calculate emissions based on specific known parameters corresponding to various types of sources and accounts.

When posting with a Sustainability Journal, entries are generated on the Sustainability Ledger. Similar to other journal types, users have the flexibility to utilize various templates and batches with specific configurations. They can opt for standard journals or recurring journals to manage sustainability data efficiently.

table

Sustainability Journal

Sustainability Entries

The Sustainability Ledger organizes all emission data according to the Chart of Sustainability Accounts. When a user posts the Sustainability Journal, all crucial data is recorded within the Sustainability Entries. Posting to these entries can be regulated by specific rules configured in the Sustainability Setup, and users can use dimensions in a manner consistent with other entries throughout Business Central. All reports are generated based on Sustainability Entries. Presently, there are several existing reports available for analyzing and tracking emissions. However, in future releases, Business Central will introduce additional reports for printing or submission to relevant authorities.

Future Development

As mentioned in the introduction, Microsoft will continue to enhance this feature by adding more horizontal capabilities and improving its connection with other features in Business Central, without focusing on industry-specific aspects. The development of new features will depend on research of the key elements for such solutions, compliance with regulatory requirements, and useful feedback from partners. Also, it is expected that ISV partners will use this basic framework to create industry-specific sustainability solutions.

The post Sustainability in Dynamics 365 Business Central: A New Way to Measure and Manage Your Environmental Impact appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Azure API Center: The First Look

Azure API Center: The First Look

This article is contributed. See the original author and article here.

Let’s say you work for a company that has a lot of APIs. You might have a few questions:


 



  • How do you manage the lifecycle of those APIs?

  • What governance would you apply to these APIs?

  • What is the list of environments needed to manage these APIs?

  • What are the deployment strategies for those APIs?

  • How would you integrate those APIs with other services?


 


As your company’s number of APIs increases, so does the complexity of managing them. Azure API Center (APIC) is a central repository for your APIs lifecycle management, and it offers the more efficient ways for management. Throughout this post, I will take a first look at what Azure API Center is and what it offers.


 



You can find a sample code from this GitHub repository.



 


Prerequisites


 


There are a few prerequisites to use Azure APIC effectively:


 



 


API Center instance provisioning


 


There are three ways to provision an APIC instance:


 



 


I’m not going to discuss how to provision an APIC instance in this article. But here’s the reference you can do it by yourself through Bicep – Azure API Center Sample


 


Register APIs to APIC


 


The purpose of using APIC is to manage your company’s APIs in a centralised manner. From design to deployment, APIC tracks all the histories. To register your APIs to APIC, you can use either Azure CLI or Azure Portal.


 


Let’s say there’s a weather forecast API you have designed and developed. You have an OpenAPI document for the API, but not implemented yet. Let’s register the API to APIC.


 


 

az apic api register 
    -g "my-resource-group" 
    -s "my-api-center" 
    --api-location ./weather-forecast.json

 


 


Registering API through Azure CLI


 


If you want to register another API through the Azure Portal, you can do it by following the official documentation.


 


Registering API through Azure Portal


 


Import APIs from API Management to APIC


 


If you have already working APIs in Azure API Management (APIM), you can import them to APIC through Azure CLI. But it requires a few more steps to do so.


 




  1. First of all, you need to activate Managed Identity to the APIC instance. It can be either system identity or user identity, but I’m going to use the system identity for now.

    az apic service update 
        -g "my-resource-group" 
        -s "my-api-center" 
        --identity '{ "type": "SystemAssigned" }'
    



  2. Then, get the principal ID of the APIC instance.

    APIC_PRINCIPAL_ID=$(az apic service show 
        -g "my-resource-group" 
        -s "my-api-center" 
        --query "identity.principalId" -o tsv)
    



  3. Now, register the APIC instance to the APIM instance as an APIM reader.

    APIM_RESOURCE_ID=$(az apim show 
        -g "my-resource-group" 
        -s "my-api-center" 
        --query "id" -o tsv)
    
    az role assignment create 
        --role "API Management Service Reader Role" 
        --assignee-object-id $APIC_PRINCIPAL_ID 
        --assignee-principal-type ServicePrincipal 
        --scope $APIM_RESOURCE_ID
    



  4. And finally, import APIs from APIM to APIC.

    az apic service import-from-apim 
        -g "my-resource-group" 
        -s "my-api-center" 
        --source-resource-ids "$APIM_RESOURCE_ID/apis/*"
    


    Importing API from APIM




 


Now, you have registered and imported APIs to APIC. But registering those APIs to APIC does nothing to do with us. What’s next then? Let’s play around those APIs on Visual Studio Code.


 


View APIs on Visual Studio Code – Swagger UI


 


So, what can you do with the APIs registered and imported to APIC? You can view the list of APIs on Visual Studio Code. First, you need to install the Azure API Center extension on Visual Studio Code.


 


Once you install the extension, you can see the list of APIs on the extension. Choose one of the APIs and right-click on it. Then, you can see the context menu. Click on the Open API Documentation menu item.


 



 


You will see the Swagger UI page, showing your API document. With this Swagger UI, you can test your API endpoints.


 


Swagger UI


 


Test APIs on Visual Studio Code – Rest Client


 


Although you can test your API endpoints on the Swagger UI, you can also test them in a different way. For this, you need to install the Rest Client extension on Visual Studio Code.


 


After you install the extension, choose one of the APIs and right-click on it. Then, you can see the context menu. Click on the Generate HTTP File menu item.


 



 


Within the HTTP file, you can actually test your API endpoints with different payloads.


 


HTTP file


 


Generate client SDK on Visual Studio Code – Kiota


 


You can write up the client SDK by yourself. But it’s time consuming and fragile because the API can change at any time. But what if somebody or a tool creates the client SDK on your behalf?


 


One of the greatest features of this APIC extension offers is to generate client SDKs. You can generate the client SDKs for your APIs in different languages. Although the API itself has no implementation yet, you can still work with the client SDK because you know what you need to send and what you will receive in return through the SDK. For this, you need to install the Kiota extension on Visual Studio Code.


 


After you install the extension, choose one of the APIs and right-click on it. Then, you can see the context menu. Click on the Generate API Client menu item.


 


Generate API client


 


Because I have a Blazor web application, I’m going to generate a C# client SDK for the API. The Kiota extension finds out all the API endpoints from APIC. You can choose them all or just a few of them. Click the :play_button: button, and it generates the client SDK for you.


 


Kiota Explorer


 


Add the necessary information like class name and namespace of the client SDK, and output folder. Finally it asks in which language to generate the client SDK. There are currently 9 languages available for now. I’m going to choose C#.


 


Kiota - choose language


 


The Kiota extension then generates the client SDK into the designated directory.


 


API client SDK generated


 


Consume the generated client SDK within an application


 


Now, the client SDK has been generated by the Kiota extension from APIC to my Blazor application. Because it uses the Kiota libraries, I need to install the following Kiota NuGet packages to my Blazor web application.


 


 

dotnet add ./src/WebApp/ package Microsoft.Kiota.Http.HttpClientLibrary
dotnet add ./src/WebApp/ package Microsoft.Kiota.Serialization.Form
dotnet add ./src/WebApp/ package Microsoft.Kiota.Serialization.Json
dotnet add ./src/WebApp/ package Microsoft.Kiota.Serialization.Text
dotnet add ./src/WebApp/ package Microsoft.Kiota.Serialization.Multipart

 


 


Add dependencies to the Program.cs file and update the Home.razor file to consume the client SDK. Then you will be able to see the result.


 


Pet Store - available pets


 


Your web application as the API consumer works perfectly with the client SDK generated from APIC.


 




 


So far, I’ve walked through how Azure API Center can handle your organisation’s APIs as a central repository, and played around the APIC extension on VS Code. This post has shown you how to provision the APIC instance, register and import APIs in various ways, and how to test those APIs on VS Code and generate the client SDKs directly from VS Code.


 


As I mentioned in the beginning, taking care of many APIs in one place is crucial as your ogranisation grows up. You might think that you don’t need APIC if your organisation’s API structure is relatively simple. However, even if your organisation is small, APIC will give you better overview of APIs, and how they can interconnected with each other.


 


More about Azure API Center?


 


If you want to learn more about APIC, the following links might be helpful.


 



 


This article was originally published on Dev Kimchi.

Enable Change Tracking service for Arc Onboarded machines (Windows and Linux)

Enable Change Tracking service for Arc Onboarded machines (Windows and Linux)

This article is contributed. See the original author and article here.

Azure Arc is a multi-cloud and on-premises management platform that simplifies governance and management by delivering a consistent way to manage your entire environment together by projecting your existing non-Azure and or on-premises resources into Azure Resource Manager.


Azure Arc has benefited multiple customers by simplifying governance and management by delivering a consistent multi-cloud and on-premises management platform such as patch management using Azure Update Manager, enabling Security using Defender for cloud, Standardized role-based access control (RBAC), Change tracking etc. for resource types hosted outside of Azure such as Sever, Kubernetes, SQL Server etc. Today, we will discuss and enable Change Tracking service for Arc Onboarded devices. To know more about Azure arc benefits and Onboarding process refer to the link here.


Let’s look at what the change tracking service does before we activate it.


The Change Tracking and Inventory services track changes to Files, Registry, Windows Software, Linux Software (Software Inventory), Services and Daemons, also supports recursion, which allows you to specify wildcards to simplify tracking across directories.


Note: Earlier this feature gets enabled using Log Analytics (MMA Agent) and Azure Automation Account. Now this has been simplified with Azure Policy.


Let’s understand how to enable Change tracking and Inventory feature for Arc Onboarded device.


Note: Please make sure that the arc machines are registered, and their status is shown as connected before you turn on the feature, as seen below.


 


Anipriya_0-1708445940483.png


 


 


Go to Azure Policy then Definition and filter the category by Change tracking and Inventory. You need to enable all the built-in policies present in Enable change tracking Inventory for Arc enabled virtual machines initiatives for Arc enabled windows and Linux devices respectively.


 


Anipriya_1-1708445940495.png


 



  1. Assign Configure Windows Arc-enabled machines to install AMA for ChangeTracking and Inventory built-in policy (Scope it to Subscription of Arc Onboarded device). Make Sure you have unchecked the Parameter and verify Effect to DeployIfNotexist and create Remediation task. This will ensure existing resources can be updated via a remediation task after the policy is assigned. Similarly, Configure Linux Arc-enabled machines to install AMA for ChangeTracking and Inventory built-in policy for Arc Onboarded Linux devices. Once configured using Azure Policy, Arc machine will have AMA Agent deployed.


 



  1. Assign Configure Change Tracking Extension for Windows Arc machines built-in policy (Scope it to Subscription of Arc Onboarded device). Follow the same steps as mentioned in point 1. Similarly, Configure Change Tracking Extension for Linux Arc machines built-in policy for Arc Onboarded Linux devices. Once configured using Azure Policy, Arc machine will have change tracking extension deployed.


 


Anipriya_2-1708445940498.png


 



  1. Create data collection rule.
    a. Download CtDcrCreation.json file. Go to Azure portal and in the search, enter Deploy a custom template. In the Custom deployment page > select a template, select Build your own template in the editor. In the Edit template, select Load file to upload the CtDcrCreation.json file or just copy the json and paste the template. And select Save. In the Custom deployment > Basics tab, provide Subscription and Resource group where you want to deploy the Data Collection Rule. The Data Collection Rule Name is optional.


Anipriya_3-1708445940502.png


 


Anipriya_4-1708445940510.png


 


b. In the Custom deployment > Basics tab, provide Subscription and Resource group where you want to deploy the Data Collection Rule. The Data Collection Rule Name is optional. Workspace Resource ID of Log analytic Workspace. (You will get the workspace ID in the overview page of Log analytic workspace) .


Anipriya_5-1708445940516.png


 


c. Select Review+create > Create to initiate the deployment of CtDcrCreation. After the deployment is complete, select CtDcr-Deployment to see the DCR Name. Go to the newly created Data collection Rule (DCR) rule named (Microsoft Ct-DCR). Click on json view and copy the Resource ID.


 


Anipriya_6-1708445940523.png


 


 


Anipriya_7-1708445940527.png


 


Anipriya_8-1708445940530.png


 


d. Go to Azure Policy Assign [Preview]: Configure Windows Arc-enabled machines to be associated with a Data Collection Rule for ChangeTracking and Inventory built-in policy (Scope it to Subscription of Arc Onboarded device). Make Sure you have enabled the Parameter and paste the Resource ID captured above and create Remediation task. Similarly, Configure Linux Arc-enabled machines to be associated with a Data Collection Rule for ChangeTracking and Inventory built-in policy for Arc Onboarded Linux devices. Once configured using Azure Policy, Arc machine will have change tracking extension deployed.


 


Anipriya_9-1708445940533.png


After all the policies are configured and deployed. Go to the Arc device, you will be able to view the change tracking and Inventory is enabled.


 


Anipriya_10-1708445940538.png


 


Anipriya_11-1708445940544.png


 

Use Microsoft Copilot for Service with your external CRM 

Use Microsoft Copilot for Service with your external CRM 

This article is contributed. See the original author and article here.

Last year, we released a revolutionary set of Copilot capabilities within Dynamics 365 Customer Service. With Copilot, agents can get quick answers, draft emails, and summarize cases and conversations using information that already exists in Dynamics 365, as well as your preferred data sources.  

Now there’s Copilot for Service, which lets you connect all your data sources without replacing your existing CRM. Just like Copilot in Dynamics 365 Customer Service, Copilot for Service enables agents to easily manage customer inquiries, track service requests, and provide personalized support. It integrates Microsoft 365 experiences with external systems, such as Salesforce, Zendesk, and ServiceNow. 

Why Copilot for Service is the right solution 

Today, it is possible to build your own copilots using Copilot Studio, formerly known as Power Virtual Agents. With Copilot Studio, you can create a bot that addresses a specific topic (for example, “get order status”) or you can use plugins to leverage copilots and bots you have already built. Plugins let you connect to external data sources, and you can use them across Copilots to ensure agents have a consistent experience anywhere they ask a service-related question. Agents don’t have to move between systems to find answers, which reduces the time to resolution. 

Customers can use Copilot Studio within Copilot for Service to customize and publish plugins for Copilot for Microsoft 365. For example, you can use Copilot Studio to add proprietary knowledge bases for the copilot’s real-time responses or set up authentication to ensure only authorized agents can use Copilot. Building and publishing standalone Copilot experiences for any internal or external channel requires a standalone Copilot Studio license. 

Copilot for Service removes the development overhead and will build specific copilot experiences tailored to your organization’s needs. Additionally, if you’re already using Copilot for Microsoft 365, it’s easy to extend it with role-based capabilities for agents through prebuilt connectors to other CRMs. 

Try Copilot for Service features 

This month, we’re releasing four new features in preview: 

  • Draft an email response in Outlook: Agents can select from the predefined intents for the email to be drafted and provide their own custom intent. Copilot uses the context of the email conversation and case details to produce personalized and contextual emails. 
  • Email summary in Outlook: Copilot provides an email summary capturing all the important information for agents to understand the context of the case they’re working on. Agents can save the generated summary to the connected CRM system. 
  • Case summary in Outlook: Agents can access case summaries as they work on emails from customers. They can also save the case summary to the CRM system. 
  • Meeting recap in Teams: Copilot provides a meeting summary and action items that integrate with the Teams recap, providing all the relevant follow-up items. Agents can also create CRM tasks right from the Teams recap section.

Get started with Copilot for Service 

First, create a copilot instance in your environment by navigating to https://servicecopilot.microsoft.com/ and following the prompts. You need to give it a name, a description, and a target system that you want to connect to. For example, you can create a copilot named Microsoft that can access information from Salesforce. The portal will then guide you through creating the connection and deploying the copilot. 

The second step is to test your copilot in the demo agent desktop that is automatically generated for you. You can ask your copilot questions related to the target system, such as “What products does Microsoft sell?” or “How do I create a case in Salesforce?” 

In summary, Microsoft Copilot for Service is a powerful tool that can help your agents provide better support to your customers. By leveraging the power of AI, it can provide quick and accurate answers to common questions, freeing up your agents to focus on more complex issues. To learn more, be sure to check out the partner workshop and try it out for yourself. 

Learn more

For pricing information, see Microsoft Copilot for Service—Contact Center AI | Microsoft AI  

Extend Copilot for Service with Copilot Studio: Extend your agent-facing copilot | Microsoft Learn 

Overview of Copilot for Microsoft 365: Copilot for Microsoft 365 – Microsoft Adoption 

If you are a Dynamics 365 Customer Service user, you can start using Copilot capabilities right away: Enable Copilot features in Customer Service | Microsoft Learn 

The post Use Microsoft Copilot for Service with your external CRM  appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.