Streamline your make-to-order supply chain

Streamline your make-to-order supply chain

This article is contributed. See the original author and article here.

New make-to-order (MTO) automation capabilities available with the October 2022 release of Dynamics 365 Supply Chain Management streamline order-taking and related downstream processes. Supply chain planners can benefit from improved capable-to-promise (CTP) accuracy with plan-specific delay tolerance, keep supply available for last-minute orders, and automatically populate external order information during intercompany trade.

Impact of make-to-order

With MTO, production only starts after a customer places an order for a specific product. The main benefits of MTO are that you can accommodate customer-specific products and you don’t need to keep inventories of finished goods with the related risk of wastage. However, with MTO, your delivery time includes the production lead time and depends on the availability of resources and raw materials. This often leads to the need for CTP, close tracking of resource capacity, and flexibility for last-minute orders.

Until now, supply chain planners had to monitor and adjust supply levels and current demand manually. With the new automation capabilities in Supply Chain Management, planners can automate these tasks. The system takes informed actions based on parameters they set.

Benefits of make-to-order automation

Let’s take a closer look at the benefits of six improvements we’ve made for make-to-order scenarios.

Delay tolerance control

Delay tolerance represents the number of days beyond the lead time that you’re willing to wait before you order new replenishment when existing supply is already planned. It helps you avoid creating new supply orders if the existing supply will be able to cover the demand after a short delay. With the new Negative days option for delay tolerance control, you can determine whether it makes sense to create a new supply order for a given demand. The ability to control delay tolerance at the master plan level gives you more flexibility between the static plan and the dynamic plan used for CTP calculations. Automating the process ensures that CTP calculations don’t allow delays. You can optimize refill orders on the static plan to use existing orders, even it causes a bit of delay.

graphical user interface, application

Use latest possible supply

The Use latest possible supply option lets you keep products available for last-minute orders. It optimizes the use of existing supply by pegging the latest possible supply to a demand instead of using the first possible supply.

graphical user interface, text

Single-level marking

Marking links supply to demand for the purpose of cost allocation. It resembles pegging, which indicates how master planning expects to cover demand. However, marking is more permanent than pegging because it’s respected by later planning runs. Now you can limit inventory marking to a single level when firming planned orders. That allows you to keep component assignments flexible for production orders after firming.

graphical user interface, application

Order-specific fulfillment policy

You can already set a global default fulfillment policy and then override it for specific customers. Now you can view which default policy applies directly on the order and override it for individual orders. Previously, the order taker had to manually change the policy on the sales order. Now this step is automated, giving more control to the order taker and enabling flexible order processing.

graphical user interface, application

Line-controlled delivery

Delivery terms, mode of delivery, and external item numbers are critical information to track when one company receives a customer sales order, and another company ships the goods to the customer. Now purchase order lines are updated automatically to include this information from the intercompany sales order. This improvement enhances intercompany information exchange. It ensures that detailed demand information flows to the supplying company and that companies meet their customer commitments.

graphical user interface, text, application

User-defined period on Capacity load page

We’ve added a field to the Capacity load page. The new Number of days field allows you to define a custom period over which to view the capacity load of a resource, enabling long-term evaluation.

table

Learn more about make-to-order automation

To learn more about MTO automation in Supply Chain Management, read the documentation: Make-to-order supply automation | Microsoft Learn.

For more information about the delay tolerance impact, read Delay tolerance (negative days) | Microsoft Learn.

For more information about the impact of marking when firming planned orders, read Inventory marking with Planning Optimization | Microsoft Learn.

The post Streamline your make-to-order supply chain appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Azure DevOps – Leveraging Pipeline Decorators for Custom Process Automation

Azure DevOps – Leveraging Pipeline Decorators for Custom Process Automation

This article is contributed. See the original author and article here.

Introduction


 


Background


In the recent pandemic, health institutions all across the world have been pushed to their limits on about every facet. Through this, many such institutions have begun to reprioritize their modernization efforts around their cloud infrastructure to support increasing demands and hedge against uncertainty. As institutions are migrating their existing workloads into the cloud, a common challenge they are faced with is that many of their on-prem security processes and standards tend to not map one-to-one with the services they are being migrated to. With the sensitive nature of the healthcare industry, it is especially important to solution feasible routes to always ensure security and validation is in place end-to-end.


In this blog post, we will look at how Azure DevOps Pipeline Decorators can be leveraged to bridge the gap in our cloud environment with the customer’s existing security processes on their on-premises IIS server.


 


What are Pipeline Decorators?


If you have ever run across jobs executing on your azure pipelines that you have not previously defined, there is a good chance you may have already run into decorators before!


Pipeline decorators allow you to program jobs to execute before or after any pipeline runs across your entire Azure DevOps organization. For scenarios such as running a virus scan before every pipeline job, or any sort of automated steps to assist with governance of your CICD processes, pipeline decorators grants you the ability to impose your will at any scale within Azure DevOps.


Read further on decorators on Microsoft Learn: Pipeline decorators – Azure DevOps | Microsoft Learn


In this blog post, I will be walking through a sample process based on the customer scenario’s requirements, and how the pipeline decorators can fit in to assist with their governance objectives.


 


Scenario


Customer’s Azure DevOps organization has grown to a considerable size composed of numerous projects with various applications with no clearly defined process or standards they adhere to. All of these applications have been hosted on an on-premises IIS server, where the application teams are trusted to provide manual inputs to deployment variables.


Due to the lack of out-of-the-box controls for validating IIS file path permissions with Azure Active Directory identities within Azure DevOps, this was an area of concern with the customer as the deployed production applications effectively did not have any preventative measures to address malicious actors or human error overwriting existing applications.


When looking at the deployment tasks to IIS servers from Azure DevOps, the two primary variables the customer was looking to control were:



  • virtualAppName– Name of an existing an already existing virtual application on the target machines

  • websiteName– Name of an existing website on the machine group


Considering the RBAC strategy the customer has in mind with AAD, there will be a third variable to represent the ownership of the application via an AAD group.



  • groupId– AAD ID of the application owner’s group


In the next section, I will outline a high-level process proposal based on these three variables, that goes into onboarding applications.


 


 


Solutioning


 


High-Level Process Proposal for Onboarding New Applications


For this demo’s purposes, we will make the following assumptions to build out a process that illustrates how application teams can successfully onboard and assist the operations team in successfully managing the application environment within their on-prem IIS server.


 


Assumptions



  1. Ops team only require the following three parameters to help govern application deployments:

    • virtualAppName

    • groupId

    • websiteName



  2. Application teams only need flexibility while building applications within the CICD pipelines, and currently do not have much concerns or the expertise to manage deployments.

  3. Ops team wishes to also build security around these parameters such that only the authorized actors will be able to modify these values.


 


Onboarding New Applications




  1. Ops team provides a template (such as GitHub issues templates) for new application requests to the application teams, and captures the following IIS deployment-specific information:



    • virtualAppName

    • groupId

    • websiteName


    For this demo, I have created a simple GitHub issues YAML form which the operations team can leverage to capture basic information from the application teams, which can also be tied to automation to further reduce operational overhead:




1.png



  1. Ops team is then notified of the request, and upon successful validation continues to provision an Application Environment with the captured information

    1. application environment in this context involves the following components:

      1. Key Vault (per application)

      2. Service Connection to application Key Vault with read permissions over secrets

      3. Place the application team provided, ops team validated virtualAppName, groupId, websiteName values as secrets

      4. Place Service Connection details in the project variable group to allow for the decorator to dynamically retrieve secrets for each project

      5. Application registered onto the IIS server that adheres to existing IIS server file management strategies





  2. Once the environment is ready for use, notify the application teams by updating the issue template and now the application teams only need to focus on building and publishing their artifact within their CICD pipelines


 


Updating Existing Applications



  1. Ops team provides a template for change requests to the application teams, and captures the following information:

    • virtualAppName

    • groupId

    • websiteName

    • Change Justification/Description



  2. Core Ops team reviews and approves the change request

  3. Update the application environment accordingly

  4. Notify the application team

    2.png


 


Now with the high-level process defined, we will now look at how we could bring in the relevant parameters into the decorators to impose validation logic.


 


 


Building the Demo


 


Setting up our Demo Application Environment


In this example, I created a key vault named kv-demolocaldev, and placed the virtualAppName, groupId, and websiteName so we may retrieve the values later as shown below:


 


3.png


 


Now, we must create the project and subsequently create the service connection to the key vault scoped to the project.


To do this, I created an Azure Resource Manager Service Connection while using my demo identity, that is scoped to the resource group containing the key vault:


 


4.png


 


 


Once the service connection is done provisioning, you can navigate to the AAD object by following the Manage Service Principal link, which will allow you to retrieve the Application ID to be used when adding the access policy.


 


5.png


 


 


Selecting the Manage Service Principal link will take us to the AAD object, where we can find the Azure Application ID to add to our Key Vault access policy.


 


6.png


 


 


7.png


 


The service connection will only need GET secret permissions on its access policy.


 


8.png


 


Afterwards, we now capture the information about the service connection and key vault by creating a variable group on the application’s Azure DevOps project named demo-connection-details:


 


9.png


 


 


There will need to be additional steps taken to provision the IIS server as well with the parameters, but for this demo’s purpose we will assume that the provisioning steps have already been taken care of. Now with this, we can move onto building out our decorators.


 


Building the Decorators


For the pipeline side, the customer is looking to control both the pre-build with validating the input variables, and post-build in placing guardrails around deployment configurations with the validated parameters.


Both pre and post decorators will leverage the same key vault secrets, so we will start with integrating the key vault secrets into the YAML definition.


 


Pipeline decorators leverage the same YAML schema as the YAML build pipelines used within Azure DevOps. Meaning we can take advantage of conditional logic with repo branches, dynamic variables, and pull in key vault secrets with service connections.


The high-level logic we are attempting to demonstrate for the pre and post decorators are the following:


 


Pre:



  1. Check for variables/conditions to bypass decorators

  2. Using pre-established variables, connect to application’s Azure Key vault and retrieve secret values

  3. For each of the deployment variables, process custom validation logic


Post:



  1. Deploy the application/artifact to the IIS server


 


You can find the demo files within the following repo: https://github.com/JLee794-Sandbox/ADO-Decorators-PoC


Pre-build decorator


To ensure users can opt-out of the process during development, we can leverage the same YAML schema as build pipelines to construct our conditionals.



  1. Check for variables/condition to bypass decorators


 


In the pre-build decorator YAML definition (located in Build/Pre/input-parameter-decorator.yml), for pipeline builds that run off the main branch, that also checks for a simple variable flag named testDecorator to be true for the decorator to execute.


 


steps:
– ${{ if and(eq(variables[‘Build.SourceBranchName’], ‘main’), contains(variables[‘testDecorator’],’true’) ) }}:

 


Following right after, I retrieve websiteName, groupId, and virtualAppName with the connection details we have placed within the demo-connection-details, which will be passed in by the build pipeline.


 


– task: AzureKeyVault@2
displayName: ‘[PRE BUILD DECORATOR] Accessing Decorator Params from the key vault – $(decorator_keyvault_name), using $(decorator_keyvault_connection_name) connection.’
inputs:
azureSubscription: $(decorator_keyvault_connection_name) # Service Connection Name (scoped to RG)
KeyVaultName: $(decorator_keyvault_name) # Key Vault Name
SecretsFilter: ‘websiteName,groupId,virtualAppName’ # Secret names to retrieve from Key Vault
RunAsPreJob: true

 


Now that the secrets have been pulled in, we can now run our custom validation logic for each. For the purpose of this demo, we will just check that each variable exists and throw an error through a simple PowerShell script.


 


– task: PowerShell@2
name: ValidateDeploymentVariables
displayName: ‘[PRE BUILD DECORATOR] Validate Deployment Variables (Injected via Decorator)’
inputs:
targetType: ‘inline’
script: |
$errorArr = @()

try {
Write-Host “VirtualAppName: $(virtualAppName)”
# your input test cases go here
# e.g querying the remote-machine to match the virtualAppName
}
catch {
errorArr += ‘virtualAppName’
Write-Host “##vso[task.logissue type=error]Input parameter ‘virtualAppName’ failed validation tests.”
}

try {
Write-Host “GroupID: $(groupId)”
# your input test cases go here
# e.g querying the remote-machine to match the groupId against the local file permissions
}
catch {
Write-Host “##vso[task.logissue type=error]Input parameter ‘groupId’ failed validation tests.”
errorArr += ‘GroupID’
}

try {
Write-Host “WebSiteName: $(webSiteName)”
# your input test cases go here
# e.g querying the web-site URL to see if site already exists, etc.
}
catch {
Write-Host “##vso[task.logissue type=error]Input parameter ‘webSiteName’ failed validation tests.”
errorArr += ‘GroupID’
}

if ($errorArr.count -gt 0) {
# Link to your teams documentation for further explanation
Write-Warning -Message “Please provide valid parameters for the following variables: $($errorArr.join(‘, ‘))”
Write-Warning -Message “See <https://docs.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch> for additional details”
throw “Please provide valid values for $($errorArr.join(‘, ‘)).”
}


 


And we are done with the pre-build decorator! Of course, while developing it is important to iteratively test your code. If you would like to publish your code now, skip to the (Publish your extension) section below.


 


Post-build decorator


For our post-build decorator, all we want to do is determine when the decorator should run, and simply invoke a deployment task such as the IISWebAppDeploymentOnMachineGroup task.


 


Of course, there are many more validation steps and tools you can place here to further control your deployment process, but for the sake of this demo we will just be outputting some placeholder messages:


 


steps:
– task: PowerShell@2
name: DeployToIIS
displayName: Deploy to IIS (Injected via Decorator)
condition: |
and
(
eq(variables[‘Build.SourceBranch’], ‘refs/heads/main’),
eq(variables.testDecorator, ‘true’)
)
inputs:
targetType: ‘inline’
script: |
# Validation steps to check if IIS
# Validation steps to check if iOS or Android
# > execute deployment accordingly

Write-Host @”
Your IIS Web Deploy Task can look like this:

– task: IISWebAppDeploymentOnMachineGroup@
inputs:
webSiteName: $(webSiteName)
virtualApplication: $(virtualAppName)
package: ‘$(System.DefaultWorkingDirectory)***.zip’ # Optionally, you can parameterize this as well.
setParametersFile: # Optional
removeAdditionalFilesFlag: false # Optional
excludeFilesFromAppDataFlag: false # Optional
takeAppOfflineFlag: false # Optional
additionalArguments: # Optional
xmlTransformation: # Optional
xmlVariableSubstitution: # Optional
jSONFiles: # Optional
“@


 


Publishing the Extension to Share with our ADO Organization


First, we need to construct a manifest for the pipeline decorators to publish them to the private Visual Studio marketplace so that we may start using and testing the code.


In the demo directory, under Build we have both Pre and Post directories, where we see a file named vss-extension.json on each. We won’t go into too much of the details around the manifest file here today, but the manifest file allows us to configure how the pipeline decorator executes, and for what sort of target.


 


Read more on manifest files: Pipeline decorators – Azure DevOps | Microsoft Learn


 


With the manifest file configured, we can now publish to the marketplace and share it with our ADO organization:




  1. Create publisher on the Marketplace management portal




  2. Install tfx command line tool


    npm install -g tfx-cli



  3. Navigate to the directory containing the vss-extension.json




  4. Generate the .vsix file through tfx extension create


    > tfx extension create –rev-version

    TFS Cross Platform Command Line Interface v0.11.0
    Copyright Microsoft Corporation

    === Completed operation: create extension ===
    – VSIX: /mnt/c/Users/jinle/Documents/Tools/ADO-Decorator-Demo/Build/Pre/Jinle-SandboxExtensions.jinlesampledecoratorspre-1.0.0.vsix
    – Extension ID: jinlesampledecoratorspre
    – Extension Version: 1.0.0
    – Publisher: Jinle-SandboxExtensions




  5. Upload the extension via the Marketplace management portal or through tfx extension publish




  6. Share your extension with your ADO Organization on the management portal


    10.png


  7. Install the extension on your ADO Organization



    1. Organization Settings > Manage Extensions > Shared > Install

      11.png





Testing the Decorator


Now that your pipeline decorators are installed in your organization, any time you push an update to the Visual Studio marketplace to update your extensions, your organization will automatically get the latest changes.


 


To test your decorators, you can leverage the built in GUI for Azure DevOps to validate your YAML syntax, as well as executing any build pipeline with the appropriate trigger conditions we have configured previously.


 


In our demo application environment, I updated the out-of-the-box starter pipeline to include our connection variable group, as well as specify the testDecorators flag to true:


variables:
– name: testDecorator
value: true
– group: demo-connection-details

Running the pipeline, I can now see the tasks I have defined execute as expected:


12.png


 


Once we verify that the pre and post tasks have run as expected with the conditional controls evaluating in a similar manner, we can then conclude this demo.




Conclusion


 


Now with the decorator’s scaffolding in place, the customer can continue to take advantage of the flexibility provided by Azure DevOps pipeline’s YAML schema to implement their existing security policies at the organization level.


 


I hope this post helped bring understanding to how pipeline decorators can be leveraged to automate custom processes and bring governance layers into your ADO environment.


If you have any questions or concerns around this demo, or would like to continue the conversation around potential customer scenarios, please feel free to reach out any time.

Improve admin productivity with guided Customer Service channel setup and settings search

Improve admin productivity with guided Customer Service channel setup and settings search

This article is contributed. See the original author and article here.

We recently announced the unified Customer Service admin center, which consolidates administration experiences across the Microsoft Dynamics 365 Customer Service suite. The unified admin center simplifies setup tasks with a step-by-step guided experience to help admins easily onboard customer service channels. A dedicated search for settings makes discovering and updating settings fast and easy.

Follow the wizard to easily set up customer service channels

The guided setup wizard helps you configure customer service channels such as email, case, chat, and voice. The wizard guides you through all the steps to configure users and permissions and set up queues and routing rules to help you start handling customer issues with minimal fuss.

graphical user interface, application

After all the steps are complete, you’ll find instructions to validate that the channel is set up correctly. You can also go directly to a step to modify settings.

graphical user interface, text, application, email

Search for admin settings

The search admin settings page helps you quickly discover the admin setting you want to manage. Along with top matches, the page lists settings for new features, so that you can evaluate whether they might be helpful for your business and start to adopt them.

graphical user interface, text

With the new unified Customer Service admin center, it’s easier than ever to manage users, add channels, route and distribute workloads, and get valuable insights about all the activity across your digital contact center. Migrate to the new app to discover how you can streamline digital contact center operations, help your agents be more productive, and earn customers for life.

Learn more

To find out more about the new unified Customer Service admin center app, read the documentation: Customer Service admin center | Microsoft Learn

Not yet a Dynamics 365 customer? Take a guided tour and get a free trial.

The post Improve admin productivity with guided Customer Service channel setup and settings search appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Simplifying the cloud data migration journey for enterprises

Simplifying the cloud data migration journey for enterprises

This article is contributed. See the original author and article here.

In this guest blog post, Kajol Patel, Senior Content Marketing Specialist at Data Dynamics, discusses digital transformation strategies for enterprises and how to utilize StorageX and the Azure File Migration Program to overcome common data migration challenges.


 


Data is foundational to any digital transformation strategy, yet enterprises worldwide struggle to find reliable and cost-efficient solutions to manage, govern, and extract valuable insights from it. According to a recent report published in Statista, the total volume of enterprise data worldwide increased from 1 petabyte (PB) to 2.02 PB between 2020 and 2022. This sizeable jump in volume indicates a 42.2 percent average annual growth in data over the last two years. The report also highlights that a majority of that data is stored in internal datacenters. Data storage and processing is costly and energy-intensive for enterprises.


 


Additionally, the cost of software for collection, analysis, and management of terabytes and petabytes of data residing in multiple storage centers adds to the expenditure. Breaking down siloes to extract real-time insights often ends up costing the enterprise exorbitant amounts of IT resources and revenue.


 


As unstructured data sprawl continues to grow, enterprises are turning to the cloud and embracing data as a strategic and valuable asset. By extracting useful insights from data, businesses can accelerate their digital journey by making data-driven decisions in real time to meet peak demand, grow revenue, and minimize storage cost. Enterprises such as Microsoft that offer cloud solutions give clients access to subscription-based remote computing services. It enables them to adjust cloud consumption to meet changing needs. As a possible recession looms, organizations that rely on the cloud are more likely to experience cost reduction as they effectively manage risk and compliance.


 


However, most enterprises face numerous challenges while migrating to the cloud: proprietary vendor lock-in, lack of migration skills, a labor-intensive process, and inadequate knowledge of data estate.


 


Top 3 data migration challenges for enterprises:



  • Lift-and-shift blind spots: Lack of knowledge of enterprise unstructured data estate may result in post-migration complexities such as security malfunction and non-compliance.

  • Lack of visibility: No clarity about what, when, and where around data may result in lack of storage optimization and delayed migration timelines.

  • Complexity of scope and scale: Lack of an integrated approach, governance, and skills, decrease in efficiency, low time to effort ratio, and other redundancies can cause chaos.


 


In a webinar hosted by Data Dynamics, Karl Rautenstrauch, Principal Program Manager, Storage Partners at Microsoft, spoke about the top challenges faced by enterprise customers while migrating to the cloud: “Over nine years of working closely with partners and customers in the field of migrating datasets and applications to Azure, we see a consistent theme of every enterprise in every industry being a little overburdened today – too much to do, too little time, and too few people, hence most of these enterprises are seeking automation. They want to ensure that they can engage in complex activities like moving an application comprised of virtual machines, databases, and file repositories in the simplest way possible with the least risk possible.”


 


He further emphasized the most consistent requirement for all customers he has worked with, regardless of size, was to migrate large data sets securely, quickly, and with minimal risk and disruption to user productivity.


 


Migrating file data between disparate storage platforms is always a daunting process. Microsoft recently announced the Azure File Migration Program to make customer data migration much easier and more secure. It helps address the customer’s need to reduce the time, effort, and risk involved in complex file data migration.


 


Data Dynamics_Central Console.png


 


Speaking at the webinar, Rautenstrauch emphasized the value of on-demand compute and modern cloud services: “We have built a platform of services called Azure Migrate, which is freely available, and it has cloud-driven capabilities. These services help customers move virtual machines easily, databases, and now even containerized applications in an automated, risk-free fashion. One area that is neglected is unstructured data, so what we are going to do is address it in the Azure File Migration Program.”


 


The Azure Migrate hub offers many effective tools and services to simplify database and server migration but doesn’t address the need for unstructured data migration. Hence, Azure File Migration Program is becoming a new favorite among enterprises possessing unstructured data sprawl.


Jurgen Willis, VP of Azure Optimized Workloads and Storage, states in his blog, “Azure Migrate offers a very powerful set of no-cost (or low-cost) tools to help you migrate virtual machines, websites, databases, and virtual desktops for critical applications. You can modernize legacy applications by migrating them from servers to containers and build a cloud native environment.”


 


Data Dynamics transforms data assets into competitive advantage with Azure File Migration


With over a decade of domain experience and a robust clientele of 300+ organizations, including 28 of the Fortune 100, Data Dynamics is a partner of choice for unstructured file data migrations. StorageX is Data Dynamics’ award-winning solution for unstructured data management. The mobility feature of StorageX provides intelligence-driven, automated data migrations to meet the needs and scale of global enterprises. 


 


Having migrated over 400 PB of data encompassing hundreds of trillions of files, this feature is trusted and proven and delivers without losing a single byte of data. It provides policy-based and automated data migration with reduced human intervention and without vendor lock-in. StorageX has proven capabilities to multi-thread and migrates at the speed where you can move millions and billions of files in hours, making it one of the most scalable and risk-free data migration solutions. 


 


It can easily identify workloads and migrate data based on characteristics such as the least-touched files, files owned by specific users or groups, or hundreds of other actionable insights. StorageX Migration is a powerful migration engine that moves large volumes of data across shares and exports with speed and accuracy.


 


Here’s a detailed comparative study of StorageX versus traditional migration tools.


 


Microsoft is sponsoring the use of Data Dynamics’ StorageX as a part of the Azure File Migration Program. Enterprises can leverage this product to migrate their unstructured files, Hadoop, and object storage data into Azure at zero additional cost to the customer and no separate migration licensing.


 


Learn more about the Azure File Migration Program or reach us at solutions@datdyn.com I (713)-491-4298 I +44-(20)-45520800

Discover how Microsoft 365 helps organizations do more with less

Discover how Microsoft 365 helps organizations do more with less

This article is contributed. See the original author and article here.

Now more than ever, IT leaders need to reduce costs while securing and empowering their workforce. Microsoft 365 combines the capabilities organizations need in one secure, integrated experience—powered by data and AI—to help people work better and smarter.

The post Discover how Microsoft 365 helps organizations do more with less appeared first on Microsoft 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Automate more, waste less: ESG initiatives with Dynamics 365 Finance

Automate more, waste less: ESG initiatives with Dynamics 365 Finance

This article is contributed. See the original author and article here.

In a recent keynote at Microsoft Inspire, Microsoft Chairman and CEO Satya Nadella made this prediction, “The next 10 years are not going to be like the last 10,” and that “Digital technology will be the only way to navigate the headwinds facing business leaders today.” Today, we face is a world of perpetual change with ever-increasing economic, environmental, and social complexities. This brings us to a cumulation of factors that profoundly impact the health and success of individuals and corporations. The opportunity facing business leaders is to find ways to utilize technology to drive a positive impact on business performance and the well-being of society and the environment.

Reduce operating costs with ESG initiatives

It has become clear that ESG (environmental, social, and governance) initiatives are becoming a top business priority for many organizations. The Gartner annual CEO and Senior Business Executive Survey states, “In 2022, environmental sustainability became a top 10 business priority for the first time ever, with 9 percent of the respondents naming it as one of their top three.”1 This focus in priority is for good reason. Studies show that companies that execute effectively on ESG programs can reduce operating costs by up to 60 percent2 and that higher ESG scores correlate to lower costs of capital (6.16 percent compared to 6.55 percent for the lowest ESG scores).3

What is less clear is how businesses execute ESG initiatives in ways that do not require more effort or added expenseeffectively, how to do more with less. In his Microsoft Inspire keynote, Satya stated, “Doing more with less doesn’t mean working harder or longer. That’s not going to scale. But it does mean applying technology to amplify what you’re able to do across an organization so you can differentiate and build resilience.” The expectation that we must do more with less is especially relevant when actioning on ESG goals. The approach business leaders must consider is anything that reduces energy costs or increases resource efficiency will be highly beneficial to CEOs, customers, employees, investors, and the environment. 

Go paperless with Dynamics 365 Finance

There are many ways to make ESG impact, and perhaps most attractive to business leaders are opportunities that align business processes to ESG outcomes. One very accessible possibility exists in the automation of paper-intensive business processes. For example, Accounts Payable (AP) has historically been plagued by the manual effort and cost required to manage tens of thousands of paper invoices. AP invoicing is heavy with storage, printing, disposal, and document security costs that can easily be mitigated through digitization.

Though the digitization of office paper has been achievable for over a decade, organizations still struggle to phase out paper-laden business processes. Some estimates show that US offices use 12.1 trillion sheets of paper annually and that demand for paper is expected to double before 2030.4 To address this trend, companies should turn to robust and easy-to-use technology to help quickly reduce paper usage.

Invoice capture within Dynamics 365 Finance

Automate your AP process

a person sitting at a table using a laptop

If your organization is ready to embark on the journey to go paperless, Microsoft is here to help. We are excited to release the preview of Invoice capture within Microsoft Dynamics 365 Finance. Invoice capture will allow our customers to digitally transform the entire invoice-to-pay process within their AP departmentdelivering better spend control, faster cycle times, and paperless processing, leading to more automation and less waste. With the support of advanced technology, organizations can efficiently drive outcomes that benefit both business interests and the greater good.


Sources

1Gartner CEO and Senior Business Executive Survey.

GARTNER is a registered trademark and service of Gartner, Inc. and/or its affiliates in the U.S. and internationally and is used herein with permission. All rights reserved.

2Five Ways That ESG Creates Value, McKinsey.

3ESG and the Cost of Capital, MSCI.

4Paper Waste Facts, The World Counts.

The post Automate more, waste less: ESG initiatives with Dynamics 365 Finance appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.