Improve admin productivity with guided Customer Service channel setup and settings search

Improve admin productivity with guided Customer Service channel setup and settings search

This article is contributed. See the original author and article here.

We recently announced the unified Customer Service admin center, which consolidates administration experiences across the Microsoft Dynamics 365 Customer Service suite. The unified admin center simplifies setup tasks with a step-by-step guided experience to help admins easily onboard customer service channels. A dedicated search for settings makes discovering and updating settings fast and easy.

Follow the wizard to easily set up customer service channels

The guided setup wizard helps you configure customer service channels such as email, case, chat, and voice. The wizard guides you through all the steps to configure users and permissions and set up queues and routing rules to help you start handling customer issues with minimal fuss.

graphical user interface, application

After all the steps are complete, you’ll find instructions to validate that the channel is set up correctly. You can also go directly to a step to modify settings.

graphical user interface, text, application, email

Search for admin settings

The search admin settings page helps you quickly discover the admin setting you want to manage. Along with top matches, the page lists settings for new features, so that you can evaluate whether they might be helpful for your business and start to adopt them.

graphical user interface, text

With the new unified Customer Service admin center, it’s easier than ever to manage users, add channels, route and distribute workloads, and get valuable insights about all the activity across your digital contact center. Migrate to the new app to discover how you can streamline digital contact center operations, help your agents be more productive, and earn customers for life.

Learn more

To find out more about the new unified Customer Service admin center app, read the documentation: Customer Service admin center | Microsoft Learn

Not yet a Dynamics 365 customer? Take a guided tour and get a free trial.

The post Improve admin productivity with guided Customer Service channel setup and settings search appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Simplifying the cloud data migration journey for enterprises

Simplifying the cloud data migration journey for enterprises

This article is contributed. See the original author and article here.

In this guest blog post, Kajol Patel, Senior Content Marketing Specialist at Data Dynamics, discusses digital transformation strategies for enterprises and how to utilize StorageX and the Azure File Migration Program to overcome common data migration challenges.


 


Data is foundational to any digital transformation strategy, yet enterprises worldwide struggle to find reliable and cost-efficient solutions to manage, govern, and extract valuable insights from it. According to a recent report published in Statista, the total volume of enterprise data worldwide increased from 1 petabyte (PB) to 2.02 PB between 2020 and 2022. This sizeable jump in volume indicates a 42.2 percent average annual growth in data over the last two years. The report also highlights that a majority of that data is stored in internal datacenters. Data storage and processing is costly and energy-intensive for enterprises.


 


Additionally, the cost of software for collection, analysis, and management of terabytes and petabytes of data residing in multiple storage centers adds to the expenditure. Breaking down siloes to extract real-time insights often ends up costing the enterprise exorbitant amounts of IT resources and revenue.


 


As unstructured data sprawl continues to grow, enterprises are turning to the cloud and embracing data as a strategic and valuable asset. By extracting useful insights from data, businesses can accelerate their digital journey by making data-driven decisions in real time to meet peak demand, grow revenue, and minimize storage cost. Enterprises such as Microsoft that offer cloud solutions give clients access to subscription-based remote computing services. It enables them to adjust cloud consumption to meet changing needs. As a possible recession looms, organizations that rely on the cloud are more likely to experience cost reduction as they effectively manage risk and compliance.


 


However, most enterprises face numerous challenges while migrating to the cloud: proprietary vendor lock-in, lack of migration skills, a labor-intensive process, and inadequate knowledge of data estate.


 


Top 3 data migration challenges for enterprises:



  • Lift-and-shift blind spots: Lack of knowledge of enterprise unstructured data estate may result in post-migration complexities such as security malfunction and non-compliance.

  • Lack of visibility: No clarity about what, when, and where around data may result in lack of storage optimization and delayed migration timelines.

  • Complexity of scope and scale: Lack of an integrated approach, governance, and skills, decrease in efficiency, low time to effort ratio, and other redundancies can cause chaos.


 


In a webinar hosted by Data Dynamics, Karl Rautenstrauch, Principal Program Manager, Storage Partners at Microsoft, spoke about the top challenges faced by enterprise customers while migrating to the cloud: “Over nine years of working closely with partners and customers in the field of migrating datasets and applications to Azure, we see a consistent theme of every enterprise in every industry being a little overburdened today – too much to do, too little time, and too few people, hence most of these enterprises are seeking automation. They want to ensure that they can engage in complex activities like moving an application comprised of virtual machines, databases, and file repositories in the simplest way possible with the least risk possible.”


 


He further emphasized the most consistent requirement for all customers he has worked with, regardless of size, was to migrate large data sets securely, quickly, and with minimal risk and disruption to user productivity.


 


Migrating file data between disparate storage platforms is always a daunting process. Microsoft recently announced the Azure File Migration Program to make customer data migration much easier and more secure. It helps address the customer’s need to reduce the time, effort, and risk involved in complex file data migration.


 


Data Dynamics_Central Console.png


 


Speaking at the webinar, Rautenstrauch emphasized the value of on-demand compute and modern cloud services: “We have built a platform of services called Azure Migrate, which is freely available, and it has cloud-driven capabilities. These services help customers move virtual machines easily, databases, and now even containerized applications in an automated, risk-free fashion. One area that is neglected is unstructured data, so what we are going to do is address it in the Azure File Migration Program.”


 


The Azure Migrate hub offers many effective tools and services to simplify database and server migration but doesn’t address the need for unstructured data migration. Hence, Azure File Migration Program is becoming a new favorite among enterprises possessing unstructured data sprawl.


Jurgen Willis, VP of Azure Optimized Workloads and Storage, states in his blog, “Azure Migrate offers a very powerful set of no-cost (or low-cost) tools to help you migrate virtual machines, websites, databases, and virtual desktops for critical applications. You can modernize legacy applications by migrating them from servers to containers and build a cloud native environment.”


 


Data Dynamics transforms data assets into competitive advantage with Azure File Migration


With over a decade of domain experience and a robust clientele of 300+ organizations, including 28 of the Fortune 100, Data Dynamics is a partner of choice for unstructured file data migrations. StorageX is Data Dynamics’ award-winning solution for unstructured data management. The mobility feature of StorageX provides intelligence-driven, automated data migrations to meet the needs and scale of global enterprises. 


 


Having migrated over 400 PB of data encompassing hundreds of trillions of files, this feature is trusted and proven and delivers without losing a single byte of data. It provides policy-based and automated data migration with reduced human intervention and without vendor lock-in. StorageX has proven capabilities to multi-thread and migrates at the speed where you can move millions and billions of files in hours, making it one of the most scalable and risk-free data migration solutions. 


 


It can easily identify workloads and migrate data based on characteristics such as the least-touched files, files owned by specific users or groups, or hundreds of other actionable insights. StorageX Migration is a powerful migration engine that moves large volumes of data across shares and exports with speed and accuracy.


 


Here’s a detailed comparative study of StorageX versus traditional migration tools.


 


Microsoft is sponsoring the use of Data Dynamics’ StorageX as a part of the Azure File Migration Program. Enterprises can leverage this product to migrate their unstructured files, Hadoop, and object storage data into Azure at zero additional cost to the customer and no separate migration licensing.


 


Learn more about the Azure File Migration Program or reach us at solutions@datdyn.com I (713)-491-4298 I +44-(20)-45520800

Discover how Microsoft 365 helps organizations do more with less

Discover how Microsoft 365 helps organizations do more with less

This article is contributed. See the original author and article here.

Now more than ever, IT leaders need to reduce costs while securing and empowering their workforce. Microsoft 365 combines the capabilities organizations need in one secure, integrated experience—powered by data and AI—to help people work better and smarter.

The post Discover how Microsoft 365 helps organizations do more with less appeared first on Microsoft 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Automate more, waste less: ESG initiatives with Dynamics 365 Finance

Automate more, waste less: ESG initiatives with Dynamics 365 Finance

This article is contributed. See the original author and article here.

In a recent keynote at Microsoft Inspire, Microsoft Chairman and CEO Satya Nadella made this prediction, “The next 10 years are not going to be like the last 10,” and that “Digital technology will be the only way to navigate the headwinds facing business leaders today.” Today, we face is a world of perpetual change with ever-increasing economic, environmental, and social complexities. This brings us to a cumulation of factors that profoundly impact the health and success of individuals and corporations. The opportunity facing business leaders is to find ways to utilize technology to drive a positive impact on business performance and the well-being of society and the environment.

Reduce operating costs with ESG initiatives

It has become clear that ESG (environmental, social, and governance) initiatives are becoming a top business priority for many organizations. The Gartner annual CEO and Senior Business Executive Survey states, “In 2022, environmental sustainability became a top 10 business priority for the first time ever, with 9 percent of the respondents naming it as one of their top three.”1 This focus in priority is for good reason. Studies show that companies that execute effectively on ESG programs can reduce operating costs by up to 60 percent2 and that higher ESG scores correlate to lower costs of capital (6.16 percent compared to 6.55 percent for the lowest ESG scores).3

What is less clear is how businesses execute ESG initiatives in ways that do not require more effort or added expenseeffectively, how to do more with less. In his Microsoft Inspire keynote, Satya stated, “Doing more with less doesn’t mean working harder or longer. That’s not going to scale. But it does mean applying technology to amplify what you’re able to do across an organization so you can differentiate and build resilience.” The expectation that we must do more with less is especially relevant when actioning on ESG goals. The approach business leaders must consider is anything that reduces energy costs or increases resource efficiency will be highly beneficial to CEOs, customers, employees, investors, and the environment. 

Go paperless with Dynamics 365 Finance

There are many ways to make ESG impact, and perhaps most attractive to business leaders are opportunities that align business processes to ESG outcomes. One very accessible possibility exists in the automation of paper-intensive business processes. For example, Accounts Payable (AP) has historically been plagued by the manual effort and cost required to manage tens of thousands of paper invoices. AP invoicing is heavy with storage, printing, disposal, and document security costs that can easily be mitigated through digitization.

Though the digitization of office paper has been achievable for over a decade, organizations still struggle to phase out paper-laden business processes. Some estimates show that US offices use 12.1 trillion sheets of paper annually and that demand for paper is expected to double before 2030.4 To address this trend, companies should turn to robust and easy-to-use technology to help quickly reduce paper usage.

Invoice capture within Dynamics 365 Finance

Automate your AP process

a person sitting at a table using a laptop

If your organization is ready to embark on the journey to go paperless, Microsoft is here to help. We are excited to release the preview of Invoice capture within Microsoft Dynamics 365 Finance. Invoice capture will allow our customers to digitally transform the entire invoice-to-pay process within their AP departmentdelivering better spend control, faster cycle times, and paperless processing, leading to more automation and less waste. With the support of advanced technology, organizations can efficiently drive outcomes that benefit both business interests and the greater good.


Sources

1Gartner CEO and Senior Business Executive Survey.

GARTNER is a registered trademark and service of Gartner, Inc. and/or its affiliates in the U.S. and internationally and is used herein with permission. All rights reserved.

2Five Ways That ESG Creates Value, McKinsey.

3ESG and the Cost of Capital, MSCI.

4Paper Waste Facts, The World Counts.

The post Automate more, waste less: ESG initiatives with Dynamics 365 Finance appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Part 2 – Observability for your azd-compatible app

Part 2 – Observability for your azd-compatible app

This article is contributed. See the original author and article here.

In Part 1, I walked you through how to azdev-ify a simple Python app. In this post, we will:



  • add the Azure resources to enable the observability features in azd

  • add manual instrumentation code in the app 

  • create a launch.json file to run the app locally and make sure we can send data to Application Insights

  • deploy the app to Azure


 


Previously…


We azdev-ified a simple Python app: TheCatSaidNo and deployed the app to Azure. Don’t worry if you have already deleted everything. I have updated the code for part 1 because of the Bicep modules improvements we shipped in the azure-dev-cli_0.4.0-beta.1 release. You don’t need to update your codes, just start from my GitHub repository (branch: part1):



  1. Make sure have the pre-requisites installed:


  2. In a new empty directory, run 

    azd up -t https://github.com/puicchan/theCatSaidNo -b part1​

    If you run `azd monitor –overview` at this point, you will get an error – “Error: application does not contain an Application Insights dashboard.” That’s because we didn’t create any Azure Monitor resources in part 1,




 


Step 1 – add Application Insights


The Azure Developer CLI (azd) provides a monitor command to help you get insight into how your applications are performing so that you can proactively identify issues. We need to first add the Azure resources to the resource group created in part 1.



  1. Refer to a sample, e.g., ToDo Python Mongo. Copy the directory /infra/core/monitor to your /infra folder.

  2. In main.bicep: add the following parameters. If you want to override the default azd naming convention, provide your own values here. This is new since version 0.4.0-beta.1. 

    param applicationInsightsDashboardName string = ''
    param applicationInsightsName string = ''
    param logAnalyticsName string = ''​


  3. Add the call to monitoring.bicep in /core/monitor

    // Monitor application with Azure Monitor
    module monitoring './core/monitor/monitoring.bicep' = {
      name: 'monitoring'
      scope: rg
      params: {
        location: location
        tags: tags
        logAnalyticsName: !empty(logAnalyticsName) ? logAnalyticsName : '${abbrs.operationalInsightsWorkspaces}${resourceToken}'
        applicationInsightsName: !empty(applicationInsightsName) ? applicationInsightsName : '${abbrs.insightsComponents}${resourceToken}'
        applicationInsightsDashboardName: !empty(applicationInsightsDashboardName) ? applicationInsightsDashboardName : '${abbrs.portalDashboards}${resourceToken}'
      }
    }


  4. Pass the application insight name as a param to appservice.bicep in the web module: 

    applicationInsightsName: monitoring.outputs.applicationInsightsName


  5. Add output for the App Insight connection string to make sure it’s stored in the .env file:

    output APPLICATIONINSIGHTS_CONNECTION_STRING string = monitoring.outputs.applicationInsightsConnectionString​


  6. Here’s the complete main.bicep

    targetScope = 'subscription'
    
    @minLength(1)
    @maxLength(64)
    @description('Name of the the environment which is used to generate a short unique hash used in all resources.')
    param environmentName string
    
    @minLength(1)
    @description('Primary location for all resources')
    param location string
    
    // Optional parameters to override the default azd resource naming conventions. Update the main.parameters.json file to provide values. e.g.,:
    // "resourceGroupName": {
    //      "value": "myGroupName"
    // }
    param appServicePlanName string = ''
    param resourceGroupName string = ''
    param webServiceName string = ''
    param applicationInsightsDashboardName string = ''
    param applicationInsightsName string = ''
    param logAnalyticsName string = ''
    // serviceName is used as value for the tag (azd-service-name) azd uses to identify
    param serviceName string = 'web'
    
    @description('Id of the user or app to assign application roles')
    param principalId string = ''
    
    var abbrs = loadJsonContent('./abbreviations.json')
    var resourceToken = toLower(uniqueString(subscription().id, environmentName, location))
    var tags = { 'azd-env-name': environmentName }
    
    // Organize resources in a resource group
    resource rg 'Microsoft.Resources/resourceGroups@2021-04-01' = {
      name: !empty(resourceGroupName) ? resourceGroupName : '${abbrs.resourcesResourceGroups}${environmentName}'
      location: location
      tags: tags
    }
    
    // The application frontend
    module web './core/host/appservice.bicep' = {
      name: serviceName
      scope: rg
      params: {
        name: !empty(webServiceName) ? webServiceName : '${abbrs.webSitesAppService}web-${resourceToken}'
        location: location
        tags: union(tags, { 'azd-service-name': serviceName })
        applicationInsightsName: monitoring.outputs.applicationInsightsName
        appServicePlanId: appServicePlan.outputs.id
        runtimeName: 'python'
        runtimeVersion: '3.8'
        scmDoBuildDuringDeployment: true
      }
    }
    
    // Create an App Service Plan to group applications under the same payment plan and SKU
    module appServicePlan './core/host/appserviceplan.bicep' = {
      name: 'appserviceplan'
      scope: rg
      params: {
        name: !empty(appServicePlanName) ? appServicePlanName : '${abbrs.webServerFarms}${resourceToken}'
        location: location
        tags: tags
        sku: {
          name: 'B1'
        }
      }
    }
    
    // Monitor application with Azure Monitor
    module monitoring './core/monitor/monitoring.bicep' = {
      name: 'monitoring'
      scope: rg
      params: {
        location: location
        tags: tags
        logAnalyticsName: !empty(logAnalyticsName) ? logAnalyticsName : '${abbrs.operationalInsightsWorkspaces}${resourceToken}'
        applicationInsightsName: !empty(applicationInsightsName) ? applicationInsightsName : '${abbrs.insightsComponents}${resourceToken}'
        applicationInsightsDashboardName: !empty(applicationInsightsDashboardName) ? applicationInsightsDashboardName : '${abbrs.portalDashboards}${resourceToken}'
      }
    }
    
    // App outputs
    output AZURE_LOCATION string = location
    output AZURE_TENANT_ID string = tenant().tenantId
    output REACT_APP_WEB_BASE_URL string = web.outputs.uri
    output APPLICATIONINSIGHTS_CONNECTION_STRING string = monitoring.outputs.applicationInsightsConnectionString


  7. Run `azd provision` to provision the additional Azure resources

  8. Once provisioning is complete, run `azd monitor –overview` to open the Application Insight dashboard in the browser.

    The dashboard is not that exciting yet. Auto-instrumentation application monitoring is not yet available for Python appHowever, if you examine your code, you will see that:



    • APPLICATIONINSIGHTS_CONNECTION_STRING is added to the .env file for your current azd environment.

    • The same connection string is added to the application settings in the configuration of your web app in Azure Portal:web.png




 


Step 2 – manually instrumenting your app


Let’s track incoming requests with OpenCensus Python and instrument your application with the flask middleware so that incoming requests sent to your app is tracked. (To learn more about what Azure Monitor supports, refer to setting up Azure Monitor for your Python app.)


 


For this step, I recommend using Visual Studio Code and the following extensions:



Get Started Tutorial for Python in Visual Studio Code is a good reference if you are not familiar with Visual Studio Code.


 



  1. Add to requirements.txt

    python-dotenv
    opencensus-ext-azure >= 1.0.2
    opencensus-ext-flask >= 0.7.3
    opencensus-ext-requests >= 0.7.3​


  2. Modify app.py to: 

    import os
    
    from dotenv import load_dotenv
    from flask import Flask, render_template, send_from_directory
    from opencensus.ext.azure.trace_exporter import AzureExporter
    from opencensus.ext.flask.flask_middleware import FlaskMiddleware
    from opencensus.trace.samplers import ProbabilitySampler
    
    INSTRUMENTATION_KEY = os.environ.get("APPLICATIONINSIGHTS_CONNECTION_STRING")
    
    app = Flask(__name__)
    middleware = FlaskMiddleware(
        app,
        exporter=AzureExporter(connection_string=INSTRUMENTATION_KEY),
        sampler=ProbabilitySampler(rate=1.0),
    )
    
    
    @app.route("/favicon.ico")
    def favicon():
        return send_from_directory(
            os.path.join(app.root_path, "static"),
            "favicon.ico",
            mimetype="image/vnd.microsoft.icon",
        )
    
    
    @app.route("/")
    def home():
        return render_template("home.html")
    
    
    if __name__ == "__main__":
        app.run(debug=True)​


  3. To run locally, we need to read from the .env file to get the current azd environment context. The easiest is to customize run and debug in Visual Studio Code by creating a launch.json file:

    • Ctrl-Shift+D or click “Run and Debug” in the sidebar

    • Click “create a launch.json file” to customize a launch.json file

    • Select “Flask Launch and debug a Flask web application

    • Modify the generated file to: 

      {
          // Use IntelliSense to learn about possible attributes.
          // Hover to view descriptions of existing attributes.
          // For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
          "version": "0.2.0",
          "configurations": [
              {
                  "name": "Python: Flask",
                  "type": "python",
                  "request": "launch",
                  "module": "flask",
                  "env": {
                      "FLASK_APP": "app.py",
                      "FLASK_DEBUG": "1"
                  },
                  "args": [
                      "run",
                      "--no-debugger",
                      "--no-reload"
                  ],
                  "jinja": true,
                  "justMyCode": true,
                  "envFile": "${input:dotEnvFilePath}"
              }
          ],
          "inputs": [
              {
                  "id": "dotEnvFilePath",
                  "type": "command",
                  "command": "azure-dev.commands.getDotEnvFilePath"
              }
          ]
      }​




  4. Create and activate a new virtual environment . I am using Windows. So: 

    py -m venv .venv
    .venvscriptsactivate
    pip3 install -r ./requirements.txt​


  5. Click the Run view in the sidebar and hit the play button for Python: Flask

    • Browse to http://localhost:5000 to launch the app.

    • Click the button a few times and/or reload the page to generate some traffic.


    Take a break; perhaps play with your cat or dog for real. The data will take a short while to show up in Application Insights.



  6. Run `azd monitor –overview` to open the dashboard and notice the change dashboard.png

  7. Run `azd deploy` to deploy your app to Azure and start monitoring your app!


 


Get the code for this blog post here. Next, we will explore how you can use `azd pipeline config` to set up GitHub action to deploy update upon code check in.


 


Feel free to run `azd down` to clean up all the Azure resources. As you saw, it’s easy to get things up and running again. Just `azd up`!


 


We love your feedback! If you have any comments or ideas, feel free to add a comment or submit an issue to the Azure Developer CLI Repo.