Microsoft Outlook introduces SMS on Outlook Lite

Microsoft Outlook introduces SMS on Outlook Lite

This article is contributed. See the original author and article here.

Since its launch in 2022, Outlook Lite has provided a way to enjoy the key features of Outlook in a small download size for low-resource phones. We are continuously looking for ways to meet the communication needs of our core users. Now, we are excited to bring SMS on Outlook Lite to users worldwide. With SMS on Outlook Lite, you can enjoy the convenience and security of sending and receiving SMS messages from your Outlook Lite app. SMS is integrated with your email, calendar, and contacts, so you can stay in touch with your contacts in one app.


 


SMS on Outlook Lite is now available in the latest version of the app, which you can download from the Google Play Store


 


How to get started with SMS on Outlook Lite?


Getting started with SMS on Outlook Lite is easy and fast. Just follow these steps:


 


1. Download Outlook Lite from the Google Play Store (here). If you already have Outlook Lite, make sure you update to the latest version.


2. Open Outlook Lite and click on the bottom tab icon named “SMS”


3. Give required permissions to activate SMS.


4. That’s it! You can now send and receive SMS messages from Outlook Lite.


 


Keerthana_AK_0-1710839794868.png


 


What’s next for SMS on Outlook Lite? 


We are working on adding more features and improvements to SMS on Outlook Lite, such as: 



  • Tighter integration with Email, Calendar and Contacts 

  • Cloud backup of messages 

  • Enhanced Security features.


We would love to hear your feedback and suggestions on SMS on Outlook Lite. You can contact us through the app, or by leaving a comment on this blog post.


 


Thank you for using Outlook Lite! 


 

Challenges in Uploading Files Over 2GB via HTTP Protocol in IIS Web Server

Challenges in Uploading Files Over 2GB via HTTP Protocol in IIS Web Server

This article is contributed. See the original author and article here.

Recently, I had worked on a case where customer`s ask was to upload a file larger than 2gb into the IIS hosted web application.


 


In my case, customer had a normal Asp.net Framework 4.6 application and when they were trying to perform the upload operation of more than 2gb zip file(the extension did not matter…we tested it with .7z and .zip both and it had failed) and somehow on the browser page, we got a Bad request and when I checked the HAR file, i could see a 400 status code.


 


Next, I checked the application configurations…like:


 


 

system.webServer/security/requestFiltering/requestLimits/maxAllowedContentLength 

system.web/httpRuntime/executionTimeout

system.web/httpRuntime/maxRequestLength

 


 


but these values were already configured for the higher number.


 


Articles for references:


HttpRuntime.ExecutionTimeout 


MaxAllowedContentLength 
Request Filtering  


HttpRuntime.maxRequestLength 


 


Also, we checked the App Pool Managed Pipeline Mode, and the interesting part was that we got different error for both the modes.


 


I was able to recreate this scenario in my lab machine for a sample asp.net framework web application and I had captured FREB logs (Failed request tracing logs) for the site for both the scenarios these were the differences I noticed:


 



  • In both the cases, I was able to upload only 2gb file size. 

  • In Integrated Mode, when I tried to upload a 4gb file, I could see this error under the FREB log: ” 

     400 Bad request-Asp.net detected invalid characters in the Url”. So, I tried to increase the value of “maxAllowedContentLength” to the maximum supported value, which is 4gb but still it failed, and it seems that webengine code (webengine4!MgdGetRequestBasics) doesn’t support more than 2 GB content-length.



  • Next, in Classic Mode, when I tried to upload a 4gb file, i got this error under FREB log : “413.1 -request entity too large error”. So, in this case also, I increased the value of “maxAllowedContentLength” to 4gb, but it failed. We did not see any error under FREB log…it was also 200 status code but the file did not upload.


So, the conclusion is that either you keep the application pipeline mode as Integrated or Classic, you would only be able to upload a 2gb file for your web application hosted on IIS.


 


If you would like to perform a larger file upload operation, HTTP protocol isn’t the right one. You need to switch to webDav feature, or use FTP protocol that is meant to perform the file upload/download operation without any size limit or you can keep using Http protocol but  you need to send the data packets as small chunks from client to the server side and then on the server side code, you need to bundle all the chunked packets together for the file upload operation.


 


Also, note that even if you think moving to latest windows server would fix it…that’s not going to help here. This behavior will be same for all the supported IIS across all the versions of supported windows server.



Hope this article helps everyone. :smile:


 


 


 


 


 


 


 


 

Explore Cloud Computing with the New “Azure for Students” Module!

Explore Cloud Computing with the New “Azure for Students” Module!

This article is contributed. See the original author and article here.

I’m really excited to share a new module that I developed by working with the Microsoft Learn Team: “Introduction to Azure for students.” As students, we often encounter numerous tech challenges, from managing projects and assignments to exploring new fields like AI and data science. This module is designed to help you tackle these challenges head-on by diving into the fascinating world of cloud computing with Azure.


 


Designer.jpeg


 


 


Why Cloud Computing?


 


Cloud computing might sound complex, but it’s simpler than you think. Imagine it as a giant library where you can borrow books whenever you need them, without the hassle of buying or storing them. Similarly, cloud computing lets you use computing resources whenever you need them, without owning or maintaining the hardware and software. It’s flexible, easy, and often more affordable.
a4770c3e0fd64906e73172c385edbc59.jpg


 


 


What You’ll Learn


 


In this module, you’ll discover:


 



  • Core Concepts of Cloud Computing: Learn the basics and understand what cloud computing is all about.

  • Azure in Action: Explore real-world scenarios to see how Azure is used in various fields, from student projects to professional healthcare.

  • Getting Started with Azure: Find out about the tools and services that will help you begin your Azure journey.


 


Why Azure Cloud?


 


Azure is Microsoft’s cloud platform, offering over 200 products and services. Whether you’re into AI, app development, data science, or machine learning, Azure has it all. Here’s how Azure can help you as a student:


 



  • Access Powerful Tools: Utilize advanced tools and services without the need for expensive hardware.

  • Scalability: Easily scale your resources up or down based on your project needs.

  • Cost-Effective: Pay only for the resources you use and take advantage of free services and a $100 Azure credit with the Azure for Students offer.


 


How Azure Can Help with College Projects?


 


Imagine you’re working on a complex college project that involves data analysis and collaboration with classmates. With Azure, you can set up a virtual environment where everyone can work together seamlessly, share resources, and analyze data in real-time. Need to develop a mobile app for your project? Azure provides all the necessary tools and platforms to build, test, and deploy your app efficiently.


 


Special Offer for Students!


 


Being a student usually means living on a budget. Good news—Microsoft’s got your back with Azure for Students! Get a $100 Azure credit and access a sea of free services. Learning has never been so cost-effective and fun! Learn more about the Azure for Students offer.


 


1703521221247.jpg


 


 


What are you waiting for? Start learning cloud today by going through the module with hands-on experience by claiming the Azure for Students offer. Whether you’re working on a school project, developing an app, or just curious about the cloud, this module will provide you with the foundation to succeed.


Ready to get started? Sign up on Microsoft Learn to save your progress and take the first step towards an exciting career in cloud computing!

Simplify Your Azure Kubernetes Service Connection Configuration with Service Connector

Simplify Your Azure Kubernetes Service Connection Configuration with Service Connector

This article is contributed. See the original author and article here.

Workloads deployed on an Azure Kubernetes Service (AKS) cluster often need to access Azure backing resources, such as Azure Key Vault, databases, or AI services like Azure OpenAI Service. Users are required to manually configure Microsoft Entra Workload ID or Managed Identities so their AKS workloads can securely access these protected resources.


The Service Connector integration greatly simplifies the connection configuration experience for AKS workloads and Azure backing services. Service Connector takes care of authentication and network configurations securely and follows Azure best practices, so you can focus on your application code without worrying about your infrastructure connectivity.


CocoWang_0-1716567353560.png


Service Connector Action Breakdown


 


Before, in order to connect from AKS pods to a private Azure backing services using workload identity, users needed to perform the following actions manually:



  1. Create a managed identity

  2. Retrieve the OIDC issuer URL

  3. Create Kubernetes service account

  4. Establish federated identity credential trust

  5. Grant permissions to access Azure Services

  6. Deploy the application


Now, Service Connector performs steps 2 to 5 automatically. Additionally, for Azure services without public access, Service Connector creates private connection components such as private link, private endpoint, DNS record,


You can create a connection in the Service Connection blade within AKS.


CocoWang_0-1716567897332.png


 


Click create and select the target service, authentication method, and networking rule. The connection will then be automatically set up. Here are a few helpful links to for you to learn more about Service Connector.



 


 


 


 


 

Unlocking efficiency: Dynamics 365 Field Service integration for seamless operations 

Unlocking efficiency: Dynamics 365 Field Service integration for seamless operations 

This article is contributed. See the original author and article here.

In the dynamic world of service management, every action counts. From frontline workers in the field to back-office functions, the complexity of service delivery impacts the bottom line. Whether it’s a physical product consumed from inventory, or a service provided, both have financial implications, especially when external customers are involved, pricing and profitability come into play.

When a field service organization’s frontline operations run in isolation, consequences can be far-reaching: inaccurate costing, delayed invoicing, dissatisfied customers, and supply chain bottlenecks. To succeed in this complicated environment, organizations must integrate their systems to coordinate their services, finances, and supply chain processes. 

Recognizing this critical need, we recently announced the integration between Dynamics 365 Field Service and Business Central, and today we’re thrilled to announce the general availability of the integration between Dynamics 365 Field Service and Dynamics 365 Finance as well as Supply Chain Management. This powerful integration ensures that the work of frontline workers, service managers, and dispatchers are seamlessly synced with the financial and supply chain heart of your business. Let’s explore some of the details of this native integration.  

Bridging the gap: Dynamics 365 integration 

The challenges 

Even with robust systems like Dynamics 365 Field Service and a strong ERP system like Dynamics 365 Finance and Supply Chain Management, gaps can emerge when these systems aren’t fully integrated: 

  1. Limited financial insight: Without smooth integration, determining job costs and profitability requires switching between windows and consuming or updated data in multiple systems, which obscures their financial status. 
  2. Supply-driven delays: Separate fieldwork and supply chain processes lead to inventory shortages and service delays. 
  3. Invoicing bottlenecks: Disparate systems and manual processes cause invoicing and payment delays, disrupting cash flow. 
  4. Inconsistent data: Discrepancies across systems create confusion, affecting accuracy of inventory, decision-making, pricing, and costing data. 
The solution

Our native integration addresses these challenges head-on: 

  1. Operational visibility: Real-time insights into finances and inventory empower informed decision-making across your organization. 
  2. Field-informed supply chain: Field Service work orders can drive estimated inventory demand, ensuring seamless supply chain coordination. 
  3. Interconnected financial operations: Automated and powerful billing and invoicing capabilities of Finance informed directly by the services provided speed up payment cycles, improve cash flow, minimize errors, boost profitability, and turn every work order into a growth opportunity. 
  4. Cost-effective integration: Our pre-built solution reduces implementation expenses and accelerates value realization. 
  5. Reduced risk, faster implementation: The native integration minimizes risk while improving implementation timelines. 
Essential features 

Organizations can create new opportunities to improve efficiency, customer satisfaction, and growth by integrating their Dynamics 365 Field Service and finance and operations applications. Key features of this native integration include:  

  • Data alignment: Dual-write and virtual entities ensure all applications operate from a cohesive set of primary tables. 
    • Primary tables alignment: Basic concepts such as currency, units of measure, products and their attributes (like styles, configurations, colors) are synced between applications to ensure a consistent source of truth. 
    • Legal entity alignment: The company concept, native to Finance and Supply Chain, is used to filter critical lookups to put guard-rails in the system, helping drive transactions along company lines. 
    • Projects and accounts: Work orders are seamlessly synced with projects and customer accounts from the finance and operations applications, allowing for precise project tracking and customer billing.  
    • Inventory: Virtual tables expose inventory from Supply Chain directly in Field Service while work order inventory transactions align with item journals, directly impacting inventory levels in the system of record. 
    • Resources: Using dual-write, resources can be aligned directly with workers ensuring field service work order transactions are automatically associated with the right workers and recorded in their respective hours journal and expense journal lines. 
  • Automated and precise invoicing: The integration automates the syncing of transactions, reducing manual work and mistakes. Organizations can decide when to sync the information and post project journals either as they use them or automatically when they finish the work order. 
  • Full insight and management: No financial system can afford to lose transactional data. Our integration gives organizations complete insight and management of data moving between the systems, making sure they can fix issues that stop data from flowing between applications and re-sync the transaction. 

Get started now 

Dynamics 365 Field Service and the Dynamics 365 finance and operations applications work together to unlock efficiencies. Organizations that use these solutions together can boost their productivity, revenue, and customer satisfaction. Grow your business with Dynamics 365 Field Service, Dynamics 365 Finance, and Dynamics 365 Supply Chain.  

Be on the lookout for future post in June for more ways to take advantage of this powerful integration that make it work for any organization.

Ready to get started today? Learn more about the integration, set up your organization, and create your first integrated work order.  

The post Unlocking efficiency: Dynamics 365 Field Service integration for seamless operations  appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Leveraging Copilot in Dynamics 365 Sales to prepare for sales meeting

Leveraging Copilot in Dynamics 365 Sales to prepare for sales meeting

This article is contributed. See the original author and article here.

In today’s rapidly evolving sales environment, staying ahead of the curve is more crucial than ever. The latest updates to Copilot in Dynamics 365 Sales, particularly its enhanced integration with Outlook, are transforming how sales professionals gear up for their meetings. Let’s dive into how these new functionalities not only streamline preparation but also enrich customer interactions. 

Streamlined Outlook integration for comprehensive sales meeting preparation

connect-to-outlook
Connect Outlook/Exchange accounts to fetch meetings and related emails

Copilot in Dynamics 365 Sales expands its integration capabilities with Outlook, specifically accommodating users who have not enabled server-side sync. This pivotal update accelerates adoption, providing a unified platform where sales professionals can access and prepare for their Outlook-scheduled sales appointments directly within Dynamics 365. This coherence not only simplifies the logistical aspects of sales preparation but also enhances the overall efficiency and effectiveness of sales operations. 

Proactive meeting preparation

Copilot fetches meetings for today and the next seven days

Copilot now allows sales teams to fetch Outlook meetings for the upcoming week, enabling them to prepare proactively. The ability to view detailed agendas and prepare in advance transforms how sales teams interact with clients, paving the way for more successful outcomes.

Refined meeting summaries for enhanced client interactions

Enhanced summary helps the seller prepare for client interactions

The upgraded meeting preparation tool in Copilot for Dynamics 365 Sales now offers richer, more detailed summaries. This enhancement provides sales teams with critical insights and key talking points, tailored to each meeting’s context. Such targeted preparation boosts confidence and competence, enabling sales professionals to tailor their approaches to meet the specific needs and interests of each client, enhancing the effectiveness of their pitches. 

Harnessing innovations for sales excellence 

The recent updates to Copilot in D365 Sales are a testament to our commitment to enhance the user experience and functionality of our sales management tools. By leveraging these new features, sales teams can enhance their productivity, improve client interactions, and ultimately drive more successful outcomes. As the digital landscape evolves, tools like Copilot in D365 Sales are invaluable for staying competitive in the fast-paced world of sales. 

Next steps

Learn more on leveraging Copilot in D365 Sales to prepare for meetings: 
Stay ahead with Copilot | Microsoft Learn 

Not a Dynamics 365 Sales customer yet? Take a guided tour and sign up for a free trial at Dynamics 365 Sales overview.    

The post Leveraging Copilot in Dynamics 365 Sales to prepare for sales meeting appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Introducing Rich Reporting and Troubleshooting for Microsoft Playwright Testing

Introducing Rich Reporting and Troubleshooting for Microsoft Playwright Testing

This article is contributed. See the original author and article here.

Today, we’re excited to introduce rich reporting and easy troubleshooting for the Microsoft Playwright Testing service!


Microsoft Playwright Testing is a managed service built for running Playwright tests easily at scale. Playwright is a fast-growing, open-source framework that enables reliable end-to-end testing and automation for modern web apps. You can read more about the service here.


Now, with this new Reporting feature users can publish test results and related artifacts and view them in the service portal for faster and easier troubleshooting.


 


Quickly Identify Failed and Flaky Tests


 


In the fast-paced world of web development, applications evolve rapidly, constantly reshaping user experiences. To keep up, testing needs to be just as swift. Playwright automates end-to-end tests and delivers essential reports for troubleshooting. The Reporting feature provides a streamlined dashboard that highlights failed and flaky tests, enabling you to identify and address issues quickly. This focused view helps maintain application quality while supporting rapid iteration.


 


Vanshvsingh_1-1716428759793.png


Screenshot of test results filtered by failed and flaky tests


 


 


Troubleshoot Tests Easily using rich artifacts


 


As test suites grow and the frequency of test execution increases, managing generated artifacts becomes challenging. These artifacts are crucial for debugging failed tests and demonstrating quality signals for feature deployment, but they are often scattered across various sources.


The Reporting feature consolidates results and artifacts, such as screenshots, videos, and traces, into a unified web dashboard, simplifying the troubleshooting process. The Trace Viewer, a tool offered by Playwright, that helps you explore traces and allows you to navigate through each action of your test and visually observe what occurred during each step. It is hosted in the service portal with the test for which it is collected, eliminating the need to store and locate it separately for troubleshooting.


 


Vanshvsingh_13-1716139925805.png


Screenshot of trace viewer hosted in the service portal


 


Seamless Integration with CI Pipelines


 


Continuous testing is essential for maintaining application quality, but collecting and maintaining execution reports and artifacts can be challenging. Microsoft Playwright Testing service can be easily configured to collect results and artifacts in CI pipelines. It also captures details about the CI agent running the tests and presents them in the service portal with the test run. This integration facilitates a smooth transition from the test results to the code repository where tests are written. Users can also access the history of test runs in the portal and gain valuable insights, leading to faster troubleshooting and reduced developer workload.


 


Vanshvsingh_2-1716428885166.png


Screenshot of test result with CI information


 


Join the Private Preview


 


For current Playwright users, adding the Reporting feature with your existing setup is easy. It integrates with the Playwright test suite, requiring no changes to the existing test code. All you need to do is install a package that extends the Playwright open-source package, add it to your configuration, and you’re ready to go. This feature operates independently of the service’s cloud-hosted browsers, so you can use it without utilizing service-managed browsers.


We invite teams interested in enhancing their end-to-end testing to join the private preview of the Reporting feature. This feature is available at no additional charge during the private preview period. However, usage of the cloud-hosted browsers feature will be billed according to Azure pricing.


Your feedback is invaluable for refining and enhancing this feature. By joining the private preview, you gain early access and direct communication with the product team, allowing you to share your experiences and help shape the future of the product.


 


Interested in trying out the reporting feature and giving us feedback? Sign up here.


 


Check out Microsoft Playwright Testing service here. If you are new to Microsoft Playwright Testing service, learn more about it.

Create a data maintenance strategy for Dynamics 365 finance and operations data (part two)

Create a data maintenance strategy for Dynamics 365 finance and operations data (part two)

This article is contributed. See the original author and article here.

A well-defined data maintenance strategy improves the quality and performance of your database and reduces storage costs. In part one of this series, we covered the roles and responsibilities of your data strategy team, tools for reviewing storage usage, and data management features in Dynamics 365 finance and operations apps that your strategy should include. We recommended that you start your planning by decommissioning unneeded sandbox environments in your tenant. In this post, we focus on creating a data retention strategy for tables as part of your overall storage maintenance strategy.

Create a data retention strategy for tables

After sandbox environments, tables have the greatest impact on total storage volume. Your data maintenance strategy should include a plan for how long to retain the data in specific tables, especially the largest ones—but don’t overlook smaller, easily manageable tables.

Review table storage by data category

In the Power Platform admin center capacity report for the production environment, drill down to the table details.

The Finance and operations capacity report showing database usage by table

Identify the largest tables in your production environment. For each one, determine the members of your data strategy team who should be involved and an action based on the table’s data category. The following table provides an example analysis.

Data category and examples Strategy Team members
Log and temporary data with standard cleanup routines

SALESPARMLINE, USERLOG, BATCHHISTORY, *STAGING

This category of data is temporary by design unless it’s affected by a customization or used in a report. Run standard cleanup after testing in a sandbox.
Note: If reports are built on temporary data, consider revisiting this design decision.
• System admin
• Customization partner or team if customized
• BI and reporting team
Log and temporary data with retention settings

DOCUHISTORY, SYSEMAILHISTORY

This data is temporary by design but has an automatically scheduled cleanup. Most automatic jobs have a retention setting. Review retention parameters and update after testing in a sandbox.   • System admin
• Customization partner or team if customized
Log data used for auditing purposes

SYSDATABASELOG

Establish which department uses the log data and discuss acceptable retention parameters and cleanup routines. • System admin
• Business users
• Controllers and auditors
Workbook data with standard cleanup routines

SALESLINE, LEDGERJOURNALTRANS

Data isn’t temporary by design, but is duplicated when posted as financial. Discuss with relevant department how long workbook data is required in the system, then consider cleanup or archiving data in closed periods. • System admin
• Business users related to the workbook module
• BI and reporting team for operational and financial reports
Columns with tokens or large data formats

CREDITCARDAUTHTRANS

Some features have in-application compression routines to reduce the size of data. Review the compression documentation and determine what data is suitable for compression. • System admin
• Business users
Financial data in closed periods

GENERALJOURNALACCOUNTENTRY

Eventually you can remove even financial data from the system. Confirm with controlling team or auditors when data can be permanently purged or archived outside of Dynamics 365. • System admin
• Controllers and auditors
• Financial business unit
• BI and reporting team for financial reports
Log or workbook data in ISV or custom tables

Should start with the ISV’s three-letter moniker

Discuss ISV or custom code tables with their developers. • System admin
• Customization partner or team
• ISV
• BI and reporting team, depending on the customization

Consider whether table data needs to be stored

For each large table, continue your analysis with the following considerations:

  • Current business use: Is the data used at all? For instance, was database logging turned on by accident or for a test that’s been completed?
  • Retention per environment: Evaluate how long data should be in Dynamics 365 per environment. For instance, your admin might use 30 days of batch history in the production environment to look for trends but would be content with 7 days in a sandbox.
  • Data life cycle after Dynamics 365: Can the data be purged? Should it be archived or moved to long-term storage?

With the results of your analysis, your data strategy team can determine a retention strategy for each table.

Implement your data retention strategy

With your data retention strategy in place, you can start implementing the actions you decided on—running standard cleanups, updating retention settings, configuring archive functions, or reaching out to your ISV or customization partner.

Keep in mind that implementing an effective strategy takes time. You need to test the effect of each action in a sandbox environment and coordinate with multiple stakeholders.

As you implement your strategy, here are some best practices to follow:

  • Delete or archive data only after all stakeholders have confirmed that it’s no longer required.
  • Consider the impact of the data life cycle on customizations, integrations, and reports.
  • Choose the date range or the amount of data to target in each cleanup or archive iteration based on the expected duration and performance of the cleanup or archiving routine, as determined by testing in a sandbox.

Need more help?

Creating a data maintenance strategy for Dynamics 365 finance and operations apps is a complex and ongoing task. It requires a thorough analysis and collaboration among different roles and departments. For help or guidance, contact your Microsoft representative for a Dynamics 365 finance and operations storage capacity assessment.

Learn more

Not yet a Dynamics 365 customer? Take a tour and start a free trial.

The post Create a data maintenance strategy for Dynamics 365 finance and operations data (part two) appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Microsoft Copilot in Azure – Unlock the benefits of Azure Database for MySQL with your AI companion

Microsoft Copilot in Azure – Unlock the benefits of Azure Database for MySQL with your AI companion

This article is contributed. See the original author and article here.

Microsoft Copilot in Azure (Public Preview) is an AI-powered tool to help you do more with Azure. Copilot in Azure extends capabilities to Azure Database for MySQL, allowing users to gain new insights, unlock untapped Azure functionality, and troubleshoot with ease. Copilot in Azure leverages Large Language Models (LLMs) and the Azure control plane, all of this is carried out within the framework of Azure’s steadfast commitment to safeguarding the customer’s data security and privacy.


 


The experience now supports adding Azure Database for MySQL self- help skills into Copilot in Azure, empowering you with self-guided assistance and the ability to solve issues independently. 


 


You can access Copilot in Azure right from the top menu bar in the Azure portal. Throughout a conversation, Copilot in Azure answers questions, suggests follow-up prompts, and makes high-quality recommendations, all while respecting your organization’s policy and privacy.


 


For a short demo of this new capability, watch the following video!


 


 


 


Discover new Azure Database for MySQL features with Microsoft Copilot in Azure


 


MicrosoftTeams-image (7).png


 


 


Explore when to enable new features to supplement real-life scenarios


 


image (1).png


 


Learn from summarized tutorials to enable features on-the-go


 


image.png


 


Troubleshoot your Azure Database for MySQL issues and get expert tips


 


copilot_cpu usage.png


 



Join the preview



 


To enable access to Microsoft Copilot in Azure for your organization, complete the registration form. You only need to complete the application process one time per tenant. Check with your administrator if you have questions about joining the preview.


 


For more information about the preview, see Limited access. Also be sure to review our Responsible AI FAQ for Microsoft Copilot in Azure.


 


Thank you!

Create a data maintenance strategy for Dynamics 365 finance and operations data (part one)

Create a data maintenance strategy for Dynamics 365 finance and operations data (part one)

This article is contributed. See the original author and article here.

Data maintenance—understanding what data needs to be stored where and for how long—can seem like an overwhelming task. Cleanup routines can help, but a good data maintenance strategy will make sure that you’re using your storage effectively and avoiding overages. Data management in Dynamics 365 isn’t a one-size-fits-all solution. Your strategy will depend on your organization’s implementation and unique data footprint. In this post, the first of a two-part series, we describe the tools and features that are available in Dynamics 365 finance and operations apps to help you create an effective storage maintenance plan. Part two focuses on implementing your plan.

Your data maintenance team

Data maintenance is often thought to be the sole responsibility of system admins. However, managing data throughout its life cycle requires collaboration from all stakeholders. Your data maintenance team should include the following roles:

  • Business users. It goes without saying that users need data for day-to-day operations. Involving them in your planning helps ensure that removing old business data doesn’t interfere with business processes.
  • BI and reporting team. This team comprehends reporting requirements. They can provide insights into what data is essential for operational reports and should be kept in live storage or can be exported to a data warehouse.
  • Customization team. Customizations might rely on data that’s targeted by an out-of-the-box cleanup routine. Your customization partner or ISV should test all customizations and integrations before you run a standard cleanup in the production environment.
  • Auditors and controllers. Even financial data doesn’t need to be kept indefinitely. The requirements for how long you need to keep posted data differ by region and industry. The controlling team or external auditors can determine when outdated data can be permanently purged.
  • Dynamics 365 system admins. Involving your admins in data maintenance planning allows them to schedule cleanup batch jobs during times when they’re least disruptive. They can also turn on and configure new features.
  • Microsoft 365 system admins.The finance and operations storage capacity report in the Power Platform admin center is helpful when you’re creating a data maintenance strategy, and these admins have access to it.

Tools for reviewing storage usage

After you assemble your team, the next step is to gather information about the size and footprint of your organization’s finance and operations data using the following tools:

  • The finance and operations storage capacity report shows the storage usage and capacity of your Dynamics 365 environments down to the table level.
  • Just-in-time database access allows you to access the database of a sandbox environment that has been recently refreshed from production. Depending on the storage actions you have set up or the time since the last database restore, the sandbox might not exactly match the production environment.

Features for managing storage

A comprehensive data maintenance strategy takes advantage of the data management features of Dynamics 365 finance and operations apps. The following features should be part of your plan.

Environment life cycle management is the process of creating, refreshing, and decommissioning sandbox environments according to your testing and development needs. Review your environments’ storage capacity and usage on the Finance and operations page of the capacity report.

Screenshot of the capacity report.
The Finance and operations capacity report in the Power Platform admin center

Critically assess the environments and their usage and consider decommissioning sandboxes that you no longer need. For instance, if the system is post go-live, can you retire the training environment? Are performance tests less frequent and easier to run in the QA environment when users aren’t testing?

We highly recommend that you don’t skip the sandbox decommissioning discussion. Reducing the number of sandboxes has a far greater effect on total storage usage than any action that targets a specific table.

Cleanup routines are standard or custom functions that automatically delete temporary or obsolete data from the system.

Retention settings schedule automatic cleanup of certain data after a specified length of time. For example, document history includes a parameter that specifies the number of days to retain history. These cleanup routines might run as batch jobs or behind the scenes, invisible to admins.

Archiving functions move historical data to a separate storage location.

Compression routines reduce the size of data in storage. For example, the Compress payment tokens feature applies compression to stored payment property tokens.

Next step

In this post, we covered the roles and responsibilities of your data strategy team, tools for reviewing database storage, and data management features beyond cleanup routines. We suggested that you begin your planning process by reviewing your sandboxes. In part two, we discuss a strategy for specific tables and actions to take.

Learn more

Not yet a Dynamics 365 customer? Take a tour and start a free trial.

The post Create a data maintenance strategy for Dynamics 365 finance and operations data (part one) appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.