Acceleration of Data-Driven App Development with Copilot

Acceleration of Data-Driven App Development with Copilot

This article is contributed. See the original author and article here.

Blog BannerBlog Banner
At Microsoft Build, 50 updates were announced and among those updates were Power Platform announcements in which we will look at one of them today. Learn how to use the Excel to App with Copilot feature which enables you to take advantage of Copilot to add data processing capabilities by helping you clean and prepare your data before you even start building the app. Copilot can now ingest any Excel file, no matter how loosely structured and create robust structured tables with a variety of data types.


 


Find out more about Microsoft Build announcements on the Microsoft Build 2023 Book of News 
Earn a free certification voucher by completing at least one Cloud Skills Challenge from the Microsoft Build Cloud Skills Challenge


 


Excel to App


Students, rising developers and pro-developers can rapidly build solutions in PowerApps by simply dragging and dropping or linking to a data source like Excel, and then build the UI on top of that data.


 


Things to consider



  • You can import your excel file as it is and PowerApps will create a Custom Dataverse Table and a Canvas App for you.

  • You do not need to format your table within excel (example you do not have to use Format as a Table within your spreadsheet)

  • Your Canvas App and Dataverse table will be built based on the first sheet of your Excel Spreadsheet.

  • The excel file needs to be closed when importing within PowerApps


 


Short Demo


Animation showing a short demo of the Excel to App feature with Copilot in PowerAppsAnimation showing a short demo of the Excel to App feature with Copilot in PowerApps


Practical steps to follow



  1. Prepare your excel file to use or create a new one.

  2. Go to PowerApps and Sign in with your account.

  3. On the homepage, click on Start with data
    Someleze_Diko_1-1684963553463.png

  4. On the Start with data wizard, choose and click on Upload an Excel file.
    Someleze_Diko_2-1684963640186.png

  5. Click on Select from device button then choose your excel file your device.
    Someleze_Diko_1-1684963977578.png

     


     



  6. You will get a preview of your Dataverse custom table that will be created for you. You will be able to edit the table by clicking on Edit Table Properties to change the Table name.
    Someleze_Diko_2-1684964101015.png

     



  7. The columns of your table are assigned to their appropriate data types and you have the ability to change/edit the column by clicking on the drop down next to the column and click on Edit Column. Once done, you can click on Create app
    Someleze_Diko_3-1684964182851.png

  8. Once you have clicked on Create app, Copilot will build a Canvas App with a Standard template that you can modify based on your needs.
    Someleze_Diko_4-1684964353771.png

     




Let’s Add more AI capabilities with Copilot



  1. Within your app, click on Settings then choose Upcoming Features
    Someleze_Diko_0-1684965708625.png

     



  2. Search for Copilot and toggle it to enable it. Once enabled close the dialog box.
    Someleze_Diko_1-1684965803470.png

     



  3. Once your Copilot Component is enabled, click on Insert and choose Copilot (preview)
    Someleze_Diko_2-1684965968308.png

     



  4. Once the Copilot Component is added on the screen, you will need to choose/select the your Datasource.
    Someleze_Diko_3-1684966201733.png

     



  5. Once you have chosen your Datasource, the full Copilot Component is added to your app. This allows your app users to use Copilot to understand and analyze their data using suggested prompts.
    Someleze_Diko_4-1684966525389.png

     



  6. Play the app and see Copilot in action. Choose one of the suggested prompts, for example on the picture below I want to know how many tickets are open.
    Someleze_Diko_5-1684966768167.png

     




Congratulations! You did it, you built a Canvas app from your excel file using Copilot!


Someleze_Diko_0-1684964847197.gif


 

How to perform a REST API request in Azure using RBAC authentication with Postman

This article is contributed. See the original author and article here.

This article describes how to perform a REST API request in Azure using RBAC authentication with Postman. I will use as example the Get Blob (REST API) request.


 


Please see below how to perform a REST API request in Azure using RBAC authentication:



  1. Open the Azure Portal and go to Azure Active Directory.

  2. On left side, please create a new App registration by clicking on App registration (left side bar) and then New registration. Fill in the Name and all the information required.

  3. Inside the new app:

    1. Click on Overview and and collect the Application (client) ID value, and the Directory (tenant) ID value.

    2. Clink on Certificates & secrets and create a New Client Secret. Please collect the client secret value.



  4. Open your storage account and go to Access Control (IAM) and assign to this App the RBAC role required to call any data access operation in Azure Storage. Please note the role assignment could take some time to take effect.

    1. For the example presented here (Get Blob request), we need to assign to the app need the following permission “Microsoft.Storage/storageAccounts/blobServices/containers/blobs/read“. The Storage Blob Data Reader RBAC role is the least privileged built-in role with this permission. This information can be found here: Get Blob (REST API) – Azure Storage



  5. Open Postman and:

    1. Create a new request.

    2. Select the Authorization tab in the request builder window and:

      1. In the “Type” dropdown, select “OAuth 2.0”

      2. On the right side, please fill in the following fields:

        1. Token Name: A name of your choosing

        2. Grant Type: Client Credentials

        3. Access Token URL: https://login.microsoftonline.com//oauth2/v2.0/token where is the Directory (tenant) ID value collected on step 3.1 above.

        4. Client ID: The Application (client) ID value collected on step 3.1 above.

        5. Client Secret: The client secret value collected on step 3.2 above.

        6. Scope: For storage, use https://storage.azure.com/.default

        7. Client Authentication: Send as Basic Auth Header.



      3. Click in “Get New Access Token” and collect the Access Token.





  6. To be able to execute the get blob request, we need to select the Headers tab in the request builder window and:


    1. Select the GET method request.

    2. Add the GET method request URI (https://myaccount.blob.core.windows.net/mycontainer/myblob)

    3. Add the header “Authorization with the value “Bearer <token>” where <token> is the value generated on the step 5.2.7 above.

    4. Add at least the two required headers x-ms-date and x-ms-version.

    5. Execute the request.




 


Disclaimer:



  • These steps are provided for the purpose of illustration only. 

  • These steps and any related information are provided “as is” without warranty of any kind, either expressed or implied, including but not limited to the implied warranties of merchantability and/or fitness for a particular purpose.

  • We grant You a nonexclusive, royalty-free right to use and modify the Steps and to reproduce and distribute the steps, provided that. You agree:


    • to not use Our name, logo, or trademarks to market Your software product in which the steps are embedded;

    • to include a valid copyright notice on Your software product in which the steps are embedded; and

    • to indemnify, hold harmless, and defend Us and Our suppliers from and against any claims or lawsuits, including attorneys’ fees, that arise or result from the use or distribution of steps.



Lesson Learned #356: Transaction log full in Azure SQL due to CDC job.

This article is contributed. See the original author and article here.

Today, we faced a service request where our customer got the following issue Msg 9002, Level 17, State 2, Line 8
The transaction log for database ‘2d7c3f5a-XXXX-XZY-ZZZ-XXX’ is full due to ‘REPLICATION’ and the holdup lsn is (194XXX:24X:1). Following I would like to share with you what was the lesson learned here.


 


We need to pay attention about the phrase “is full due to”, in this case is REPLICATION that means that could be related about Transaction Replication or Change Data Capture (CDC). 


 


In order to determine the situation, if we are not using Transaction Replication is to review if CDC is enabled running the following query: select name,recovery_model,log_reuse_wait,log_reuse_wait_desc,is_cdc_enabled,* from sys.databases where database_id=db_id() – sys.databases (Transact-SQL) – SQL Server | Microsoft Learn


 


If the value of the column is_cdc_enabled is 1 and you are not using CDC, use the command sys.sp_cdc_disable_db to disable the CDC job. sys.sp_cdc_disable_db (Transact-SQL) – SQL Server | Microsoft Learn


 


During the troubleshooting process during the execution of sys.sp_cdc_disable_db we got another error Msg 22831, Level 16, State 1, Procedure sys.sp_cdc_disable_db_internal, Line 338 [Batch Start Line 6]
Could not update the metadata that indicates database XYZ is not enabled for Change Data Capture. The failure occurred when executing the command ‘(null)’. The error returned was 9002: ‘The transaction log for database ‘xxx-XXX-43bffef44d0c’ is full due to ‘REPLICATION’ and the holdup lsn is (51XYZ:219:1).’. Use the action and error to determine the cause of the failure and resubmit the request. 


 


In this situation, we need to add more space to the transaction log file due there is not possible to register the disabling CDC operation in the transaction log. 


 


Once, we have more space in our transaction log, we were able to disable CDC and after disabling CDC, Azure SQL Database was able to marked as backup the Transaction Log.


 


Finally, in order to try to speed up the truncation of this transaction log we executed several times the command DBCC SHRINKFILE (Transact-SQL) – SQL Server | Microsoft Learn  and we were able to reduce the file size of the transaction log file. 


 


Also, during the troubleshooting we used the following to see how many VLFs that we have and the space usage: sys.dm_db_log_info (Transact-SQL) – SQL Server | Microsoft Learn and sys.database_recovery_status (Transact-SQL) – SQL Server | Microsoft Learn


 

SELECT * FROM sys.dm_db_log_info(db_id()) AS l
select * from sys.database_recovery_status where database_id=db_id()

 


 

Improve Labelling processes with new enhanced capabilities

Improve Labelling processes with new enhanced capabilities

This article is contributed. See the original author and article here.

Introduction

Effective labelling processes and configuration play a crucial role in optimizing warehouse operations. There are several reasons why accurate labelling and configuration are important.

Firstly, proper labelling and configuration enhance efficiency in a warehouse. When items are labelled and organized accurately, warehouse staff can quickly locate and identify products, reducing the time spent searching for items and ultimately boosting productivity.

Furthermore, clear and accurate labelling also reduces the likelihood of picking or shipping errors, which can lead to improved customer satisfaction and decreased costs associated with returns and corrections.

Lastly, proper labelling and configuration contribute to safety and compliance in a warehouse. By adhering to regulations and ensuring that hazardous materials or items with specific storage requirements are handled and stored correctly, the risk of accidents can be reduced.

As technology continues to advance, so do the tools available to improve labelling and configuration processes in warehouses. In Wave 1 2023, Microsoft Dynamics 365 SCM released several enhancements to support more advanced scenarios and bring extra capabilities to the labelling process.

License plate label layout

In 10.0.31 Microsoft Dynamics 365 SCM, a new License plate label layout was introduced for designing license plate labels. This feature lets you build more advanced license plate label layouts. Now LP layouts can have repeating structures and include header, body, and footer elements (for example, if you want to print item labels out of receiving or shipping work (similar to how wave labels currently work)). You can set up custom data sources with joined tables to print information from the related tables and define custom date, time, and number formats. This capability provides more flexibility in designing labels and removes some of the customization work needed to add data to the labels.

Custom label layouts

In 10.0.33 Microsoft Dynamics 365 SCM a new Custom label layout feature was released. 

This feature introduces a new Custom label layout type that allows you to build layouts from any data sources. A new Print button will be displayed automatically when layout exists for corresponding source. Users can print labels for any data including but not limited to Product labels, Location labels, Customer labels and many more.

It gives you the tool you need to create your own labels based on the business requirements. As well as configuring and printing any labels from any source.

Print labels using an external service

In 10.0.34 Microsoft Dynamics 365 SCM provides a quick and simple method for linking Dynamics 365 to many of the most popular enterprise labeling platforms. With Microsoft Dynamics 365 SCM’s seamless integration and flexible configuration options make for a pain-free, rapid implementation. It allows you to create a seamless flow of communication and transactions to optimize your printing workflow.

It allow you to configure the HTTP(S) request that you make, allowing for the integration with cloud native and on-premise (if the firewall is opened or an Azure API created) label printing services, including Zebra’s cloud printing service (https://developer.zebra.com/apis/sendfiletoprinter-model), Loftware NiceLabel Cloud or Seagull Scientific BarTender configured with REST APIs.

Conclusion

In conclusion, the continued evolution of technology is providing ever more sophisticated tools for improving labelling processes and configuration in warehouses. The enhancements released in Wave 1 2023 are just the latest example of how Microsoft Dynamics 365 SCM is staying at the forefront of this evolution and providing users with the tools they need to optimize their warehouse operations.


Would you like to learn more?

Print labels using an external service – Supply Chain
Management | Dynamics 365 | Microsoft Learn

Print labels using the Loftware NiceLabel label service
solution – Supply Chain Management | Dynamics 365 | Microsoft Learn

Print labels using the Seagull Scientific BarTender® label
service solution – Supply Chain Management | Dynamics 365 | Microsoft Learn

License plate label layouts and printing – Supply Chain Management | Dynamics 365 | Microsoft Learn

Custom label layouts and printing – Supply Chain Management | Dynamics 365 | Microsoft Learn

Print labels using an external service – Supply Chain Management | Dynamics 365 | Microsoft Learn

Not yet a Supply Chain Management customer? 

Take a guided tour.

The post Improve Labelling processes with new enhanced capabilities appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Empowering Accessibility: Language and Audio Document Translation Made Simple with Low-Code/No-Code

Empowering Accessibility: Language and Audio Document Translation Made Simple with Low-Code/No-Code

This article is contributed. See the original author and article here.

This solution architecture proposal outlines how to effectively utilize OpenAI’s language model alongside Azure Cognitive Services to create a user-friendly and inclusive solution for document translation. By leveraging OpenAI’s advanced language capabilities and integrating them with Azure Cognitive Services, we can accommodate diverse language preferences and provide audio translations, thereby meeting accessibility standards and reaching a global audience. This solution aims to enhance accessibility, ensure inclusivity, and gain valuable insights through the combined power of OpenAI, Azure Cognitive Services and PowerPlatform.



Dataflow


Here is the process:




  1. Ingest: PDF documents, text files, and images can be ingested from multiple sources, such as Azure Blob storage, Outlook, OneDrive, SharePoint, or a 3rd party vendor.




  2. Move: Power Automate triggers and moves the file to Azure Blob storage. Blob triggers then get the original file and call an Azure Function.




  3. Extract Text and Translate: The Azure Function calls Azure Computer Vision Read API to read multiple pages of a PDF document in natural formatting order, extract text from images, and generate the text with lines and spaces, which is then stored in Azure Blob storage. The Azure Translator then translates the file and stores it in a blob container. The Azure Speech generates a WAV or MP3 file from the original language and translated language text file, which is also stored in a blob container




  4. Notify: Power Automate triggers and moves the file to the original source location and notifies users in outlook and MS teams with an output audio file.




 Without Open AIdocument-translation-for-language-and-audio-for-accessbility.png


 With Open AI


document-translation-for-language-and-audio-for-accessbility (1).png


Refer for OpenAI
Transform your business with automated insights & optimized workflows using Azure OpenAI GPT-3 – Microsoft Community Hub


Alternatives


The Azure architecture utilizes Azure Blob storage as the default option for file storage during the entire process. However, it’s also possible to use alternative storage solutions such as SharePoint, ADLS or third-party storage options. For processing a high volume of documents, consider using Azure Logic Apps as an alternative to Power Automate. Azure Logic Apps can prevent you from exceeding consumption limits within your tenant and is a more cost-effective solution. To learn more about Azure Logic Apps, please refer to the Azure Logic Apps.


 


Components


These are the key technologies used for this technical content review and research:



Scenario details


This solution uses multiple Cognitive Services from Azure to automate the business process of translating PDF documents and creating audio files in wav/mp3 audio format for accessibility and global audience. It’s a great way to streamline the translation process and make content more accessible to people who may speak different languages or have different accessibility needs.


Potential use cases


By leveraging this cloud-based solution idea that can provide comprehensive translation services on demand, organizations can easily reach out to a wider audience without worrying about language barriers. This can help to break down communication barriers and ensure that services are easily accessible for people of all cultures, languages, locations, and abilities.


In addition, by embracing digital transformation, organizations can improve their efficiency, reduce costs, and enhance the overall customer experience. Digital transformation involves adopting new technologies and processes to streamline operations and provide a more seamless experience for customers.


It is particularly relevant to industries that have a large customer base or client base, such as e-commerce, tourism, hospitality, healthcare, and government services.