MidDay Cafe Episode 36 – Human AI Partnership

MidDay Cafe Episode 36 – Human AI Partnership

This article is contributed. See the original author and article here.

MidDayCafe.png  In this episode of MidDay Café hosts Tyrelle Barnes and Michael Gannotti discuss Human/AI partnership. Many organizations are trying to figure out AI strategy but seem to be taking a tech/product first approach. What Tyrelle and Michael discuss is how to anchor on people/employees first with AI/Tech in support.


Listen to the Audio podcast version:



Resources:



Thanks for Visiting!


Tyrelle Barnes LinkedIn : Michael Gannotti LinkedIn | Twitter


TyrelleBarnes.jpeg me.jpg


 

Bring your own data to Azure OpenAI chat models

Bring your own data to Azure OpenAI chat models

This article is contributed. See the original author and article here.

Introduction


Azure OpenAI models provide a secure and robust solution for tasks like creating content, summarizing information, and various other applications that involve working with human language. Now you can operate these models in the context of your own data. Try Azure OpenAI Studio today to naturally interact with your data and publish it as an app from from within the studio.


 


Getting Started


Follow this quickstart tutorial for pre-requisites and setting up your Azure OpenAI environment.


 


In order to try the capabilities of the Azure OpenAI model on private data, I am uploading an ebook to the Azure OpenAI chat model. This e-book is about “Serverless Apps: Architecture, patterns and Azure Implementation” written by Jeremy Likness and Cecil Phillip. You can download the e-book here


 


Before uploading own data


Prior to uploading this particular e-book, the model’s response to the question on serverless design patterns is depicted below. While this response is relevant, let’s examine if the model is able to pick up the e-book related content during the next iteration


 


pre-training.png


 


After uploading own data


This e-book has an exclusive section that talks in detail about different design patterns like Scheduling, CQRS, Event based processing etc.


 


ebook.png


After training the model on this PDF data, I asked a few questions and the following responses were nearly accurate. I also limited the model to only supply the information from the uploaded content. Here’s what I found.


 


post-training.png


 


Now when I asked about the contributors to this e-book, it listed everyone right.


 


post-training-1.png


 


Read more


With enterprise data ranging to large volumes in size, it is not practical to supply them in the context of a prompt to these models. Therefore, the setup leverages Azure services to create a repository of your knowledge base and utilize Azure OpenAI models to interact naturally with them.


 


The Azure OpenAI Service on your own data uses Azure Cognitive Search service in the background to rank and index your custom data and utilizes a storage account to host your content (.txt, .md, .html, .pdf, .docx, .pptx)Your data source is used to help ground the model with specific data. You can select an existing Azure Cognitive Search index, Azure Storage container, or upload local files as the source we will build the grounding data from. Your data is stored securely in your Azure subscription.


 


We also have another Enterprise GPT demo that allows you to piece all the azure building blocks yourself. An in-depth blog written by Pablo Castro chalks the detail steps here.


 


Getting started directly from Azure OpenAI studio allows you to iterate on your ideas quickly. At the time of writing this blog, the completions playground allow 23 different use cases that take advantage of different models under Azure OpenAI.


 



  1. Summarize issue resolution from conversation

  2. Summarize key points from financial report (extractive )

  3. Summarize an article (abstractive)

  4. Generate product name ideas

  5. Generate an email

  6. Generate a product description (bullet points)

  7. Generate a listicle-style blog

  8. Generate a job description

  9. Generate a quiz

  10. Classify Text

  11. Classify and detect intent

  12. Cluster into undefined categories

  13. Analyze sentiment with aspects

  14. Extract entities from text

  15. Parse unstructured data

  16. Translate text

  17. Natural Language to SQL

  18. Natural language to Python

  19. Explain a SQL query

  20. Question answering

  21. Generate insights

  22. Chain of thought reasoning

  23. Chatbot


Resources


There are different resources to get you started on Azure OpenAI. Here’s a few:



 

Copy data to Azure Data Services at scale with Microsoft Fabric

Copy data to Azure Data Services at scale with Microsoft Fabric

This article is contributed. See the original author and article here.

Introduction


Did you know that you can use Microsoft Fabric to copy data at scale from on-premises SQL Server to Azure SQL Database or Azure SQL Managed Instance within minutes?


 


It is often required to copy data from on-premises to Azure SQL database, Azure SQL Managed Instance or to any other data store for data analytics purposes. You may simply want to migrate data from on-premises data sources to Azure Database Services. You will most likely want to be able to do this data movement at scale, with minimal coding and complexity and require an automated and simple approach to handle such scenarios.


 


In the following example, I am copying 2 tables from an On-premises SQL Server 2019 database to Azure SQL Database using Microsoft Fabric. The entire migration is driven through a metadata table approach, so the copy pipeline is simple and easy to deploy. We have used this approach to copy hundreds of tables from one database to another efficiently. The monitoring UI provides flexibility and convenience to track the progress and rerun the data migration in case of any failures. The entire migration is driven using a database table that holds the information about the tables to copy from the source.  


 


Architecture diagram


This architectural diagram shows the components of the solution from SQL Server on-premises to Microsoft Fabric.


addy_0-1687194528376.png


 


Steps


Install data gateway:


To connect to an on-premises data source from Microsoft Fabric, a data gateway needs to be installed. Use this link to install an on-premises data gateway | Microsoft Learn


 


Create a table to hold metadata information:


First, let us create this table in the target Azure SQL Database.


 


 


 


 

CREATE TABLE [dbo].[Metadata](
      [Id] [int] IDENTITY(1,1) NOT NULL,
      [DataSet] [nvarchar](255) NULL,
      [SourceSchemaName] [nvarchar](255) NULL,
      [SourceTableName] [nvarchar](255) NULL,
      [TargetSchemaName] [nvarchar](255) NULL,
      [TargetTableName] [nvarchar](255) NULL,
      [IsEnabled] [bit] NULL 
)

 


 


 


 


 


 


I intend to copy two tables – Customer and Sales – from the source to the target. Let us insert these entries into the metadata table. Insert one row per table.


 


 


 


 

INSERT [dbo].[Metadata] ([DataSet], [SourceSchemaName], [SourceTableName], [TargetSchemaName], [TargetTableName], [IsEnabled]) VALUES (N'Customer', N'dbo', N'Customer', N'dbo', N'Customer', 1);

INSERT [dbo].[Metadata] ([DataSet], [SourceSchemaName], [SourceTableName], [TargetSchemaName], [TargetTableName], [IsEnabled]) VALUES (N'Sales', N'dbo', N'Sales', N'dbo', N'Sales', 1);

 


 


 


 


 


 


Ensure that the table is populated. The data pipelines will use this table to drive the migration.


addy_1-1687194528380.png


 


Create Data Pipelines:


Open Microsoft Fabric and click create button to see the items you can create with Microsoft Fabric.


addy_0-1687203365811.png


 


Click on “Data pipeline” to start creating a new data pipeline.


addy_2-1687194559507.png


 


Let us name the pipeline “Copy_Multiple_Tables”.


addy_3-1687194559511.png


 


Click on “Add pipeline activity” to add a new activity.


addy_4-1687194559514.png


 


Choose Azure SQL Database from the list. We will create the table to hold metadata in the target.


addy_5-1687194559516.png


 


Ensure that the settings are as shown in the screenshot.


addy_6-1687194559518.png


 


Click the preview data button and check if you can view the data from the table.


addy_7-1687194559518.png


 


Let us now create a new connection to the source. From the list of available connections, choose SQL Server, as we intend to copy data from SQL Server 2019 on-premises. Ensure that the gateway cluster and connection are already configured and available. 


addy_8-1687194559524.png


 


Add a new activity and set the batch count to copy tables in parallel.


addy_9-1687194559528.png


 


We now need to set the Items property, which is dynamically populated at runtime. To set this click on this button as shown in the screenshot and set the value as:


 


 


 


 

@activity('Get_Table_List').output.value

 


 


 


 


 


addy_10-1687194559530.png


 


addy_11-1687194559534.png


 


Add a copy activity to the activity container.


addy_12-1687194559536.png


 


Set the source Table attributes in the copy activity as shown in the screenshot. Click on the edit button and click the “Add dynamic content” button. Ensure that you paste the text only after you click the “Add dynamic content” button, otherwise, the text will not render dynamically during runtime.


 


Set the Table schema name to:


 


 


 


 

@item().SourceSchemaName

 


 


 


 


 


 


 


Set the Table name to:


 


 


 


 

@item().SourceTableName

 


 


 


 


 


addy_13-1687194559541.png


 


Click on the destination tab and set the destination attributes as in the screenshot.


Set the Table schema name to:


 


 


 


 

@item().TargetSchemaName

 


 


 


 


 


Set the Table name to:


 


 


 


 

 @item().TargetTableName

 


 


 


 


 


addy_14-1687194559547.png


 


We have configured the pipeline. Now click on save to publish the pipeline.


addy_15-1687194559551.png


 


Run pipeline:


Click the Run button from the top menu to execute the pipeline. Ensure the pipeline runs successfully. This will copy both tables from source to target.


addy_16-1687194559556.png


 


Summary:


In the above example, we have used Microsoft Fabric pipelines to copy data from an on-premises SQL Server 2019 database to Azure SQL Database. You can modify the sink/destination in this pipeline to copy to other sources such as Azure SQL Managed Instance or Azure Database for PostgreSQL. If you are interested in copying data from a mainframe z/OS database, then you will find this blog post from our team also very helpful.


 


Feedback and suggestions 


If you have feedback or suggestions for improving this data migration asset, please contact the Azure Databases SQL Customer Success Engineering Team. Thanks for your support!


 


Note: For additional information about migrating various source databases to Azure, see the Azure Database Migration Guide.


 

Terraform on Azure June Update

Terraform on Azure June Update

This article is contributed. See the original author and article here.

stevenjma_0-1687281126864.png


 


Welcome to our June Terraform on Azure bimonthly update! We hope the first update was helpful towards giving you insights on what the product team has been working on. This update is our first bimonthly update with collaboration between Microsoft and HashiCorp. We will be aiming for the next update in August!


 


AzureRM provider


 


The resources exposed by the AzureRM provider are what most customers think of and include in their configurations when managing Azure infrastructure with Terraform. Azure is always adding new features and services so we work hard to ensure that you can manage these when they are generally available (GA).


 


Latest Updates


 


A new version of the provider is released weekly that includes bug fixes, enhancements and net new resources and data sources. Here are some notable updates since our previous blogpost:



  • Auth v2 support for web apps (#20449)

  • Key Vault keys support auto rotation (#19113)

  • AKS cluster default node pulls can now be resized. (#20628)


For a full list of updates to the AzureRM provider check out terraform-provider-azurerm/CHANGELOG.md at main · hashicorp/terraform-provider-azurerm (github.com)


 


Export Tool


 


Azure Export for Terraform is a tool that seeks to ease the translation of Terraform and Azure concepts between each other. Whether it’s exporting your code into a new environment or creating repeatable code from an existing environment, we believe the tool provides functionality that simplifies tougher processes.


 


Latest Updates


 


The Team has published comprehensive documentation for a variety of Azure Export for Terraform scenarios. We’re excited to have you test this exciting tool and provide feedback – both on the product as well as our documentation for it. Read the overview of the tool here: https://aka.ms/tf/exportdocs


 


We’ve also recently merged a PR that supports import blocks for Terraform 1.5 onward: https://github.com/Azure/aztfexport/pull/398. To read up on import blocks, check out the HashiCorp documentation here, and if you’re curious about the difference between Azure Export for Terraform and import blocks, we also have a pinned issue detailing this: https://github.com/Azure/aztfexport/issues/406


 


Last, but certainly not least, we’ve released a video for Azure Export for Terraform! Make sure to give it a watch, as it includes benefits, scenarios, and demos: 


 


Verified Modules


 


Have you ever encountered below problems related to modules:  



  • Modules are out of date, not actively supported, and no longer functional

  • Cannot override some module logic without modifying the source code

  • Get confused when you see multiple modules with similar functions

  • When calling various modules, inconsistencies exist that cause instability to existing infrastructure

  • ……


To help tackle the above problems and more, the Azure Terraform team has established a verified module testing pipeline, and only those modules that have passed this pipeline will be marked as “verified”. This pipeline ensures consistency and best practices across verified multiple modules, reduces breaking changes, and avoids duplication to empower the “DRY” principle.


 


Latest Updates


 


We have now released nine Azure verified modules. We prioritized these modules based on customer research and telemetry analysis. Meanwhile, we have continuously updated our verified modules for bug fixes and feature enhancements. For instance, for the AKS verified module, we have added support for the linux_os_config block in default_node_pool and default node pool’s node taints. For a full list of updates to each module, please refer to the changelog: Azure/terraform-azurerm-aks: Terraform Module for deploying an AKS cluster (github.com).


 


For our next modules, we are planning on releasing modules for hub networking, firewalls and key vaults, with close collaboration with the broader developer community. We hope you become one of the proactive contributors to the Azure Terraform verified modules community as well!


 


Community


 


The Terraform on Azure community is a key investment for our team in bringing the latest product updates, connecting you with other Terraform on Azure users, and enabling you to engage in ongoing feedback as we aim to improve your Terraform experience on Azure. This section will consistently speak on community related feedback or engagements. As always, register to join the community at https://aka.ms/AzureTerraform!


 


Community Calls


 


Our latest community call was on April 6th! The recording of the event is at https://youtu.be/Zrr-GXN6snQ and we hope you give it a watch. Ned Bellavance talks in depth about Azure Active Directory and OIDC authentication, and we spend some time exploring GitHub Copilot with Terraform.


 


We also announced our new slack channel, which you can join at https://aka.ms/joinaztfslack. Not only will you get access to fellow Azure Terraform community members, but also the product team.


 


Our next community call is June 22nd at 9 am PT. Make sure to register here. It’ll be a time of open discussion with the team on Terraform, Azure, and the future of AI. Come with your thoughts and opinions!


 


We are also taking applications to co-present with us at our community calls! Our only prerequisite is that you are a member of the community. If you are interested, fill out our form at https://aka.ms/aztfccspeakers and we will reach out if we like your topic! Don’t worry if you don’t get picked for the next one; we will keep your talk on file and may reach out later.


 


Docs


 


It’s been a busy couple of months in Azure Terraform documentation!


 


A key goal we’re making progress on is to bring the Terraform Azure-service documentation into parity with ARM Templates and Bicep. The object is to make it easier to find and compare Azure infrastructure-provisioning solutions across the various IaC options.


 


To that end, we’ve published 15 new Terraform articles covering many different Azure-service topics.



 


Terraform at Scale


 


This ongoing section previously called Solution Accelerators details helpful announcements for utilizing Terraform at enterprise workflow scales.


 


First, an article was published on deploying securely into Azure architecture with Terraform Cloud and HCP Vault. Read this article to learn about how to use Microsoft Defender and incorporate HCP Vault cluster!


 


Second, Terraform Cloud has announced dynamic provider credentials, which enables OIDC with Azure in TFC. If you want a video explaining the benefits of dynamic credentials, check out a great presentation here.


 


Upcoming Events


 


Make sure to sign up for the Terraform on Azure June 22nd 9am PT community call here! We’ll be discussing in an open panel discussion with the team about the future of Terraform on Azure, especially regarding the latest developments in AI.


We’ll aim for our next blogpost in August. See you then!

REGISTER for Microsoft Operations: CSP Partner Community Q&A calls happening next week!

This article is contributed. See the original author and article here.

Microsoft invites you to our Microsoft Operations: Community Q&A calls for CSP Partners. These sessions are dedicated to assist CSP Direct Bill and Indirect Providers with questions related to CSP launches and upcoming changes. Our goal is to help drive a smoother business relationship with Microsoft.  We offer sessions in English, Chinese, Japanese and Korean.


 


Register Today to join a live webinar with Subject Matter Experts or listen back to past sessions.