How to query Azure SQL Metrics using Powershell

This article is contributed. See the original author and article here.

One way you can quickly search and query metrics data is using Azure Portal, where you have chart data. But maybe you want to get the raw data and query it yourself. Find below a Powershell sample to get this data


 


Find below a sample I build based on one I got from https://docs.microsoft.com/en-us/azure/azure-sql/database/scripts/monitor-and-scale-database-powershell


 


And you can get the other possible metric names to send it in as parameter in this other document: https://docs.microsoft.com/en-us/azure/azure-monitor/platform/metrics-supported#microsoftsqlserversdatabases


 


Find full sample at https://github.com/FonsecaSergio/ScriptCollection/blob/master/Powershell/AzureSQL%20-%20Read%20Azure%20SQL%20Metrics.ps1


 


But the main idea is using Get-AzMetric powershell command. You will get results as table, that you can save it in the format you want or save it to a database.


 


 

$MonitorParameters = @{
ResourceId = "/subscriptions/$($SubscriptionID)/resourceGroups/$($ResourceGroup)/providers/Microsoft.Sql/servers/$($ServerName)/databases/$($DBName)"
TimeGrain = $TimeGrain
MetricNames = $MetricName
StartTime = (Get-Date).AddDays($DaysToLook)
}
$Metrics = Get-AzMetric @MonitorParameters -DetailedOutput

 


 

TimeStamp           Average Metric                 
---------           ------- ------                 
07/10/2020 11:07:00       0 dtu_consumption_percent
07/10/2020 11:07:00      10 dtu_limit              
07/10/2020 11:07:00       0 dtu_used               
07/10/2020 11:12:00       0 dtu_consumption_percent
07/10/2020 11:12:00      10 dtu_limit              
07/10/2020 11:12:00       0 dtu_used               
07/10/2020 11:17:00    19,6 dtu_consumption_percent
07/10/2020 11:17:00      10 dtu_limit              
07/10/2020 11:17:00    1,96 dtu_used               
07/10/2020 11:22:00   34,85 dtu_consumption_percent
07/10/2020 11:22:00      10 dtu_limit              
07/10/2020 11:22:00   3,485 dtu_used               
07/10/2020 11:27:00    30,1 dtu_consumption_percent
07/10/2020 11:27:00      10 dtu_limit              
07/10/2020 11:27:00    3,01 dtu_used               
07/10/2020 11:32:00    27,7 dtu_consumption_percent
07/10/2020 11:32:00      10 dtu_limit              
07/10/2020 11:32:00    2,77 dtu_used               
07/10/2020 11:37:00       0 dtu_consumption_percent
07/10/2020 11:37:00      10 dtu_limit              
07/10/2020 11:37:00       0 dtu_used               
07/10/2020 11:42:00       0 dtu_consumption_percent
07/10/2020 11:42:00      10 dtu_limit              
07/10/2020 11:42:00       0 dtu_used               

 

Azure SQL Capacity Planning: Scenarios | Data Exposed

This article is contributed. See the original author and article here.

Capacity planning plays a critical role in migrating an existing application or designing a new one. In the final part of this three-part series with Silvano Coriani, we’ll review various Azure SQL capacity planning scenarios, as well as review best practices and recommendations on what to use when.


 


For an overview of Azure SQL Capacity planning, watch part one.


To learn about the differences between DTU and vCore, watch part two.


 


Watch on Data Exposed


 


Resources:
Choose between the vCore and DTU purchasing models
vCore model overview
Service tiers in the DTU-based purchase model
Migrate Azure SQL Database from the DTU-based model to the vCore-based model
Query Performance Insight for Azure SQL Database
Troubleshoot with Intelligent Insights


 


View/share our latest episodes on Channel 9 and YouTube!

Microsoft Teams chat history and controlling presenters!

Microsoft Teams chat history and controlling presenters!

This article is contributed. See the original author and article here.

Over the past few months having meetings with our colleagues and even friends and families via Microsoft Teams is something we’ve all had to adjust to.  I’ve been a home based worked for over 2 years now and have become adept at using Teams for meetings and figuring out the best way to do things.  Which has meant that I’ve become the source of knowledge for a lot of friends and family who are new to using Teams.  In this article, I want to share some of the questions I’ve been asked by them and of course the answers.


 


Looking back at meeting Chat


 


One question I was asked was can I go back and check the chat from a meeting? And yes you can, it’s totally possible.  There are a couple of ways to do this.


 


If you find the meeting within your calendar within Teams you can then right click on it and select “Chat with participants”, this will allow you to see the meeting chat and hopefully you’ll find the reference you are looking for.


 


Chat with participantsChat with participants


 


 


The other option is to open up the meeting event within Microsoft Teams and then click on chat.  This will take you back to that chat and also allow you see details such as participants invited to the meeting etc.


 


Configure who can present


 


Microsoft Teams has an option that allows people to   which is a great facility if you want to present a PowerPoint presentation or even just your screen.  However, in some meeting scenarios you might want to restrict who has access to do that during the meeting.  I know a lot of school teachers have this very scenario and are encountering some issues.


 


You can however configure your Teams meeting to restrict who has presentation rights during the meeting.  There are three times you can configure this.


 


Setting up the meeting


 


When you set up a Teams meeting from within Outlook you can configure the meeting settings.


Within Outlook, click on the New Teams Meeting button. A new window will open allowing you to set the meeting team, invite the people, etc. Along the top banner you will see a button called “Meeting Options”. This is the button you need. (If you don’t see this button, within the body of the email, there will be a URL link to the meeting options instead.)


 


Microsoft Teams Meeting Options in OutlookMicrosoft Teams Meeting Options in Outlook


 


 


After the meeting has been set up


 


If you’ve set the meeting up and have to change who is presenting you can still do that, you can still tweak the settings to suit what is going to happen during your meeting.


 


Just find the meeting either in your Outlook calendar or within your Teams calendar and open it up. Within there you will see the Meeting Options URL link and you can tweak the settings there.


 


Microsoft Teams meeting optionsMicrosoft Teams meeting options


 


 


During the meeting


 


Now if during the meeting you decide that others need to be able to present or you’ve set up the meeting with everyone being able to present and you don’t want that, all is not lost, you can change this during the meeting.


 


Click on the participants list and when it comes up you will see a list of everyone that is in the meeting. Depending on how you’ve set your meeting up and who has accepted or not turned up to the meeting you will may have several sections, displaying presenters, attendees and invited attendees.


Find the person that you want to give presenter rights to and click on the ellipsis (three dots) beside their name, click on “Make Presenter”, you will get a dialog box popping up confirming this is the action you want to take and when you confirm that the attendee will have the rights to present to the meeting.


 


Call to Action


What questions have you been fielding or what questions do you still need an answer to? Leave a comment below! 


 

Experiencing Data Latency issue in Azure Portal for Many Data Types – 10/08 – Investigating

This article is contributed. See the original author and article here.

Initial Update: Thursday, 08 October 2020 02:05 UTC

We are aware of issues within Log Analytics based Application Insights and are actively investigating. Some customers may experience data latency, gaps, and inaccurate alerting in North Europe.
  • Work Around: NA
  • Next Update: Before 10/08 06:30 UTC
We are working hard to resolve this issue and apologize for any inconvenience.
-Subhash

How to Automate a PKI Configuration for an Existing Azure VM in Microsoft Azure

How to Automate a PKI Configuration for an Existing Azure VM in Microsoft Azure

This article is contributed. See the original author and article here.

 


Hi Cloud automation friends. This is Preston K. Parsard again, and in this post we’ll cover configuring PKI on an existing Active Directory domain-joined Windows Azure Virtual Machine in Microsoft Azure 


 


Now before being scared by the term PKI, like zombies shuffling towards you at twilight, it’s really not that bad this time. In fact, I’ve provided a link in the references section of this post to a short video demonstration as well. Feel free to skip to it now if you prefer, but the rest of this post does provide some background and sets the context for this type of configuration. 


 


DISCLAIMER: These instructions are primarily meant for an informal dev/test or training lab environment intended to experiment or learn about technologies which may rely on a basic certificate authority service. For a more formal and extensive reference for Windows PKI, please see the Windows PKI Documentation Reference and Library link in the references section at the end of this article. Also my esteemed colleague Daniel Metzger wrote Building the Totally Network Isolated Root Certificate Authority, which is a great article for production environments and is also referenced at the end. 


 


By configuring an enterprise certificate authority server in your test and development or lab environment, you can reproduce scenarios that require certificates for web server, code signing or document encryption. One example may include building a simulated on-premises Desired State Configuration (DSC) pull server and auto-enrolling the virtual machines in the domain for document encryption certificates.  


 


Now I realize that it’s more likely that if you have workloads in Azure and you need to leverage desired state configuration, you already have the option to use the native Azure automation state configuration feature for this purpose. So you wouldn’t really need to build a traditional on-premises simulated DSC pull server after-all and consequently would not require a certificate server as well.  


 


The idea behind our scenario however is simply to provide the added flexibility to experiment with setting up a small PKI infrastructure to evaluate, train or prepare for any certificate services related technologies. Most medium and large enterprise customers I’ve worked with have separate IT operations and PKI teams, and IT ops folks are not usually exposed to the mechanics and experience of the PKI integration, relying fully on the PKI teams instead for these services. This solution aims to now empower the IT ops staff to learn and gain better insights and appreciation for any PKI related technologies their projects may rely on. Who knows? It may even encourage greater dialoguand collaboration between both teams when ops can more explicitly elaborate their requirements to the PKI team and the PKI folks can see exactly how and why ops will be using these services if they need such information for approving certificate requests. 


 


Adatum Consulting Overview 


 


First, let’s visit our fictitious company, Adatum Consulting, which is a global provider of cloud architecture and automation consulting services for its manufacturing customers. Adatum has recently created a consolidated development environment where they can develop, test, reproduce and prototype solutions for their clients. 


 

 

Opportunity 


 


To accommodate this requirement, Jason, the infrastructure team lead, has asked Jessica to deploy certificate services on an existing Azure VM to act as the enterprise certificate authority for this environment. By implementing a simple single tier certificate services PKI infrastructure, the team can quickly issue certificates for these secure web applications or to reproduce configuration management solutions. In fact, to configure the certificate services server, Jessica will actually use the Azure automation state configuration capability, but the really cool part about this option is that Azure based DSC does not require first configuring a certificate authority itself, because certificate services for this feature is already built into Azure! 


 


  JessicaJessica


 


Target State Diagram 


 


Here is the target state diagram, which is based on the existing Project 0026 solution in GitHub. If you want to use this solution to automatically provision the dev.adatum.com environment so you can follow along, please use these links below and return to this article when you’re ready to continue. 


 



  1. Blog 

  2. Code 


 tsd06.png


 


Requirements 


 


So let’s review the sequence of steps that Jessica will take to configure certificate services on an VM in their existing dev.adatum.com domain, right after we outline the requirements and assumptions below. 


 



  1. An Azure subscription is needed, which will include at least a domain controller and a new installation of a domain-joined Windows Server 2019 Virtual Machine in the dev.adatum.com domain. If you want to follow along, you will just need to use your own domain with an available Windows Server 2019 image. This subscription must have an automation account since the state configuration feature will be used later to apply the configuration on the target virtual machine. 

  2. We’ll assume that ainternet connection from Adatums’ enterprise or Jessica’s home office to connect to her company’s Azure subscription is available. 

  3. Windows PowerShell version 5.1 or greater. Note that Jessica may be able to use PowerShell (version 7+), but some refactoring of the code may be required. This solution has only been tested on Windows PowerShell 5.1 specifically. 

  4. The account that Jessica will use to run the PowerShell script must be a member of the local administrators group on the machine from which the script is executed. This is because the Az module installations, if required will need this permission. 

  5. During the script execution, Jessica will be prompted to upgrade from the AzureRM modules, in case she still uses these legacy cmdlets, to the newer Az modules from the PowerShell gallery repository. 

  6. A final prompt will appear when the script runs for Jessica to supply credentials for an account that will be used to configure the Active Directory Certificate Services server. This account must actually be members of both the enterprise administrator and root domain administrator for dev.adatum.com in our scenario. 


 


Workflow 


 


1. Download Script 


 


In a PowerShell console, opened in the context of administratorJessica first creates a target folder named C:Project0067 on her machine to download the script from a public GitHub repository 


 


New-Item -Path C:Project0026 -ItemType Directory -Verbose 


Start-Process -FilePath https://github.com/autocloudarc/0067-ConfigurePKI 


 

Next, she downloads the zip file, extracts only the Configure-PKIonVMinAzure.ps1 script and copies it to the new directory named C:Project0067She then unblocks it assuming that her PowerShell execution policy is set to RemoteSigned  so it can be executed locally. 


 


Set-Location -Path c:project0067 -Verbose 


Unblock-File -Path .Configure-PKIonVMinAzure.ps1 -Verbose 


 


2. Execute Script 


 


Next, Jessica executes the C:Project0067Configure-PKIonVMinAzure.ps1 script from her existing Windows PowerShell 5.1 session. She uses the command below to specify the automation account name and resource group that contains the dev.adatum.com environment and the target VM that will be configured. 


 


.Configure-PKIonVMinAzure.ps1 -aaaName <AutomationAccountName> -rgpName <ResourceGroupName> -Verbose 


 


 ss02.png


 

A prompt will appear to upgrade from the legacy AzureRM to the newer Az PowerShell modules so that the most up-to-date PowerShell Azure cmdlets can be used for this configuration. Jessica will enter y, or yes, which is not case sensitive, to proceed with the upgrade if it is required. 


 


ss03.png 


 

3. Authenticate to Subscription 


 


 The script then presents a prompt for Jessica to enter her Azure subscription credentials. Since Jessica has multiple subscriptions, a secondary prompt also asks her to specify which subscription associated with her credentials she wants to use. She chooses the subscription in which the dev.adatum.com target PKI virtual machine resides. 


 


ss04.png 


 

4. Select Virtual Machine 


 


ss05.png


 


Based on the resource group used in the parameter -rgpName for this script shown in step 2 above, which was rg10, the virtual machines in that resource group is then listed so that Jessica can select the appropriate target VM that she wants to configure. She chooses AZRPKI1001. The naming convention used here is AZR  = three letter cloud service provider code ([A][Z]u[R]e)PKI = virtual machine function code ([P]ublic [K]ey [I]nfrastructure)and the remaining characters are 1001, where 10 represents the resource group identifier in rg[10and 01 is the series number 


 


5. Provide Credentials 


 


 ss06.png


 


Jessica enters the username for this domain as adm.infra.user@dev.adatum.com, which is a member of the Enterprise Admins and Domain Admins in the root (and only) domain for this forest. This is a requirement to install and configure Active Directory Certificate Services. 


 


6. Download Configuration 


 


The script will now automatically download both the pkiConfig.ps1 as well as the configuration data file pkiConfigData.psd1 as artifacts from the public GitHub project. 


 


ss07.png


 


The pkiConfigData.psd1 information are the set of DSC related parameter values that are associated with the pkiConfig.ps1 configuration script. When the configuration is imported and compiled, it will include these configuration data parameters and used to specify properties of the PKI server. Some examples of these properties include; the CACommonName, the cryptographic providerhash algorithm and key length for the CA root certificate. 


 


7. Import Modules 


 


Before the configuration script can be imported and compiled, the DSC resource modules it requires must first be imported from the PowerShell gallery into the Azure automation account. The script also does this automatically as shown in the image below. 


 


ss08.png


 


8. Import and Compile Configuration 


 


The script will also import and compile the configuration with the configuration data associated. Both the import and compilation steps are initiated from a single command in PowerShell, which is show below in line 404. 


 


ss09.png


 


The output in the PowerShell console shows that the PkiConfig compilation has started. 


 


ss10.png


 


The automation account will also confirm that the configuration was compiled. 


 


ss11.png


 


9. Register Node 


 


Since the script has now completed the compilation of the configuration, it proceeds to register or on-board the target virtual machine that will be configured. In this example this process took less than 5 minutes, but there may be other factors that may make it take longerAs with most of the previous steps, no manual intervention is required here either. Recall from step 4 that this is the AZRPKI1001.dev.adatum.com virtual machine. This is the last step in the configuration process where the configuration is actually applied to the PKI server and then restarted to finalize that configuration. 


 


ss12.png


 


10. Get Results 


 


The script then provides a set of instructions to perform the final verification and prompts Jessica to open the transcript. This transcript provides all the details of the script output that was shown in the console and Jessica decides to open the transcript to see if there are any errors she may have missed. The transcript opens with notepad in a new window and the script completes. 


 


ss13.png



From the Azure portal, Jessica confirms that the target node applied the PkiConfig.localhost node configuration and that the node itself is compliant.  


 


ss14.png


 


11. Verify Configuration 


 


For the final verification that PKI was properly setup on this target VM, Jessica will log into the VM with RDP over HTTPs using the Azure bastion service 


 


She then starts a PowerShell session and types certsrv.msc to open the certificate authority MMC console and validates that this certificate services feature was installed and configured correctly. 


 


ss15.png



Summary 


 


So to quickly setup a simple PKI server, we can just download and execute a script from a public GitHub repository. Once executed, this script will automatically retrieveimport artifacts and modulescompile a configuration and register a target domain joined virtual machine in Azure to configure it as a certificate authority server. I hope this information was useful and feel free to leave a commentThanks for reading and happy automating! 


 


References 


 



  1. Video for this article. 

  2. Code for this article. 

  3. How to build an Azure Automation Lab: Article 

  4. How to build an Azure Automation Lab: Code 

  5. Check out my other blog articles at https://aka.ms/AutoCloudArc 


 


Other Links 


 



  1. Windows PKI Documentation Reference and Library 

  2. Building the Totally Network Isolated Root Certificate Authority 

  3. Azure Automation:  https://docs.microsoft.com/en-us/azure/ 

  4. Desired State Configuration:  https://docs.microsoft.com/en-us/powershell/dsc/overview 

Important changes for Azure Sphere 20.10 Retail Evaluation

This article is contributed. See the original author and article here.

The Azure Sphere 20.10 OS is scheduled for release to Retail Evaluation in mid-October. For this release, we are making important changes to the Retail Evaluation period:



  • The Retail Evaluation period will last three weeks instead of the usual 14 days.

  • We will provide a special evaluation version of the SDK for use during this period. This SDK will be replaced by the final version of the 20.10 SDK when the OS is released to the retail feed.

  • We ask you not only to validate your current application binaries (built with the 20.07 or earlier SDK) but also to rebuild your applications with the evaluation SDK and validate those binaries as well.


Why is 20.10 Retail Evaluation different?


The Azure Sphere 20.10 OS and SDK incorporate some fundamental changes that may affect your applications. For 20.10, we upgraded The Yocto Project to the latest 3.1 LTS Dunfell release, which includes updated tools and libraries that may result in a change to the amount of memory used by applications.


 


To help you verify that your applications continue to work as intended, we will provide an evaluation version of the 20.10 SDK along with the Azure Sphere OS, so that you can rebuild your applications. You might see new or different GCC warnings when you compile with the new Azure Sphere SDK.


We are providing early notification of the evaluation release and extending the overall evaluation period by a week so that you have additional time to validate your existing applications and ensure that you can rebuild and run them without issues.


 


What should you do?


First, set up a device group for Retail Evaluation if you haven’t already done so. Devices in this group will receive the 20.10 Retail Evaluation OS when we release it.


When we release the 20.10 OS to Retail Evaluation:



  1. Test your existing application binaries with the new OS to make sure that they continue to work as you expect.

  2. Recompile your applications using the evaluation SDK and test them with the new OS. We recommend that you test high-memory use scenarios.

  3. If you encounter problems or discover errors, notify Microsoft immediately.


When the Retail Evaluation OS is released, we will provide details about installation and use of the evaluation SDK, recommended testing scenarios, and how to notify us about problems. See 20.10 Retail Evaluation for more information.

Experiencing Data Latency issue in Azure Portal for Many Data Types – 10/07 – Resolved

This article is contributed. See the original author and article here.

Final Update: Wednesday, 07 October 2020 20:09 UTC

We’ve confirmed that all systems are back to normal with no customer impact as of 10/7, 19:00 UTC. Our logs show the incident started on 10/7, at approximately 18:30 UTC and that during the 30 minutes that it took to resolve the issue most Application Insights and Log Analytics customers experienced outages with various services.
Root Cause: The failure was due to a back-end networking issue that caused problems with a large number of Azure services.
Incident Timeline: 0 Hours & 30 minutes – 10/7, 18:30 UTC through 10/7, 19:00 UTC
We understand that customers rely on Application Insights and Log Analytics as critical services and apologize for any impact this incident caused.

-Jack Cantwell

Physicians Predictive Model Development for Individuals and Patient Specific Interventions – Webcast

Physicians Predictive Model Development for Individuals and Patient Specific Interventions – Webcast

This article is contributed. See the original author and article here.

HLS Partner Plays.png  Thanks to the COVID-19 pandemic, healthcare is at a crossroads where there is no going back to using age-old technologies.  We’ve digitized data and changed the incentives for physicians and hospitals; this change comes at a time when consumers are expecting more from their healthcare experience. Healthcare organizations are struggling to follow their patients and their needs across various care settings. This is in part because they lack unified patient records which they could leverage to better understand their patient’s medical course and view their network in the most comprehensible manner possible. The combination of increasing patient expectations, the move to value-based care and evolving regulatory changes mean that healthcare is changing faster than ever. Yet, providers and payers are still struggling with leveraging data with the right technology.


In this webcast (10/28 1:30pm to 2:30pm EST) we’ll explore how cloud-based technology powered by Artificial Intelligence (AI) and Machine Learning assists physicians to develop predictive models for individual patients and then implement patient-specific interventions with experts from Microsoft Partner Innovaccer.


Questions that will be addressed during this webcast include:



  • What is the current state of healthcare in terms of managing the widely distributed data across multiple facilities? Why do we still have distributed information and a complex system of disconnected silos of information? Why is a unified patient record accessible across the continuum of care so important?

  • What is the concept of “unified patient records?” How can a connected care framework help to manage the complexity within healthcare and help us to create the culture of “healthcare with no address?” What predictions do you have for where value-based care will be in five years? What should organizations be doing now to get ready for these changes?

  • With the new CMS interoperability regulations in place, what will be the future of data connectivity and sharing in the post-COVID world?

  • How can advanced technologies such as Artificial Intelligence (AI) and Machine Learning (ML) assist organizations in leveraging the data and efficiency they’ll need to embrace the rapidly evolving healthcare ecosystem and achieve better patient outcomes?

  •  How are health systems changing the way they conduct their administrative, clinical, and financial operations?


You can add the webcast to your calendar by grabbing the .ics file below or access the event directly by clicking the direct link at event time.



About this sessions guest partner Innovaccer:


Innovaccer is a leading healthcare technology company pioneering the Data Activation Platform that’s helping the healthcare industry realize the promise of value-based care. – About Innovaccer


Thanks for visiting – Michael Gannotti   LinkedIn | Twitter


Michael GannottiMichael Gannotti

The Azure Data Team Gears Up for PASS Virtual Summit 2020

This article is contributed. See the original author and article here.

This year’s PASS Summit has gone virtual and is taking place November 10-13. PASS Summit is the largest gathering of data professionals focused on the Microsoft platform, and this year, it will be more accessible than ever, giving attendees the opportunity to learn from data professionals around the world right from their very own home.


Not only will the Azure SQL/ SQL Server teams be delivering 25+ sessions, but there will also be opportunities to connect directly with the product groups through:



  • Microsoft Azure Data Clinic: Attendees will be able to access the virtual clinic booth to schedule 1:1 time with Microsoft engineers to get their questions answered.


  • Focus Groups: The week following PASS Summit (November 16 -20), attendees will be able to meet with the product teams to provide feedback on topics related to the technical roadmap for Microsoft data platforms, trends/drivers facing your organizations or industry, and priorities to move your business forward. To sign up for one of our focus groups, click here.



As a bonus – the first 1,500 registrants for PASS Summit will receive Bob Ward, Principal Architect on the Azure Data team, new eBook, Azure SQL Revealed: A Guide to the Cloud for SQL Server Professionals.


More details about the events are being updated directly on the PASS Summit website but in the meantime, you can start planning which Azure Data sessions you would like to attend with my quick reference list below:


 




















































































































Speakers



Title


PRE-CONFERENCE SESSION

Bob Ward, Anna Hoffman



The Azure SQL Workshop



CONFERENCE SESSIONS


Ajay Jagannathan, Anna Hoffman Azure SQL: What to use when and what’s new
Anna Hoffman, Alain Dormehl, Emily Lisa Azure SQL Built-in High Availability
Bob Ward Inside Waits, Latches, and Spinlocks Returns
Bob Ward, Denzil Ribeiro Inside Azure SQL Database Hyperscale
Borko Novakovic, Srdan Bozovic Modernize your SQL applications with the recently enhanced version of Azure SQL Managed Instance 
Buck Woody Machine Learning with SQL Server – From the Edge to the Cloud
Buck Woody SQL Server Big Data Clusters Architecture
Davide Mauri, Silvano Coriani Azure SQL is the best choice for building back-end REST API: discuss!
Kevin Farlee HA/DR in SQL Server, Putting the puzzle pieces together and planning for the future
Michelle Wallig The future of ML services on hybrid
Morgan Oslake Maximize performance for less cost using Azure SQL Database serverless
Mukesh Kumar, Raj Pochiraju Database modernization best practices and lessons learned through customer engagements 
Pam Lahoud, David Pless Best Practices for deployment of SQL Server on Azure Virtual Machines 
Pedro Lopes, Joe Sack Azure SQL: Path to an Intelligent Database
Pedro Lopes, Pam Lahoud Practical Guidance to Make Your Tier-1 SQL Server Roar
Raj Pochiraju, Amit Banerjee Lift and shift your SQL Servers to Azure SQL Virtual Machines
Raj Pochiraju, Mukesh Kumar App Modernization and Migration from End to end, using data migration tools and Azure SQL
Ron Matchoro, Rohit Nayak, Mirek Sztajno, Jakub Szymaszek What’s New in Security & Networking for Azure SQL
Sanjay Mishra, Abdul Sathar Sait Elevation to the cloud – How our customers are using Azure SQL
Tejas Shah, Amit Khandelwal Deployment, High Availability and Performance guidance for SQL Server on Linux in Azure IaaS ecosystem
Umachandar Jayachandran Taking your data hybrid with Azure Arc enabled data services
Vasiya Krishnan Extend SQL applications to the edge with Azure SQL Edge 
Vicky Harp, Ken Van Hyning What’s New in the SQL Server Tools?
Vin Yu, Amit Khandelwal SQL Server Containers in Action

 


We’re looking forward to seeing you online!  Tweet us at @AzureSQL for sessions you are most excited about.