The Total Economic Impact™ of Microsoft cloud solutions for CMMC Compliance

This article is contributed. See the original author and article here.

 


Complying with the new Cybersecurity Maturity Model Certification (CMMC) from the Department of Defense (DoD) can be a challenge for customers and partners in the defense ecosystem.


 


The broad range of suppliers in the Defense Industrial Base (DIB) providing goods and services to the DoD need to navigate evolving compliance requirements. Microsoft cloud solutions are here to support DIB actors during this transition. Join Forrester and Microsoft on July 25th, 11am PT for this session to: Learn how to navigate the DoD’s CMMC compliance requirements, discover more about the benefits of Microsoft cloud services, like reducing audit-related efforts and get a look into Forrester’s Total Economic Impact™ of Microsoft cloud solutions. Register today

2022 release wave 2 plans for Dynamics 365 and Power Platform now available 

2022 release wave 2 plans for Dynamics 365 and Power Platform now available 

This article is contributed. See the original author and article here.

On July 12, 2022, we published the 2022 release wave 2 plans for Microsoft Dynamics 365 and Microsoft Power Platform, a compilation of new capabilities that are planned to be released between October 2022 and March 2023. This second release wave of the year offers hundreds of new features and enhancements, demonstrating our continued investment to power digital transformation for our customers and partners. 

Highlights from Dynamics 365 

  • Dynamics 365 Marketing brings real-time customer journey orchestration to enable business-to-business (B2B) brands to hyper-personalize experiences across the entire buying journey and confidently grow their marketing and customer experience programs to target up to 100 million customers with up to 300 million messages and interactions per month. Intuitive lead capture forms, leads nurturing hands-off automation, and a new analytics dashboard enable alignment between sales and marketing teams like never before. Organizations can reach new levels of marketing maturity with AI-powered next best content selection and increased support for business units.  
  • Dynamics 365 Sales continues to optimize the seller experience using data and AI to help sellers prioritize their work, blending business and productivity tools to meet sellers where they are and driving in-the-moment collaboration experiences so that every seller can engage with their colleagues and customers efficiently, reclaiming their time and being more productive.  
  • Dynamics 365 Customer Service is focused on delivering the capabilities that help run contact centers optimally by providing enhancements in unified routing with features such as percentage-based routing, preferred agent routing, and longest idle routing. Customer support swarming in Microsoft Teams will help agents resolve complex cases through collaboration. Organizations can empower their customers with options to leave voicemail, callback, and dial agents directly in the voice channel. The agent experience is modernized with an enhanced conversation timeline, horizontal multisession navigation, and AI-powered conversation summary. Supervisors can view Microsoft Power Virtual Agents analytics within their omnichannel analytics dashboards.
  • Dynamics 365 Field Service brings new capabilities that enable organizations to better orchestrate service operations for workers. Organizations can now build and maintain location and assets for large facilities, keep their cost at bay by configuring “not to exceed” limits, and group similar incident types under “trade” for ease of management. We are also bringing optimization improvements on booking lock constraints and introducing a myriad of user experience improvements to the mobile app to continue empowering frontline workers.  
  • Dynamics 365 Finance is launching the general availability of vendor invoice optical character recognition (OCR) which automates the reading and recognition of vendor invoices and continues adding additional capabilities for subscription billing use cases. We will integrate tax calculation service with Dynamics 365 Project Operations (preview) and extend electronic invoicing service to support new upcoming e-invoice legislations for France, Poland, and Saudi Arabia. 
  • Dynamics 365 Supply Chain Management continues to invest in capabilities that drive agility and resilience across the supply chain. New analytics and support for multiple vendors in planning optimization help organizations optimize their sourcing strategies. Inventory visibility lets organizations track real-time consumption within allocated quantities in support of promotions, special events, and new product introductions. Guided warehouse implementation and configuration experiences enable rapid reconfiguration of supply chains and manufacturers in the process industry can use Planning Optimization for shortening their planning cycles. 
  • Dynamics 365 Intelligent Order Management has continued to expand its ecosystem of providers and built on the continued success of FedEx. We now have 14 providers that span the supply chain lifecycle from order ingestion to last-mile delivery. In the upcoming release, we are adding support for various order typesback orders, subscription orders, manual orders, and purchase orders. We will also provide the ability to simulate fulfillment so that our customers can model and understand the impact of choosing various fulfillment strategies. Finally, we also have contextual collaboration features where an order can be shared with multiple stakeholders using embedded Microsoft Teams. 
  • Dynamics 365 Project Operations is continuing to invest in capabilities to empower project managers and project teams in this release wave with project budgeting and time-phased forecasting, baselines and snapshots, and in modernizing application experiences on the web and mobile form factors. For the project accountants and back-office personas, we are lighting up advanced subcontracting and subscription billing capabilities. In addition, across-the-board investments to ease the complexity of interaction patterns and uptake of modern and fluent controls are also targeted for this release wave.
  • Dynamics 365 Guides will continue investing in capabilities to improve collaboration experiences for authors and operators on Microsoft HoloLens 2. The application will also be updated to provide more advanced content authoring workflows versioning and publishing of guides in the coming wave. 
  • Dynamics 365 Human Resources will bring improved efficiency by enabling human resource business partners to tailor experiences and automatically complete processes where manual decisions and tasks are needed today. Improved efficiency will also be available to managers and employees by providing notifications outside of the application for benefits processes and tasks. We’ll also be providing better experiences across Dynamics 365 applications by integrating employee skill and compensation, and leaving information to resource managers in Dynamics 365 Project Operations.  
  • Dynamics 365 Commerce enables new and updated B2B experiences, including sales agreements across channels and customer-specific catalogs. Omnichannel media management features streamline workflows. Key point of sale investments includes Store Commerce app availability for iOS and Android devices. And Apple Pay and Google Pay digital wallet integration, as well as new customer support options through virtual agent and live agent integration will be available for e-commerce.
  • Dynamics 365 Fraud Protection will be offering a new transaction acceptance booster (TAB) offering that allows merchants to increase their bank approval rates without having to rip and replace their incumbent fraud provider solution. Having to rip and replace a merchant’s incumbent fraud solution is costly and timely, this enables the merchant to benefit from Dynamics 365 Fraud Protection TAB capabilities with minimal disruption to their business.
  • Dynamics 365 Business Central continues to improve the reporting capabilities for customers, including new report datasets for Excel and improvements to the Microsoft Power BI reports which now will support dimensions. The Microsoft Power Apps and Microsoft Power Automate integration also continue to offer new capabilities for low-code development. The application will get several improvements like helping users do reverse entries in the payment reconciliation journal and several improvements to the supply chain functionality. We are taking steps forward in scaling productivity of our partners via more efficient and performant tooling for development and administration.
  • Dynamics 365 Customer Insights continues to invest in accelerating customer understanding by enhancing time to value with quicker out-of-the-box insights, predictions, segments, and measures with limitless extensibility across technology ecosystems. New features will allow you to power personalized experiences with real-time insights, analytics, and activations to deliver industry-leading personalization and moments-based marketing. New features also enable ubiquitous insights that allow an integrated data flow across Microsoft Dataverse, Dynamics 365, and Microsoft Power Platform for seamless workflows.
  • Dynamics 365 Connected Spaces now supports alerts and notifications via Teams or Outlook when business AI-skills detect actionable patterns within a physical space. Customers can now use Dynamics 365 Connected Spaces in Germany (besides US and UK) and connect up to 10 cameras for each Azure Stack Edge device maximizing their existing investments in expanding Dynamics 365 Connected Spaces across their physical footprint. Customers can also leverage the Azure Stack Edge Pro 2 device for configuring Dynamics 365 Connected Spaces at the edge in addition to the existing Pro 1 devices.

Highlights from Microsoft Power Platform

  • Power BI continues to invest in empowering every individual, team, and organization to drive a data culture. Creation experience is improved by aligning our experiences with Office and enabling datasets authoring on the web. By bringing power query diagram view into Power BI Desktop, creators can use a no code experience to perform extract transform load (ETL) on their data. For teams, we are bringing enhancements to metrics focused on enterprise needs and integration with Microsoft Viva Goals. In addition, big data experience is increased through automatic aggregations, query scale out, data protection capabilities via data loss prevention (DLP), and providing improved visibility into activity to admins. 
  • Power Apps will expand governance capabilities to allow organizations to enable, manage, and support citizen development across the entire organization. Makers and developers of all skill levels will be more productive over Dataverse in a unified studio, with modern experiences to build and manage data and logic, as well as infused intelligence to support development, enrich data, and optimize end-user experiences. In addition to ensuring trust and the ability to leverage rich data experiences, both makers and end-users will benefit from out-of-the-box collaboration capabilities to enable users to be more productive when working together.
  • Power Pages continues to invest inbringing more out-of-the-boxcapabilities to support both low-code/no-code development as well professional developers. Some of the salient capabilities in this release allow makers to have additional capabilities to work with forms and lists using the design studio and get them started quickly using additional solution templates. There are enhancements for professional developers to do more with the sites using Microsoft Power Platform command line interface (CLI) tool and visual studio (VS) Code as well as for administrators to better administer and govern their Power Pages sites. 
  • Power Automateis more accessible than ever before with new experiences to help users of every skill level build out their cloud and desktop flows. Organizations need to automate their deployments of Power Automate, so there are additional enhancements for application lifecycle management (ALM). And, with increased usage of robotic process automation (RPA), we areadding features to make it easier to manage machines in Azure and the credentials of your users and accounts.
  • Power Virtual Agents brings improvements in the authoring experience with commenting, Power Pages integration, data loss prevention options, proactive bot update messaging in Teams, and more. Creating a bot is typically a complex and time-intensive process, requiring long content update cycles and a team of experts. Power Virtual Agents gives anyone in your organization the ability to create powerful custom bots using an easy, code-free graphical interface, without the need for AI experts, data scientists, or teams of developers. A bot can interact with users, ask for clarifying information, and ultimately answer a user’s questions. 
  • AI Builder continues to enable citizen developers to use and customize AI capabilities to build more intelligent apps and workflows. Lifecycle and governance of AI Builder models will be improved with enhanced versioning, deployment, and monitoring capabilities. Makers will also benefit from new features for intelligent document and text processing like the ability to manage human in-the-loop validation, easier integration of large data sets for automated e-mail processing, and the ability to process contracts and multi-page tables in documents.

For a complete list of new capabilities, please check out the Dynamics 365 and Microsoft Power Platform 2022 release wave 2 plans. 

Early access period 

Starting August 1, 2022, customers and partners will be able to validate the latest features in a non-production environment. These features include user experience enhancements that will be automatically enabled for users in production environments during October 2022. Take advantage of the early access period, try out the latest updates in a non-production environment, and get ready to roll out updates to your users with confidence. To see the early access features, check out the Dynamics 365 and Microsoft Power Platform pages. For questions, please visit the Early Access FAQ page

We’ve done this work to help youour partners, customers, and usersdrive the digital transformation of your business on your terms. Get ready and learn more about the latest Dynamics 365 and Microsoft Power Platform product updates and product roadmaps, and share your feedback in the community forum for Dynamics 365 or Microsoft Power Platform

The post 2022 release wave 2 plans for Dynamics 365 and Power Platform now available  appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Enable File Sharing with the Azure Communication Services UI Library and Azure Blob Storage

Enable File Sharing with the Azure Communication Services UI Library and Azure Blob Storage

This article is contributed. See the original author and article here.

Azure Communication Services allows you to add communications to your applications to help you connect with your customers and across your teams.  Available capabilities include voice, video, chat, SMS and more. Frequently you need to share media, such as a Word document, an image, or a video as part of your communication experience.  During a meeting, users want to share, open, or download the media directly. This content can be referenced throughout the conversation for visibility and feedback – whether it is a doctor sending a patient a note in a PDF, a retailer sending detailed images of their product, or a customer sharing a scanned financial document with their advisor.


 


chat-file-sharing.png



 


As part of the Azure Family, Azure Communication Services works together with Azure Blob Storage to share media between communication participants. Azure Blob Storage provides you with globally redundant, scalable, encrypted storage for your content and Azure Communication Services allow you to deliver that content.


 


Using Azure Communication Services chat SDK and the UI Library, developers can easily enable experiences that incorporate chat communications and media sharing into your existing applications. Check out the recently published tutorial and reference implementation. You can find the completed sample on GitHub.


 


This tutorial covers how to upload media to Azure Blob Storage and link it to your Azure Communication Services chat messages.  Going one step further, the guide shows you how to use the Azure Communication Services UI Library to create a beautiful chat user experience which includes these file sharing capabilities. You can even stylize the UI components using the UI library’s simple interfaces to match your existing app.  


 


filesharing-typical-flow (1).png


 


The tutorial yields a sample of how file sharing capability can be enabled. You should ensure that the file system used and the process of uploading and downloading files to be compliant with your requirements related to privacy and security. 


 


We hope you check out the tutorial to learn how you can bring interactive communication and media sharing experiences to your application using Azure Communication Services.

Bicep for Terraform Engineers

This article is contributed. See the original author and article here.

Introduction


Hi folks! My name is Felipe Binotto, Cloud Solution Architect, based in Australia.


 


The purpose of this article is to provide a comparison on how you can do something with Bicep vs how you can do the same thing with Terraform. My intension here is not to provide an extensive comparison or dive deep into what each language can do but to provide a comparison of the basics.


 


I have worked with Terraform for a long time before I started working with Bicep and I can say to Terraform Engineers that it should be an easy learning curve with Bicep if you already have good experience with Terraform.


Before we get to the differences when writing the code, let me provide you with a quick overview of why someone would choose one over the other.


 


The main differentiator of Terraform is being multi-cloud and the nice UI it provides if you leverage Terraform Cloud to store the state. I like the way you can visualize plans and deployments.


 


Bicep, on the other hand, is for Azure only, but it provides deep integration which unlocks what some call ‘day zero support’ for all resource types and API versions. This means that as soon as some new feature or resource is released, even preview features, they are immediately available to be used with Bicep. If you have been using Terraform for a while, you know that it can take a long time until a new Azure release is also available in Terraform.


 


Terraform stores a state of your deployment which is a map with the relationship of your deployed resources and the configuration in your code. Based on my field experience, this Terraform state causes more problems than provides benefits. Bicep doesn’t rely on the state but on incremental deployments.


 


Code


Both Terraform and Bicep are declarative languages. Terraform files have TF extension and Bicep files have BICEP extension.


 


The main difference is that for Terraform you can have as many TF files as you want in the same folder, and they will be interpreted as a single TF file which is not true for Bicep.


 


Throughout this article, you will also notice that Bicep uses single quotes while Terraform uses double quotes.


 


Variables, Parameters & Outputs


 


Variables


In Bicep, variables can be used to simplify complex expressions which are equivalent to Terraform “local variables”.


 


The example below depicts how you can concatenate parameter values in a variable to make up a resource name.


 


 

param env string
param location string
param name string

var resourceName = '${location}-${env}-${name}'

 


 


The same can be achieved in Terraform as follows.


 


 

variable "env" {}
variable "name" {}
variable "location" {}

locals {
  resourceName = "${var.location}-${var.env}-${var.name}"
}

 


 


Parameters


In Bicep, parameters can be used to pass inputs to your code and make it reusable which is the equivalent to “input variables” in Terraform.


 


Parameters in Bicep are made of the key work “param”, followed by the parameter name followed by the parameter type, in the example below, a string.


 


 

param env string

 


 


A default value can also be provided.


 


 

param env string = 'prd'

 


 


Parameters in Bicep can also use decorators which is a way to provide constraints or metadata. For example, we can constrain the parameters “env” to be three letters only.


 


 

@minLength(3)
@maxLength(3)
param env string = 'prd'

 


 


Parameter values can be provided from the command line or passed in a JSON file.


 


In Terraform, input variables can be declared as simple as the following.


 


 

variable "env" {}

 


 


A default value can also be provided.


 


 

variable "env" {
  default = "prd"
}

 


 


In Terraform, a validation block is the equivalent to the Bicep parameter decorators.


 


 

variable "env" {
  default = "prd"
  validation {
  	condition     = length(var.env) == 3
  	error_message = "The length must be 3."
  }
}

 


 


Parameter values can be provided from the command line or passed in a TFVARS file.


 


Outputs


Outputs are used when a value needs to be returned from the deployed resources.


 


In Bicep, an output is represented by the keyword “output” followed by the output type and the value to be returned.


 


In the example below, the hostname is returned which is the FQDN property of a public IP address object.


 


 

output hostname string = publicIP.properties.dnsSettings.fqdn

 


 


In Terraform, the same can be done as follows.


 


 

output "hostname" {
   value = azurerm_public_ip.vm.fqdn
}

 


 


Resources


Resources are the most important element in both Bicep and Terraform. They represent the resources which will be deployed to the target infrastructure.


 


In Bicep, resources are represented by the keyword “resource” followed by a symbolic name, followed by the resource type and API version.


 


The following represents a compressed version of an Azure VM.


 


 

resource vm 'Microsoft.Compute/virtualMachines@2020-06-01' = {
  name: vmName
  location: location
  …
}

 


 


The following is how you can reference an existing Azure VM.


 


 

resource vm 'Microsoft.Compute/virtualMachines@2020-06-01' existing = {
  name: vmName
}

 


 


The same resource can be represented in Terraform as follows.


 


 

resource "azurerm_windows_virtual_machine" "vm" {
  name                  = var.vmName
  location              = azurerm_resource_group.resourceGroup.location
  …
}

 


 


However, to reference an existing resource in Terraform, you must use a data block.


 


 

data "azurerm_virtual_machine" "vm" {
  name                = vmName
  resource_group_name = rgName
}

 


 


The main differences in the examples above are the following:


 



  • Resource Type

    • For Bicep, the resource type version is provided in the resource definition.

    • For Terraform, the version will depend on the plugin versions downloaded during “terraform init” which depends on what has been defined in the “required_providers” block. We will talk about providers in a later section.





  • Scope

    • For Bicep, the default scope is the Resource Group unless other scope is specified, and the resources don’t have a Resource Group property which requires to be specified.

    • For Terraform, the Resource Group has to be specified as part of the resource definition





  • Referencing existing resources

    • For Bicep, you can use the same construct using the “existing” keyword.

    • For Terraform, you must use a data block.




Modules


Modules have the same purpose for both Bicep and Terraform. Modules can be packaged and reused on other deployments. It also improves the readability of your files.


 


Modules in Bicep are made of the key word “module”, followed by the module path which can be a local file path or a remote registry.


The code below provides a read-world example of a very simple Bicep module reference.


 


 

module vmModule '../virtualMachine.bicep' = {
  name: 'vmDeploy'
  params: {
    name: 'myVM'
  }
}

 


 


One important distinction of Bicep modules is the ability to provide a scope. As an example, you could have your main deployment file using subscription as the default scope and a resource group as the module scope as depicted below.


 


 

module vmModule '../virtualMachine.bicep' = {
  name: 'vmDeploy'
  scope: resourceGroup(otherRG)
  params: {
    name: 'myVM'
  }
}

 


 


The same can be achieved with Terraform as follows.


 


 

module "vmModule" {
  source   = "../virtualMachine"
  name     = "myVM"
}

 


 


Providers & Scopes


Terraform uses providers to interact with cloud providers. You must declare at least one azurerm provider block in your Terraform configuration to be able to interact with Azure as displayed below.


 


 

provider "azurerm" {
  features {}
}

 


 


To reference multiple subscriptions, you can use an alias for the providers. In the example below we reference two distinct subscriptions.


 


 

provider "azurerm" {
  alias             = "dev"
  subscription_id   = "DEV_SUB_ID"
  tenant_id         = "TENANTD_ID"
  client_id         = "CLIENT_ID"
  client_secret     = "CLIENT_SECRET"
  features {}
}
 
provider "azurerm" {
  alias             = "prd"
  subscription_id   = "PRD_SUB_ID"
  tenant_id         = "TENANTD_ID"
  client_id         = "CLIENT_ID"
  client_secret     = "CLIENT_SECRET"
  features {}
}

 


 


Bicep uses scopes to target different resource groups, subscriptions, management groups or tenants.


 


For example, to deploy a resource to a different resource group, you can add to the resource, the scope property, and use the “resourceGroup” function.


 


 

module vmModule '../virtualMachine.bicep' = {
  name: 'vmDeploy'
  scope: resourceGroup(otherRG)
  params: {
    name: 'myVM'
  }
}

 


 


To deploy the resource to a resource group in a different subscription, you can also include the subscription id as per the example below.


 


 

module vmModule '../virtualMachine.bicep' = {
  name: 'vmDeploy'
  scope: resourceGroup(otherSubscriptionID, otherRG)
  params: {
    name: 'myVM'
  }
}

 


 


Deployment


There are many Bicep and Terraform commands and variations which can be used for deployment or to get to a point where a deployment can be performed, but in this section, I will just compare “terraform plan” and “terraform apply” with Bicep’s equivalent commands.


 


“terraform plan” is the command used to preview the changes before they actually happen. Running it from the command line will output the resources which would be added, modified, or deleted in plain text. Running the plan from Terraform Cloud, you can see the same information but in a nice visual way. Parameters can be passed as variables or variables files as per below.


 


 

terraform plan -var 'vmName=myVM'

terraform plan -var-file prd.tfvars

 


 


“terraform apply” deploys the resources according to what was previewed in the plan.


 


In Bicep, the “terraform plan” command is equivalent to the CLI “az deployment group what-if” command or “New-AzResourceGroupDeployment -Whatif” PowerShell command.


 


Running it from the command line will also output the resources which would be added, modified, or deleted in plain text. However, Bicep still doesn’t provide a user interface for the what-if visualization.


 


The “terraform apply” command is equivalent to the Bicep CLI command “az deployment group create” or “New-AzResourceGroupDeployment -Confirm” PowerShell command.


 


Note that these Bicep commands are for resource group deployments. There are similar commands for subscription, management group and tenant deployments.


 


Conclusion


Terraform still has its place in companies which are multi-cloud or using it for on-premises deployments. I’m Terraform certified and always loved Terraform. However, I must say when considering Azure by itself, Bicep has the upper hand. Even for multi-cloud companies, if you wish to enjoy deep integration and be able to use all new features as soon as they are released, Bicep is the way to go.


 


I hope this was informative to you and thanks for reading! Add your experiences or questions in the comments section.


 


 


Disclaimer


The sample scripts are not supported under any Microsoft standard support program or service. The sample scripts are provided AS IS without warranty of any kind. Microsoft further disclaims all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose. The entire risk arising out of the use or performance of the sample scripts and documentation remains with you. In no event shall Microsoft, its authors, or anyone else involved in the creation, production, or delivery of the scripts be liable for any damages whatsoever (including, without limitation, damages for loss of business profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use the sample scripts or documentation, even if Microsoft has been advised of the possibility of such damages.

Lesson Learned #220:Hands-On-Labs: Activity Monitor in my Elastic Database Pool

Lesson Learned #220:Hands-On-Labs: Activity Monitor in my Elastic Database Pool

This article is contributed. See the original author and article here.

To be honest, this post is one of my favorites that I was looking to post due to many questions that we get from our customers about how to monitor my elastic database pool. Many customers have a dense elastic database pool and they need a clear picture of what is happening in their elastic database pool. I hope that you can enjoy like as much as I enjoyed during these tests. 


 


In this article and video we are going to monitor the elastic database pool using the monitor an we are going to share a query to obtain all the current processes that your elastic database pool is running. 


 


The first thing is to know the main characteristics of an elastic database pool. 


 



  • Databases running on a single SQL Instance.

  • Configuration per database


 


The second, is to know the options that we have to monitor an elastic database pool


 



  • Azure Portal, Azure Monitor, Log Analytics and SQL Auditing

  • Select * from sys.dm_db_resource_stats

  • Select * from sys.dm_exec_requests in combinations with other

  • Query Data Store

  • Use the queries provided in the demo


 


FInally, the best practices:


 



 


Demo


 


In this demo I have the following configuration:


 



  • Elastic Database Pool Name: Jmjuradotest

  • Elastic Database Pool Configuration:

    • General Purpose 2 vCores

    • Storage Size: 316 GB

    • Per Database Setting: Unlimited per Database.



  • Databases that are part of this Elastic Database Pool:

    • Jmjuradotestdb1

    • Jmjuradotestdb2

    • Jmjuradotestdb3




How to monitor queries that are running in my Elastic Database Pool.


 


This is the query that I used to monitor the activity


 


 


 

SELECT
 substring(REPLACE(REPLACE(SUBSTRING(ST.text, (req.statement_start_offset/2) + 1, (
(CASE statement_end_offset WHEN -1 THEN DATALENGTH(ST.text) ELSE req.statement_end_offset END
- req.statement_start_offset)/2) + 1) , CHAR(10), ' '), CHAR(13), ' '), 1, 512) AS statement_text
,dbs.name
,program_name
,req.session_id
, req.cpu_time 'cpu_time_ms'
, req.status
, wait_time
, wait_resource
, wait_type
, last_wait_type
, req.total_elapsed_time
, total_scheduled_time
, req.row_count as [Row Count]
, command
, scheduler_id
, memory_usage
, req.writes
, req.reads
, req.logical_reads
FROM sys.dm_exec_requests AS req
inner join sys.dm_exec_sessions as sess on sess.session_id = req.session_id
left join [dbo].[master_data] as dbs on dbs.database_id = sess.database_id
CROSS APPLY sys.dm_exec_sql_text(req.sql_handle) as ST
where req.session_id <> @@SPID
order by dbs.name

 


 


 


If you run this query connected to any database that belongs to your elastic database pool you could find some useful information:


 


Jose_Manuel_Jurado_0-1657360726230.png


 


As you could see this query has a special table called master_data, basically it is an external table that is connecting to master database to obtain the name of the database. Unfortunately, in Azure SQL Database is not possible to connect to others databases once you are connected to another. If you don’t want to create an external table, please, basically, remove the reference like I posted below.


 


 


 

SELECT
 substring(REPLACE(REPLACE(SUBSTRING(ST.text, (req.statement_start_offset/2) + 1, (
(CASE statement_end_offset WHEN -1 THEN DATALENGTH(ST.text) ELSE req.statement_end_offset END
- req.statement_start_offset)/2) + 1) , CHAR(10), ' '), CHAR(13), ' '), 1, 512) AS statement_text
--,dbs.name
,req.database_id
,program_name
,req.session_id
, req.cpu_time 'cpu_time_ms'
, req.status
, wait_time
, wait_resource
, wait_type
, last_wait_type
, req.total_elapsed_time
, total_scheduled_time
, req.row_count as [Row Count]
, command
, scheduler_id
, memory_usage
, req.writes
, req.reads
, req.logical_reads, blocking_session_id
FROM sys.dm_exec_requests AS req
inner join sys.dm_exec_sessions as sess on sess.session_id = req.session_id
--left join [dbo].[master_data] as dbs on dbs.database_id = sess.database_id
CROSS APPLY sys.dm_exec_sql_text(req.sql_handle) as ST
where req.session_id <> @@SPID
--order by dbs.name

 


 


 


Definition of external table


 


 


 

CREATE DATABASE scoped CREDENTIAL CredentialJM WITH IDENTITY  ='username', SECREt = 'Password'

CREATE EXTERNAL DATA SOURCE [RemoteDataJM] WITH (TYPE = RDBMS, LOCATION = N'servername.database.windows.net', CREDENTIAL = [CredentialJM], DATABASE_NAME = N'master')
GO

CREATE external TABLE [dbo].[master_data](
name varchar(120), database_id bigint
)
WITH
(
  DATA_SOURCE = [RemoteDataJM],
  SCHEMA_NAME = 'sys', --schema name of remote table
  OBJECT_NAME = 'databases' --table name of remote table
);

 


 


 


 


In the following video you could see that giving a special workload (running queries that are taking high CPU, Bulk inserts and TempDB operations) how I monitor my elastic database pool, how I know that queries running and how I know which is the database that is taking more resources


 


 


Enjoy!