by Contributed | Feb 17, 2023 | Technology
This article is contributed. See the original author and article here.
Introduction
Hi folks! My name is Felipe Binotto, Cloud Solution Architect, based in Australia.
We all know how frustrating it can be to receive a call about a storage account not replicating or being unable to fail over. To help prevent this from happening, I am going to show you how to monitor the replication of your storage accounts. Keep in mind that replication logs are not available as part of the storage account’s diagnostic settings.
Pre-Requisites
Before we begin, please ensure that you have the following prerequisites in place:
- Azure Subscription
- Automation Account
- Log Analytics workspace
High-Level Steps
The process for monitoring storage account replication can be broken down into several high-level steps, which we will go through in the following order:
- Clone the repository that contains the runbook.
- Create a user-assigned managed identity.
- Provide the identity with the necessary access.
- Assign the identity to the Automation Account.
- Import the runbook to the Automation Account.
- Provide values for the Runbook variables.
- Create a new Automation Account variable.
- Run the runbook to retrieve the storage account replication information.
Getting Started
Clone the repo by running the following command:
git clone https://github.com/fbinotto/storagereplication.git
Create a new User Managed Assigned Identity.
$id = (New-AzUserAssignedIdentity -Name storagereplication `
-ResourceGroupName REPLACE_WITH_YOUR_RG –Location australiaeast)
Assign the identity Storage Account Contributor rights in your subscription(s) so it can retrieve the replication information from Storage Accounts.
New-AzRoleAssignment -ObjectId $id.PrincipalId `
-RoleDefinitionName 'Storage Account Contributor' `
-Scope /subscriptions/
New-AzRoleAssignment -ObjectId $id.PrincipalId `
-RoleDefinitionName 'Log Analytics Contributor' `
-Scope /subscriptions/
Now assign the identity to the Automation Account.
# Get the automation account
$automationAccount = Get-AzAutomationAccount -ResourceGroupName REPLACE_WITH_YOUR_RG -Name REPLACE_WITH_YOUR_AA
# Assign the identity to the automation account
$automationAccount | Set-AzAutomationAccount -Identity $id
Import the runbook to your automation account. Make sure you run the next command from the folder which was cloned.
Import-AzAutomationRunbook -Path ".storagereplication.ps1" `
-Name StorageReplication –Published:$true `
-ResourceGroupName REPLACE_WITH_YOUR_RG `
-AutomationAccountName REPLACE_WITH_YOUR_AA -Type PowerShell
Open the script in VS Code or you can edit straight in your automation account. Now I will highlight some of the important sections, so you have a clear understanding of what is going on.
The script is used to collect and send replication logs for Azure Storage Accounts to a Log Analytics workspace. By using this script, you can monitor the replication of your Storage Accounts, so that you can be alerted if there are any issues and act before it becomes a problem.
The script starts by setting some variables, including the ID of the Log Analytics workspace, the primary key for authentication, and the name of the record type that will be created.
The primary key is retrieved from an Automation Account variable, so we don’t expose it in clear text. Run the following command to create the variable.
# Create the encrypted variable
New-AzAutomationEncryptedVariable -AutomationAccountName REPLACE_WITH_YOUR_AA -ResourceGroupName REPLACE_WITH_YOUR_RG -Name SharedKey -Value REPLACE_WITH_YOUR_LOG_ANALYTICS_PRIMARY_KEY
The script then defines two functions: Build-Signature and Post-LogAnalyticsData.
The Build-Signature function creates an authorization signature that will be used to authenticate the request to the Log Analytics API. The function takes in several parameters, including the ID of the Log Analytics workspace, the primary key, the date, the content length, the method, the content type, and the resource.
The Post-LogAnalyticsData function creates and sends the request to the Log Analytics API. This function takes in the ID of the Log Analytics workspace, the primary key, the body of the request (which contains the replication logs), the log type, and the timestamp field.
The script also includes a line of code (Disable-AzContextAutosave) that ensures that the runbook does not inherit an AzContext, which can cause issues when running the script.
Finally, the script calls the Post-LogAnalyticsData function, sending the replication logs to the Log Analytics workspace.
At this point you can run the Runbook. Once the logs have been sent, you can create Azure Alerts based on KQL queries to notify you of any issues with the replication.
For example, the following code would return Storage Accounts which have not replicated in the last 8 hours.
StorageReplicationHealth_CL
| where todatetime(Storage_LastSyncTime_s) < ago(8h)
In Part 2 of this post, I will demonstrate how you can leverage Logic Apps to send out customized emails when your Storage Account is not replicating.
Conclusion
In conclusion, monitoring the replication of your Azure Storage Accounts is crucial to ensure the availability and reliability of your data. In this blog post, we have shown you how to set up monitoring for your Storage Accounts using Log Analytics and Azure Automation. By following the steps outlined in this post and using the provided script, you will be able to monitor the replication status of your Storage Accounts and receive alerts if there are any issues. This will allow you to act quickly and prevent any disruptions to your services. With this solution in place, you can have peace of mind knowing that your data is safe and available.
I hope this was informative to you and thanks for reading!
Disclaimer
The sample scripts are not supported under any Microsoft standard support program or service. The sample scripts are provided AS IS without warranty of any kind. Microsoft further disclaims all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose. The entire risk arising out of the use or performance of the sample scripts and documentation remains with you. In no event shall Microsoft, its authors, or anyone else involved in the creation, production, or delivery of the scripts be liable for any damages whatsoever (including, without limitation, damages for loss of business profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use the sample scripts or documentation, even if Microsoft has been advised of the possibility of such damages.
by Contributed | Feb 17, 2023 | Technology
This article is contributed. See the original author and article here.
Odds are, if you are impacted by the Cybersecurity Maturity Model Certification (CMMC) mandates, you already know it. Odds are, if you are reading this post, you are doing research because you are impacted by the mandates. If you are impacted by the mandates, this post is for you. This post is to give you ideas that [we hope] help you on your compliance journey.
The open question is likely “how do I become compliant”? Ultimately, there are two options. But before we get to the options of how to become compliant, we first need to address the scope of what needs to become compliant.
What about scope?
There are thousands of other published pages on the scope of CMMC, and that’s not the point of this post. The point here is to state the following:
- Today, you have N applications in your Portfolio
- A subset (maybe 100%, and maybe a smaller percentage) of those applications and their data must be compliant with CMMC by certain dates depending on your contracts and business requirements
- Every business that is beholden to the mandates needs to make a list of the applications (and data) that are in-scope. Many companies will do that rapid assessment on their own. Other companies will enlist the help of partners/vendors to help them move faster and more confidently. Either way is fine.

Once you have the list of apps (and data) that are in-scope for you, then what? Then, it is time to choose an option.
Option 1: Work on the running engine
The challenge with working on a running engine is the increased risk of losing a finger :smiling_face_with_smiling_eyes:. Honestly, if you had to spend time quantifying your Portfolio, then it stands to reason that there may be things that you missed in that assessment. But leaving that point aside, there is always the option to assess every app, every piece of data, every server, every switch, etc to become compliant. That is a very difficult journey because of years of technical debt. Can you really clean out all shared-credential-service accounts in your environment without breaking something critical?

Option 2: Build a new engine and rapidly move to it
Surely there are exceptions, but we have yet to see one. The best answer [is usually] to build a new engine. Not only is the right answer to build a new engine, but the right answer is to build a new engine in the cloud.
Why the cloud?
- They are already compliant (e.g. Microsoft Azure Government [MAG] and Government Commercial Cloud High [GCCH])
- You will not invest more in cybersecurity and compliance than Microsoft Cloud will, so they are and will be, more secure than you can be
- If you leverage the cloud, you then only have to worry about securing the pieces and parts that are unique to YOUR business: your enclave(s) and tenant(s), your application(s), your data.

Executing on Option 2 (New, Cloud Engine)
Step A: Rapidly Establish Cloud Enclave
- M365: Commercial and/or GCC and/or GCC-High and/or GCC-DOD
- Which one(s) do you need?
- How do you rapidly set them up and harden them?
- How do you continuously monitor (and automatically respond) to anomalies that would take you out of compliance?
- How do you give the auditor a real-time dashboard to speed up the audit(s)?
- Azure: Commercial Azure, Azure Government as IL2, Azure Government as IL4, Azure Government as IL5, or a combination
- Which one(s) do you need?
- How do you rapidly set them up and harden them?
- How do you continuously monitor (and automatically respond) to anomalies that would take you out of compliance?
- How do you give the auditor a real-time dashboard to speed up the audit(s)?
- For every enclave and/or tenant, how will it be managed on Day 1? Day N? (often, the goal is to “manage it myself” on Day N, but folks are unclear and aren’t ready to manage it on Day 1)
Step B: Move Applications (and Data)
- How do you prioritize your applications based on timelines and resourcing?
- For each application, should it
- Lift and Shift?
- Have slight tweaks? (e.g. converted to PaaS? Converted to hardened containers per DevSecOps Reference Architecture and DoD Standards? Other?)
- Rewrite?
- Other?
- For every application (and data), how will it be managed on Day 1? Day N? (Often, the goal is to “manage it myself” on Day N, but folks are unclear and aren’t ready to manage it on Day 1)
Step C: What about Client Devices?
- Are your laptops and desktops managed in such a way that they are compliant?
- What about mobile devices?
- Can you detect and minimize spillage?
- Do you understand your Data Loss posture?
Step D: What about Policies?
- For example, is your Data Loss Prevention Policy where it needs to be for CMMC?
- Are the written policies tactically implemented for the Enclaves, Tenants, Apps and Data defined as you establish the enclaves and move the applications?
Step E: What about Auditability?
- When the auditor shows up, will you spend days and weeks with them, or will you show them your real-time dashboards?
- When the auditor shows up, will you do tabletop exercises with them? Will you introduce an out-of-compliance-server and watch the automation turn off the server? Will automation also create a security incident in parallel? Is it true that the only way to end up with an errant server in this new, pristine engine is that someone went around the process as defined by the policy?’
Surely, you will choose Option 2.
Insource, Outsource or Hybrid?
Now, the only remaining question is whether you will figure it all out on your own or will you bring in someone to help you? Given the impact of getting it wrong and given the timeline, most companies will bring in someone to help them.
Which Partner?
There are two courses of action:
- Pay someone to “consult” with you while doing the work yourself
- Pay someone to do it for you including Day 1 thru Day N management
Most companies prefer B, but they assume that there is no such unicorn. And, if they assume there is a unicorn, they fear that they cannot afford it.
The ideal partner will help you in the following ways:
- Rapidly define the in-scope apps and data
- Ask a series of repeatable business questions
- Rapidly establish the enclave(s) and tenant(s)….ideally by using automation to save you time and money
- Rapidly move applications and data to the new enclave(s) and tenant(s) while making the necessary application tweaks (and being willing to take accountability for full application re-writes as necessary)….ideally using automation to refactor and/or re-write the apps
- Manage the clients and mobile devices and/or work through and with your existing client/mobile team to take accountability for the client and mobile posture….ideally using automation
- Manage the enclave(s), tenant(s), applications and data to keep them current and compliant….ideally using automation
- Work through and with your Policy team(s) to update Policies as necessary to match the actual implementation
- Stand at the ready to host your auditors when they show up …. ideally using automation
- Partner Requirements
- Already doing this same work in DoD IL5/CUI environments
- Already doing this work in Commercial environments including for Defense Industrial Base
- Already doing this work for small customers (e.g. 5 seats) through huge customers (e.g. 150k seats)
- Willing to take the risk to do the work as Firm-Fixed-Fee on a committed timeline
- Willing to commit to pricing of operations and maintenance pricing for years 2 through 5 (and beyond) on day 1
- Willing to provide significant multi-year discounts
Call to action:
- Quantify the applications (and data) that will fall within your CMMC scope
- Leverage Microsoft Azure Government and GCCH to meet the requirements
- Leverage an experienced partner to help you skip the learning curve
About the Author:
Carroll Moon is the CTO and Co-Founder of CloudFit Software. Prior to CloudFit, Carroll spent almost 18 years at Microsoft helping to build and run Microsoft’s Clouds. CloudFit Software aims to securely run every mission critical workload in the universe. CloudFit is a DoD company that also intentionally serves commercial companies. Commercial customers (including Microsoft’s Product Groups) keep CloudFit on the cutting edge of cloud and cloud apps—that makes CloudFit attractive to DoD customers. DoD customers require that CloudFit be a leader in cybersecurity—that makes CloudFit attractive to commercial customers. This intersection of DoD and Commercial uniquely positions CloudFit Software to help customers comply with cybersecurity mandates like CMMC, and the build-and-run-the-hyperscale-cloud pedigree of CloudFit’s executive team means that CloudFit is executing on their charter with software and automation rather than with people. CloudFit Software’s patented platform enables increased repeatability, decreased costs, increased availability and increased security in all areas from establishing hardened cloud enclaves to migrating (and re-factoring) workloads to operating securely in the cloud. Beyond the IT/Cloud charter, CloudFit Software exists to fund two 501c3 charities: KidFit (providing hope and opportunities to youth using sports as the enabler) and JobFit (providing hope and opportunities to adults and young adults using IT training and paid internships as the enablers). Carroll lives in Lynchburg, VA with his wife and two children. CMMC | CloudFit Software
by Contributed | Feb 16, 2023 | Technology
This article is contributed. See the original author and article here.
offers a huge library of written content, including technical documentation and paths. But what if you need something a little more visual and demonstrative while learning new skills? Enter Microsoft Learn’s vast collection of video content.
Whether you’re searching for a walk-through of Azure or wanting to know the newest trends within the tech world, Microsoft Learn offers a wide variety of unique video content. Produced as both stand-alone how-tos and episodic shows on Microsoft Learn, videos will help you attain new skills and knowledge while keeping up with the latest Microsoft technology.
Although Microsoft Learn offers content that fits learners at every stage of their journey, these seven videos can help new users take the first step towards achieving their learning goals.
1.

2. Exam Readiness Zone

If you need to prepare for a Microsoft Certification exam and you don’t know where to begin, check out this show that offers study tips, content overviews, and sample questions and answers for each featured exam.
Watch now
3. FastTrack for Azure Learn Live Series

Interact with Microsoft Azure engineers in real-time via livestreams. Geared towards helping you migrate or initiate new workloads in Azure, this series will give you added confidence when preparing for highly technical implementations.
Watch now
4. The Low Code Revolution

Learn how to develop and optimize applications and processes with Microsoft Power Platform direct from industry experts. Focused on low code solutions, this series is a great resource for developers of all backgrounds.
Watch now
5.
Catch up on the latest trends and news snippets within the developer community in this engaging and informative series. Watch highlights of interesting projects and discover tips and tricks for developers of all backgrounds and skillsets.
Watch now
6. Microsoft Graph Fundamentals

This multi-part series introduces Microsoft Graph basics. Best of all, it features interactive exercises that showcase how to use Microsoft Graph for connecting Microsoft 365 data with app development platforms.
Watch now
7. The AI Show

Learn about what’s new in artificial intelligence in this Friday evening series. Watch as host, Seth Juarez, works on machine learning and AI projects while offering tips for getting started on your own.
Watch now
you’re looking for a live demonstration of complex skills or a last-minute knowledge check before a certification exam, videos on Microsoft Learn are here to help. Check out what’s available today!
Explore shows on Microsoft Learn
Watch on-demand events
by Contributed | Feb 16, 2023 | Business, Microsoft 365, Technology
This article is contributed. See the original author and article here.
Despite living in a connected world, the way we use our apps can often feel distinctly disconnected. Today, we’re announcing a new way to help you stay focused with help from two Microsoft apps that many people use daily and consistently together—the classic Microsoft Outlook app on Windows and the Microsoft Edge web browser.
The post Discover new ways to multitask with Microsoft 365 and Edge appeared first on Microsoft 365 Blog.
Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.
by Contributed | Feb 15, 2023 | Technology
This article is contributed. See the original author and article here.
It is common that IT personnel tasked with monitoring the health and performance of database systems be given very high privileges such as SQL sysadmin. This enables them to do their job but comes with significant risks. Those privileges enable them to read or modify the data that other users in the organization store in those databases. That data is commonly referred to as “user data”. Sometimes user data can be very sensitive, for example, the consolidated financial information of a public company prior to being disclosed in an earnings report, a technological achievement that gives the company a competitive edge, and customer or employee information that must be protected to comply with privacy regulations. Sensitive data may be leaked or tampered with because of malicious intentions or simply poor security practices. When that happens, the company usually suffers financial damage and litigation against its officers.
Microsoft Purview DevOps policies support the Principle of Least Privilege (PoLP), which simply states that people should be given only the minimum access they need to be able to perform their job and no more. DevOps policies address the scenario of IT personnel tasked with monitoring the health and performance of database systems. This article showcases the experience for Azure SQL Managed Instance, the newest source supported for DevOps policies (soon to enter private preview). Azure SQL Database and SQL Server 2022 are already supported, and the configuration steps are linked at the end.
First, register the Azure SQL MI in Microsoft Purview and enable Data use management. This means consenting that you would like to use Microsoft Purview to grant users access to the Azure SQL MI.

Second, navigate to the Data Policy App in Microsoft Purview and then to DevOps policies. Create a policy, selecting the Azure SQL MI data source in the prior step. Once you do that, the Data resource path will show <subscription name > resource-group name > data source name>. Next, select one of two role definitions “SQL Performance Monitor” or “SQL Security Auditor”. Finally, select the Add/remove subjects to specify the Azure AD user(s) or group(s) that should be granted access:

Once you save the policy, Microsoft Purview will communicate it to the Azure SQL MI. It may take up to 5 minutes to be enforced.
To test, you can use SSMS. Connect with one of the AAD users that was granted access and then execute a query to get system metadata (DMVs and DMFs). For example, SQL Performance Monitor grants access to see the virtual file stats or the wait times. SQL Security Auditor grants access to see database encryption keys. An IT user granted SQL Performance monitor user should be able to perform the operations:

Next, try accessing a table in one of the databases. The IT user is denied, which means the crown jewels are protected.

Recommended steps:
- DevOps policies for Azure SQL MI (Private Preview): Click here to test-drive this new experience. Note, your Microsoft Purview account and Azure SQLM MI will be allow-listed after you enroll.
- DevOps policies for Azure SQL Database (Public Preview) and SQL Server 2022 (GA):
Recent Comments