by Contributed | Nov 3, 2023 | Technology
This article is contributed. See the original author and article here.
We are happy to announce the general availability of the User Interface (UI) for the Azure Virtual Desktop Web Client. The new UI offers a cleaner, more modern look and feel. With this update, you can
- Switch between Light and Dark Mode
- View your resources in a grid or list format
- Reset web client settings to their defaults
How to access it
The new client is toggled on by default on the web client, and the “preview” caption has now been removed from the toggle.
For additional information on the new UI, see What’s new in the Remote Desktop Web client for Azure Virtual Desktop | Microsoft Learn and New User Interface.
Note: We recommend using the new client as the original version will be deprecated soon. We will share more information on that shortly!
by Contributed | Nov 2, 2023 | Technology
This article is contributed. See the original author and article here.
Just a decade ago, few people seemingly knew or cared about firmware. But with the increasing interconnectedness of devices and the rise of cybersecurity threats, there’s a growing awareness of firmware as the foundational software that powers everything from smartphones to smart TVs.
Traditionally developed using the C language, firmware is essential for setting up a device’s basic functions. As a globally recognized standard, UEFI — Unified Extensible Firmware Interface enables devices to boot with fundamental security features that contribute to the security posture of modern operating systems.
Call for greater firmware security
As the security of our device operating systems gets more sophisticated, firmware needs to keep up. Security is paramount, but it shouldn’t compromise speed or user-friendliness. The goal is clear – firmware that’s both fast and secure.
What does this modern approach look like? Let’s start by looking at the key challenges:
- Evolving threat landscape: As operating systems become more secure, attackers are shifting their focus to other system software, and firmware is a prime target. Firmware operates at a very foundational level in a device, and a compromise here can grant an attacker deep control over a system.
- Memory safety in firmware: Many firmware systems have been historically written in languages like C, which, while powerful, do not inherently protect against common programming mistakes related to memory safety. These mistakes can lead to vulnerabilities such as buffer overflows, which attackers can exploit.
- Balance of speed and security: Firmware needs to execute quickly. However, increasing security might introduce execution latency, which isn’t ideal for firmware operations.
Rust in the world of firmware
When it comes to modern PC firmware, Rust stands out as a versatile programming language. It offers flexibility, top-notch performance, and most importantly, safety. While C has been a go-to choice for many, it has its pitfalls, especially when it comes to errors that might lead to memory issues. Considering how crucial firmware is to device safety and operation, any such vulnerabilities can be a goldmine for attackers, allowing them to take over systems.[1] That’s where Rust shines. It’s designed with memory safety in mind, without the need for garbage collection, and has strict rules around data types and parallel operations. This minimizes the probability of errors that expose vulnerabilities, making Rust a strong choice for future UEFI firmware development.
Unlocking new possibilities with Rust
Rust is not just another programming language; it’s a gateway to a wealth of resources and features that many firmware developers might have missed out on in the past. For starters, Rust embraces a mix of object-oriented, procedural, and functional programming approaches and offers flexible features like generics and traits, making it easier to work with different data types and coding methods. Many complex data structures that must be hand-coded in C are available “for free” as part of the Rust language. But it’s not just about versatility and efficiency. Rust’s tools are user-friendly, offering clear feedback during code compilation and comprehensive documentation for developers. Plus, with its official package management system, developers get access to tools that streamline coding and highlight important changes. One of those features is Rust’s use of ‘crates’ – these are like ready-to-use code packages that speed up development and foster collaboration among the Rust community.
Making the move from C to Rust
Rust stands out for its emphasis on safety, meaning developers often don’t need as many external tools like static analyzers, which are commonly used with C. But Rust isn’t rigid; if needed, it allows for exceptions with its “unsafe code” feature, giving developers some flexibility. One of Rust’s advantages is how well it interacts with C. This means teams can start using Rust incrementally, without having to abandon their existing C code. So, while Rust offers modern advantages, it’s also mindful of the unique requirements of software running directly on hardware — without relying on the OS or other abstraction layers. Plus, it offers compatibility with C’s data structures and development patterns.
The Trio: Surface, Project Mu and Rust
Surface with Windows pioneered the implementation of Project Mu in 2018 as an open-source UEFI core to increase scalability, maintainability, and reusability across Microsoft products and partners. The idea was simple but revolutionary, fostering a more collaborative approach to reduce costs and elevate quality. It also offers a solution to the intricate business and legal hurdles many partners face, allowing teams to manage their code in a way that respects legal and business boundaries. A major win from this collaboration is enhanced security; by removing unnecessary legacy code, vulnerabilities are reduced. From its inception, Surface has been an active contributor, helping Project Mu drive innovation and improve the ecosystem.
Pioneering Rust adoption through Project Mu and Surface
Surface and Project Mu are working together to drive adoption of Rust into the UEFI ecosystem. Project Mu has implemented the necessary changes to the UEFI build environment to allow seamless integration of Rust modules into UEFI codebases. Surface is leveraging that support to build Rust modules in Surface platform firmware. With Rust in Project Mu, Microsoft’s ecosystem benefits from improved security transparency while reducing the attack surface of Microsoft devices due to Rust’s memory safety benefits. Also, by contributing firmware written in Rust to open-sourced Project Mu, Surface participates in an industry shift to collaboration with lower costs and a higher security bar. With this adoption, Surface is protecting and leading the Microsoft ecosystem more than ever.
Building together: Surface’s commitment to the Rust community
Surface and Project Mu plan to participate in the open Rust development community by leveraging and contributing to popular crates and publishing new ones that may be useful to other projects. A general design strategy is to solve common problems in a generic crate that can be shared and integrated into the firmware. Community crates, such as r-efi for UEFI, have already been helpful during early Rust development.
Getting Started
Project Mu has made it easier for developers to work with Rust by introducing a dedicated container in the Project Mu Developer Operations repository (DevOps repo). This container is equipped with everything needed to kickstart Rust development. As more Rust code finds its way into Project Mu’s repositories, it will seamlessly integrate with the standard Rust infrastructure in Project Mu, and the dedicated container provides an easy way to immediately take advantage of it.
The Project Mu Rust Build readme details how to begin developing with Rust and Project Mu. Getting started requires installing the Rust toolchain and Cargo make as a build runner to quickly build Rust packages. Refer to the readme for guidance on setting up the necessary build and configuration files and creating a Rust module.
Demonstrating Functionality
QEMU is an open-source virtual machine emulator. Project Mu implements open-source firmware for the QEMU Q35 platform in its Mu Tiano Platforms repository. This open virtual platform is an easily accessible demonstration vehicle for Project Mu features. In this case, UEFI (DXE) Rust modules are already included in the platform firmware to demonstrate their functionality (and test it in CI).
Looking ahead
With the expansion of firmware code written in Rust, Surface looks forward to leveraging the Project Mu community to help make our firmware even more secure. To get involved with Project Mu, review the documentation and check out the Github repo. Regularly pull updates from the main repo, keep an eye on the project’s roadmap, and stay engaged with the community to remain informed about changes and new directions.
Footnotes
1. See Trends, challenge, and shifts in software vulnerability mitigation
References
by Contributed | Nov 1, 2023 | Technology
This article is contributed. See the original author and article here.
Microsoft Learn offers you the latest resources to ensure you have what you need to prepare for exams and reach your skilling goals. Here we share some important updates about Security content, prep videos, certifications, and more.
Exam Readiness Zone: preparing for Exams SC-100, SC-200, and SC-300
Now, you can leverage the Exam Readiness Zone, our free exam prep resource available on Microsoft Learn for your next Security certification! View our expert-led exam prep videos to help you identify the key knowledge and skills measured on exams and how to allocate your study time. Each video segment corresponds to a major topic area on the exam.
During these videos, trainers will point out objectives that many test takers find difficult. In these videos, we include example questions and answers with explanations.
For technical skilling, we now have videos available for the following topics:
- Design solutions that align with security best practices and priorities
- Design security operations, identity, and compliance capabilities
- Design security solutions for infrastructure
- Design security solutions for applications and data
Review the exam prep videos.
- Mitigate threats using Microsoft 365 Defender
- Mitigate threats using Microsoft 365 Defender
- Mitigate threats using Microsoft Sentinel
Review the exam prep videos and take a free practice assessment.
- Implement identities in Azure AD
- Implement authentication and access management
- Implement access management for applications
- Plan and implement identity governance in Azure AD
Review the exam prep videos and take a free practice assessment.
Visit the Exam Readiness Zone to leverage tips, tricks, and strategies for preparing for your next Microsoft Certification exam.
Newly added Security Cloud Skills Challenge on 30 Days to Learn It
We recently released the new Security Operations Analyst Cloud Skills Challenge on 30 Days to Learn It. Build your skills and prepare for Exam SC-200: Microsoft Security Operations Analyst, required to earn your Microsoft Certified: Security Operations Analyst Associate certification.
Are you thinking of adopting the upcoming Security Copilot? This challenge will help you prepare, as it includes the security operations analyst skills required to tune up your platform and get it ready for Security Copilot.
Complete the challenge within 30 days and you can be eligible to earn a 50% discount on the Certification exam.
Start the 30 Days to Learn it challenge today!
New name: Information Protection and Compliance Administrator Associate certification
As we announced a couple of months ago, we updated the certification name to the Microsoft Certified: Information Protection and Compliance Administrator Associate certification and the exam name to Exam SC-400: Administering Information Protection and Compliance in Microsoft 365 as we recognized the need to expand this certification and exam to include compliance features.
The Exam SC-400 evaluates your proficiency in performing the following technical tasks: implementing information protection, implementing DLP, implementing data lifecycle and records management, monitoring and investigating data and activities through Microsoft Purview, and managing insider and privacy risks in Microsoft 365.
Prepare for the exam with the SC-400 study guide and with our free practice assessment.
Security Learning Rooms
The Microsoft Learn Community offers a variety of ways to connect and engage with each other and technical experts. One of the core components of this experience are the learning rooms, a space to find connections with experts and peers.
There are four Microsoft Security Learning Rooms to choose from that span end-to-end:
- Cloud Security Study Group
- Compliance Learning Room
- Cybersecurity from Beginner to Expert
- Microsoft Entra
Whether you choose one path or all of them, the Microsoft Learn Community experiences are ready to support your learning journey.
To explore additional security technical guidance, please visit the refreshed Security documentation hub on Microsoft Learn.
by Contributed | Oct 31, 2023 | Technology
This article is contributed. See the original author and article here.
Creating and deploying Docker containers to Azure resources manually can be a complicated and time-consuming process. This tutorial outlines a graceful process for developing and deploying a Linux Docker container on your Windows PC, making it easy to deploy to Azure resources.
This tutorial emphasizes using the user interface to complete most of the steps, making the process more reliable and understandable. While there are a few steps that require the use of command lines, the majority of tasks can be completed using the UI. This focus on the UI is what makes the process graceful and user-friendly.
In this tutorial, we will use a Python Flask application as an example, but the steps should be similar for other languages such as Node.js.
Prerequisites:
Before you begin, you’ll need to have the following prerequisites set up:
WSL provides a great way to develop your Linux application on a Windows machine, without worrying about compatibility issues when running in a Linux environment. We recommend installing WSL 2 as it has better support with Docker. To install WSL 2, open PowerShell or Windows Command Prompt in administrator mode, enter below command:
wsl –install
And then restart your machine.
You’ll also need to install the WSL extension in your Visual Studio Code.

Run “wsl” in your command prompt. Then run following commands to install python 3.10 (if you use Python 3.5 or a lower version, you may need to install venv by yourself):
sudo apt-get update
sudo apt-get upgrade
sudo apt install python3.10
You’ll need to install Docker in your Linux environment. For Ubuntu, please refer to below official documentation:
https://docs.docker.com/engine/install/ubuntu/
To create an image for your application in WSL, you’ll need Docker Desktop for Windows. Download the installer from below Docker website and run the downloaded file to install it.
https://www.docker.com/products/docker-desktop/
Steps for Developing and Deployment
1. Connect Visual Studio Code to WSL
To develop your project in Visual Studio Code in WSL, you need to click the bottom left blue button:

Then select “Connect to WSL” or “Connect to WSL using Distro”:

2. Install some extensions for Visual Studio Code
Below two extensions have to be installed after you connect Visual Studio Code to WSL.
The Docker extension can help you create Dockerfile automatically and highlight the syntax of Dockerfile. Please search and install via Visual Studio Code Extension.

To deploy your container to Azure in Visual Studio Code, you also need to have Azure Tools installed.

3. Create your project folder
Click “Terminal” in menu, and click “New Terminal”:

Then you should see a terminal for your WSL.
I use a quick simple Flask application here for example, so I run below command to clone its git project:
git clone https://github.com/Azure-Samples/msdocs-python-flask-webapp-quickstart
4. Python Environment setup (optional)
After you install Python 3 and create project folder. It is recommended to create your own project python environment. It makes your runtime and modules easy to be managed.
To setup your Python Environment in your project, you need to run below commands in the terminal:
cd msdocs-python-flask-webapp-quickstart
python3 -m venv .venv
Then after you open the folder, you will be able to see some folders are created in your project:

Then if you open the app.py file, you can see it used the newly created python environment as your python environment:

If you open a new terminal, you also find the prompt shows that you are now in new python environment as well:

Then run below command to install the modules required in the requirement.txt:
pip install -r requirements.txt
5. Generate a Dockerfile for your application
To create a docker image, you need to have a Dockerfile for your application.
You can use Docker extension to create the Dockerfile for you automatically. To do this, enter ctrl+shift+P and search “Dockerfile” in your Visual Studio Code. Then select “Docker: Add Docker Files to Workspace”

You will be required to select your programming languages and framework(It also supports other language such as node.js, java, node). I select “Python Flask”.
Firstly, you will be asked to select the entry point file. I select app.py for my project.
Secondly, you will be asked the port your application listens on. I select 80.
Finally, you will be asked if Docker Compose file is included. I select no as it is not multi-container.
A Dockefile like below is generated:

Note:
If you do not have requirements.txt file in the project, the Docker extension will create one for you. However, it DOES NOT contain all the modules you installed for this project. Therefore, it is recommended to have the requirements.txt file before you create the Dockerfile. You can run below command in the terminal to create the requirements.txt file:
pip freeze > requirements.txt
After the file is generated, please add “gunicorn” in the requirements.txt if there is no “gunicorn” as the Dockerfile use it to launch your application for Flask application.
Please review the Dockerfile it generated and see if there is anything need to modify.
You will also find there is a .dockerignore file is generated too. It contains the file and the folder to be excluded from the image. Please also check it too see if it meets your requirement.
6. Build the Docker Image
You can use the Docker command line to build image. However, you can also right-click anywhere in the Dockefile and select build image to build the image:

Please make sure that you have Docker Desktop running in your Windows.
Then you should be able to see the docker image with the name of the project and tag as “latest” in the Docker extension.

7. Push the Image to Azure Container Registry
Click “Run” for the Docker image you created and check if it works as you expected.

Then, you can push it to the Azure Container Registry (ACR). Click “Push” and select “Azure”.

You may need to create a new registry if there isn’t one. Answer the questions that Visual Studio Code asks you, such as subscription and ACR name, and then push the image to the ACR.
8. Deploy the image to Azure Resources
Follow the instructions in the following documents to deploy the image to the corresponding Azure resource:
Azure App Service or Azure Container App: Deploy a containerized app to Azure (visualstudio.com) Opens in new window or tab
Container Instance: Deploy container image from Azure Container Registry using a service principal – Azure Container Instances | Microsoft Learn Opens in new window or tab
by Contributed | Oct 30, 2023 | Technology
This article is contributed. See the original author and article here.
In this technical article, we will delve into an interesting case where a customer encountered problems related to isolation levels in Azure SQL Managed Instance. Isolation levels play a crucial role in managing the concurrency of database transactions and ensuring data consistency. We will start by explaining isolation levels and providing examples of their usage. Then, we will summarize and describe the customer’s problem in detail. Finally, we will go through the analysis of the issue.
Isolation Level
Isolation level is a property of a transaction that determines how data is accessed and modified by concurrent transactions. Different isolation levels provide different guarantees about the consistency and concurrency of the data. SQL Server and Azure SQL Managed Instance support five isolation levels: read uncommitted, read committed, repeatable read, snapshot, and serializable. The default isolation level for both platforms is read committed.
Read uncommitted allows a transaction to read data that has been modified by another transaction but not yet committed. This can lead to dirty reads, non-repeatable reads, and phantom reads. Read committed prevents dirty reads by only allowing a transaction to read data that has been committed by another transaction. However, it does not prevent non-repeatable reads or phantom reads. Repeatable read prevents non-repeatable reads by locking the data that has been read by a transaction until the transaction ends. However, it does not prevent phantom reads. Snapshot prevents both non-repeatable reads and phantom reads by using row versioning to provide a consistent view of the data as it existed at the start of the transaction. Serializable prevents all concurrency anomalies by locking the entire range of data that is affected by a transaction until the transaction ends.
The isolation level can be set for each connection using the SET TRANSACTION ISOLATION LEVEL statement or using the IsolationLevel property of the .NET TransactionScope class. The isolation level can also be overridden for individual statements using table hints such as (NOLOCK) or (READCOMMITTED).
Problem Description
The customer reported that they observed unexpected transaction isolation level changes when running distributed transactions using .NET Transaction Scope on Azure SQL Managed Instance, while the same application was behaving differently when using On premise SQL Server.
The customer was opening two connections to the same database under one transaction scope, one at a time, and they observed the transaction isolation level got reset after the second connection had been opened. For example, if they set the isolation level to repeatable read for the first connection, it would be changed to read committed for the second connection. This caused inconsistency and concurrency issues in their application.
The following code snippet illustrates the scenario:
TransactionOptions transactionOptions = new TransactionOptions
{
IsolationLevel = System.Transactions.IsolationLevel.ReadUncommitted
};
string connectionStr = "Data Source=testwest.com;Initial Catalog=test;User id=sa;Password=;Connection Timeout=0";
using (TransactionScope ts = new TransactionScope(TransactionScopeOption.Required, transactionOptions))
{
using (SqlConnection connection1 = new SqlConnection(connectionStr))
{
SqlCommand cmd = new SqlCommand("SELECT transaction_isolation_level FROM sys.dm_exec_sessions where session_id = @@SPID", connection1);
connection1.Open();
SqlDataReader rs = cmd.ExecuteReader();
rs.Read();
Console.WriteLine(rs.GetInt16(0));
connection1.Close();
}
using (SqlConnection connection2 = new SqlConnection(connectionStr))
{
SqlCommand cmd = new SqlCommand("SELECT transaction_isolation_level FROM sys.dm_exec_sessions where session_id = @@SPID", connection2);
connection2.Open();
SqlDataReader rs = cmd.ExecuteReader();
rs.Read();
Console.WriteLine(rs.GetInt16(0));
connection2.Close();
}
ts.Complete();
}
The customer stated that they are not using the “Pooling” parameter in their connection string, which means that connection pooling is enabled by default.
Problem Analysis
We investigated the issue and found that the root cause was related to how connection reset works on Azure SQL Managed Instance and cloud in general, compared to On-premise SQL Server.
Connection reset is a mechanism that restores the connection state to its default values before reusing it from the connection pool. Connection reset can be triggered by various events, such as closing the connection, opening a new connection with a different database name or user ID, or executing sp_reset_connection stored procedure.
One of the connection state attributes that is affected by connection reset is the transaction isolation level. Resetting the connection on Azure SQL Managed Instance will always reset the transaction isolation level to the default one, which is read committed. This is not true for on-premise SQL Server, where resetting the connection will preserve the transaction isolation level that was set by the application.
This difference in behavior is due to how Azure SQL Managed Instance implements distributed transactions using MSDTC (Microsoft Distributed Transaction Coordinator). MSDTC requires that all connections participating in a distributed transaction have the same transaction isolation level. To ensure this requirement, Azure SQL Managed Instance resets the transaction isolation level to read committed for every connection that joins a distributed transaction.
Since the customer is opening and closing the connection to the same database twice, only one physical connection will be created. The driver will use the same connection for both query executions, but the connection will be reset before being reused. The first connection reset will happen when the first connection is closed, and the second connection reset will happen when the second connection is opened under the same transaction scope. The second connection reset will override the isolation level that was set by the application for the first connection.
This explains why the customer observed unexpected transaction isolation level changes when running distributed transactions using .NET Transaction Scope on Azure SQL Managed Instance.
Conclusion
First and foremost, it is beneficial to emphasize that this is an expected behavior from a design perspective. The customer is advised to either disable connection pooling or explicitly set the transaction isolation level for every opened connection.
To disable connection pooling, they can add “Pooling=false” to their connection string. This will create a new physical connection for every logical connection, and avoid the connection reset issue. However, this will also increase the overhead of opening and closing connections, and reduce the scalability and performance of the application.
To explicitly set the transaction isolation level for every opened connection, they can use the SET TRANSACTION ISOLATION LEVEL statement or the IsolationLevel property of the .NET TransactionScope class. This will ensure that the isolation level is consistent across all connections participating in a distributed transaction, regardless of the connection reset behavior. For example, they can modify their code snippet as follows:
using (TransactionScope scope = new TransactionScope(TransactionScopeOption.Required, new TransactionOptions { IsolationLevel = IsolationLevel.RepeatableRead }))
{
using (SqlConnection conn1 = new SqlConnection(connectionString))
{
conn1.Open();
// Set the isolation level explicitly
SqlCommand cmd1 = new SqlCommand("SET TRANSACTION ISOLATION LEVEL REPEATABLE READ", conn1);
cmd1.ExecuteNonQuery();
// Execute some queries on conn1
}
using (SqlConnection conn2 = new SqlConnection(connectionString))
{
conn2.Open();
// Set the isolation level explicitly
SqlCommand cmd2 = new SqlCommand("SET TRANSACTION ISOLATION LEVEL REPEATABLE READ", conn2);
cmd2.ExecuteNonQuery();
// Execute some queries on conn2
}
scope.Complete();
}
For additional information about database isolation settings, you can review the below documents.
SET TRANSACTION ISOLATION LEVEL (Transact-SQL) – SQL Server | Microsoft Learn
Transaction locking and row versioning guide – SQL Server | Microsoft Learn
System stored procedures (Transact-SQL) – SQL Server | Microsoft Learn
SQL Server Connection Pooling – ADO.NET | Microsoft Learn
I hope this article was helpful for you, please feel free to share your feedback in the comments section.
Disclaimer
Please note that products and options presented in this article are subject to change. This article reflects isolation level settings for Azure SQL Managed Instance in October, 2023.
Recent Comments