by Contributed | Jan 10, 2022 | Technology
This article is contributed. See the original author and article here.
Howdy folks,
We’re thrilled to announce the General Availability (GA) of Continuous Access Evaluation (CAE) as part of the overall Azure AD Zero Trust Session Management portfolio!
CAE introduces real-time enforcement of account lifecycle events and policies, including:
- Account revocation
- Account disablement/deletion
- Password change
- User location change
- User risk increase
On receiving such events, app sessions are immediately interrupted and users are redirected back to Azure AD to reauthenticate or reevaluate policy. With CAE, we have introduced a new concept of Zero Trust authentication session management that is built on the foundation of Zero Trust principles–Verify Explicitly and Assume Breach. With the Zero Trust approach, the authentication session lifespan now depends on session integrity rather than on a predefined duration. This work is consistent with an industry effort called Shared Signals and Events, and we’re proud to be the first company in the group with a generally available implementation of continuous access!
In fact, we’re so excited about CAE that we auto-enabled it for all tenants. Azure AD Premium 1 customers can make configuration changes or disable CAE in a session blade of Conditional Access.
Session blade of CAE for customizing configurations
With this GA, you’ll be more secure and resilient because the real-time enforcement of policies can safely extend session duration. In case of any Azure AD outages, users with CAE sessions can ride out these outages without ever noticing them.
“With CAE, gone are the days where we are waiting for the session to be revoked or the user to be reauthenticated for critical services like Exchange Online and SharePoint Online. If we ever had a security incident pop with a user identity, knowing that the token can be revoked instantly, is confidence inspiring. Further, the long default session lifetime with CAE is another benefit we welcome, particularly from the perspective of additional resilience to potential outages.”
— BRIDGEWATER
CAE has been one of our most popular preview features and has already been deployed successfully by thousands of customers across millions of users. You can learn more about CAE here, including a full list of apps that support CAE today.
As always, we’d love to hear any feedback or suggestions you have. Let us know what you think in the comments below or on the Azure AD feedback forum.
Best regards,
Alex Simons (Twitter: @alex_a_simons)
Corporate Vice President Program Management
Microsoft Identity Division
Learn more about Microsoft identity:
by Contributed | Jan 10, 2022 | Dynamics 365, Microsoft 365, Technology
This article is contributed. See the original author and article here.
Today we are excited to announce the general availability of finance insights, a set of AI-powered capabilities that help customers of Microsoft Dynamics 365 Finance improve the efficiency and quality of financial processes by leveraging intelligent automation.
Reshaping corporate financial management
A core set of financial management processes and systems support the work of every organization. Yet, while digital transformation has significantly impacted business solutions in workloads like commerce, manufacturing, and warehousing, not much has changed for financial processes, at least by comparison.
At Microsoft, we believe AI and intelligent automation will transform financial workloads. Just as Microsoft Excel transformed finance departments following its release over thirty years ago, AI and machine learning are positioned to reshape corporate financial managementmaking routine and error-prone tasks more efficient, focused, and accurate.
Learn more in our blog: Improve efficiency and quality with AI-infused finance processes or in the webinar, Reshape the Future of FinancialsThrough AI.
Applying AI in finance
Finance professionals have heard a lot about the promise of AI, machine learning, and robotic process automation (RPA) in recent years. However, what has been missing from the discussion were concrete examples of where to apply these emerging technologies to drive higher value outcomes and improve our business and financial processes. Finance insights bridge this gap with customer payment insights, cash flow forecasting, and intelligent budget proposals.
“Our Global Risk Officer at SOLEVO wanted to have more insights on customer payments, especially to take automatic actions based on the historical analysis for each customer. We hope that finance insights can help us setup a more proactive process and help him getting a better overview.”Martin Sengel, CIO of SOLEVO Group.
Customer payment insights
Finance insights customer SOLEVO has been a leading distributor of chemicals and inputs for specific industrial and agricultural segments in Africa for over seventy years. SOLEVO became interested in finance insights as means of providing management with an improved understanding of customer payments. Specifically, they wanted to automate actions based on historical customer data analysis and use these insights to set up a more proactive collections process.
Traditionally, it has been a challenge for businesses to predict when customers will pay their invoices. This lack of insight leads to less accurate cash flow forecasts, collection processes that start too late, and orders released to customers who may default on the payment.
But what if organizations were able to predict if a customer would pay their invoice on time? Or if they will pay it late?
Customer payment insights empower organizations to predict when an invoice will be paid by applying machine learning to financial data. This new capability enables Dynamics 365 Finance to learn from historical invoices, payments, and customer data and is nothing short of transformative for the collection process.
Customer payments insights effectively shift the collection process from reactive to proactive by allowing companies to act on data-driven, AI-derived predictions to proactively automate the task of following up with at-risk customers.
Cash flow forecasting
Cash flow is critical to every business. Even profitable companies can become insolvent by failing to maintain the cash flow required to fund operations. The cash flow forecasting capability in finance insights help companies avoid this undesirable outcome by more effectively monitoring and managing their cash balances.
Intelligent cash flow forecasting enables automatic integration from external systems and reports and uses machine learning to help businesses forecast cash flows more accurately than was previously possible. With more accurate forecasts, managers make more effective decisions, optimizing opportunities within the context of their current cash position. Intelligent cash flow forecasting also enables automatic integration from external systems and reports.
Using time-series, cash flow forecasting in combination with customer payment predictions improves overall cash flow accuracy. The ability to save cash flow forecasts and compare against actual financial results further enables organizations to measure forecast performance and prepare more accurate forecasts in the future.
Intelligent budget proposals
Organizations spend a significant amount of time and resources preparing budgets. Much of this time is spent scrubbing historical data to enable an accurate view of history, a repetitive and low-value add process.
Budget proposals help alleviate much of this work by providing a streamlined mechanism to prepare multiple years of history, analyze the data, and predict what the budget or forecast should look like based on what has happened over time.
Learn more in our recent blog: How finance leaders can leverage intelligent automation to unleash innovation.
Transformational capabilities
Finance insights brings AI and automation functionality to Dynamics 365 Finance to enable data and outcome-driven finance. It is a true game-changer, redefining transactional finance and empowering data and outcome-driven finance. As we have seen here, this is accomplished by effectively leveraging AI, machine learning, and intelligent automation in an interrelated set of three new financial management tools: customer payment insights, cash flow forecasting, and budget proposals.
These tools will allow finance organizations to drive better business decision-making by providing new AI-driven business insights that are clearer and faster while also improving operational efficiency by utilizing intelligent automation. At Microsoft, we are truly excited to make these transformational capabilities available and look forward to adding additional features to finance insights soon.
What’s next?
Dynamics 365 Finance is a leading provider of financial management solutions in the product-centric, cloud-based enterprise resource planning market. Our goal is to make Dynamics 365 Finance the most intelligently automated financial management solution available, and with finance insights, we have moved one step closer.
If you are already a Dynamics 365 Finance user, you can learn more about getting started with finance insights by checking out the finance insights support documentation or by watching the webinar, Reshape the Future of FinancialsThrough AI.
The post Finance insights is now generally available in Dynamics 365 Finance appeared first on Microsoft Dynamics 365 Blog.
Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.
by Contributed | Jan 9, 2022 | Technology
This article is contributed. See the original author and article here.
I want to share with you here in this article an example of SQL Injection, how it can be used to access sensitive data and harm the database, and what are the recommendations and steps that can be done to protect your application or website from being vulnerable to SQL Injection.
I created a simple web site with a few lines of code:
- Added System.Data and System.Data.SqlClient Namespaces.
- Create connection, command and Data Adapter objects to execute an SQL command and fill the data table object.
- The command is a Select command query on one of database tables and the result set is filtered by email address, the value that is entered by the user before hitting search button.
- The result will be shown on a grid view object on the page.
The Web Page code:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using System.Web.UI;
using System.Web.UI.WebControls;
using System.Data;
using System.Data.SqlClient;
namespace SqlInjection
{
public partial class _Default : Page
{
protected void Page_Load(object sender, EventArgs e)
{
}
protected void BtnSearch_Click(object sender, EventArgs e)
{
string connetionString;
System.Data.SqlClient.SqlConnection cnn;
connetionString = @"Data Source=xxx.database.windows.net;Initial Catalog=xxx;User ID=xxx ;Password=xxx ";
cnn = new SqlConnection(connetionString);
cnn.Open();
SqlCommand command = new SqlCommand("SELECT customerid as ID,Firstname + ' ' + lastname as Name,companyname as Company, emailaddress as Email,phone FROM saleslt.customer WHERE EmailAddress = '" + txtEmail.Text + "'", cnn);
SqlDataAdapter MyDataadapter;
MyDataadapter = new SqlDataAdapter(command);
command.Parameters[0].Value = txtEmail.Text;
command.ExecuteScalar();
DataTable Datatbl;
Datatbl = new DataTable();
MyDataadapter.Fill(Datatbl);
GridView.DataSource = Datatbl;
GridView.DataBind();
cnn.Close();
}
}
}
The application is working fine and retrieves data from the database as below screenshot:

But, if I change the email address value to be ‘ or 1=1 or 1=’ instead of any email address, I will get all the data of the “customers” table, as below screenshot:

If I try something else, like searching for example: ‘ or 1=2 union select object_id,name,schema_name(schema_id), name , name from sys.tables; select 0 where 1= ‘
Here I did not get the query’s table data, I added union statement to get database tables names using sys.tables system view, and I’ve got the following result:

Now I am able to simply get the list of all database tables and view any table I want, using same SQL injection scenario.
Also, I tried to insert the value : ‘ or 1=2; truncate table dbo.product; select 0 where 1= ‘ ,and I was able to truncate the product table.
The Queries that have been executed on the database are:
(@0 nvarchar(110))SELECT customerid as ID,Firstname + ' ' + lastname as Name,companyname as Company, emailaddress as Email,phone FROM saleslt.customer WHERE EmailAddress = '' or 1=2 union select object_id,name,schema_name(schema_id), name , name from sys.tables; select 0 where 1= ''
(@0 nvarchar(58))SELECT customerid as ID,Firstname + ' ' + lastname as Name,companyname as Company, emailaddress as Email,phone FROM saleslt.customer WHERE EmailAddress = '' or 1=2; truncate table dbo.product; select 0 where 1= ''
How to avoid SQL Injection:
Use Parameters:
I Modified my C# code and added the required parameter to the SQL Command as the following:
protected void BtnSearch_Click(object sender, EventArgs e)
{
string connetionString;
System.Data.SqlClient.SqlConnection cnn;
connetionString = @"Data Source=xxx.database.windows.net;Initial Catalog=xxx;User ID=xxx ;Password=xxxxx ";
cnn = new SqlConnection(connetionString);
cnn.Open();
SqlCommand command = new SqlCommand("SELECT customerid as ID,Firstname + ' ' + lastname as Name,companyname as Company, emailaddress as Email,phone FROM saleslt.customer WHERE EmailAddress = @0", cnn);
command.Parameters.Add(new SqlParameter("0", 1));
SqlDataAdapter MyDataadapter;
MyDataadapter = new SqlDataAdapter(command);
command.Parameters[0].Value = txtEmail.Text;
command.ExecuteScalar();
DataTable Datatbl;
Datatbl = new DataTable();
MyDataadapter.Fill(Datatbl);
GridView.DataSource = Datatbl;
GridView.DataBind();
cnn.Close();
}
Now, if I try the SQL injection it is not working any more, it is giving no result at all:

Whatever the value I write on the email text box, the query that is executed on the database is always the following:
(@0 nvarchar(26))SELECT customerid as ID,Firstname + ' ' + lastname as Name,companyname as Company, emailaddress as Email,phone FROM saleslt.customer WHERE EmailAddress = @0
Microsoft Defender:
Microsoft Defender for Cloud – an introduction | Microsoft Docs
Microsoft Defender for Cloud (Azure Security center) can detect such attacks and notify the customer, I received the following email alert:
An application generated a faulty SQL statement on database ‘xxxx’. This may indicate that the application is vulnerable to SQL injection.
|
|
|
Activity details
Severity
|
Medium
|
Subscription ID
|
xxx
|
Subscription name
|
xxx
|
Server
|
xx
|
Database
|
xx
|
IP address
|
81.xx.xx.xx
|
Principal name
|
tr*****
|
Application
|
.Net SqlClient Data Provider
|
Date
|
November 28, 2021 14:50 UTC
|
Threat ID
|
2
|
Potential causes
|
Defect in application code constructing faulty SQL statements; application code doesn’t sanitize user input and may be exploited to inject malicious SQL statements.
|
Investigation steps
|
For details, view the alert in the Azure Security Center. To investigate further, analyze your audit log.
|
Remediation steps
|
Read more about SQL Injection threats, as well as best practices for writing safe application code. Please refer to Security Reference: SQL Injection.
|
|
|
|
|
|
|
|
Give the Application the minimum required permissions:
In the example I shared, the attacker was able to get any data he wants, table names and even was able to truncate or drop tables and more. Maybe it is easier to give permissions as sysadmin or db_owner in one step, but it recommended to give only required permissions (execute permission for example) and only on specific objects required by the application.
Use Application to validate data:
In my web page, the user should use the email address to search for data, it should have an expression special for the email address, and it could not contain spaces and part like 1=1 or .
I added a “Reqular expression Validator” object to the page and linked it to the text box I use for the email address.
Below is the validation expression for the email address:

Now I am not able to run the SQL injection again, I get a validation error instead:

by Contributed | Jan 7, 2022 | Technology
This article is contributed. See the original author and article here.
We are thrilled to welcome Barracuda Virtual Reactor® on Azure. With this technology, engineers can simulate fluid, particulate-solid, thermal and chemically reacting behavior in industrial fluid-particle processes while gaining the on-demand and virtually unlimited computing capacity of Azure.
Virtual ReactorTM is the industry-standard tool for many applications in refining, petrochemicals, cement manufacture, power generation and other energy-intensive industries. Today, the same award-winning technology is enabling, and reducing the time to market for, multiple sustainability technologies that are changing our planet for the better. Examples include advanced recycling of plastics, waste-to-energy/fuels/chemicals, renewable fuels, hydrogen production and other decarbonization applications.
Fluid-particle systems in these industries tend to operate 24/7 for years on end, and hence, even small improvements in reliability and performance lead to a tremendous economic impact. Barracuda® reduces the risk of making changes to existing processes by identifying the root cause(s) of underperformance, performing virtual testing of changes and identifying additional optimization opportunities. Similarly, the software empowers those developing new technologies to economically explore a wide range of possibilities during R&D while compressing development, scale-up and commercialization timeframes.
We recently tested Virtual Reactor on Azure’s state-of-the-art NDv4 virtual machines (VMs)to showcase performance scalability across a variety of models. We have highlighted the results of the Virtual Reactor with our ND96asr_v4 VM, and the performance is shown below across 8 different sizes.

Smaller simulations which in our tests had less than 25 million computational particles, achieved maximum speed-up when running in multi-GPU mode using two NVIDIA A100 Tensor Core GPUs. Larger simulations achieved maximum speed-up when using four GPUs. The speed-up scaling1 from one to four GPUs is shown below for benchmark case 82.
NOTE: Case 8 is a simulation benchmark case containing 55M particles; for additional GPU scaling questions please contact CPFD

The ND A100 v4 series starts with a single virtual machine and eight NVIDIA A100 Tensor Core GPUs. ND A100 v4-based deployments can scale up to thousands of GPUs with 1.6 Tb/s of interconnect bandwidth per VM. Each GPU within the VM is provided with its own dedicated, topology-agnostic NVIDIA Quantum 200Gb/s InfiniBand networking. These connections are automatically configured between VMs occupying the same virtual machine scale set, and support GPUDirect RDMA.
Each GPU features third-generation NVIDIA NVLink connectivity for communication within the VM, and the instance is also backed by 96 physical second-generation AMD Epyc™ CPU cores.
These instances provide excellent performance for many AI, ML and analytics tools that support GPU acceleration ‘out-of-the-box,’ such as TensorFlow, Pytorch, Caffe, RAPIDS and other frameworks. Additionally, the scale-out InfiniBand interconnect is supported by a large set of existing AI and HPC tools built on NVIDIA’s NCCL2 communication libraries for seamless clustering of GPUs.
“Azure’s multi-GPU virtual machines powered by NVIDIA A100 Tensor Core GPUs provide the global Barracuda Virtual Reactor user community with instant access to the latest high performance computing resources without the overhead of purchasing and maintaining on-premise hardware. The observed speed-ups of over 200x, combined with the convenience of the Azure Platform, provide our clients with virtually unlimited, on-demand, compute bandwidth as they tackle some of our planet’s toughest engineering, energy and sustainability challenges.”
Peter Blaser, Vice President of Operations, CPFD Software
“We welcome Barracuda Virtual Reactor to Azure and are excited to showcase this exciting technology to customers in process industries who will benefit immensely from our purpose-built NVIDIA GPU hosts that are designed to deliver superior cost performance for this workload. Azure and CPFD have joined forces to offer customers a compelling range of options to explore Virtual Reactor on Azure and pick the best VM sizing suited to their use case requirements.”
Kurt Niebuhr, Azure Compute HPC | AI Workload Incubation & Ecosystem Team Lead
About CPFD
CPFD Software is advancing multiphase simulation and technology. Our flagship product, Barracuda Virtual Reactor, is a physics-based engineering software package capable of predicting fluid, particulate-solid, thermal and chemically reacting behavior in fluidized bed reactors and other fluid-particle systems reducing the risk associated with design, scale-up, commercialization, and trouble-shooting of industrial processes. The Virtual Reactor technology is accessible through software licensing, consulting, or engineering services.
NOTE:
1For additional GPU scaling questions please contact CPFD
2Case 8 is a simulation benchmark case containing 55M particles
by Scott Muniz | Jan 7, 2022 | Security, Technology
This article is contributed. See the original author and article here.
WordPress versions between 3.7 and 5.8 are affected by multiple vulnerabilities. Exploitation of some of these vulnerabilities could cause a denial of service condition.
CISA encourages users and administrators to review the WordPress Security Release and upgrade to WordPress 5.8.3.
by Contributed | Jan 7, 2022 | Technology
This article is contributed. See the original author and article here.
What’s New??
Since our last update in September 2021, we have published new training content to support the features and functionality added to Microsoft Defender for Cloud Apps during the previous quarter. The new materials are included in our Microsoft Defender for Cloud Apps | December 2021 blog post. If you previously completed the Defender for Cloud Apps Ninja Training and want to view only updated content, we have highlighted and linked to the new material for your convenience.
Legend:
Module (ordered by Competency Level)
|
What’s new
|
Microsoft Cloud Apps for Security – Fundamental Level:
Module 2. Microsoft Defender for Cloud Apps Introduction
|
|
Microsoft Cloud Apps for Security – Fundamental Level:
Module 3. Initial Setting
|
|
Microsoft Cloud Apps for Security – Intermediate Level:
Module 3. Information Protection and Real-Time Control
|
|
Microsoft Cloud Apps for Security – Intermediate Level:
Module 4. Threat Detection
|
|
by Contributed | Jan 6, 2022 | Technology
This article is contributed. See the original author and article here.
In this episode of Data Exposed, Anna Hoffman and Lior Kamrat discuss why should you care about Azure Arc-enabled data services connectivity options and which one is the best for your organization.
Resources:
Demystifying Azure Arc-enabled data services: https://www.youtube.com/watch?v=7Z6WZgaupIk
Connectivity modes and requirements: https://docs.microsoft.com/en-us/azure/azure-arc/data/connectivity
Azure Arc-enabled data services: https://azurearcjumpstart.io/azure_arc_jumpstart/azure_arc_data/
by Scott Muniz | Jan 5, 2022 | Security, Technology
This article is contributed. See the original author and article here.
Google has released Chrome version 97.0.4692.71 for Windows, Mac, and Linux. This version addresses vulnerabilities that an attacker could exploit to take control of an affected system.
CISA encourages users and administrators to review the Chrome Release Note and apply the necessary updates as soon as possible.
by Scott Muniz | Jan 5, 2022 | Security, Technology
This article is contributed. See the original author and article here.
VMware has released a security advisory to address a vulnerability in Workstation, Fusion, and ESXi. An attacker could exploit this vulnerability to take control of an affected system.
CISA encourages users and administrators to review VMware Security Advisory VMSA-2022-0001 and apply the necessary updates and workarounds.
by Contributed | Jan 5, 2022 | Technology
This article is contributed. See the original author and article here.
We are excited to announce the preview release of auto-failover groups for Azure SQL Hyperscale tier. This preview release includes support for forced and planned failover for Azure SQL Hyperscale databases that use active geo-replication and auto-failover groups. Some key benefits of auto-failover groups include:
- Simplified management of a group of geo-replicated databases including ability to failover the entire group of databases.
- Ability for application to maintain the same read/write and read-only endpoints after failover.
- Recovery during loss of an entire region through geo-failover which can be initiated manually or through an automatic failover policy.
- Readable online secondaries that can be used for read-only workloads by connecting with read-only listener endpoints which remain unchanged during geo-failovers.
Hyperscale service tier supports 100 TB of database size, rapid scale (out and up) and nearly instantaneous database backups, removing the limits traditionally seen in cloud databases.
How auto-failover groups work for Hyperscale
Auto-failover groups are created between servers in 2 regions. The groups can include all or some databases in the servers. If a Hyperscale database is selected to be part of the failover group, then this database will failover with the rest of the failover group unit. The following diagram illustrates a typical configuration of a geo-redundant cloud application using multiple databases and auto-failover group.

Available regions
Auto-failover groups for Hyperscale will be supported in all regions where Azure SQL Hyperscale is supported.
Quick start
a. Create an Auto-failover group using Portal.
- Failover groups can be configured at the server level. Select the name of the server under Server name to open the settings for the server.

- Select Failover groups under the Settings pane, and then select Add group to create a new failover group.

- On the Failover Group page, enter or select your desired values for your failover group.

- Add your Hyperscale database to the failover group then select Create.

b. Create an Auto-failover group using PowerShell.
c. Create an Auto-failover group using CLI.
d. Create an Auto-failover group using REST API.
Geo-failover examples
Example 1: Planned failover for an auto-failover group
a. Execute a failover of an auto-failover group in Portal.
1. Select your failover group.

2. Select Failover to initiate failover for your auto-failover group. Once failover is completed you should see that your primary and secondary servers have swapped roles.

b. Execute a failover of an auto-failover group using Switch-AzSqlDatabaseFailoverGroup in PowerShell.
c. Execute a failover of an auto-failover group using az sql failover-group set-primary in CLI.
d. Execute a failover of an auto-failover group using REST API.
Example 2: Forced failover with active geo-replication
a. Execute a failover using Portal.
1. Select the Replicas tab. In the list of geo replicas click the ellipsis for the secondary you would like to become the new primary. Then select Forced failover.

2. You should now see that the primary and secondary have swapped roles.
b. Execute a failover using Set-AzSqlDatabaseSecondary in PowerShell with the -AllowDataLoss parameter specified.
c. Execute a failover using az sql db replica set-primary in CLI with the –allow-data-loss parameter specified.
d. Execute a failover using REST API.
Learn more
https://docs.microsoft.com/azure/azure-sql/database/auto-failover-group-overview
https://aka.ms/activegeoreplication
https://docs.microsoft.com/en-us/azure/azure-sql/database/service-tier-hyperscale
Recent Comments