Azure SQL Database – GEO Replication across subscription with private endpoints

Azure SQL Database – GEO Replication across subscription with private endpoints

This article is contributed. See the original author and article here.

Active geo-replication is an Azure SQL Database feature that allows you to create readable secondary databases of individual databases on a server in the same or different data center (region).


 


We have received few cases where customers would like to have this setup across subscriptions with private endpoints. This article describes how to achieve it and set up GEO replication between two Azure SQL servers across subscriptions using private endpoints while public access is disallowed.


 


To start with this setup, kindly make sure the below are available in your environment.


 



  • Two subscriptions for primary and secondary environments,

    1. Primary Environment: Azure SQL Server, Azure SQL database, and Virtual Network.

    2. Secondary Environment: Azure SQL Server, Azure SQL database, and Virtual Network.




               Note: Use paired region for this setup, and you can have more information about paired regions by accessing this link.



  • Public access should be enabled during the GEO replication configuration.

  • Both your virtual network’s subnet should not overlap IP addresses. You can refer to this blog for more information.


 


For this article , the primary and secondary environments will be as below:


 


Primary Environment


 


Subscription ID: Primary-Subscription


Server Name: primaryservertest.database.windows.net


Database Name: DBprim


Region: West Europe


Virtual Network: VnetPrimary


Subnet: PrimarySubnet – 10.0.0.0/24


 


Secondary Environment


 


Subscription ID: Secondary-Subscription


Server Name: secservertest1.database.windows.net


Region: North Europe


Virtual Network: VnetSec


Subnet: SecondarySubnet – 10.2.0.0/24


 


Limitations



  • Creating a geo-replica on a logical server in a different Azure tenant is not supported 

  • Cross-subscription geo-replication operations including setup and failover are only supported through T-SQL commands.

  • Creating a geo-replica on a logical server in a different Azure tenant is not supported when Azure Active Directory only authentication for Azure SQL is active (enabled) on either primary or secondary logical server.


 


GEO Replication Configuration


Follow the below steps to configure GEO replication (make sure the public access is enabled while executing the below steps)


1) Create a privileged login/user on both primary and secondary to be used for this setup:


    a. Connect to your primary Azure SQL Server and create a login and a user on your master database using the below script:


 

--Primary Master Database
create login GeoReplicationUser with password = 'P@$$word123'

create user GeoReplicationUser for login GeoReplicationUser
alter role dbmanager add member GeoReplicationUser

 


Get the created user SID and save it:


 

select sid from sys.sql_logins where name = 'GeoReplicationUser'

 


 b. On the primary database create the required user as below:


 

-- primary user database
create user GeoReplicationUser for login GeoReplicationUser
alter role db_owner add member GeoReplicationUser

 


   c. Connect to your secondary server and create the same login and user while using the same SID you got from point A:


 

--Secondary Master Database
create login GeoReplicationUser with password = 'P@$$word123', sid=0x010600000000006400000000000000001C98F52B95D9C84BBBA8578FACE37C3E
create user GeoReplicationUser for login GeoReplicationUser;
alter role dbmanager add member GeoReplicationUser

 


 


2) Make sure that both primary and secondary Azure SQL servers firewall rules are configured to allow the connection (such as the IP address of the host running SQL Server Management Studio).


 


3) Log in with the created user to your primary Azure SQL server to add the secondary server and configure GEO replication, by running the below script on the primary master database:


 


 

-- Primary Master database
alter database DBprim add secondary on server [secservertest1]

 


4) To verify the setup, access your Azure portal, go to your primary Azure SQL database, and access Replicas blade as below:


 


Sabrin_Alsahsah_0-1633699119251.png


 


You will notice that the secondary database has been added and configured.


 


Note: before moving to the next step make sure your replica has completed the seeding and is marked as “readable” under replica status (as highlighted below):  


 


Sabrin_Alsahsah_1-1633699162115.png


 


Configuring private endpoints for both servers


 


Now, we will start preparing the private endpoints setup for both primary and secondary servers.


 1) From Azure Portal > Access Primary Server > private endpoints connections blade > add new private endpins as below:


Sabrin_Alsahsah_2-1633699265960.png


 


we will select the primary subscription to host the primary server private endpoints, 


 


Sabrin_Alsahsah_3-1633699299875.png


 


Sabrin_Alsahsah_4-1633699307177.png


 


Next, the primary private endpoint will be linked to the primary virtual network and make sure the private DNS zone is linked to the primary subscription as below: 


 


Sabrin_Alsahsah_5-1633699331465.png


 


2. Create secondary server private endpoint, from Azure Portal > Access Secondary Server > private endpoints connections blade > add a new private endpoint as below:


 


in the below steps, we will select the secondary server virtual network and subscription,


 


Sabrin_Alsahsah_6-1633699357253.png


 


Sabrin_Alsahsah_7-1633699365699.png


 


In the next step, will link the secondary server private endpoint with the primary private DNS Zone, as Both primary and secondary private endpoints should be linked to the same private DNS zone (as below),


 


Sabrin_Alsahsah_9-1633699441044.png


 


3) Once both private endpoints are created, make sure that they are accepted as mentioned in this document.


Sabrin_Alsahsah_10-1633699460960.png


 


4) Access your private DNS zone from Azure portal, and verify that both are linked to the same one. This can be checked by accessing Azure portal >  go to private DNS zone > select your primary subscription and check it as below, 


Sabrin_Alsahsah_11-1633699487012.png


 


Sabrin_Alsahsah_12-1633699531802.png


 


Sabrin_Alsahsah_13-1633699540265.png


Note: this step has been discussed in detail in this blog article.


 


Virtual Network setup


You need to make sure your Azure Virtual networks have Vnet peering between primary and secondary, in order to allow communication once the public access is disabled. For more information, you can access this document


 


Disabling public access 


Once the setup is ready you can disallow public access on your Azure SQL servers,  


Sabrin_Alsahsah_14-1633701316344.png


 


Next


once the public access is disabled, the GEO replication will be running under private endpoints between your Azure SQL server across subscriptions.


 


Troubleshooting


1- You may encounter below error when adding the secondary using T-SQL


alter database DBprim add secondary on server [secservertest1]


 


Msg 42019, Level 16, State 1, Line 1

ALTER DATABASE SECONDARY/FAILOVER operation failed. Operation timed out.

 

Possible solution: Set “deny public access” to off while setting up the geo replication via the T-SQL commands , once the geo replication is set up “deny public access” can be turned back on and the secondary will be able to sync and get the data from primary, public access only needs to be on for setting up the geo replication.

 


2-  Also, You may encounter below error when adding the secondary using T-SQL


alter database DBprim add secondary on server [secservertest1]


 


Msg 40647, Level 16, State 1, Line 1
Subscription ‘xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx’ does not have the server ‘secservertest1’.


 


Possible solution: Make sure that both private links use the same private DNS zone that was used for the primary.  Refer to blog  for more information.


 


References


Active geo-replication – Azure SQL Database | Microsoft Docs


Using Failover Groups with Private Link for Azure SQL Database – Microsoft Tech Community


 


Disclaimer

Please note that products and options presented in this article are subject to change. This article reflects the Geo Replication across different subscriptions with private endpoints option available for Azure SQL Database in October, 2021.

Closing remarks
 


I hope this article was helpful for you, please like it on this page and share through social media. please feel free to share your feedback in the comments section below. 

Azure Functions Auth via OpenAPI in 6 Ways

Azure Functions Auth via OpenAPI in 6 Ways

This article is contributed. See the original author and article here.

Azure security baseline for Azure Functions well describes the security consideration in general while developing an Azure Functions application. In addition to that, Azure Functions offers a built-in authentication method through the functions key. If you use the OpenAPI extension for Azure Functions, you can define the endpoint authentication and authorisation for each API endpoint in various ways. You can even try them through the Swagger UI page. Throughout this post, I’m going to discuss six different approaches for access control to Azure Functions API endpoints using the OpenAPI extension.


 



This GitHub repository contains the sample app used in this post.



 


OpenAPI Spec for Authentication


 


It could be a good idea to take a look at the authentication spec defined in OpenAPI before going further.


 



  • type: defines what type of authentication method will be used. Currently, it accepts API Key, HTTP, OAuth2, and OpenID Connect. But, the OpenAPI v2 spec doesn’t support the OpenID Connect.

  • name: declares the auth key name. It’s required for API Key.

  • in: defines the location of the auth key. It’s required for API Key and accepts query, header, or cookie.

  • scheme: declares the auth scheme. It’s required for HTTP auth and accepts either Basic or Bearer.

  • bearerFormat: uses JWT in most cases when using the Bearer token through the HTTP auth.

  • flows: is required for the OAuth2 auth. Its value can be implicit, password, clientCredentials, or authorizationCode.

  • openIdConnectUrl: is necessary for the OpenID Connect auth. However, it is advised to use either OAuth2 or Bearer auth for the OpenAPI v2 spec.


 


Based on the understandings above, let’s apply the different auth approach to Azure Function endpoints through the OpenAPI extension.


 


APK Key in Querystring


 


This is the built-in feature of Azure Functions. Let’s take a look at the code below. If you installed the OpenAPI extension, you could add the decorators. Spot on the OpenApiSecurityAttribute(…) decorator, which sets the value (line #6-9).


 



  • Type: SecuritySchemeType.ApiKey

  • In: OpenApiSecurityLocationType.Query

  • Name: code


 


    public static class ApiKeyInQueryAuthFlowHttpTrigger
{
[FunctionName(nameof(ApiKeyInQueryAuthFlowHttpTrigger))]
[OpenApiOperation(operationId: “apikey.query”, tags: new[] { “apikey” }, Summary = “API Key authentication code flow via querystring”, Description = “This shows the API Key authentication code flow via querystring”, Visibility = OpenApiVisibilityType.Important)]

[OpenApiSecurity(“apikeyquery_auth”,
SecuritySchemeType.ApiKey,
In = OpenApiSecurityLocationType.Query,
Name = “code”)]

[OpenApiResponseWithBody(statusCode: HttpStatusCode.OK, contentType: “application/json”, bodyType: typeof(Dictionary<string, string>), Summary = “successful operation”, Description = “successful operation”)]
public static async Task Run(
[HttpTrigger(AuthorizationLevel.Function, “GET”, Route = null)] HttpRequest req,
ILogger log)
{
log.LogInformation(“C# HTTP trigger function processed a request.”);

var queries = req.Query.ToDictionary(q => q.Key, q => (string) q.Value);
var result = new OkObjectResult(queries);

return await Task.FromResult(result).ConfigureAwait(false);
}
}


 


Run the function app, and you will see the Swagger UI page.


 


Swagger UI - Query


 


Click the lock button on the right-hand side to enter the API key value. This value will be appended to the querystring parameter.


 


Swagger UI - Query - API Key


 


The result screen shows the API key passed through the querystring parameter, code.


 


Swagger UI - Query - Result


 


API Key in Request Header


 


It’s also the Azure Function’s built-in feature. This time, set the value of the OpenApiSecurityAttribute(…) decorator like below (line #6-9).


 



  • Type: SecuritySchemeType.ApiKey

  • In: OpenApiSecurityLocationType.Header

  • Name: x-functions-key


 


    public static class ApiKeyInHeaderAuthFlowHttpTrigger
{
[FunctionName(nameof(ApiKeyInHeaderAuthFlowHttpTrigger))]
[OpenApiOperation(operationId: “apikey.header”, tags: new[] { “apikey” }, Summary = “API Key authentication code flow via header”, Description = “This shows the API Key authentication code flow via header”, Visibility = OpenApiVisibilityType.Important)]

[OpenApiSecurity(“apikeyheader_auth”,
SecuritySchemeType.ApiKey,
In = OpenApiSecurityLocationType.Header,
Name = “x-functions-key”)]

[OpenApiResponseWithBody(statusCode: HttpStatusCode.OK, contentType: “application/json”, bodyType: typeof(Dictionary<string, string>), Summary = “successful operation”, Description = “successful operation”)]
public static async Task Run(
[HttpTrigger(AuthorizationLevel.Function, “GET”, Route = null)] HttpRequest req,
ILogger log)
{
log.LogInformation(“C# HTTP trigger function processed a request.”);

var headers = req.Headers.ToDictionary(q => q.Key, q => (string) q.Value);
var result = new OkObjectResult(headers);

return await Task.FromResult(result).ConfigureAwait(false);
}
}


 


Run the function app and see the Swagger UI page.


 


Swagger UI - Header


 


If you want to authenticate the endpoint, enter the API key value to the field, labelled as x-functions-key.


 


Swagger UI - Header - API Key


 


As a result, the API key was sent through the request header, x-functions-key.


 


Swagger UI - Header - Result


 


Basic Auth Token


 


Let’s use the Basic auth token this time. Set the property values of OpenApiSecurityAttribute(…) (line #6-8).


 



  • Type: SecuritySchemeType.Http

  • Scheme: OpenApiSecuritySchemeType.Basic


 


As this is not the built-in feature, you can use this approach for additional auth methods or replace the built-in feature. If you don’t want to use the built-in API key, you should set the auth level value of the HttpTrigger binding to AuthorizationLevel.Anonymous (line #12).


 


    public static class HttpBasicAuthFlowHttpTrigger
{
[FunctionName(nameof(HttpBasicAuthFlowHttpTrigger))]
[OpenApiOperation(operationId: “http.basic”, tags: new[] { “http” }, Summary = “Basic authentication token flow via header”, Description = “This shows the basic authentication token flow via header”, Visibility = OpenApiVisibilityType.Important)]

[OpenApiSecurity(“basic_auth”,
SecuritySchemeType.Http,
Scheme = OpenApiSecuritySchemeType.Basic)]

[OpenApiResponseWithBody(statusCode: HttpStatusCode.OK, contentType: “application/json”, bodyType: typeof(Dictionary<string, string>), Summary = “successful operation”, Description = “successful operation”)]
public static async Task Run(
[HttpTrigger(AuthorizationLevel.Anonymous, “GET”, Route = null)] HttpRequest req,
ILogger log)
{
log.LogInformation(“C# HTTP trigger function processed a request.”);

var headers = req.Headers.ToDictionary(q => q.Key, q => (string) q.Value);
var result = new OkObjectResult(headers);

return await Task.FromResult(result).ConfigureAwait(false);
}
}


 


Run the app to see the Swagger UI like below.


 


Swagger UI - Basic Auth


 


To authenticate your endpoint, you should enter the Username and Password, added to the Authorization header.


 


Swagger UI - Basic Auth - Details


 


The result screen shows the request header of Authorization with the base64 encoded value.


 


Swagger UI - Basis Auth - Result


 


Then, you should validate the auth details with your custom logic.


 


Bearer Auth Token


 


Similarly, this time, let’s use the Bearer auth token. Set the property values of OpenApiSecurityAttribute(…) (line #5).


 



  • Type: SecuritySchemeType.Http

  • Scheme: OpenApiSecuritySchemeType.Bearer

  • BearerFormat: JWT


 


You now know how to set the auth level of the HttpTrigger binding to AuthorizationLevel.Anonymous (line #13).


 


    public static class HttpBearerAuthFlowHttpTrigger
{
[FunctionName(nameof(HttpBearerAuthFlowHttpTrigger))]
[OpenApiOperation(operationId: “http.bearer”, tags: new[] { “http” }, Summary = “Bearer authentication token flow via header”, Description = “This shows the bearer authentication token flow via header”, Visibility = OpenApiVisibilityType.Important)]

[OpenApiSecurity(“bearer_auth”,
SecuritySchemeType.Http,
Scheme = OpenApiSecuritySchemeType.Bearer,
BearerFormat = “JWT”)]

[OpenApiResponseWithBody(statusCode: HttpStatusCode.OK, contentType: “application/json”, bodyType: typeof(Dictionary<string, string>), Summary = “successful operation”, Description = “successful operation”)]
public static async Task Run(
[HttpTrigger(AuthorizationLevel.Anonymous, “GET”, Route = null)] HttpRequest req,
ILogger log)
{
log.LogInformation(“C# HTTP trigger function processed a request.”);

var headers = req.Headers.ToDictionary(q => q.Key, q => (string) q.Value);
var handler = new JwtSecurityTokenHandler();
var token = handler.ReadJwtToken(headers[“Authorization”].Split(‘ ‘).Last());
var claims = token.Claims.Select(p => p.ToString());
var content = new { headers = headers, claims = claims };

var result = new OkObjectResult(content);

return await Task.FromResult(result).ConfigureAwait(false);
}
}


 


Run the function app and see the Swagger UI page.


 


Swagger UI - Bearer Auth


 


During the authentication, you are asked to enter the Bearer token value. The Authorization header will add the value.


 


Swagger UI - Bearer Auth - Details


 


The result screen shows the JWT value in the Authorization header.


 


Swagger UI - Bearer Auth - Result


 


You should decode the JWT and find the appropriate claims and validate them for further processing.


 


OAuth2 Implicit Auth Flow


 


Although there are many ways in the OAuth2 authentication flow, I’m going to use the Implicit flow for this time. Set the properties of OpenApiSecurityAttribute(…) (line #6-8).


 



  • Type: SecuritySchemeType.OAuth2

  • Flows: ImplicitAuthFlow


 


Auth level is also set to Anonymous (line #12).


 


    public static class OAuthImplicitAuthFlowHttpTrigger
{
[FunctionName(nameof(OAuthImplicitAuthFlowHttpTrigger))]
[OpenApiOperation(operationId: “oauth.flows.implicit”, tags: new[] { “oauth” }, Summary = “OAuth implicit flows”, Description = “This shows the OAuth implicit flows”, Visibility = OpenApiVisibilityType.Important)]

[OpenApiSecurity(“implicit_auth”,
SecuritySchemeType.OAuth2,
Flows = typeof(ImplicitAuthFlow))]

[OpenApiResponseWithBody(statusCode: HttpStatusCode.OK, contentType: “application/json”, bodyType: typeof(IEnumerable), Summary = “successful operation”, Description = “successful operation”)]
public static async Task Run(
[HttpTrigger(AuthorizationLevel.Anonymous, “GET”, Route = null)] HttpRequest req,
ILogger log)
{
log.LogInformation(“C# HTTP trigger function processed a request.”);

var headers = req.Headers.ToDictionary(p => p.Key, p => (string) p.Value);
var handler = new JwtSecurityTokenHandler();
var token = handler.ReadJwtToken(headers[“Authorization”].Split(‘ ‘).Last());
var claims = token.Claims.Select(p => p.ToString());

var result = new OkObjectResult(claims);

return await Task.FromResult(result).ConfigureAwait(false);
}
}


 


You can see ImplicitAuthFlow as the flow type. Since it uses Azure Active Directory, it sets AuthorizationUrl, RefreshUrl, and Scopes values. It also takes the single tenant type, which requires the tenant ID (line #3-6, 10, 14-15). Scopes has the default value (line #17).


 


    public class ImplicitAuthFlow : OpenApiOAuthSecurityFlows
{
private const string AuthorisationUrl =
“https://login.microsoftonline.com/{0}/oauth2/v2.0/authorize”;
private const string RefreshUrl =
“https://login.microsoftonline.com/{0}/oauth2/v2.0/token”;

public ImplicitAuthFlow()
{
var tenantId = Environment.GetEnvironmentVariable(“OpenApi__Auth__TenantId”);

this.Implicit = new OpenApiOAuthFlow()
{
AuthorizationUrl = new Uri(string.Format(AuthorisationUrl, tenantId)),
RefreshUrl = new Uri(string.Format(RefreshUrl, tenantId)),

Scopes = { { “https://graph.microsoft.com/.default”, “Default scope defined in the app” } }
};
}
}


 


Run the function app and check the Swagger UI page.


 


Swagger UI - OAuth2 Implicit Auth


 


When you click the lock button, it asks you to enter the client ID value, redirecting you to sign in to Azure Active Directory. Then, you will get the access token.


 


Swagger UI - OAuth2 Implicit Auth - Details


 


The result shows the Authorization header with the access token in the JWT format.


 


Swagger UI - OAuth2 Implicit Auth - Result


 


That JWT is now decoded and verified for further processing.


 


OpenID Connect Auth Flow


 


Finally, let’s use the OpenID Connect auth flow. OpenApiSecurityAttribute(…) contains the following definitions (line #6-9).


 



 


The {tenant_id} value, of course, should be replaced with the real tenant ID. With this OpenID Connect URL, it automatically discovers the OAuth2 auth flows. Then, set the auth level to Anonymous (line #12).


 


    public static class OpenIDConnectAuthFlowHttpTrigger
{
[FunctionName(nameof(OpenIDConnectAuthFlowHttpTrigger))]
[OpenApiOperation(operationId: “openidconnect”, tags: new[] { “oidc” }, Summary = “OpenID Connect auth flows”, Description = “This shows the OpenID Connect auth flows”, Visibility = OpenApiVisibilityType.Important)]

[OpenApiSecurity(“oidc_auth”,
SecuritySchemeType.OpenIdConnect,
OpenIdConnectUrl = “https://login.microsoftonline.com/{tenant_id}/v2.0/.well-known/openid-configuration”,
OpenIdConnectScopes = “openid,profile”)]

[OpenApiResponseWithBody(statusCode: HttpStatusCode.OK, contentType: “application/json”, bodyType: typeof(IEnumerable), Summary = “successful operation”, Description = “successful operation”)]
public static async Task Run(
[HttpTrigger(AuthorizationLevel.Anonymous, “GET”, Route = null)] HttpRequest req,
ILogger log)
{
log.LogInformation(“C# HTTP trigger function processed a request.”);

var headers = req.Headers.ToDictionary(p => p.Key, p => (string) p.Value);
var handler = new JwtSecurityTokenHandler();
var token = handler.ReadJwtToken(headers[“Authorization”].Split(‘ ‘).Last());
var claims = token.Claims.Select(p => p.ToString());
var content = new { headers = headers, claims = claims };

var result = new OkObjectResult(content);

return await Task.FromResult(result).ConfigureAwait(false);
}
}


 


Run the function app and find the Swagger UI page.


 


Swagger UI - OpenID Connect Auth


 


Unlike other auth flows, this OpenID Connect auth flow shows two methods. The first one is the authentication code flow, and the other one is the implicit flow. Let’s use the second one and enter the client ID value. It will redirect you to Azure Active Directory to sign in and give you the access token.


 


Swagger UI - OpenID Connect Auth - Details


 


Once execute the endpoint, the access token is passed through the Authorization header in the JWT format.


 


Swagger UI - OpenID Connect Auth - Result


 


Decode and validate the token for further processing.


 




 


So far, we’ve covered six different ways to authenticate the HTTP trigger endpoints with the OpenAPI extension. These six ways are the most commonly used ones. Therefore, if you need, you can pick up one approach and implement it.


 


This article was originally published on Dev Kimchi.

Meet a recent Microsoft Learn Student Ambassador graduate: Vivekkumar Parmar

Meet a recent Microsoft Learn Student Ambassador graduate: Vivekkumar Parmar

This article is contributed. See the original author and article here.

This series highlights Microsoft Learn Student Ambassadors who achieved the Gold milestone and have recently graduated from university. Each blog features a different student and highlights their accomplishments, their experience with the Student Ambassadors community, and what they’re up to now.


 


Today we’d like to introduce Vivekkumar Parmar who is from India and recently graduated from Dharmsinh Desai University.


 


Gold Student Ambassador: Vivekkumar ParmarGold Student Ambassador: Vivekkumar Parmar


 


Responses have been edited for clarity and length.


 


When you first joined the Student Ambassadors community in 2018, did you have specific goals you wanted to reach, such as a particular skill or quality?  What were they?  Did you achieve them? How has the community impacted you in general? 

When I joined, I didn’t have any prior experience hosting events and giving seminars on technical topics, but I was very excited to share my knowledge with fellow students. What I saw and experienced at my campus was that students were building amazing projects as part of a curriculum or even for self-learning or as a hobby, but once the development was completed, the code used to reside on a code repository hosting service like GitHub, but there was no practical usage of it.  If it had been deployed on a cloud service platform and made available to anyone on the internet, then that same project could have been useful to solve real life problems, so after I became a Student Ambassador, I decided to make the cloud my primary area of focus and conducted various workshops and seminars on Azure. In order to host events on Azure, I had to first learn the concepts, so I explored this domain in depth and gained a thorough understanding, which has helped me in my career too.


 


What is an accomplishment that you’re the proudest of and why?


 


I’m proudest of the very first event that I hosted in February 2019 at my university, “Introduction to Cloud Services”. Since I didn’t have any prior experience in event management and delivering speeches, with the help of my classmates (shout out to them – Kaushal, Hardik, Sameer, Utsav & Virat), we planned and successfully organized the event which was attended by 80+ students.


 


Now why is that event very special to me? That event planted the seeds of cloud at my university. At that time, most of the students on my campus were not familiar with cloud technologies. There were a few technical clubs, but I don’t remember any event being hosted around the cloud domain. With my event, Azure was introduced in our university. It was a half day event.  We started with introducing the cloud, Azure, and its services and also had a hands-on workshop at the end. [Thanks to Microsoft for those Azure student credits and Subway meal support!!] Students were really amazed to see various services offered by Azure and how they could implement it in their college projects. After attending that event, a few students took a deep dive into Azure.  I used to receive messages from them regarding their queries, and I felt very proud to see them using various Azure services in their projects.  


 


What are you doing now that you’ve graduated?



I’m passionate about cloud computing and community building. Currently, I’m mentoring a few tech communities of my university and am planning to take a few Azure certification exams in the coming months.  I would like to start my career as a DevOps engineer and later become a developer advocate/evangelist.  I would love to continue to speak occasionally on Azure, Cloud, DevOps, etc. to share my knowledge.
 


If you could redo your time with the Student Ambassadors community, is there anything you would have done differently?



Yes.  I wasn’t able to interact much with Student Ambassadors from other regions. I would have loved to collaborate with Student Ambassadors from all over the world and host global hackathons and initiate projects with them.


 


If you were to describe the community to a student who is interested in joining, what would you say about it to convince him or her to join?  

Rather than a program, I would describe Microsoft Learn Student Ambassadors as a family, a family of like-minded, amazing people from all around the world who are passionate about technology and always excited to share their knowledge and learn something new.

Being part of this community will give you an immense opportunity to represent yourself, your local community, and your university at a global level. It’ll help you grow and strengthen your skill set. And most importantly, you’ll create valuable memories that will stay with you for a very long period!  


 


What advice would you give to new Student Ambassadors?


 


If you’ve recently joined this community, then first of all, congratulations and welcome! Here are few words of advice from my own experience:



  •          Don’t be shy. Interact as much as you can with fellow students at your college, other Student Ambassadors from across the globe, and the program team. You’ll learn many things just by communicating with them.

  •          “Every expert was once a beginner.” It’s never too late to start something new. You may face challenges and obstacles in the beginning, but don’t be afraid. Have faith in yourself. Members of our community are very helpful. You can reach out to anyone for help, and I’m sure that you’ll definitely find the solution.

  •          Participate in various events and activities going on in Teams [Editor’s note: this is the platform through which the Student Ambassadors and the program team communicate and collaborate] and hackathons, etc. as per your interests. Acquire knowledge and do some mini-projects to get hands-on experience and later share it with others by hosting events, writing blogs, or creating video tutorials. Microsoft Learn is one of the best platforms on which to gain a deep understanding of Microsoft technologies.

  •          Don’t just host events for the sake of doing it or achieving milestones. Enjoy that experience and live that moment that you experience even if only one attendee out of all the participants does something extra after attending your event like exploring in depth a particular technology that was taught, building a project around it, or winning hackathons using any of those technologies.  That feeling of happiness and pride you feel will put you on cloud nine. Trust me!


 


 


Do you have a motto in life, a guiding principle that drives you?

“Learning is a lifelong process”, so keep exploring something new every day.


 


What is one random fact few people know about you?

Since my early teen years, I wanted to have martial arts training because it helps you build focus, discipline, and mental and physical strength. But unfortunately, I never got an opportunity. Hopefully someday if I get the time and chance to pursue it, then I’m up for it!


 


Best of luck to you in the future, Vivekkumar!


 

Get started with Azure Purview in less than 5 minutes

Get started with Azure Purview in less than 5 minutes

This article is contributed. See the original author and article here.

Getting started with Azure Purview for data governance is quick and easy. First, if you don’t already have an Azure account, get instant access and $200 of credit to try Azure Purview by signing up for a free account.


 


 


CindyNa_1-1633118649326.png


 


After you create an Azure account, sign into the Azure portal and search for Purview accounts.


CindyNa_2-1633118649350.png


 


Then select Create to start an Azure Purview account. Note that you can add only one Azure Purview account at a time.


CindyNa_3-1633118649367.png


 


In the Basics tab, select an existing Resource group or create a new one.


CindyNa_4-1633118649385.png


 


Now, enter a name for your Azure Purview account. Note that spaces and symbols are not allowed.


CindyNa_5-1633118649402.png


 


Next, choose your Location.


CindyNa_6-1633118649416.png


 


Finally, select the Review + Create button, then the Create button. Your Azure Purview account will be ready in a few minutes!


CindyNa_7-1633118649429.png


 


Once you’ve launched an Azure Purview account, be sure to first visit the Knowledge center in the Azure Purview Studio.


The Knowledge center can be accessed via the home page.


CindyNa_8-1633118649443.png


 


Here, you can watch videos to learn more about the capabilities of Azure Purview, read blog posts about the latest product announcements, and do tutorials to get started with registering and scanning new data sources. Learn how to set up a business glossary and take a tour of the Azure Purview Studio to familiarize yourself with key features.


CindyNa_9-1633118649472.png


 


Try Azure Purview today



 


 


 


 

Integration guidance helps partners deliver Zero Trust solutions

Integration guidance helps partners deliver Zero Trust solutions

This article is contributed. See the original author and article here.

Building a great product means listening to what our customers need, and we’ve heard loud and clear from our customers that Zero Trust adoption is more important than ever. In the 2021 Zero Trust Adoption Report, we learned that 96% of security decision-makers state that Zero Trust is critical to their organization’s success, and 76% of organizations have at least started implementing a Zero Trust strategy. In the next couple years, Zero Trust strategy is expected to remain the top security priority and organizations anticipate increasing their investment.


 


Zero Trust adoption has been accelerated by the U.S. government as well. In May 2021, the White House signed an executive order calling for improvement to the nation’s cybersecurity, including advancing towards a Zero Trust architecture. More recently, the Office of Management and Budget released a draft federal strategy for moving towards Zero Trust architecture, with key goals to be achieved by 2024. Microsoft has published customer guidance and resources for meeting Executive Order objectives.


 


These government and industry imperatives create a huge opportunity for Microsoft and our partners to enhance support for our customers as they move towards an end-to-end Zero Trust security posture. At Microsoft, we strive to make it easy for partners, such as independent software vendors, to integrate with us so customers can easily adopt the most comprehensive security solutions. We recognize that customers take varied paths on their journey to Zero Trust and have multiple security solutions in their environment. When we work together to meet these needs, we build stronger protections for our companies and nations.   


 


To support partner integration and Zero Trust readiness, we recently released partner integration guidance at our Zero Trust Guidance Center. This guidance is organized across the pillars of Zero Trust, supporting integrations across a wide variety of products and partners. Examples include:



 


We applaud those who are embracing a Zero Trust approach for security solutions. We will close out with a few examples of how ISV partners, F5 and Yubico, have benefited from this integration guidance in the Zero Trust Guidance Center.


 


F5 and Microsoft rescue a county from malware 


 


 F5.png


Many companies rely line-of-business applications that were developed before adoption of the latest authentication protocols like SAML and OIDC. This means organization must manage multiple ways to authenticate users, which complicates user experience and increases costs.


BIG-IP Access Policy Manager (APM) is F5’s access management proxy solution that centralizes access to apps, APIs and data. BIG-IP APM integrated with Microsoft Azure AD to provide conditional access to the BIG-IP APM user interface.


 


Last year, Durham County enhanced security across a hybrid environment with Azure AD and F5 BIG-IP APM in the wake of a serious cybersecurity incident. F5 BIG-IP APM gave employees the unified solution they needed to access legacy on-premises apps. F5 used Azure AD to apply security controls to all their apps, enforce multifactor authentication, and use finetuned policies based on circumstances like employee login location. In addition, self-service password reset powered by the solution reduced help desk calls for passwords by 80%.


 


 



Government of Nunavut turns to Yubico and Microsoft to build phishing resistance following ransomware attack


 


kuchinski_1-1633383236951.png


 


In 2019, the Canadian government of Nunavut experienced a spear phishing attack that took down critical IT resources for the territory. In the wake of the attack, protecting identities and applications was a top priority.


 


Together, Azure AD and YubiKey offered a solution that upgraded the security of the Government of Nunavut and fit their unique needs. The Government of Nunavut wanted to implement a phishing-resistant authentication solution. In addition, the government agencies used a variety of Windows-based systems, and, because of their remote locations, had inconsistent network access. To address these needs, they adopted YubiKeys, which are a hardware device that can be used for multi-factor authentication with no network, power source, or client software. You can read the full story from Yubico and learn more from the video below.


 


 


Learn more


We are incredibly proud of the work our partners are doing to provide customers with critical cybersecurity solutions using the principles of Zero Trust. Check out our newly published partner integration guidance for Zero Trust readiness to learn more about opportunities.


 


 


 


Learn more about Microsoft identity: