Introducing the Surface API Management Service

Introducing the Surface API Management Service

This article is contributed. See the original author and article here.

Our goal is to support our customers’ needs, wherever they might be. To this end, we’re introducing the Surface API Management Service, aimed at simplifying how you access information about your devices, coverage and insights.


 


Surface API Management Service.pngExtend Surface Management Portal


This service builds on the familiar Surface Management Portal, enabling IT admins to directly access Microsoft coverage and entitlement information for their devices via API endpoints. It’s a practical enhancement for those already using the portal, designed to make device management more straightforward.


 


The service at a glance


The Surface API Management Service is an API management system that allows you to gain access to APIs that provide you with information about your devices. 


 


These APIs serve as a direct conduit for customers seeking to streamline their asset management processes. By seamlessly connecting to the API, IT admins gain immediate access to critical device and warranty information, essential for effective asset oversight. Tailored specifically for Surface customers, this integration offers a hassle-free solution, eliminating the need for convoluted data acquisition methods. Now, with the simple integration of this API, customers can quickly retrieve pertinent details, empowering organizations to make informed decisions and optimize their asset management strategies.


 


Our first launch experience will allow current Surface Management Portal users to get their Microsoft coverage and entitlement information for their devices directly through API endpoints. Through our portal you can also see more of the performance and usage of the APIs across the globe with Surface API Management Service Reports.


 


Surface API Management Service reports.png


 


Get access to Surface API Management Services


To access, you need an active Surface Management Portal Account and a completed customer validation check. (If you have access to create service requests within Surface Management Portal you have been approved.)


 









Email request to surfaceapimanagement@microsoft.com


Subject: “Requesting Access to Surface API Management Service”


Include the following info:



  • Company Name

  • Tenant ID

  • Tenant primary domain (e.g. contoso.onmicrosoft.com)

  • Application (client) ID*

  • Estimated quantity of Intune-registered Surface devices in your organization



 


Get started


To get started with using the APIs, see the Readme on GitHub.

Securing Azure AD B2C API Connector (Function App) without Error

Securing Azure AD B2C API Connector (Function App) without Error

This article is contributed. See the original author and article here.

I was recently working with a customer who is using Azure AD B2C API Connector to enrich tokens with claims from external sources. They are using Azure Function App as the external source. As this setup demands, they exposed Azure Function App over public IP to work with B2C. But due to enterprise security restriction policy they must remove public endpoint from Function App and use private endpoints to VNET.


 


They thought of 2 options to expose the Function App securely over internet – using Azure API Management instance to a virtual network – external mode APIM in external mode or using Azure Application Gateway. But in both the cases B2C auth process errors out after adding the API Connector in the user flow:


 


Picture1.png


 


2.png


 


Initially I investigated on the error messages collected at the B2C, and APIM or Azure Application Gateway end. But later realized the main source of problem lies somewhere else. It is the ASP.NET Core framework used in building the Function App.


 


We need to modify default FowardedHeaders middleware settings. Otherwise, it will ignore the X-Forwarded headers being sent by APIM or Application Gateway because it isn’t in the list of KnownProxies and KnownNetworks. Please see the following links to understand the concept better:


 



 


So, I did the following changes:


 


1. Added ASPNETCORE_FORWARDEDHEADERS_ENABLED application setting to my Function App Configuration:


 


Picture2.png


 


2. Added a Startup.cs file in my function app code.


 


using System.Collections.Generic;


using Microsoft.AspNetCore.Builder;


using Microsoft.Azure.Functions.Extensions.DependencyInjection;


using Microsoft.Extensions.DependencyInjection;


 


[assembly: FunctionsStartup(typeof(TestAPIFunctionApp.Startup))]


namespace TestAPIFunctionApp


{


    public class Startup : FunctionsStartup


    {


        public override void Configure(IFunctionsHostBuilder builder)


        {


            builder.Services.Configure(options =>


            {


                options.ForwardedHeaders = Microsoft.AspNetCore.HttpOverrides.ForwardedHeaders.XForwardedFor | Microsoft.AspNetCore.HttpOverrides.ForwardedHeaders.XForwardedProto | Microsoft.AspNetCore.HttpOverrides.ForwardedHeaders.XForwardedHost;


                options.KnownNetworks.Clear();


                options.KnownProxies.Clear();


                // Put your front door, application gateway, APIM, b2clogin FQDN here and any other hosts that will send headers you want respected


                options.AllowedHosts = new List() { “.azurewebsites.net”, “.b2clogin.com”, “.azure-api.net”};


            });


        }


    }


}


 


That solves our problem. We can now see the “augmented claims”:


 


Picture3.png

General availability: Elastic Jobs in Azure SQL Database

General availability: Elastic Jobs in Azure SQL Database

This article is contributed. See the original author and article here.

We are excited to announce the general availability (GA) of Elastic Jobs for Azure SQL Databases.  


 


Elastic Jobs is a fully integrated Azure SQL database service that allows you to automate and manage administrative tasks across multiple SQL databases in a secure, scalable way. It can run one or more T-SQL job scripts in parallel using Azure portal, PowerShell, REST, or T-SQL APIs. Jobs can be run on a schedule or on-demand, targeting any tier of Azure SQL Database. Job target can include all databases in a server, in an elastic pool, across multiple servers and even databases across different subscriptions and geo regions on Azure. Servers and pools are dynamically enumerated at runtime, so jobs run against all databases that exist in the target group at the time of execution.


 


Jobs - conceptual-diagram.png


Where can you use Elastic Jobs?


Any database administration or management job that can be scripted with a T-SQL script is a good candidate for elastic jobs. Some of these example scenarios include:



  • Automate management tasks like deploy schema changes, index rebuilding, performance/telemetry data collection etc.  

  • Configure jobs like query execution, collecting results across a collection of databases on a recurring basis.

  • Aggregate and collect data for processing and reporting.

  • Data movement, ETL processing to extract/process/insert data between tables in a database.

  • .. to name a few.


What are some significant capabilities of Elastic Jobs?


Elastic Jobs makes it easy and secure to manage large number of SQL databases across Aure. Some significant security and management capabilities of Elastic Jobs include:



  • Microsoft Entra ID (formerly Azure Active Directory) support for central administration of authentication and permissions

  • Service-managed Private Link support to securely connect to target databases.

  • Integration with Azure Alerts for job execution status notification. 

  • Easily scale Job Agent’s tier to connect to hundreds of target databases concurrently across Azure. 

  • Dynamic enumeration target databases in target servers and elastic pools

  • Jobs can be composed of multiple steps to customize the execution sequence.

  • All functionalities can be accessed through portal, PowerShell, T-SQL and REST APIs


How do you setup and use Elastic Jobs?


Setting up and using elastic jobs is simple as described here.


 



  • Job Agent and Job database creation


Job Agent and associated Job database creation experience in portal is similar to a SQL database creation. As part of job agent creation, its service tier can be chosen, and a user assigned managed identity can be added for Entra authentication. Once the job agent is created, its portal page allows easy access to all its capabilities.


 


SriniAcharya_3-1712275428921.png


 



  • Defining Jobs, their target groups and monitoring them through job agent’s portal page.


Jobs and job steps can be defined, edited and executed through portal page. These jobs can also be scheduled to run at regular intervals and their execution can be monitored.


SriniAcharya_1-1712356381526.png


 



  • Advanced security functionalities, alert notification and scaling are also easily accessed through Job Agent’s portal page.


Job agents Entra ID can be changed and private links to target databases established easily through portal page. Azure Alerts can be defined for getting alert notification on job executions status. Scaling the job agent’s compute tier to enable it to connect to hundreds of target databases concurrently is also easy through portal page links.


 


SriniAcharya_0-1712358374916.png


 


Steps described above for creating, configuring, and managing elastic jobs also be accomplished through using PowerShell, REST and T-SQL APIs.


 


Pricing


Billing for Job agents will start at GA time, April 11th, 2024. Billing cards in the Azure portal page will show estimated cost based on the provisioned job agent tier.


  *Billing for the job agents in national clouds are expected to start a little later, depending on the billing pipeline deployment in those regions.

 


Regional availability 


The Elastic Jobs is available in all the regions that Azure SQL Database is available. 


 


Resources 


To get started, access the Elastic Database Jobs in our documentation and follow through with the tutorials. 


 

SharePoint at the Microsoft 365 Community Conference

SharePoint at the Microsoft 365 Community Conference

This article is contributed. See the original author and article here.

If you don’t Share, what’s the Point! The community motto, “Sharing is caring” is in full swing – and there’s a ton of sharing in preparation, and we, the SharePoint Team, want to invite you.


 


Join us in Orlando, Florida for the biggest Microsoft 365 Community Conference to date | April 30 – May 2, 2024. Microsoft is sending over 175 Microsoft product makers — to share and discuss innovation and real-world solutions across keynotes, sessions, and pre/post event deep-dive workshops to build your expertise.


 


:cool:Register today | Note: Use the MSCMTY discount code to save $100 USD.


 


Below is a subset of the event content, so you know what to expect to hear and see from the SharePoint team. Expect clarity on what SharePoint is today AND directions content management and communications for the future. To see all that Microsoft is planning for the event, please review our Microsoft 365 Community Conference event guide


 


Join in: The Microsoft 365 Community Conference in Orlando, FL | April 30 - May 2, 2024 | aka.ms/M365Conf24 – Sponsored by Microsoft.Join in: The Microsoft 365 Community Conference in Orlando, FL | April 30 – May 2, 2024 | aka.ms/M365Conf24 – Sponsored by Microsoft.


 


SharePoint content at the Microsoft 365 Community Conference


Join us to learn how AI-powered content management in Microsoft 365 enables content intelligence, optimizes critical business processes, improves governance, and prepares your content for Copilot. Below is a subset of content related to SharePoint:  


 



  • Opening keynote | “The Age of Copilots” with Jeff Teper (President of Collaboration Apps and Platforms) | Tuesday, April 30th, 8:00am – 9:30am EDT





 


:cool:Register today | Note: Use the MSCMTY discount code to save $100 USD.


 


In addition to our main sessions and content, expect a lot of community time and networking with executives and product makers in the Expo Hall – Microsoft will have a booth with a stage for lightning talks, meet and greets throughout the week, day and evening activities – including the main attendee party at Universal Islands of Adventure, plus a variety of roundtable discussion with our researchers and product team members to listen and help share future product/feature direction.


 


We asked three Microsoft technology and event experts, @Sharon Weaver@Sean Bugler, and Derek Cash Peterson — to share their in-person tips and tricks so you can best prepare to have an awesome and optimal Microsoft 365 Community Conference experience: 


 


 


Join in! The Microsoft 365 Community Conference in Orlando, FL | April 30 – May 2, 2024 | aka.ms/M365Conf24 – Sponsored by Microsoft.


 



  • WhatMicrosoft 365 Community Conference 2024


    • Register today | Note: Use the MSCMTY discount code to save $100 USD.


  • Content: Microsoft keynote + 1 AMA || 150+ overall sessions – 88 Microsoft-led sessions (see all below in product-area buckets) | 18 full-day workshops (pre-day and post) – 4 supported by Microsoft


  • When & whereApril 30 – May 2nd, 2024 


    • In-person: Orlando, Florida – Swan & Dolphin Resort – Disneyworld

    • Workshops: April 28, 29, and May 3, 2024


  • Twitter & hashtag@M365CONF | #M365Con

  • Cost$1,899 – full conference (Includes 3 continental breakfasts, 3 lunches, a T-shirt, and backpack. Additional costs for full-day workshops.)


 


Thank you, Mark Kashman, Senior product manager – Microsoft


 


Join in: The Microsoft 365 Community Conference in Orlando, FL | April 30 - May 2, 2024 | aka.ms/M365Conf24 – Sponsored by Microsoft.Join in: The Microsoft 365 Community Conference in Orlando, FL | April 30 – May 2, 2024 | aka.ms/M365Conf24 – Sponsored by Microsoft.


 


Join in: The Microsoft 365 Community Conference in Orlando, FL | April 30 - May 2, 2024 | aka.ms/M365Conf24 – Sponsored by Microsoft.Join in: The Microsoft 365 Community Conference in Orlando, FL | April 30 – May 2, 2024 | aka.ms/M365Conf24 – Sponsored by Microsoft.


Join in: The Microsoft 365 Community Conference in Orlando, FL | April 30 - May 2, 2024 | aka.ms/M365Conf24 – Sponsored by Microsoft.Join in: The Microsoft 365 Community Conference in Orlando, FL | April 30 – May 2, 2024 | aka.ms/M365Conf24 – Sponsored by Microsoft.

Is Azure the right place to run Red Hat Enterprise Linux workloads?

Is Azure the right place to run Red Hat Enterprise Linux workloads?

This article is contributed. See the original author and article here.

Ensure peak performance, security and compatibility with Azure for Red Hat Enterprise Linux. Leverage Azure Migrate to transition on-prem Linux VMs to Azure, for cloud-native or hybrid deployment. Deploy and orchestrate infrastructure with Azure Resource Manager templates, Terraform, and Ansible playbooks. Uncover cost-saving opportunities and performance optimization tools, and benefit from license portability, commitment-based discounts, and diverse compute options, including Azure Confidential Computing VMs, for enhanced scalability and efficiency. Experience flexibility with Azure, enabling RHEL workloads to run across global regions and edge locations, with Azure Arc providing centralized management and security for hybrid environments.


 


Main.png


 


Join Azure expert, Matt McSpirit, as he shares why Azure is the right place to run your Red Hat Enterprise Linux workloads.


 


 


Transition on-prem Linux VMs to Azure


 


1-Azure Migrate.png


Leverage Azure Migrate for cloud-native or hybrid deployment. Check out Azure’s seamless integration for your Red Hat Enterprise Linux workloads.


 


 


Optimize spend with RHEL workloads on Azure.


 


2-RHEL.png


Take advantage of the latest cloud tech, license portability, and commitment-based discounts. Click to watch.


 


 


Azure provides flexibility for RHEL workloads.


 


3-Deploy.png


Deploy RHEL workloads across global regions, edge locations, and hybrid environments ensuring consistency and integration with other Azure services. See it here.


 


 


Watch our video here: 


 


 







QUICK LINKS:


00:00 — Why run Red Hat Enterprise Linux workloads on Azure?
01:10 — Integration
01:41 — Automated scripting or code-based options
02:09 — Beyond provisioning
02:31 — Customer support
03:07 — Efficiency- optimize your spend
04:28 — Increase performance and scalability
05:41 — Flexibility
06:26 — Update management
06:40 — Wrap Up


 


 


Link References:


See the Forrester Consulting study at https://aka.ms/RHELTEI


For additional information check out https://aka.ms/RedHatAzure


 


 


Unfamiliar with Microsoft Mechanics?


As Microsoft’s official video series for IT, you can watch and share valuable content and demos of current and upcoming tech from the people who build it at Microsoft.



 


 


Keep getting this insider knowledge, join us on social:











Video Transcript:


-Is Azure the right place to run your Red Hat Enterprise Linux workloads? Well, let’s start by addressing the question at the heart of it all, and that’s how well does Azure support Linux as an operating system overall? Especially considering Microsoft’s own strength with Windows Server workloads? 


 


-Well, it may surprise you to know that the majority, more than 60% of customer cores in Azure, run Linux workloads. Azure’s service platform and its hypervisor layer, in fact, is fully optimized for Linux, to ensure peak performance, security and compatibility. 


 


-Linux is a first class operating system in Azure, with many of Microsoft’s most popular cloud services, like the Azure OpenAI service, Azure Kubernetes Service, App Service, Cosmos DB, Postgres and more, all running on it. And if we look specifically at Red Hat Enterprise Linux workloads, there’s even more synergy with Azure. 95% of the Fortune 500 companies use Azure. And 90% of the Fortune Global 500 rely on Red Hat. Meaning the majority of the world’s largest companies run on both. Azure, in fact, is deeply integrated with Red Hat technologies and services. 


 


-Including tools for the migration, management, and modernization of your RHEL workloads. And today, we’ll focus on the three areas that make Azure a great choice to run them: Integration, efficiency, and flexibility. Starting with integration. If you’re not in the cloud yet, Azure Migrate tooling can help you bring your running on-premises Linux VMs fully configured to Azure to run natively in the cloud, or run hybrid together with your on-premises or edge site infrastructure. 


 


-Then, when it comes to provisioning RHEL virtual machines, Azure has a huge gallery of pre-configured images, which are maintained to always include the latest updates. And beyond the portal experience, you also have the choice of automated scripting or code-based options. You can leverage Azure Resource Manager templates to define your infrastructure using JSON code. Use Terraform to scale your RHEL workloads. Take advantage of the numerous Ansible playbooks on the Ansible Automation Platform for the orchestration and automation of your workloads. 


 


-Or choose from a selection of fully automated marketplace solutions, with all the components needed to build complex, multi-tiered apps in minutes. That said, the integration with RHEL goes beyond provisioning. Azure is also integrated with other Red Hat tools and solutions that you may already be using today, to orchestrate, run, update, and monitor your RHEL workloads on Azure. In addition to the Ansible Automation Platform, the list includes Red Hat OpenShift, and the JBoss Enterprise Application Platform. 


 


-And these are fully managed services. Then, importantly, if you have an issue, as you work with your support team, both Azure and Red Hat support engineers can work together with your approval. In fact, these engineers are often co-located, and their customer service systems are integrated to triage issues in the least amount of time by enabling them to collaborate efficiently on your case, as they troubleshoot the problem and work towards a solution. Importantly, they’ll coordinate the response back to you. 


 


-And will work closely with you to close the joint ticket once it’s been resolved. Azure is unique in that it’s the only cloud with this level of integrated support and tooling with Red Hat. So now let’s move on to the second area, efficiency. Here, Red Hat recently commissioned a study by Forrester Consulting on the total economic impact of Red Hat Enterprise Linux running on Azure. Customers that moved their on-premises RHEL workloads onto Azure achieved 192% return on investment over three years. 


 


-You can find the full study at aka.ms/RHELTEI. RHEL workloads on Azure can also take advantage of the latest cloud tech, license portability, and commitment-based discounts to optimize your spend. For example, for workloads that can handle interruptions, such as batch processing and DevTest, you can take advantage of spare Azure compute capacity by deploying Azure Spot VMs for RHEL, as they become available. And save up to 90% compared to pay-as-you-go rates. 


 


-Then, for more critical or stateful applications, as you create virtual machines, you can either use Red Hat subscription services from Azure, or bring your own subscription at no additional cost by selecting the Azure hybrid benefit for RHEL. And that will remove the Azure RHEL subscription. At that point, you can apply your own Red Hat subscription without requiring any downtime or a reboot. And to save more, you can use commitment-based discounts as well. Azure reserved instances can save you up to 76% in costs with a three year term. 


 


-Now running your RHEL workloads in Azure not only can save costs, but can also increase their performance and scalability. There are an extensive range of compute options for RHEL workloads, from general purpose to memory optimized, compute optimized, storage optimized, and GPU enabled VMs. Azure Confidential Computing VMs work with Red Hat Enterprise Linux. And these use a trusted execution environment to extend encryption protections to your sensitive data while it’s in use. And we’re working with Red Hat engineers to bring this to containers as well. 


 


-Beyond virtual machines, you can also run RHEL workloads on physical hardware with Azure Large Instances, which can use dedicated servers. These are ideal for workloads that need high performance, isolation, or special compliance requirements, like SAP HANA, Oracle, or IBM DB2 workloads. That said, to remove management overhead, you can also run RHEL with the JBoss EAP. And you’ll find multiple options to do that at almost any scale in the marketplace. 


 


-And using JBoss with the Azure App Service Instance delivers a fully managed platform for web and API based applications. With its underlying Docker containers and built-in capabilities, including autoscaling, security controls, and more, for operating your RHEL workloads. Which brings us onto our third area for operating your RHEL workloads, flexibility. 


 


-With Azure, you can run RHEL workloads where you want, how you want, and when you want, without any lock-in or limitations. RHEL workloads can run on Azure across dozens of global regions, zones, and edge locations. If you run RHEL workloads on-prem in your data center, on edge sites, or in any other cloud, Azure ARC lets you manage and secure them from any of these locations directly in Azure. 


 


-In fact, for your hybrid workloads, Azure Arc gives you a single control plane for policy enforcement, security, update management, monitoring and more. And has been thoroughly tested and validated with Red Hat. And it can include your other Linux Distros as well as Windows Server. And they’ll integrate with other Azure native services.


 


-Next, to help with update management, you have full control over how RHEL updates are managed using your existing or preferred tools. Like the Ansible Automation Platform, which can initiate via the command line. Or you can use Azure Update Manager to control updates from the Azure portal. And with that, I hope I’ve shared enough to show you why Azure is a great place to run your Red Hat Enterprise Linux workloads with the security, performance, and compatibility that you need. 


 


-Whether you want to migrate your existing Linux workloads to Azure, or build new cloud native apps from scratch, and if you prefer, run them hybrid, Azure offers unparalleled support, and opens up a breadth of opportunities to do more. For additional information, check out aka.ms/RedHatAzure. Keep checking back to Microsoft Mechanics for the latest tech updates. And thanks for watching.