Leveling Up Static Web Apps With the CLI

Leveling Up Static Web Apps With the CLI

This article is contributed. See the original author and article here.

With the Azure Static Web Apps GA there was a sneaky little project that my colleague Wassim Chegham dropped, the Static Web Apps CLI.


The SWA CLI is a tool he’s been building for a while with the aim to make it easier to do local development, especially if you want to do an authenticated experience. I’ve been helping out on making sure it works on Windows and for Blazor/.NET apps.


It works by running as a proxy server in front of the web and API components, giving you a single endpoint that you access the site via, much like when it’s deployed to Azure. It also will inject a mock auth token if want to create an authenticated experience, and enforce the routing rules that are defined in the staticwebapp.config.json file. By default, it’ll want to serve static content from a folder, but my preference is to proxy the dev server from create-react-app, so I can get hot reloading and stuff working. Let’s take a look at how we can do that.


Using the cli with VS Code


With VS Code being my editor of choice, I wanted to work out the best way to work with it and the SWA CLI, so I can run a task and have it started. But as I prefer to use it as a proxy, this really requires me to run three tasks, one of the web app, one for the API and one for the CLI.


So, let’s start creating a tasks.json file:


 

{
    "version": "2.0.0",
    "tasks": [
        {
            "type": "npm",
            "script": "start",
            "label": "npm: start",
            "detail": "react-scripts start",
            "isBackground": true
        },
        {
            "type": "npm",
            "script": "start",
            "path": "api/",
            "label": "npm: start - api",
            "detail": "npm-run-all --parallel start:host watch",
            "isBackground": true
        },
        {
            "type": "shell",
            "command": "swa start http://localhost:3000 --api http://localhost:7071",
            "dependsOn": ["npm: start", "npm: start - api"],
            "label": "swa start",
            "problemMatcher": [],
            "dependsOrder": "parallel"
        }
    ]
}

 


The first two tasks will run npm start against the respective parts of the app, and you can see from the detail field what they are running. Both of these will run in the background of the shell (don’t need it to pop up to the foreground) but there’s a catch, they are running persistent commands, commands that don’t end and this has a problem.


When we want to run swa start, it’ll kick off the two other tasks but using dependent tasks in VS Code means it will wait until the task(s) in the dependsOn are completed. Now, this is fine if you run a task that has an end (like tsc), but if you’ve got a watch going (tsc -w), well, it’s not ending and the parent task can’t start.


Unblocking blocking processes


We need to run two blocking processes but trick VS Code into thinking they are completed so we can run the CLI. It turns out we can do that by customising the problemMatcher part of our task with a background section. The important part here is defining some endPattern regex’s. Let’s start with the web app, which in this case is going to be using create-react-app, and the last message it prints once the server is up and running is:



To create a production build, use npm run build.



Great, we’ll look for that in the output, and if it’s found, treat it as the command is done.


The API is a little trickier though, as it’s running two commands, func start and tsc -w, and it’s doing that in parallel, making our output stream a bit messy. We’re mostly interested on when the Azure Functions have started up, and if we look at the output the easiest message to regex is probably:



For detailed output, run func with –verbose flag.



It’s not the last thing that’s output, but it’s close to and appears after the Functions are running, so that’ll do.


Now that we know what to look for, let’s configure the problem matcher.


Updating our problem matchers


To do what we need to do we’re going to need to add a problemMatcher section to the task and it’ll need to implement a full problemMatcher. Here’s the updated task for the web app:


 

{
    "type": "npm",
    "script": "start",
    "problemMatcher": {
        "owner": "custom",
        "pattern": {
            "regexp": "^([^s].*)((d+|d+,d+|d+,d+,d+,d+)):s+(error|warning|info)s+(TSd+)s*:s*(.*)$",
            "file": 1,
            "location": 2,
            "severity": 3,
            "code": 4,
            "message": 5
        },
        "fileLocation": "relative",
        "background": {
            "activeOnStart": true,
            "beginsPattern": "^.*",
            "endsPattern": "^.*To create a production build, use npm run build."
        }
    },
    "label": "npm: start",
    "detail": "react-scripts start",
    "isBackground": true
}

 


Since create-react-app doesn’t have a standard problemMatcher in VS Code (as far as I can tell anyway) we’re going to set the owner as custom and then use the TypeScript pattern (which I shamelessly stole from the docs :rolling_on_the_floor_laughing:). You might need to tweak the regex to get the VS Code problems list to work properly, but this will do for now. With our basic problemMatcher defined, we can add a background section to it and specify the endsPattern to match the string we’re looking for. You’ll also have to provide a beginsPattern, to which I’m lazy and just matching on anything.


Let’s do a similar thing for the API task:


 

{
    "type": "npm",
    "script": "start",
    "path": "api/",
    "problemMatcher": {
        "owner": "typescript",
        "pattern": {
            "regexp": "^([^s].*)((d+|d+,d+|d+,d+,d+,d+)):s+(error|warning|info)s+(TSd+)s*:s*(.*)$",
            "file": 1,
            "location": 2,
            "severity": 3,
            "code": 4,
            "message": 5
        },
        "background": {
            "activeOnStart": true,
            "beginsPattern": "^.*",
            "endsPattern": ".*For detailed output, run func with --verbose flag..*"
        }
    },
    "label": "npm: start - api",
    "detail": "npm-run-all --parallel start:host watch",
    "isBackground": true
}

 


Now, we can run the swa start task and everything will launch for us!


 

2021-05-25-leveling-up-static-web-apps-with-the-cli.gif


Conclusion


Azure Static Web Apps just keeps getting better and better. With the CLI, it’s super easy to run a local environment and not have to worry about things like CORS, making it closer to how the deployed app operates. And combining it with these VS Code tasks means that with a few key presses you can get it up and running.


I’ve added these tasks to the GitHub repo of my Auth0 demo app from the post on using Auth0 with Static Web Apps.

Use Windows Update native experience with Configuration Manager Technical Preview 2105.2

Use Windows Update native experience with Configuration Manager Technical Preview 2105.2

This article is contributed. See the original author and article here.

Update 2105.2 for the Technical Preview Branch of Microsoft Endpoint Configuration Manager has been released. When installing software updates from Configuration Manager, you can now choose to use the native Windows Update interface and restart experience. The client’s Windows Update Settings page will display the updates like they appear when using Windows Update for scanning. Restarts from software updates will also behave as though you’re using Windows Update.


 


screenshots of Windows native software update experiencescreenshots of Windows native software update experience


To use this feature, client devices must be running Windows Insider build 21277 or later. To enable the Windows Update native experience:



  1. From the Administration workspace, select Client Settings in Configuration Manager.

  2. Select the Computer Restart group in Client Settings

  3. For the Select the restart experience to be shows to end users setting, choose the Windows option.

  4. If needed, you can change the number of days the device is allowed to be pending a restart before it’s forced using the following setting:  Specify a deadline, the time (in days) from when a device is pending reboot until the device is forced to restart.


For more information, see Windows Update native experience for software updates. 


 


This preview release also includes:


 


Send product feedback from error windows – Previously, if the Configuration Manager console reported an error in a separate window, you had to go back to the main console window to send feedback. In some cases, this action isn’t possible with other console windows open.


 


Starting in this release, error messages include a link to Report error to Microsoft. This action opens the standard “send a frown” window to provide feedback. It automatically includes details about the user interface and the error to better help Microsoft engineers diagnose the error. Aside from making it easier to send a frown, it also lets you include the full context of the error message when you share a screenshot.


 


Custom properties for devices – Many customers have other data that’s external to Configuration Manager but useful for deployment targeting, collection building, and reporting. This data is typically non-technical in nature, not discoverable on the client, and comes from a single external source. For example, a central IT Infrastructure Library (ITIL) system or asset database, which has some of the following device attributes:



  • Physical location

  • Organizational priority

  • Category

  • Cost center

  • Department


Starting in this release, you can use the administration service to set this data on devices. You can then use the custom properties in Configuration Manager for reporting or to create collections.


 


Hardware inventory for client log settings – You can now inventory client log file settings such as log levels and size by enabling the new inventory class, Client Diagnostics (CCM_ClientDiagnostics). This behavior allows you to track settings that you change by the Client Diagnostics actions. This new inventory class isn’t enabled by default.


 


Simplified CMPivot permissions requirements – We’ve simplified the CMPivot permissions requirements. The new permissions are applicable for CMPivot standalone and CMPivot in the on-premises console. The following changes have been made:



  • CMPivot no longer requires SMS Scripts read permission

    • The SMS Provider still requires this permission if the administration service falls back to it due to a 503 (Service Unavailable) error, as seen in the CMPivot.log.



  • The default scope permission isn’t required.


Hierarchy approved console extensions don’t require signing – Starting in this technical preview, you can choose to allow unsigned hierarchy approved console extensions. You may need to allow unsigned console extensions due to an unsigned internally developed extension, or for testing your own custom extension in a lab.


 


Improvements to CMPivot – We’ve made the following improvements to CMPivot:



  • Added a Key value to the Registry entity

  • Added a new RegistryKey entity that returns all registry keys matching the given expression

  • Added maxif and minif aggregators that can be used with the summarize operator

  • Improvements to query autocomplete suggestions in the query editor


PowerShell release notes preview – These release notes summarize changes to the Configuration Manager PowerShell cmdlets in technical preview version 2105.2.


 


For more details and to view the full list of new features in this update, check out our Features in Configuration Manager technical preview version 2105.2 documentation. 


 


Update 2105.2 for Technical Preview Branch is available in the Microsoft Endpoint Configuration Manager Technical Preview console. For new installations, the 2103 baseline version of Microsoft Endpoint Configuration Manager Technical Preview Branch is available on the Microsoft Evaluation Center. Technical Preview Branch releases give you an opportunity to try out new Configuration Manager features in a test environment before they are made generally available.


 


We would love to hear your thoughts about the latest Technical Preview!  Send us feedback about product issues directly from the console and continue to share and vote on ideas about new features in Configuration Manager.


 


Thanks,


The Configuration Manager team


 


 


Configuration Manager Resources:


Documentation for Configuration Manager Technical Previews


Try the Configuration Manager Technical Preview Branch


Documentation for Configuration Manager


Configuration Manager Forums


Configuration Manager Support


 

A new, more powerful, and customizable Microsoft Bookings is here

A new, more powerful, and customizable Microsoft Bookings is here

This article is contributed. See the original author and article here.

Last March, at Ignite, we gave you a preview of the new powerful and customizable Bookings experience and starting today, we are rolling it out to everyone!


 


To turn on the new experience, toggle the switch on the top right corner of the Bookings home page.


GIF short reduced.gif


Look for the toggle on the top right corner of the Bookings web experience


 

Once you’re in the new Bookings view, you’ll see the new experience which introduces many new capabilities in Microsoft Bookings.


 

Image showing the welcome to Bookings popupImage showing the welcome to Bookings popup


The new Bookings has more options to customize and better controls on each staff’s role


 


If you need to go back to the classic version, you can flip the toggle back and forth as often as you need.


Now that the preview is live, we wanted to highlight some of the key features of Bookings and what new experiences you’ll be seeing.


Compliance, Privacy and Tighter Controls


We understand that each organization is different and has varied needs for managing appointments. Bookings now has stricter administrative controls, and each user within Bookings has varied levels of control over how calendars are created, edited, and shared as well as how appointments can be booked.


Microsoft 365 Admins


Admins can now control who has access to Bookings, whether external users can book appointments, and if staff details can be shared with customers.


Admins can also control the privacy of the customers’ booking appointments and can restrict what information can be collected when making a booking, like phone numbers, email, or contact address. Additionally, they can prevent staff members from requesting customer information by blocking the creation of custom fields.


 

Image showing the new tenant administrator controlsImage showing the new tenant administrator controls
Admins can control the information required in a booking and even block custom fields


Bookings Admins


Bookings admins have controls to ensure their organization’s compliance and privacy standards. They can restrict appointments to users within the organization and can also restrict search engine listings. Admins can also configure data usage consent with their own custom message, privacy policy link, as well as add terms and conditions information on the Bookings page.


New Roles


To ensure that the correct staff members have the adequate access to Bookings’ pages, two new roles have been created.



  • Team Member – this role allows a staff member to view and edit their own calendar but not anyone else’s.

  • Scheduler – this role allows staff members to schedule appointments without being able to modify services or settings. In addition to ensuring tighter access control, these roles unburden the Bookings admin from day-to-day operations.


 

Image showing all the roles including the newly introduced Scheduler & Team memberImage showing all the roles including the newly introduced Scheduler & Team member
New roles for staff members


Customization & Branding


Bookings allows organizations to customize their Bookings page with their own logo. A color theme that best suits the organization can be chosen as well. Confirmations, cancellations, and reminders can be customized using a rich text editor.


 

Image showing the new theming optionsImage showing the new theming options
Choose a color and add your logo for your Bookings page


 

Image showing the new Create Services popupImage showing the new Create Services popup
Set your services’ details


Simpler Scheduling


We’ve strived to make appointment scheduling as simple as possible. Admins can add multiple staff members and get a unified view across all their calendars and availability. Switching between multiple calendars is made easier with an option to filter by staff members and services. There is also an option to pin a specific calendar for easier tracking.


 

Image showing the new filters experienceImage showing the new filters experience
Unified calendar view across staff members


 


Admins can navigate to a staff member profile directly from the calendar and get a comprehensive view of their scheduled meetings, contact information, and services offered.


 


Image showing the new views for Staff & ServicesImage showing the new views for Staff & Services


See a comprehensive view of staff’s details


 


Custom availability can be set for each staff member with multiple slots in a day and certain days marked as non-available. This is synced with the staff’s Outlook calendar to avoid double bookings. Additionally, appropriate lead time can be configured for each service to ensure that staff members are well prepared before an appointment. There’s also an option to add buffer time before and after an appointment to provide sufficient breathing time.


 


Coming soon


Today, Microsoft Bookings is used by thousands of organizations globally to manage their appointments inside and outside their organization. It is used across various industries to enable different scenarios like virtual classrooms, financial consulting, and tele-health.


 


To read more on how customers are using Bookings for these scenarios, please click here.


 


As Bookings continues to grow and evolve, we are committed to building new features and capabilities which can further improve the Bookings experience and empower organizations to manage their calendars and appointments better.


Admin Toolkit


We want organizations to have more control over how Bookings is used by their staff members. The admin toolkit will provide admins with granular control over the naming policy, logo, business hours, staff availability and other aspects of Bookings within their organization.


Scalability


We talked about scaling Bookings for large demand in an earlier post here. We are working hard on improving Bookings to handle more scale so that Bookings continues to work well across various scenarios like virtual meetings, consultations, and other types of appointments.


Customized Scheduling


Availability of staff members keeps changing based on personal and business needs. We want to provide granular access to admins and staff members to customize the staff availability for appointments well into the future.


Richer APIs


While we continue to improve Bookings, we also want to allow organizations to build on top of the Bookings platform and develop custom solutions which are more suited to their needs. Bookings APIs are currently available in preview as part of Microsoft Graph APIs and will soon be generally available.


 


If you want to  learn more about Bookings, how to set it up, and start creating your own bookings pages click here.


 


As always, we welcome your feedback. Let us know if you have any scenarios you’d like to see us support in the future.


 


Thanks!


 


Teja


 


 


 

What’s new: Detect credential leaks using built-in Azure Sentinel notebooks!

What’s new: Detect credential leaks using built-in Azure Sentinel notebooks!

This article is contributed. See the original author and article here.

Special thanks to @ZhipengZhao , @JulianGonzalez@Lars_Mohr, and Microsoft CredScanCore team for making these notebooks happen.


Thanks to @Tiander Turpijn for reviewing this blog and for the great feedback.


 


In this blog post, I’m going to walk you through three cool and easy-to-use Azure Sentinel notebooks that can scan logs across your entire Azure Sentinel workspace, Azure Blog storage, and Azure Data Explorer environment to detect credential leaks (which can save you from some serious potential cyberattacks!). These are the built-in templates that you can instantly use without writing any line of code!


 


Why is there a need?


According to Verizon’s 2020 Data Breach Investigation Report, the use of credentials in cyberattacks has been on a meteoric rise. Over 80% of hacking-related breaches involve the use of stolen or lost credentials.


 


It’s common sense to protect sensitive data such as passwords, API keys, database credentials, etc. by properly storing them. Unfortunately, storing data safely is not an easy task, and human error will continue to happen. This makes credential leaks high risks to many organizations. For that reason, it’s crucial to perform regular log scans to catch potential leaked credentials and take actions before they get in the wrong hands.


 


In the Azure Sentinel context, collected log messages are stored in a Log Analytics workspace. Many organizations also store their data in Azure Blob Storage or Azure Data Explorer, especially for long-term retention purpose. You might have an Azure Storage account Shared Access Signature used in a KQL query or an Azure Active Directory client access token used to authorize an application that has been logged and saved in a storage location. The storage becomes a gold mine for bad actors waiting to readily access, excavate, and exploit your organizations’ assets.


 


To help solve this problem, we’ve recently released three new Azure Sentinel notebooks that can scan across these environments – your Azure Sentinel workspace, Azure Blob Storage, and Azure Data Explorer – to uncover credential leaks in your data!


 


How do the notebooks work?


Each notebook scans logs in each respective environment.



  • The Credential Scan on Azure Log Analytics notebook enables you to pick any Azure Sentinel log table in your Log Analytics workspace and scan all or one specific column in the selected table.

  • The Credential Scan on Azure Data Explorer (ADX) notebook enables you to pick and scan a table in a database from a specific ADX cluster.

  • The Credential Scan on Azure Blob Storage notebook enables you to pick and scan a file in a blob container from your Blob storage account.


If any sensitive credentials are found, the results will be exported into a csv file. This file is saved in the same location as your notebook, where you can access and view the details of the leaked credentials. The file can also be downloaded and shared with relevant members in your team to validate the findings and apply appropriate remediation actions.


 


If no leaked credentials are found, no csv file is generated.


 


What types of credentials can the notebooks detect? The notebooks use regular expression patterns to identify the most common types of credentials, including passwords, Azure SQL connection strings, etc.  


For a full list of credential categories, please click on the following Spoiler section. 

Spoiler (Highlight to read)


  1. User login credentials

  2. Azure SQL connection string

  3. Azure IoT Shared Access Key

  4. Azure Storage Account Shared Access Signature

  5. Azure Storage Account Shared Access Signature for High-Risk Resources

  6. Azure Active Directory access token

  7. Amazon S3 client secret access key

  8. Azure Service Bus Shared Access Signature

  9. Azure Redis Cache Connection String Password

  10. Azure COSMOS DB Account Access Key

  11. Azure App Service Deployment Password

  12. Azure DevOps Personal Access Token

  13. Azure Function Master-API Key

  14. Azure Shared Access Key

  15. Azure AD Client Access Token

  16. X.509 Certificate Private Key

  17. ASP.NET Machine Key

  18. General Password

  19. Http Authorization Header

  20. Client Secret – API Key

  21. General Symmetric Key

  22. Ansible Vault

  23. Moniker Agent Storage Account Key

  24. Legacy Geneva Resource Access Key

  25. Domain User Password

User login credentials
Azure SQL connection string
Azure IoT Shared Access Key
Azure Storage Account Shared Access Signature
Azure Storage Account Shared Access Signature for High-Risk Resources
Azure Active Directory access token
Amazon S3 client secret access key
Azure Service Bus Shared Access Signature
Azure Redis Cache Connection String Password
Azure COSMOS DB Account Access Key
Azure App Service Deployment Password
Azure DevOps Personal Access Token
Azure Function Master-API Key
Azure Shared Access Key
Azure AD Client Access Token
X.509 Certificate Private Key
ASP.NET Machine Key
General Password
Http Authorization Header
Client Secret – API Key
General Symmetric Key
Ansible Vault
Moniker Agent Storage Account Key
Legacy Geneva Resource Access Key
Domain User Password

Let’s look at a notebook example in detail. While these notebooks are simple to use, there are some basic pre-requisites and important instructions for them to work.


 


Pre-requisites



  • Data in at least one of these locations: Azure Sentinel workspace, Azure Blob storage, Azure Data Explorer.

  • An Azure Sentinel workspace to use Azure Sentinel notebooks.

  • An Azure Machine Learning (AML) workspace. Create an Azure Machine Learning workspace. You can set a default AML workspace from Azure Sentinel portal if you have more than one AML workspaces. Make sure to have at least Contributor permissions to the AML workspace to launch and execute the notebooks.

  • At least a Azure Sentinel or Azure Data Explorer or Azure Blob Storage Reader role to query logs on these data locations.


 


Deployment



  1. From the Azure Sentinel portal, navigate to the Threat Management section and open the Notebooks blade.

  2. Go to the Templates tab.

  3. Search for Credential Scan and you should see three notebooks in the result.

  4. Select one notebook. For this example, I’m going to use Credential Scan on Azure Data Explorer.

  5. On the right panel, select Save notebook. You can rename the selected notebook or keep the default name and save it to a default AML workspace. Then select OK.

  6. The notebook is now accessible to your AML workspace. From the same panel, select Launch notebook. Then you are prompted to log into the AML workspace.

  7. In the AML workspace, notice that a Credential Scan on Azure Data Explorer.ipynb file and a config.json file have been automatically generated from step 6 above.

    • The Credential Scan on Azure Data Explorer.ipynb file has the main content of the notebook.

    • The config.json file has configuration information about your Azure Sentinel environment where your notebook was launched from. It contains tenant_id, subscription_id, resource_group, workspace_id, and workspace_name, which are used for Azure Authentication (see step 9 below).



  8. Select a compute instance for your notebook server. If you don’t have a compute instance, create one by following step 5 in Launch a notebook using your Azure ML workspace.


 

create computecreate compute


 



  1. There are three main sections in this specific notebook: Warm-up, Azure Authentication, and Azure Data Explorer Queries. Each notebook cell contains instructions and/or the sample code using Azure SDK for Python and KQL. To avoid common errors, it’s important to not skip these instructions.

    • The Warm-up cells load Python libraries, parameters, and functions that will be used in the notebook.

    • The Azure Authentication section allows the notebook access to your Azure Resource Group where your ADX environment is located.

    • The Azure Data Explorer Queries section enables you to choose an ADX cluster, a database from the cluster, then a table that you want to scan.



  2.  After you finish running the notebook, if you don’t see the results in a newly created csv file, refresh the notebooks file explorer on the left navigation panel. Note that the name of the csv file is different for each notebook, and no csv file is created if no credentials are found.


csv filecsv file


 


11. If you need to download the csv file, select the three dots next to the file name, right click and hit Download option.


12. All scripts and output files are stored in a default storage account of your AML workspace.



  1. Go to the storage account.

  2. Open File Shares -> Users.

  3. Select your user’s folder.


Important notes:



  • It’s crucial to execute the cells sequentially instead of running all cells at once. Each code cell depends on the output of its previous cells.

  • Depending on your data volume, some cell execution may take a few minutes, so please be patient.

  • If you run into an issue, follow these preliminary steps:

    1. Sign off from AML workspace and sign in again.

    2. Restart the kernel.




Restart kernelRestart kernel


3. Rerun the notebook.


4. If that still doesn’t work, send me a direct message and make sure to tag me. Or you can always create a Support ticket and our team will assist you.


 


Check out the video below for a live demo!


 


 


Summary


I hope you find these notebooks useful. Give them a try and let us know what you think!


Got more scenarios where you would like to use a notebook for? We’d love to hear! You can reach us by sending me a direct message, or posting a comment below, or posting your feedback on Azure Sentinel feedback forums.


 


 


 

New Data Flow Connector: SQL Server as Source and Sink

New Data Flow Connector: SQL Server as Source and Sink

This article is contributed. See the original author and article here.

Mapping Data Flows is the visual data transformation service in Azure Data Factory and Azure Synapse Analytics that enables powerful scale-out ETL capabilities with a low-code user interface. The ADF team is excited to announce that we are opening up on-prem and VM-based SQL Server as a source and sink to data flows in ADF and Azure Synapse Analytics. You will see SQL Server now as a connector option in both shared integration datasets as well as inline in your source & sink.


 


sql-server-dataset.png


sql-server-inline.png


Instructions on how to set-up the network configuration to use the Azure IR VNET and private link for accessing your SQL Server from data flows can be found here. Data flows in ADF & Synapse currently only support Azure IR, not self-hosted IR (SH-IR).

Use Power Automate to Notify of Upcoming Azure AD App Client Secrets and Certificate Expirations

Use Power Automate to Notify of Upcoming Azure AD App Client Secrets and Certificate Expirations

This article is contributed. See the original author and article here.

 


Disclaimer
The sample scripts are not supported under any Microsoft standard support program or service. The sample scripts are provided AS IS without warranty of any kind. Microsoft further disclaims all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose. The entire risk arising out of the use or performance of the sample scripts and documentation remains with you. In no event shall Microsoft, its authors, or anyone else involved in the creation, production, or delivery of the scripts be liable for any damages whatsoever (including, without limitation, damages for loss of business profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use the sample scripts or documentation, even if Microsoft has been advised of the possibility of such damages.


 


 


Are you constantly challenged with keeping up with all your Azure Active Directory Enterprise Application client secrets and certificates and their associated expiration dates? 


I sometimes work with customers who have a thousand or more Azure AD applications to manage. Unfortunately, trying to keep up with all the client secrets and certificates expiring on each one of these apps can be a futile experience.   


 


I came up with a solution to this problem by using Power Automate to proactively notify the Azure Active Directory administrators of upcoming client secret and certificate expirations. This solution was a big help for customers with thousands of AAD apps to keep track of.  I owe a huge thanks to a friend and peer of mine, Norman Drews, for his CSS & HTML expertise. 


 


Here’s how I solved it using Power Automate.  If you’d like to download the Power Automate flow and import it into your environment, click here to download it from my Github repository.


 



  1. Create (or use an existing) Azure AD app registration that has ONE of the following Application Permissions (starting from the least and ending with the most restrictive option) –  Application.Read.All, Application.ReadWrite.All, Directory.Read.All, or Directory.AccessAsUser.All.  

  2. Create a Scheduled Flow to run daily or weekly depending on how often you want to be alerted. 
     

    Variable definitions in the FlowVariable definitions in the Flow


     

  3. Initialize variable (String) – appId – this is the appID of the application. 

  4. Initialize variable (String) – displayName – this will be used to identify the display name of the application. 

  5. Initialize variable (String) – clientSecret – this needs to be set with the client secret of the Azure AD application created or chosen in step 1. 

  6. Initialize variable (String) – clientId – this needs to be set with the application (client) ID of the Azure AD application created or chosen in step 1. 

  7. Initialize variable (String) – tenantId – this needs to be set with the tenant ID of the Azure AD application created or chosen in step 1. 

  8. Initialize variable (Array) – passwordCredentials – this variable will be used to populate the client secrets of each Azure AD application. 

  9. Initialize variable (Array) – keyCredentials – this variable will be used to populate the certificate properties of each Azure AD application. 

  10. Initialize variable (String) – styles – this is some CSS styling to highlight Azure AD app secrets and expirations that are going to expire in 30 days (yellow) vs 15 days (red).  You can adjust these values accordingly to meet your needs.


 


Content of this step: 


 


 


 

{ 
  "tableStyle": "style="border-collapse: collapse;"", 
  "headerStyle": "style="font-family: Helvetica; padding: 5px; border: 1px solid black;"", 
  "cellStyle": "style="font-family: Calibri; padding: 5px; border: 1px solid black;"", 
  "redStyle": "style="background-color:red; font-family: Calibri; padding: 5px; border: 1px solid black;"", 
  "yellowStyle": "style="background-color:yellow; font-family: Calibri; padding: 5px; border: 1px solid black;"" 
} 

 


 


 


 


 


       11.  Initialize variable (String) – html – this creates the table headings and rows that will be populated with each of the Azure AD applications and associated expiration info. 



Content of this step:


 


 


 

<table @{variables('styles').tableStyle}><thead><th @{variables('styles').headerStyle}>Application ID</th><th @{variables('styles').headerStyle}>Display Name</th><th @{variables('styles').headerStyle}>Days until Expiration</th><th @{variables('styles').headerStyle}>Type</th><th @{variables('styles').headerStyle}>Expiration Date</th></thead><tbody> 

 


 


 


 



  1. Initialize variable (Float) – daysTilExpiration – this is the number of days prior to client secret or certificate expiration to use in order to be included in the report 

  2. We need to request an authentication token using our tenantIdclientId, and clientSecret variables. 


 

Request token step of the FlowRequest token step of the Flow


 



  1. The Parse JSON step will parse all the properties in the returned token request. 


The JSON schema to use is as follows: 


 


 


 

{ 
    "type": "object", 
    "properties": { 
        "token_type": { 
            "type": "string" 
        }, 
        "expires_in": { 
            "type": "integer" 
        }, 
        "ext_expires_in": { 
            "type": "integer" 
        }, 
        "access_token": { 
            "type": "string" 
        } 
    } 
}

 


 


 


 


 


 


Retrieve token info JSON schemaRetrieve token info JSON schema


 



  1. Initialize variable (String) – NextLink – This is the graph API URI to request the list of Azure AD applications.  The $select only returns the appIdDisplayNamepasswordCredentials, and keyCredentials, and since graph API calls are limited to 100 rows at a time, I bumped my $top up to 999 so it would use less API requests (1 per 1000 apps vs 10 per 1000 apps). 


 


https://graph.microsoft.com/v1.0/applications?$select=appId,displayName,passwordCredentials,keyCredentials&$top=999 


 



  1. Next, we enter the Do until loop. It will perform the loop until the NextLink variable is empty.  The NextLink variable will hold the @odata.nextlink property returned by the API call. When the API call retrieves all the applications in existence, there is no @odata.nextlink property.  If there are more applications to retrieve, the @odata.nextlink property will store a URL containing the link to the next page of applications to retrieve. The way to accomplish this is to click “Edit in advanced mode” and paste @empty(variables(‘NextLink’)). 


 

Do Until loopDo Until loop


 



  1. The next step in the Do until loop uses the HTTP action to retrieve the Azure AD applications list.  The first call will use the URL we populated this variable within step 15. 

  2. Parse JSON step is added to parse the properties from the returned body from the API call. 


 


The content of this Parse JSON step is as follows:


 


 


 

{
    "type": "object",
    "properties": {
        "@@odata.context": {
            "type": "string"
        },
        "value": {
            "type": "array",
            "items": {
                "type": "object",
                "properties": {
                    "appId": {
                        "type": "string"
                    },
                    "displayName": {
                        "type": "string"
                    },
                    "passwordCredentials": {
                        "type": "array",
                        "items": {
                            "type": "object",
                            "properties": {
                                "customKeyIdentifier": {},
                                "displayName": {},
                                "endDateTime": {},
                                "hint": {},
                                "keyId": {},
                                "secretText": {},
                                "startDateTime": {}
                            },
                            "required": []
                        }
                    },
                    "keyCredentials": {
                        "type": "array",
                        "items": {
                            "type": "object",
                            "properties": {
                                "customKeyIdentifier": {},
                                "displayName": {},
                                "endDateTime": {},
                                "key": {},
                                "keyId": {},
                                "startDateTime": {},
                                "type": {},
                                "usage": {}
                            },
                            "required": []
                        }
                    }
                },
                "required": []
            }
        },
        "@@odata.nextLink": {
            "type": "string"
        }
    }
}

 


 


 


 



  1. Get future time action will get a date in the future based on the number of days you’d like to start receiving notifications prior to expiration of the client secrets and certificates. 

  2. Next a foreach – apps loop will use the value array returned from the Parse JSON step of the API call to take several actions on each Azure AD application. 


 

ForEach apps loopForEach apps loop


 



  1. Set variable (String) – appId – uses the appId variable we initialized in step 3 to populate it with the application ID of the current application being processed. 

  2. Set variable (String) – displayName – uses the displayName variable we initialized in step 4 to populate it with the displayName of the application being processed. 

  3. Set variable (String) – passwordCredentials – uses the passwordCredentials variable we initialized in step 5 to populate it with the client secret of the application being processed. 

  4. Set variable (String) – keyCredentials – uses the keyCredentials variable we initialized in step 5 to populate it with the client secret of the application being processed. 

  5. A foreach will be used to loop through each of the client secrets within the current Azure AD application being processed. 


 

ForEach passwordCreds loopForEach passwordCreds loop


 



  1. The output from the previous steps to use for the foreach input is the passwordCreds variable.  

  2. A condition step is used to determine if the Future time from the Get future time step 19 is greater than the endDateTime value from the current application being evaluated. 

  3. If the future time isn’t greater than the endDateTime, we leave this foreach and go to the next one. 

  4. If the future time is greater than the endDateTime, we first convert the endDateTime to ticks. Ticks is a 100-nanosecond interval since January 1, 0001 12:00 AM midnight in the Gregorian calendar up to the date value parameter passed in as a string format. This makes it easy to compare two dates, which is accomplished using the expression ticks(item()?[‘endDateTime’]). 

  5. Next, use a Compose step to convert the startDateTime variable of the current time to ticks, which equates to ticks(utcnow()). 

  6. Next, use another Compose step to calculate the difference between the two ticks values, and re-calculate it using the following expression to determine the number of days between the two dates. 


 


div(div(div(mul(sub(outputs(‘EndTimeTickValue’),outputs(‘StartTimeTickValue’)),100),1000000000) , 3600), 24) 


 



  1. Set the variable daystilexpiration to the output of the previous calculation. 

  2. Set variable (String) – html – creates the HTML table.  The content of this step is as follows: 


 


 


 


 

<tr><td @{variables('styles').cellStyle}><a href="https://ms.portal.azure.com/#blade/Microsoft_AAD_RegisteredApps/ApplicationMenuBlade/Credentials/appId/@{variables('appId')}/isMSAApp/">@{variables('appId')}</a></td><td @{variables('styles').cellStyle}>@{variables('displayName')}</td><td @{if(less(variables('daystilexpiration'),15),variables('styles').redStyle,if(less(variables('daystilexpiration'),30),variables('styles').yellowStyle,variables('styles').cellStyle))}>@{variables('daystilexpiration')} </td><td @{variables('styles').cellStyle}>Secret</td><td @{variables('styles').cellStyle}>@{formatDateTime(item()?['endDateTime'],'g')}</td></tr> 

 


 


 


 


 



  1. Another foreach will be used to loop through each of the certificates within the current Azure AD application being processed.  This is a duplication of steps 25 through 33 except that it uses the keyCredentials as its input, compares the future date against the currently processed certificate endDateTime, and the Set variable – html step is as follows: 


 


 


 


 

<tr><td @{variables('styles').cellStyle}><a href="https://ms.portal.azure.com/#blade/Microsoft_AAD_RegisteredApps/ApplicationMenuBlade/Credentials/appId/@{variables('appId')}/isMSAApp/">@{variables('appId')}</a></td><td @{variables('styles').cellStyle}>@{variables('displayName')}</td><td @{if(less(variables('daystilexpiration'), 15), variables('styles').redStyle, if(less(variables('daystilexpiration'), 30), variables('styles').yellowStyle, variables('styles').cellStyle))}>@{variables('daystilexpiration')} </td><td @{variables('styles').cellStyle}>Certificate</td><td @{variables('styles').cellStyle}>@{formatDateTime(item()?['endDateTime'], 'g')}</td></tr> 

 


 


 


 


 


ForEach Key Credentials loopForEach Key Credentials loop


 



  1. Immediately following the foreach – apps loop, as a final step in the Do while loop is a Set NextLink variable which will store the dynamic @odata.nextlink URL parsed from the JSON of the API call. 

  2. Append to variable (Array) – html – Immediately following the Do while loop ends, we close out the html body and table by appending <tbody></table> to the variable named html. 


 


Yes branchYes branch


 



  1. Finally, send the HTML in a Send an e-mail action, using the variable html for the body of the e-mail. 


 

End of the flow including send emailEnd of the flow including send email


  


And below is the resulting e-mail received when the flow runs on its scheduled time.  Included is a hyperlink for each application that takes you directly to where you need to update the client secret and/or certificates for each application within the Azure portal. 


Sample e-mailSample e-mail


 

 


Thanks for reading!

More NFC card reading accessories released for Surface Go

More NFC card reading accessories released for Surface Go

This article is contributed. See the original author and article here.

As a fan of Surface Go and Surface Go 2, I sometimes get asked about a little-known capability: Near field communication (NFC), the technology behind card readers. Even though we have only limited support for NFC, there are a growing number of third-party solutions that take advantage of the card reading capability.


 


nfc-go.png


 


All Surface Go for Business and Surface Go 2 for Business devices are equipped to take advantage of greater third-party support for passwordless authentication.  Azure AD now supports FIDO2 security keys as an authentication method for signing into operating systems, applications, and services. Organizations can issue these keys to everyday information workers.


 


And we’re seeing more innovations in the marketplace that build on the NFC capabilities in Surface Go: 



AuthenTrend Technology ATKey.Card


Our friends in Azure recently partnered with AuthenTrend Technology in using its ATKey.Card Smart Badge type security card with Surface Go 2.  To learn more, check out this recent Ignite session:


 


Go passwordless Hands on Tour in Azure AD.png


 


 


Imprivata OneSign


The increasing use of Surface Go as a shared mobile device in health care settings means protecting personal medical information has never been more critical. To meet this challenge, Imprivata OneSign provides an NFC solution that enables health care providers to simply tap their badge (instead of typing a username and password) to access the device and applications, which protects private health information with minimal disruption to the user.


 


 “There’s tons of sensitive data on these devices that need to be secured but protecting these devices can’t be disruptive for the end-user clinicians focused on patient care,” Imprivata explains in the following video. The “seamless process gives them access to that tablet without having to enter a username or password or having to put a call into IT because they forgot those passwords.”


 


Seamless access to Surface Go Tablets with Imprivata OneSign.png


 


The Joy Factory aXtion Pro MPA NFC for Surface Go


This waterproof case features an NFC range extender for use in health care settings.  Infused with an antimicrobial agent to help prevent against bacteria and mold growth, the military-grade certified aXtion Pro MPA for Surface Go features a built-in rotating module with hand strap and non-slip kickstand. To learn more, see this short demo: 


 


Joy Factory aXtion Pro MP.png


 


FAQ


Here are some answers to questions that typically come up when talking about the NFC functionality in Surface Go.


 


Is NFC available on all Surface Go and Surface Go 2 devices?



  • NFC is only available on Surface Go for Business and Surface Go 2 for Business devices.


Can the NFC be disabled through UEFI or DFCI?



  • Not at this time.


Can Go NFC be used to issue (digital wallet) or process (point of sale) payments?



  •  No. The NFC component does not include a secured element and the interface is not HID but a simple I2C.


Is multi factor authentication on Surface Go compliant with FIDO2.0 standards?



  • Yes, when combined with compliant authentication solutions and server, such as using Authentrend Key Card, Windows Hello and Azure AD.


Can I access and use NFC from the front of the device?



  • Yes, but only if the card has an independent power source like Authentrend. Passive cards can only be read from the back of the device — at very close proximity of ~10mm.


How can I troubleshoot multiple failed read attempts?



  • Recall the location of the effective read area on the device.

  • Remove any other NFC tags or NFC-enabled cards in the vicinity. limited NFC support is available for ISO/IEC 14443-A tag types 1 and 2 with antenna diameter between 15mm to 17mm.

  • We recommend using the Mifare Classic 1K card type.

  • Try keeping your badge in a nylon sleeve rather than a hard plastic case.

  • You might find this tool useful for troubleshooting: Springcard.


What are some other commercial uses?



  • Proximity-based apps. Applications that take advantage of proximity and location by using the RFID capability in Surface Go  and a proximity sensor in Windows 10.

  • Consumer apps.  RFID-enabled apps capable of directing consumers to target websites. For example,  users can swipe an RFID-enabled prescription container that opens relevant product information.


Learn more


Application Insights Azue DevOps Release Annotations Implementation Update

Application Insights Azue DevOps Release Annotations Implementation Update

This article is contributed. See the original author and article here.

Visualizations & Workbooks are a key component of the App Insights experience. They enable customers to monitor performance / failure trends & debug issues as they occur. 


 


While performance data is critical to understanding the health of your application by itself it lacks the full context to help you understand why performance issues may be happening. Release Annotations are a simple way to add context & quantify the impact of an Azure DevOps release on your metrics. 


 

 

Annotations.png


 


Annotations can be automatically created by the Azure Pipelines build system. You can also create annotations to flag any event you like by creating them from PowerShell.


 


If your subscription has an Application Insights resource linked to it and you use one of the following deployment tasks, then you don’t need to configure anything else.


 











































Task code Task name Versions
AzureAppServiceSettings Azure App Service Settings Any
AzureRmWebAppDeployment Azure App Service deploy V3 and above
AzureFunctionApp Azure Functions Any
AzureFunctionAppContainer Azure Functions for container Any
AzureWebAppContainer Azure Web App for Containers Any
AzureWebApp Azure Web App Any

 


You can also write custom annotations by using an inline PowerShell script.


 


Release annotations are a feature of the cloud-based Azure Pipelines service of Azure DevOps & only available for Azure DevOps repos today.


 


If you’re using the App Insights release task today please delete it & switch to the new implementation.


 


Learn more:


Release Annotations documentation


Migrate content from Box, Dropbox, and Google Workspace into Microsoft 365 – release update

Migrate content from Box, Dropbox, and Google Workspace into Microsoft 365 – release update

This article is contributed. See the original author and article here.

Microsoft focuses on providing a seamless move to Microsoft 365 with as much expertise and tooling we and our ecosystem can offer – across the globe. Our goal is to help move to the cloud with confidence.


 


At times, you need to move content cloud-to-cloud. We’re pleased to highlight Mover integration progress, to bring more of their technology directly inside the SharePoint admin center in Microsoft 365. Now, the Migration Manager admin tab is the home to manage all content migrations into Microsoft 365 (primarily into OneDrive, SharePoint, and Microsoft Teams) – without leaving the service. 


 


Connect your Box, Dropbox, or Google Workspace account to Microsoft 365 to move files and folder into OneDrive, SharePoint, and Microsoft Teams.Connect your Box, Dropbox, or Google Workspace account to Microsoft 365 to move files and folder into OneDrive, SharePoint, and Microsoft Teams.


Microsoft offers numerous content migration tools and services to assist your migration into Microsoft 365 – from assessment, to planning and onboarding. And we work closely with our 3rd-party migration partners to optimize their offerings as well.


 


Migrate your files and folder from Box [roadmap ID: 68816]


SharePoint and Microsoft 365 admins require support to migrate content from Box into Microsoft 365; that’s Box files and folders as well as conversion of Box notes into Word documents – to where you choose as destination into OneDrive, SharePoint, and Teams. Now it’s more centrally located and fewer clicks to discover content and move it into Microsoft 365.


 


After clicking “Get Started” from the main Migration Manager page, Box users are scanned automatically. You can also review reports and logs pre-migration to investigate any possible issues that might block your migration.After clicking “Get Started” from the main Migration Manager page, Box users are scanned automatically. You can also review reports and logs pre-migration to investigate any possible issues that might block your migration.


When you connect to a Box enterprise account, the service discovers users and their files. The service will automatically map to an individual’s OneDrive accounts, and you can manually map to a specific OneDrive user account, or route to SharePoint sites or a Teams channel for content meant to be in shared spaces.


 


See Migrate Box to Microsoft 365 with Migration Manager to learn more.


 


Migrate your files and folder from Dropbox [roadmap ID: 82015]


Similar to the Box movement of content into Microsoft 365, this release makes it so you can take that same action – to migrate Dropbox folders, files, and users to OneDrive, SharePoint, and Teams in Microsoft 365, enabling collaboration take place on a single platform closer to where you manage much of your work and productivity.


 


As you connect to a Dropbox for Business account, the service begins discovering users and their files. The service will automatically map to an individual’s OneDrive accounts, and you can also manually map to a specific OneDrive user account, SharePoint site, or a Teams channel.


 


After clicking “Get Started” from the main Migration Manager page, Dropbox files and folders are scanned automatically. You can also review reports and logs pre-migration to investigate any possible issues that might block your migration.After clicking “Get Started” from the main Migration Manager page, Dropbox files and folders are scanned automatically. You can also review reports and logs pre-migration to investigate any possible issues that might block your migration.


Note: To access, you must be a global admin or OneDrive/SharePoint admin to the Microsoft 365 tenant where you want to migrate your content.


 


Migrate Dropbox to Microsoft 365 with Migration Manager to learn more. 


 


Migrate your files and folder from Google Workspace [roadmap ID: 82014]


To cover the spectrum of customer needs, we, too, have released the ability to move content from Google Workspace – helping you move documents, data, and users to OneDrive, SharePoint, and Teams in Microsoft 365 and collaborate all in one place.


 


As you connect to a Google enterprise account, the service begins discovering drives and their files. The service will automatically map to an individual’s OneDrive accounts, and you can also manually map to a specific OneDrive user account, SharePoint site, or a Teams channel.


 


After clicking “Get Started” from the main Migration Manager page, Google Workspace files and folders are scanned automatically. You can also review reports and logs pre-migration to investigate any possible issues that might block your migration.After clicking “Get Started” from the main Migration Manager page, Google Workspace files and folders are scanned automatically. You can also review reports and logs pre-migration to investigate any possible issues that might block your migration.


Note: To access, you must be a global admin or OneDrive/SharePoint admin to the Microsoft 365 tenant where you want to migrate your content.


 


Migrate Google Workspace to Microsoft 365 with Migration Manager to learn more.


 


Additional resources



 


What’s next…


As we continue to invest across the migration offerings, we are excited to expand our cloud-to-cloud capabilities to allow moving content from Egnyte into Microsoft 365 [roadmap ID: 82016]. And before you move any file or folder from on-premises into Microsoft 365, you need to discover content and plan for the migration. Soon, Migration Manager will provide content discovery so admins can best understand what content they have, decide what to migrate and what to remediate. If you are interested, you can complete this form for the Migration discovery preview.


 


Regardless of your organization’s size, data scale or information complexity, you can migrate documents and sites into OneDrive, SharePoint, and Teams in Microsoft 365 successfully. And we are here to help.


 


Use more of what SharePoint and Microsoft 365 offer, and let us know what you think


In addition to the above updates now rolling out to Microsoft 365, we encourage you to learn more about all migration offerings. Mover supports numerous cloud-to-cloud migration scenarios alongside our the SharePoint Migration Tool (SPMT) which targets migrating content from on-premises SharePoint sites and file shares to Microsoft 365, FastTrack planning and onboarding, and a strong migration partner ecosystem – collectively the broadest set of offerings to assist your migration into Microsoft 365.


 


Our goal is to empower you and every person on your team to achieve, and move, more. Let us know what you need next. We are always open to feedback via UserVoice and continued dialog in the SharePoint community within the Microsoft Tech Community —and we always have an eye on tweets to @SharePoint. Let us know.


 


Thanks, Mark Kashman, senior product manager – Microsoft

New transactable offers from Datadog, Datometry, and Sycomp in Azure Marketplace

New transactable offers from Datadog, Datometry, and Sycomp in Azure Marketplace

This article is contributed. See the original author and article here.








Microsoft partners like Datadog, Datometry, and Sycomp deliver transact-capable offers, which allow you to purchase directly from Azure Marketplace. Learn about these offers below:

















Datadog logo.jpg

Datadog: Datadog is a SaaS monitoring and security platform for cloud applications. It integrates and automates infrastructure monitoring, application performance monitoring, and log management to provide unified, real-time observability of your entire technology stack. Enable digital transformation, drive collaboration across teams, secure applications and infrastructure, and more.


Datometry logo.png

Datometry Hyper-Q for Azure Synapse Analytics: Datometry Hyper-Q is a virtualization platform that makes existing Teradata applications interoperate with Microsoft Azure Synapse Analytics at a fraction of the time, cost, and risk associated with a conventional migration. Gain an edge over your competition by using Hyper-Q to rapidly move to the cloud without leaving any applications or business logic behind.


Sycomp logo.png

Sycomp Storage Fueled by IBM Spectrum Scale: Sycomp’s solution deploys IBM Spectrum Scale storage clusters with Red Hat Enterprise Linux 7.8 based on your business needs. The offer is intended for customers seeking a resilient, performance-oriented storage platform, such as Microsoft Azure HPC, Azure AI, and Azure Machine Learning clients, along with those moving Hadoop workloads to the cloud.