Dockershim deprecation and AKS

This article is contributed. See the original author and article here.

The huge success of Docker in energizing the tech community around containers has been truly amazing to see.  As containers are the foundation of Kubernetes, it goes hand in hand that Docker was a core part of the platform. As Kubernetes matured, and the needs of the community and our customers grew, the tight coupling of Docker to Kubernetes through dockershim (the interface between the Kubernetes platform and the Docker runtime) needed to evolve. On 2 December 2020, with the release of Kubernetes 1.20, Kubernetes announced it is deprecating Docker as a container runtime, through the use of dockershim, in a future release of Kubernetes.


 


With the upcoming Kubernetes 1.24 release, dockershim will be removed.


 


If you are using a supported Kubernetes version in Azure Kubernetes Service (AKS):



  • For Linux node pools, there is no action required on your part. All supported Kubernetes versions on AKS use containerd as the default runtime for Kubernetes 1.19 and greater.

  • For Windows Server 2019 node pools, in January 2021, we announced the general availability of containerd as the default container runtime for Kubernetes 1.23 and greater. If you are using a cluster with a Kubernetes version prior to 1.23, you can create a new node pool with containerd enabled before the switch over to it being the default in 1.23 then move your pods to the new pool.


 


We work very closely with the community through our participation in the Kubernetes Special Interest Groups (SIG) from Storage, to Windows and everything in between and I’d like to thank both the community and our engineering teams for the work to get to this milestone.

External Identities B2C supports Authenticator apps, and new data residency pricing

External Identities B2C supports Authenticator apps, and new data residency pricing

This article is contributed. See the original author and article here.

Hello friends,


 


Happy new year everyone! With the new year come new possibilities. Today I’m excited to announce two Azure AD External Identities updates including the public preview of multi-factor authentication (MFA) with time-based one-time passcode (time-based OTP) for B2C users and an important change to our support for data residency in Azure AD B2C directories.


 


Strengthen MFA for B2C users with time-based OTP


Rising fraud and security attacks make it critical to protect consumer accounts with more secure forms of MFA. By incorporating time-based OTP through an authenticator app in your B2C user flows, you can provide a higher level of security compared to existing email and phone factors, without incurring additional telephony charges. Learn more from my colleague, Alex Weinert, about why we believe app-based MFA is more secure than email and phone MFA alone.


 


Time-based OTP for your user accounts can be configured with any authentication application. We recommend setting up your time-based OTP with Microsoft Authenticator, which uses encrypted bi-directional communication for authentication status and supports additional context and control that make it easier for your users to help protect themselves.


In this B2C user flow, a Contoso customer is prompted to complete authentication using time-based OTP with the Microsoft Authenticator application.In this B2C user flow, a Contoso customer is prompted to complete authentication using time-based OTP with the Microsoft Authenticator application.


Read the documentation to learn how to set up time-based OTP for Azure AD B2C.


 


Data residency pricing update


We understand how important it is for our customers to have control over their data and to comply with local data residency requirements. As a first step in supporting this business-critical need, earlier this year we announced the general availability of Azure AD B2C data residency in Australia. To support increased demand for this support, beginning mid-2022, current and new customers who have data residency configured for Australia or other specific countries/regions will incur an add-on charge of $0.02 per monthly active user (MAU).


 


B2C Go local -  Australia.jpg



 


Based on your feedback, we are continuing to grow our data residency offerings so that you can target selection of the country/region needed to meet data storage requirements.


 


We love hearing from you, so please share your feedback on these updates through the Azure forum or by tagging @AzureAD on Twitter.


 


Robin Goldstein 


Twitter: @RobinGo_MS


 


 


Learn more about Microsoft identity: 



 

Microsoft Defender for Office 365 Ninja Training: January 2022 Update

Microsoft Defender for Office 365 Ninja Training: January 2022 Update

This article is contributed. See the original author and article here.

We have published a few Microsoft Defender for Office 365 resources over the past few months, and these are now included in the Ninja Training. If you want to refresh your knowledge and get updated, here is what has been added since the last release in September 2021.


 


Legend:




















CTang885_0-1642629359889.png   Product videos  CTang885_1-1642629359973.png   Webcast recordings CTang885_2-1642629359885.png   Tech Community
CTang885_3-1642629360313.png Docs on Microsoft CTang885_4-1642629359956.png Blogs on Microsoft CTang885_5-1642629359915.png GitHub

⤴ External


CTang885_6-1642629359857.png Interactive guides  

 


































Module (ordered by Competency Level)



What’s new



Email Security – Fundamentals:


Module 3. Configuration (Part 1)





 


Email Security – Fundamentals:


Module 5.  General Awareness


 





Email Security – Intermediate:


Module 11. Reports/Custom Reporting





Security Operations – Advanced:


Module 4. Migration





Security Operations – Advanced:


Module 6. Attack Simulation Training





Security Operations – Advanced:


Module 7. General Awareness





 

The evolution of retail store into an experience center

The evolution of retail store into an experience center

This article is contributed. See the original author and article here.

With more and more customers looking to digital channels for product information, feedback, and insights, the role of the store is changing from a place that simply houses and transacts products to another integral step in building and differentiating customer experience. While this transformation has been taking place these past several years, the recent impact to in-person sales has accelerated the discussion for more retailers in what the role of the retail store will be in the future. Customer experience has been top of mind for retailers for years, but experience means different things to different people. In-store experience for a fashion retailer is dramatically different from that of a grocery store, however, this is something that each retailer has to define for their business. Microsoft sets out to help retailers bridge the gap between customer expectation and delivered experience. By utilizing the intelligent and connected tools available in Microsoft Dynamics 365, retailers can streamline the buying journey and ensure consistent and personalized customers experience across all relevant channels.

Dynamics 365 Commerce combined with Dynamics 365 Customer Insights enables retailers to streamline in-store practices and bring relevant customer data to sales agents, when and where needed, to deliver real-time personalization for customers in-store.

Transforming retail experiences to meet customer’s expectations

Mattress Firm is a great example of a retailer that has redefined their role as part of the customer sleep journey. By moving their focus from selling mattresses, to helping customers gain a better night’s sleep, it transformed their perspective on the experiences needed in store to reflect this focus.

This embed requires accepting cookies from the embed’s site to view the embed. Activate the link to accept cookies and view the embedded content.

This site uses cookies for analytics, personalized content and ads. By continuing to browse this site, you agree to this use.

“Customers are just asking for these elevated retail experiences, and all of these things require technology, data or both.”Jonathan Sider, CIO and COO of e-commerce, Mattress Firm.

To evolve the retail store, organizations need to deliver more robust, sensory experiences and personalized customer service. The main ways through which this can be accomplished are personalization, seamless purchasing, and expanded in-store experiences.

The value of data and personalization

Consumers today respond better to personalized experiences. In the context of brick-and-mortar retail, this means being able to connect a customer’s online activities and past purchases with an in-store team member. One means of effective personalization is to deliver customer intelligence data, such as tailored recommendations, to the point in the customer journey where this information will have the greatest impact, such as point of sale (POS) terminals in-store or to sales associate handheld devices. Automation in the retail store is about improving the customer journey while also simplifying and removing as many low-value manual processes as possible from team member responsibilities, such as inventory management, ordering, and fulfillment.

Retailers like GNC are looking to Microsoft and Dynamics 365 to help meet and exceed customer expectations at every touchpoint.

a woman standing on a sidewalk

“We expect to provide our customers with a highly personalized experience tailored to their wellness journey. Whether they’re an 18-year-old high school athlete or a 70-year-old retiree, we want them to get the right message about products, programs, or deals that will appeal to them. We chose Customer Insights because with it, we have the data infrastructure to really do that right.”—Lauren Mannetti, Vice President, Marketing, GNC.

These forms of automation have rapidly become vital to providing the seamless omnichannel experience that customers demand. But, as we’ve discussed before in our Exceed customer expectations with seamless and unified commerce experiences blog post, this requires connecting data across back-end systems, which companies often overlook while focusing on their front-end process and online sales channels.

Seamless, frictionless purchasing is now an expectation

Consumers have mostly moved away from cash and now desire a fast, contactless, friction-free buying processideally without lines. For retail, this requires truly “instant checkout,” which means: Scan. Pay. Done. No lines. No hassle. No wait. The entire process occurs through the mobile point of sale application on a phone or tablet. Combining this with checkout stations and roaming team members positioned throughout the stores allows customers to bag their purchases and go on their way.

Customers like LIDS are using Dynamics 365 Commerce to get out from behind the counter and engage with customers in the isles with a user-friendly, touch-based point of sale device. With a wide variety of products and sizes, transactions are smoother when the employee can work with a customer and look up what’s in stock right from the sales floor.

“Our greatest assets are our store employees and managers, they care about the POS, and they want technology. When they walk into the store and see something that’s not a touchscreen, they kind of disconnect.”Nick Corthier, Chief Financial Officer, Lids.

Dynamics 365 also simplifies the payment experience with native integration with payment providers like Adyen, thereby enabling an easy and truly unified commerce solution for retailers.

Expanding in-store experiences

It’s up to every retailer to define what the role of their stores will be given the range and variety of retail offerings. One thing is for sure, retailers that adapt quicker are more likely to create differentiated value in the market and ultimately define what the future of their ‘retail vertical’ should look like. Technology is helping these retail leaders set the pace and standing up new models and raising the bar, especially for premium brands.

Gibson Brands is looking to lead the way in building a best-in-class retail experience for music enthusiasts. Gibson Garage, a new entertainment and retail outlet in downtown Nashville, features digitally fueled in-store experiences for customers. The Gibson Garage allows customers to see the guitar they’re buying, hear somebody play it, get excited about their purchase, and enjoy the entire music-centered hangout experience, complete with musicians and iconic gorgeous instruments. It becomes a place that musicians and music lovers alike want to gather and spend time; the way book lovers would a bookstore and caf.

Gibson Guitar store with guitars hanging on the wall

“Our legacy systems were unstable, unsustainable, and not optimized for the current ways of working…They weren’t talking to each other. Dynamics 365 has brought the company to the leading edge of technology in terms of enterprise resource planning (ERP). It’s stepped up our game and unlocked so many possibilities that we’re just scratching the surface. We’re continually going through, refining processes, and unlocking different aspects of the ERP to figure out what works for us in a system.”Mallory McClain, Dealer Service Supervisor, Gibson Brands.

These are examples of combining theatre-like retail settings, personalization, and automation to create a retail sales experience that is differentiated and pushes the boundaries of what we typically think of as shopping.

What’s next?

Many companies have merged data across disparate systems to rise to the new expectations of retail as an experience. These merchants have made investments in technology to move to an integrated unified commerce solution, like Dynamics 365 Commerce.

Visit our Dynamics 365 retail page to learn how Dynamics 365 can help you deliver on your customers’ expectations by evolving your retail experiences and how Microsoft can support your business for growth.

The post The evolution of retail store into an experience center appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Runbook to manage Azure Firewall Back ups

Runbook to manage Azure Firewall Back ups

This article is contributed. See the original author and article here.

Azure Firewall is a managed stateful network security service that recently became generally available for premium features across most Azure regions, providing capabilities such as TLS inspection, URL filtering and more


Across the different virtual networks and subscriptions, rules are created for network segmentation and access control. Managing the network traffic may require you to audit rules for utilization, flow hit count or require a previous working configuration.


This runbook will help create instantaneous back-up copies of Azure Firewall with the Firewall Policy and when scheduled with an automation account, you can take daily/weekly snapshots and store them in a specified path.


For this runbook, Azure blob storage will be used to store Azure Firewall configuration for both network infrastructure and firewall policy at the time of the dump. You can edit the template to specify another storage method. We also discuss how to redeploy a firewall to a known configuration using one of the backed-up templates. For more information on other network resources that you may like to adapt to this runbook, please see the Export-AzResourceGroup module.



Requirements



  • Automation account

  • Storage Account

  • Runbook


 


Set up Automation account


Go to the Search bar and type Automation account. Create a New Automation account. When done, go to the Automation account and in the Settings blade, under Account settings, create a “Run As” account. This provide the service principal access that will be used to auto-login into our script later.


tobiotolorin_0-1642610140453.png


This runbook is a PowerShell module and we need to confirm that we have access to network and resources modules. On the Automation account blade which you have just created, go to Modules and then search the Gallery to import the following three requisites:


– Az.Account


– Az.Network


– Az.Resources


tobiotolorin_1-1642610193540.png


Create Storage account


Next, we create a storage account to store each back-up created in Azure Blob storage. Go to the search bar for resources and search for Storage Account. Create a Storage account. Select Cool storage mode


tobiotolorin_2-1642610219475.png


Next, we create the Runbook.


On the Automation account blade, click on Runbooks and create one. Here below, I have created AzFwBackUp.


tobiotolorin_3-1642610262929.png


 


After creating the runbook, copy the code from our Azure Network Security Github repository and paste it in line one. This code has three functions: 1. Create storage, 2. Export the firewall configuration, and save in the storage, and 3. Purge the older backups. It uses the Get-AzFirewall and Get-AzFirewallPolicy cmdlet to create snapshot instances.


 


Click Save and then click on Test Plane.


tobiotolorin_4-1642610311710.png


 


On the next page, you will be prompted for a few parameters which you have set up earlier:



  • Provide the resource group name, Azure Firewall name and Firewall Policy name, obtainable in your Resource group

  • Provide storage account name that was created in the step earlier above and the storage key.  The storage key can be obtained under “Access Keys” in the Storage Account blade (see image below).

  • Give it a Blob container name and specify a retention date. Back-ups older than this date will be deleted at next run.


tobiotolorin_5-1642610351313.png


 


Test the script


Click Start to begin the dry run to confirm you can store a copy of the current configuration. Once done, you can then proceed to create a schedule to make this run periodically.


tobiotolorin_7-1642610388322.png


 The back-up should be available in your Storage -> Container in .json format.


 


Create a schedule


To create a schedule, go to the Runbook that was created, at the Runbook blade, select Schedules -> Add a Schedule. Give it a name and select “RecurringConfigure for every week or month.


tobiotolorin_8-1642610470056.png


Click to “Link the schedule to the runbook.” Select the runbook you have created and then click “Configure parameters and run settings,” Fill this form as done earlier.


The Runbook should now be all set to create back-ups in. json template that may be used to restore the firewall to an earlier configuration. The storage account should now store your back-ups as specified.


tobiotolorin_9-1642610536508.png


 


Restore Azure Firewall


To restore an Azure firewall (firewall infrastructure and firewall policy) to an earlier configuration, run the following syntax in your Cloud shell:


 

New-AzResourceGroupDeployment -name $azurefirewallname -ResourceGroupName $resourcegroupname -TemplateFile $filepath

 


 


Note that Rule Collection Groups or RCGs are treated as dependency objects and these objects cannot be deployed in parallel. Collection groups all reference the policy as dependant (dependsOn in the arm template).


This means it will first update policy and then try to update all rule collection groups in parallel. This may fail due to the policy dependency issue. (This is currently by ARM (Azure Resource Manager) design and a roadmap item).


Hence, rule collection groups must be deployed one after the other using the “depends on” tag in your firewall backup file.


As an example, the Rule Collection Group would have to be edited using the rule processing order.


The format is to first specify the Firewall Policy name, then Policy RCG object.


 


 

      "type": "Microsoft.Network/firewallPolicies/ruleCollectionGroups",

      "apiVersion": "2020-11-01",

      "name": "SOC-NS-FWPolicy_premium/DefaultNetworkRuleCollectionGroup",

      "location": "westus2",

      "dependsOn": [

        "[resourceId('Microsoft.Network/firewallPolicies', 'SOC-NS-FWPolicy_premium')]",

   "[resourceId('Microsoft.Network/firewallPolicies/ruleCollectionGroups', 'SOC-NS-FWPolicy_premium','DefaultDnatRuleCollectionGroup')]"

      ]

 


 


(Also, confirm that the KeyVault information is not missing after the redeployment is complete)


 


In summary, you may need to export firewall settings when creating child policies, restoring from a wrong configuration, auditing rules etc. By setting up frequent back up schedule, you can create a historical record of your configuration.
For more information about Azure firewall use cases and governance information, see the Azure Network Security TechCommunity blog