[Guest Blog] Mixed Reality and the Human Component of Innovation

[Guest Blog] Mixed Reality and the Human Component of Innovation

This article is contributed. See the original author and article here.

This article was written by community member Alexandra Petty, a Mixed Reality Evangelist and Change Management Consultant in the Netherlands as part of our new #HumansofMixedReality guest blogger series, which seeks to humanize the world of mixed reality. In this series, we’ll spotlight the wonderful people working in the mixed reality space globally. Read on to hear her story about why she believes this technology can truly transform lives. 

Meet Alexandra – this is her story:

Meet Alexandra Petty, Mixed Reality Evangelist and Change Management Consultant in the Netherlands
Meet Alexandra Petty, Mixed Reality Evangelist and Change Management Consultant in the Netherlands

Let me start off by saying that I am a CODA, i.e child of deaf adult(s). I love discovering how we can make innovative technology work for all of us and our loved ones, and to make this world and our lives a more meaningful one. I’ve personally seen how technology has empowered my mother to make meaningful connections through the use of her smartphone. It opened up a world of possibilities for her, and now she can now leverage all forms of content – from video calling, to speech and text. I cannot wait to see how we continue to push boundaries of technology in helping empower people with various accessibility needs.

New technologies are very exciting – and not just from a technical perspective. I think it’s exciting from a human perspective. How can it work for us? How can we use it to improve someone’s quality of life? How can we use this to improve the quality of our work?

Now focusing back on a topic much closer to our hearts here in the Mixed Reality Community – what can Mixed Reality really do? I think, anything. With near limitless possibilities, driving Mixed Reality to its boundaries is exciting. Sometimes so exciting that we lose sight of the human component. Many end-users out there aren’t even aware that this technology exists, let alone understand how it works. We need to help bridge that knowledge gap so others can also learn and benefit from this technology.

Next to that, as with any technology innovation, there is a strong need for the human perspective as well. Privacy concerns and ethics matter. Just because we can do it, doesn’t mean we should do it. Especially now where we’re discovering more about artificial intelligence and contextualizing data within our environment and visualizing it.

I invite you to join me in a thought experiment:

Imagine you’re an engineer tasked with correctly positioning side mirrors on a car. Every shift you make needs to be recalculated to see if it fits within the norm of so many different countries. Now imagine you can just do that with the help of Mixed Reality, Edge Computing and AI. No more having to always go back to the drawing board, but immediately perfecting what you can do in real-time. It’s a very clear value-add to a business process and someone’s day to day work. Better yet, the various scenarios in which this can be applied is endless.

Sounds revolutionary? This scenario is actually not new, and has actually already been shown at Microsoft Ignite three years ago. With a little bit more context and accuracy, of course.

Let’s continue our thought experiment:

Now imagine working as a healthcare professional and you could just see the patient information at a glance, right in your field of view, when you’re talking to your patient. No more rifling through documents to look up the file, spending time at a desk instead of spending critical face-to-face time with your patient. All the information of your patient, you can easily just review and pull up whilst sitting face-to-face with your patient.

The technology is there- it can already do it using Mixed Reality, Edge computing, AI and Azure Services. The ethics, privacy laws and, most importantly, global awareness is where the challenges lie, and is something that all of us as Mixed Reality professionals continue to work on daily. So few people are actually aware of what Mixed Reality truly is. How can we, as pioneers and early adopters, the driving force behind this technology, help our fellow human beings understand how this can work for them?

Here’s where the strong human perspective at the conception of any innovation becomes critical.

Let’s say, as a business you understand the need for innovation. You have:

  • Your scenarios well thought out and clearly identified
  • The means to research and fund this innovation
  • Finally come to the conclusion that this would benefit your employees and ultimately your business

What next? How do you make this change successful within your organization? How do you ensure that your employees will have room to process this change, to adapt, learn and truly embrace this technology? The answer is simple: You need to have a human-centered lens, understand how humans perceive innovation and change, and then help facilitate them through the tech adoption process.

Involve the change management team, your early adopters and identify ambassadors right from the start. When implementing Mixed Reality solutions, there is also a strong behavioral change involved. Unlike typical desktop/ line of business applications, you’re not simply replacing one application with another. You’re moving into an entirely new space, often with a new form factor that people are not familiar with, and may find intimidating. Think about how ubiquitous mobile phones are today, and how almost everyone is glued to one wherever they go – that was not always the case. We all have to start somewhere with new technology and build familiarity as we go. This is why we need to demystify Mixed Reality, and open the floodgates of knowledge so everyone can learn. 

As a change management consultant, one of the challenges I’ve seen with implementing innovative technologies is that the excitement gets too overwhelming and distracts from the key focus area. People get excited by shiny new things, and can sometimes overlook the importance of getting the brass tacks of implementation and adoption of these new technologies right. Don’t get me wrong – excitement and enthusiasm is good. However, losing sight of who will be impacted by this change and failing to have a comprehensive change management plan to support your employees in this process, not so much.

Want to drive mixed reality adoption in your organization?

  • Your change management team should be working together with your project team as soon as possible. There is a lot of work to be done. Everyone driving this change will also have to learn to use this new technology. They will go through their own personal change management process as they learn to adapt themselves, before considering the bigger picture of how it impacts the organization and most importantly, who.
  • Focus on your people – the humans who will be using this technology. Best practices gathered over the years have shown that the human aspect of change is extremely important and will make or break your project. It is the difference between how slow or fast the change will take to be fully rolled through your organization for it to start generating more revenue and begin netting a true return on investment (ROI).

Where am I in my personal Mixed Reality journey? I’m fortunate to have joined an exciting new startup in the Netherlands that shares my vision. I am excited to explore this new frontier where innovation and people come together, powered by the wonders of mixed reality. I cannot wait to share my findings and my journey with everyone. Today, I work on a lot of Dynamics 365 Remote Assist customer projects to help them infuse MR into their business, and there will be more mixed reality goodness to come!

I hope you enjoyed my story and that my blog left you with some food for thought.

#HumansofMixedReality #CareerJourney #ChangeManagement

Security Controls in Azure Security Center: Apply adaptive application control

Security Controls in Azure Security Center: Apply adaptive application control

This article is contributed. See the original author and article here.

As part of our recent Azure Security Center (ASC) Blog Series, we are diving into the different controls within ASC’s Secure Score.  In this post we will be discussing the control of “Apply Adaptive Application control”.  


 


This security control contains up to recommendations, depending on the resources you have deployed within your environment, and it is worth maximum of 1 point (2%) that counts towards your overall Secure Score. To understand about Azure Security Center’s secure score make sure you read this articleThese recommendations are meant to keep your resources safe and improve your security hygiene.


 


Apply adaptive application control contains the following 7 recommendations, depending on your environment:



  • Log Analytics agent should be installed on your virtual machine 

  • Monitoring agent should be installed on your machines

  • Log Analytics agent should be installed on your Windows-based Azure Arc machines

  • Log Analytics agent should be installed on your Linux-based Azure Arc machines 

  • Log Analytics agent health issues should be resolved on your machines 

  • Adaptive application controls for defining safe applications should be enabled on your machines 

  • Allowlist rules in your adaptive application control policy should be updated 


The example screenshot below shows an environment in which only 6 of those recommendations are within the scope of Apply adaptive application control security control, because the recommendations which do not apply to any resource within your environment do not appear.  


Image 1 – Recommendations within the Apply adaptive application controlImage 1 – Recommendations within the Apply adaptive application control


Like the rest of the Secure Score controls, all these recommendations must be considered in order to get the full points and drive up your Secure Score (you can review all of the recommendations here). Also, some might have a “Quick Fix!” button as well!  No excuses not to enable those, it simplifies remediation and enables you to quickly increase your secure score, improving your environment’s security. To understand how Quick Fix works, please make sure to visit here  


 


Category #1: Log Analytics agent should be installed on your virtual machine


To monitor for security vulnerabilities and threats, Azure Security Center depends on the Log Analytics Agent. The agent collects various security-related configuration details and event logs from connected machines, and then copies the data to your Log Analytics workspace for further analysis. Without the agent, Security Center will not be able to collect security data from the VM and some security 


recommendations and alerts will be unavailable and within 24hrs, Security Center will determine that the VM is missing the extension and recommends you to install it via this security control. You could manually install the agent with the help of this recommendation or If you have auto-provisioning turned on, when Security Center identifies missing agent, it installs the extension automatically which in-turn reduces management overhead. Refer to this article to understand deployment options. Several questions arise at this point for scenarios like, how auto provisioning works in cases where there is already an agent installed and to understand that please read this information.


The following recommendations belong to this category:



  • Monitoring agent should be installed on your machines.

  • Log Analytics agent should be installed on your Windows-based Azure Arc machines. This recommendation applies to Windows-based Azure Arc machines

  • Log Analytics agent should be installed on your Linux-based Azure Arc machines. This recommendation applies to Linux-based Azure Arc machines


Alternatively, to fix this recommendation, you can visit our Github Repository and leverage the automations we have published there.  


 


Category #2: Log Analytics agent health issues should be resolved on your machines


You’ll notice this recommendation when Azure Security Center finds Log Analytics agent unhealthy which means, a VM is unmonitored by Security Center since the VM does not have healthy Log Analytics agent extension. This could be due to several reasons, one of it could be the agents are not able to connect to and register with Security Center due to no access to the network resources. Read more about this scenario here. To fully benefit from all of Security Center’s capabilities, the Log Analytics agent extension is required.


For more information about the reasons Security Center is unable to successfully monitor VMs and computers initialized for automatic provisioning, see Monitoring agent health issues.


 


NOTE: The above recommendations (Category #1 and #2) to install the agent and recommendation about agent health issues are pre-requisites. You might observe these recommendations also show up in a different security control, and if they were remediated there, it will not appear here in this Security control.


 


Category #3: Adaptive application controls for defining safe applications should be enabled on your machines


Application allowlist is not necessarily a new concept. One of the biggest challenges of dealing with the application allowlist is how to maintain that list. The traditional approach of using AppLocker in Windows is a good solution, but still has the overhead of keeping up with the applications and making the initial baseline work properly for our needs.


 


Adaptive application controls is one of the advanced protection features you can benefit from, when you upgrade to Azure Defender ON, this falls under the cloud Workload Platform Protection (CWPP).


Adaptive application controls help to harden your VMs against malware by making it easier to control which applications can run on your Azure VMs. Azure Defender has built-in intelligence that allows you to apply allowlist rules based on machine learning. This intelligence analyzes the processes that are running in your VMs, creates a baseline of applications, and groups the virtual machines. From here, recommendations are provided that allow you to automatically apply the appropriate allowlist rules. The use of machine learning intelligence makes it super simple to configure and maintain application the allowlist.


 


With this feature, you’re able to alert on or audit . These can even be malicious applications that might otherwise be missed by endpoint protection solutions, or applications with known vulnerabilities. By default, Azure Defender enables application control in Audit mode. No enforcement options are available at this time of writing.


 


Adaptive Application Control do not support Windows machines for which AppLocker policy is already enabled by either group policy objects (GPOs) or Local Security policy.


Hope this helps you understand why it is super important for you to enable them. Learning about Adaptive Application Control is essential for anyone looking to gain more granular control and security within their environment, so make sure to read our documentation.


 


Category #4: Allowlist rules in your adaptive application control policy should be updated


This recommendation will be displayed when Azure Defender’s machine learning identifies potentially legitimate behavior that hasn’t previously been allowed. This recommendation suggests you to add new rules to the existing policy to reduce the number of false positives in adaptive application controls violation alerts. To edit the application control policy please refer to this for more information.


 


Next Steps


As with all security controls, you need to make sure to remediate all recommendations within the control that apply to a particular resource in order to gain credit towards your secure score.


 


I hope you enjoyed reading this blog post as much as I enjoyed writing it and learned how this specific control can assist you to strengthen your Azure security posture.



  • The main blog post to this series (found here)

  • The DOCs article about Secure Score (this one


Reviewer


Special Thanks to @Yuri Diogenes, Principal Program Manager in the CxE ASC Team for reviewing this article.


 


 

Getting Started with DevOps for Azure SQL | Data Exposed

This article is contributed. See the original author and article here.

“Databases-as-Code” is an important principle in improving predictability in developing, delivering, and operating Azure SQL databases. In the first part of this two-part series with Arvind Shyamsundar, we quickly survey the different tools and methodologies available and then show you how to get started with GitHub Actions for a simple CI/CD pipeline deploying changes to an Azure SQL DB.

Azure Unblogged – GitHub

This article is contributed. See the original author and article here.

Today, I am pleased to share with you a new episode of Azure Unblogged.  I chat to Martin Woodward, Director of Developer Relations at GitHub.  Martin and I discuss why GitHub is something that IT Pros and System Administrators should look at learning GitHub.  The new features GitHub Actions and GitHub Codespaces and how they integrate with Azure as well as the forthcoming GitHub Universe


 


You can watch the full video here or on Microsoft Channel 9


 

I hope you enjoyed the video if you have any questions feel free to leave a comment and if you want to check out some of the resources Martin mentioned please check out the links below:


Azure Sphere OS version 20.12 is now available for evaluation

This article is contributed. See the original author and article here.

The Azure Sphere OS version 20.12 is now available for evaluation in the Retail Eval feed. The retail evaluation period provides 14 days for backwards compatibility testing. During this time, please verify that your applications and devices operate properly with this release before it is deployed broadly via the Retail feed. The Retail feed will continue to deliver OS version 20.10 until we publish 20.12 in two weeks. For more information on retail evaluation see our blog post, The most important testing you’ll do: Azure Sphere Retail Evaluation.


 


Azure Sphere OS version 20.12


The 20.12 release includes the following bug fixes and enhancements in the Azure Sphere OS. It does not include an updated SDK. 



  • Reduced the maximum transmission unit (MTU) from 1500 bytes to 1420 bytes.

  • Improved device update in congested networks.

  • Fixed an issue wherein the Wi-Fi module stops scanning but does not respond with a completion event if a background scan is running and the active Wi-Fi network is deleted.

  • Fixed a bug wherein I2CMaster_Write() returns EBUSY when re-sideloading the app interrupts operation.


 


Azure Sphere SDK version 20.11


On Nov 30, we released version 20.11 of the Azure Sphere SDK. The 20.11 SDK introduces the first Beta release of the azsphere command line interface (CLI) v2. The CLI v2 Beta is installed alongside the existing CLI on both Windows and Linux, and it works with both the 20.10 and 20.12 versions of the OS. For the purpose of retail evaluation, continue to use the CLI v1. For more information on the v2 CLI and a complete list of additional features, see Azure Sphere CLI v2 Beta.


 


For more information on Azure Sphere OS feeds and setting up an evaluation device group, see Azure Sphere OS feeds. 


 


For self-help technical inquiries, please visit Microsoft Q&A or Stack Overflow. If you require technical support and have a support plan, please submit a support ticket in Microsoft Azure Support or work with your Microsoft Technical Account Manager. If you would like to purchase a support plan, please explore the Azure support plans.

Azure Service Fabric 7.2 Fourth Refresh Release

This article is contributed. See the original author and article here.

The Azure Service Fabric 7.2 fourth refresh release includes stability fixes for standalone, and Azure environments and has started rolling out to the various Azure regions. The updates for .NET SDK, Java SDK and Service Fabric Runtime will be available through Web Platform Installer, NuGet packages and Maven repositories in 7-10 days within all regions.


 


You will be able to update to the 7.2 fourth refresh release through a manual upgrade on the Azure Portal or via an Azure Resource Manager deployment. Due to customer feedback on releases around the holiday period we will not begin automatically updating clusters set to receive automatic upgrades.


 



  • Service Fabric Runtime


    • Windows – 7.2.445.9590

    • Service Fabric for Windows Server Service Fabric Standalone Installer Package – 7.2.445.9590




  • .NET SDK


    • Windows .NET SDK –  4.2.445

    • Microsoft.ServiceFabric –  7.2.445

    • Reliable Services and Reliable Actors –  4.2.445

    • ASP.NET Core Service Fabric integration –  4.2.432


  • Java SDK –  1.0.6


 


Key Announcements



  • .NET 5 apps for Windows on Service Fabric are now supported as a preview. Look out for the GA announcement of .NET 5 apps for Windows on Service Fabric in the coming weeks.

  • .NET 5 apps for Linux on Service Fabric will be added in the Service Fabric 8.0 release (Spring 2021).

  • Windows Server 20H2 is now supported as of the 7.2 CU4 release.


For more details, please read the release notes.