Make the ‘choice’ to use custom fields

Make the ‘choice’ to use custom fields

This article is contributed. See the original author and article here.

You can create a type of custom field that allows you to quickly choose from several options. Choice custom fields are great for scenarios such as having a Risk column with “High”, “Medium”, and “Low” values. Also, try using them for rough planning of tasks such as “This week”, “Next Week”, and “Later”. Like other custom fields, you can use them in the Grid, in the Board view, and in the task details card. Try them out!


AlexanderLahuerta_0-1624300142533.jpeg


Getting Started


Like other custom fields, click the “Add column” button and choose the “New field” option.


AlexanderLahuerta_1-1624300142536.jpeg


Then, select the “Choice” field type.


AlexanderLahuerta_2-1624300142538.jpeg


 


Now give your field a name and a set of options. Press the delete icon next to a choice and the Add a Choice button at the bottom to manage the number of choices you would like to have. You can use the handle on the left side to reorder the choices. You can also use Ctrl + Up/Down on Windows, or Command + Up/Down on Mac to reorder choices with your keyboard.


AlexanderLahuerta_3-1624300142541.jpeg


 


Frequently Asked Questions


How many choices can I make?


20 for now. We will support 25 soon. Let us know if you need more!


 


How can I use emojis in my fields?


Windows and Mac both include an emoji menu. Press Win + ; (Windows key plus semicolon) to open it. On Mac, press Control + Command+ Space.


 


Are choice fields included when I make copies of a project?


Yes! Both the fields and the values are included when you copy a project.


 


Where can I go to learn about the other types of custom fields?


Head over to our support site to learn more about the types of custom fields available in Project for the web.


 


Are these “enterprise” fields? Can I build reports using them?


No. Like the other types of custom fields, this field type is only visible in Project for the web.

Zero Trust, The Essentials video series

Zero Trust, The Essentials video series

This article is contributed. See the original author and article here.

This video series shows how you can adopt a Zero Trust approach for security and benefit from the core ways in which Microsoft can help. In the past, your defenses may have been focused on protecting network access with on-premises firewalls and VPNs, assuming everything inside the network was safe. But as corporate data footprints have expanded to sit outside your corporate network, to live in the Cloud or a hybrid across both, the Zero Trust security model has evolved to address a more holistic set of attack vectors.


 


Screen Shot 2021-06-21 at 1.51.24 PM.png


 


Based on the principles of “verify explicitly”, “apply least privileged access” and “always assume breach”, Zero Trust establishes a comprehensive control plane across multiple layers of defense:



  • Identity

  • Endpoints

  • Applications

  • Network

  • Infrastructure

  • Data


Introduction to Zero Trust 


Identity:


Join our host, Jeremy Chapman, as he unpacks the foundational layer of the model with identity. As the primary control plane for Zero Trust, it acts as the front door for people, service accounts, and devices as each requests access to resources. Identity is at the core of the Zero Trust concepts of never trust, always verify and grant the appropriate level of access through the principle of least privilege.


 


Zero Trust | Identity


Endpoints & Applications:


See how you can apply Zero Trust principles and policies to your endpoints and apps; the conduits for users to access your data, network, and resources. For Zero Trust, endpoints refer to the devices people use every day — both corporate or personally owned computers and mobile devices. The prevalence of remote work means devices can be connected from anywhere and the controls you apply should be correlated to the level of risk at those endpoints.


 


For corporate managed endpoints that run within your firewall or your VPN, you will still want to use principles of Zero Trust: Verify explicitly, apply least privileged access, and assume breach.Jeremy Chapman walks through your options, controls, and recent updates to implement the Zero Trust security model.


 


Zero Trust | Endpoints & Applications

Microsoft Viva, The Essentials video series

Microsoft Viva, The Essentials video series

This article is contributed. See the original author and article here.

Screen Shot 2021-06-21 at 12.56.08 PM.png


 


This series of videos shows team leaders and admin the underlying tech and options for enabling and configuring the four core modules of Microsoft Viva. Viva is the new employee experience platform that connects learning, insights, resources, and communication. It has a unique set of curated and AI-enriched experiences built on top of and integrated with the foundational services of Microsoft 365.


 



Introduction to Microsoft Viva 


 


Microsoft Viva’s 4 core modules:



  • Viva Topics — builds a knowledge system for your organization

  • Viva Connections — boosts employee engagement

  • Viva Learning — creates a central hub to discover learning content and build new skills

  • Viva Insights — recommends actions to help improve productivity and wellbeing


 


Microsoft Viva Topics:


Viva Topics builds a system that transforms information into knowledge and actively delivers it to you in the context of your work. As many of us are working remotely or in more hybrid office environments, it can be harder to stay informed. With Topics, we connect you to the knowledge and the people closest to it. CJ Tan, Lead Program Manager, joins host Jeremy Chapman to cover the overall experience for users, knowledge managers, and admins.


 


Microsoft Viva Topics


 


Microsoft Viva Connections:


Viva Connections is specifically about boosting employee engagement. This spans everyone in your organization, from everyday users, specific groups in departments, to frontline workers. It expands on your SharePoint home site and newsfeed and is designed to offer a destination that delivers personalized news, conversations, and commonly used resources. Adam Harmetz, lead engineer, joins host Jeremy Chapman to walk through the user experience, how to set it up, and options for personalizing information sharing by role.


 


Microsoft Viva Connections


 


Microsoft Viva Learning:


With Viva Learning, you have a center for personalized skill development that offers a unique social experience where learning content is available in the flow of work. It recommends and manages the progress of key trainings all from one place and is built on top of SharePoint, Microsoft Search, Microsoft Teams, Microsoft Graph, and Substrate. Swati Jhawar, Principal Program Manager for Microsoft Viva, joins Jeremy Chapman to share options for setup, learning content curation, and integration with your existing learning management system.


 


Microsoft Viva Learning


 


Microsoft Viva Insights:


With hybrid work at home and in the office as the new normal, Viva Insights gives individuals, managers, and leaders the insight to develop healthier work habits and a better work environment. It is an intelligent experience designed to leverage MyAnalytics, Workplace Analytics, and Exchange Online to deliver insights that recommend actions to help prioritize well-being and productivity. Engineering leader, Kamal Janardhan, joins Jeremy Chapman for a deep dive and a view of your options for configuration.


 


Microsoft Viva Insights

Deploy PyTorch models with TorchServe in Azure Machine Learning online endpoints

This article is contributed. See the original author and article here.

With our recent announcement of support for custom containers in Azure Machine Learning comes support for a wide variety of machine learning frameworks and servers including TensorFlow Serving, R, and ML.NET. In this blog post, we’ll show you how to deploy a PyTorch model using TorchServe.


The steps below reference our existing TorchServe sample here.


 


Export your model as a .mar file


To use TorchServe, you first need to export your model in the “Model Archive Repository” (.mar) format. Follow the PyTorch quickstart to learn how to do this for your PyTorch model.


Save your .mar file in a directory called “torchserve.”


 


Construct a Dockerfile


In the existing sample, we have a two-line Dockerfile:


 


 

FROM pytorch/torchserve:latest-cpu

CMD ["torchserve","--start","--model-store","$MODEL_BASE_PATH/torchserve","--models","densenet161.mar","--ts-config","$MODEL_BASE_PATH/torchserve/config.properties"]

 


 


Modify this Dockerfile to pass the name of your exported model from the previous step for the “–models” argument.


 


Build an image


Now, build a Docker image from the Dockerfile in the previous step, and store this image in the Azure Container Registry associated with your workspace:


 


 

WORKSPACE=$(az config get --query "defaults[?name == 'workspace'].value" -o tsv)
ACR_NAME=$(az ml workspace show -w $WORKSPACE --query container_registry -o tsv | cut -d'/' -f9-)

if [[ $ACR_NAME == "" ]]
then
    echo "ACR login failed, exiting"
    exit 1
fi

az acr login -n $ACR_NAME
IMAGE_TAG=${ACR_NAME}.azurecr.io/torchserve:8080
az acr build $BASE_PATH/ -f $BASE_PATH/torchserve.dockerfile -t $IMAGE_TAG -r $ACR_NAME

 


 


Test locally


Ensure that you can serve your model by doing a local test. You will need to have Docker installed for this to work. Below, we show you how to run the image, download some sample data, and send a test liveness and scoring request.


 


 

# Run image locally for testing
docker run --rm -d -p 8080:8080 --name torchserve-test 
    -e MODEL_BASE_PATH=$MODEL_BASE_PATH 
    -v $PWD/$BASE_PATH/torchserve:$MODEL_BASE_PATH/torchserve $IMAGE_TAG

# Check Torchserve health
echo "Checking Torchserve health..."
curl http://localhost:8080/ping

# Download test image
echo "Downloading test image..."
wget https://aka.ms/torchserve-test-image -O kitten_small.jpg

# Check scoring locally
echo "Uploading testing image, the scoring is..."
curl http://localhost:8080/predictions/densenet161 -T kitten_small.jpg

docker stop torchserve-test

 


 


Create endpoint YAML


Create a YAML file that specifies the properties of the managed online endpoint you would like to create. In the example below, we specify the location of the model we will use as well as the Azure Virtual Machine size to use when deploying.


 


 

$schema: https://azuremlsdk2.blob.core.windows.net/latest/managedOnlineEndpoint.schema.json
name: torchserve-endpoint
type: online
auth_mode: aml_token
traffic:
  torchserve: 100

deployments:
  - name: torchserve
    model:
      name: torchserve-densenet161
      version: 1
      local_path: ./torchserve
    environment_variables:
      MODEL_BASE_PATH: /var/azureml-app/azureml-models/torchserve-densenet161/1
    environment:
      name: torchserve
      version: 1
      docker:
        image: {{acr_name}}.azurecr.io/torchserve:8080
      inference_config:
        liveness_route:
          port: 8080
          path: /ping
        readiness_route:
          port: 8080
          path: /ping
        scoring_route:
          port: 8080
          path: /predictions/densenet161
    instance_type: Standard_F2s_v2
    scale_settings:
      scale_type: manual
      instance_count: 1
      min_instances: 1
      max_instances: 2

 


 


Create endpoint


Now that you have tested locally and you have a YAML file, you can create your endpoint:


 


 

az ml endpoint create -f $BASE_PATH/$ENDPOINT_NAME.yml -n $ENDPOINT_NAME

 


 


Send a scoring request


Once your endpoint finishes deploying, you can send it unlabeled data for scoring:


 


 

# Get accessToken
echo "Getting access token..."
TOKEN=$(az ml endpoint get-credentials -n $ENDPOINT_NAME --query accessToken -o tsv)

# Get scoring url
echo "Getting scoring url..."
SCORING_URL=$(az ml endpoint show -n $ENDPOINT_NAME --query scoring_uri -o tsv)
echo "Scoring url is $SCORING_URL"

# Check scoring
echo "Uploading testing image, the scoring is..."
curl -H "Authorization: {Bearer $TOKEN}" -T kitten_small.jpg $SCORING_URL

 


 


Delete resources


Now that you have successfully created and tested your TorchServe endpoint, you can delete it.


 


 

# Delete endpoint
echo "Deleting endpoint..."
az ml endpoint delete -n $ENDPOINT_NAME --yes

# Delete model
echo "Deleting model..."
az ml model delete -n $AML_MODEL_NAME --version 1

 


 


Next steps


Read our documentation to learn more and see our other samples.


 

4 post-pandemic retail trends

4 post-pandemic retail trends

This article is contributed. See the original author and article here.

The last few decades have seen monumental change in the retail industry. Specifically, technology has untethered the shopper from the store and allowed retail to take place anywhere, at any time. When viewed through hindsight, these changes occurred just in time, for without the e-commerce revolution during the past two decades, retailers would have faced a significantly more difficult situation during the recent pandemic. Nonetheless, innovative companies have been able to adapt and adjust to these challenging times.

Businesses looked to the role of technology in enabling the agility to not only meet emerging customer needs but also to drive lasting impact in their organization’s business efficiency, customer experience, and market position. Patagonia partnered with Microsoft and Dynamics 365 to help navigate through the market turmoil and continue to innovate and drive to meet their business goals.

This embed requires accepting cookies from the embed’s site to view the embed. Activate the link to accept cookies and view the embedded content.

This site uses cookies for analytics, personalized content and ads. By continuing to browse this site, you agree to this use.

The pandemic forced many retailers to rethink almost everything related to the purchase experience. Where and how can customers pick up and return their purchased items? How can they pay? What is the utility of a store in an online-first shopping environment? Questions like these are now being answered in ways that will reverberate through the industry long after the pandemic has receded.

Dynamics 365 Commerce user interface across different devices and channels

Dynamics 365 has helped equip retailers with a flexible, unified, and seamless buying experience for customers and partners. With Dynamics 365 Commerce, businesses have been able to engage across traditional and emerging channels while allowing consumers the option to buy when, how, and where they wanton any deviceby delivering a frictionless and consistent experience across online and offline channels.

This embed requires accepting cookies from the embed’s site to view the embed. Activate the link to accept cookies and view the embedded content.

This site uses cookies for analytics, personalized content and ads. By continuing to browse this site, you agree to this use.

As we continue through the pandemic, both shoppers and retailers will have to adapt to a new normal. Let’s take a closer look at some trends that will persist in retail for the time to come.

1. Flexible order fulfillment

2020 ushered in the mass adoption of non-traditional fulfillment opportunities. One of these new methods that became essential during the past year is BOPIS, or buy-online-pick-up-in-store. As the pandemic unfolded, retailers saw a 208 percent increase in BOPIS from the previous year as customers gravitated to the increased convenience and choice that flexible order fulfillment offered. This accelerated a trend that had been gaining ground in some industries, like grocery, and saw it spread to retail sectors of all kinds.

2. Flexible returns

Like flexible order fulfillment, flexible returns provide the customer with increased convenience, this time related to items they no longer need or are not satisfied with. Returns are costly, and COVID-19 supercharged their numbers, with 2020 seeing consumers return $428 billion in merchandise, with $102 billion attributed to online purchasing, up 38.5 percent from 2019. As we move forward, the challenge will be for retailers to manage costs around returns without depriving consumers of the ease of returns they came to expect during the pandemic.

3. Contactless payments

Of all the trends mentioned in this blog, contactless payment via radio frequency identification (RFID) and near-field communication technology was perhaps the most widely adopted before the pandemic. Even at that, the need for increased cleanliness protocols and a desire to avoid touching surfaces where possible meant that the use of contactless payments has increased by 30 percent since COVID-19 started. With continued growth expected, savvy retailers will no longer allow the purchase experience to be tethered to specific locations or modes of payment.

4. Omnichannel retail

Like contactless payments, omnichannel retail plans had already been a high priority for many companies pre-COVID-19. However, the pandemic forced many retailers to speed up transitions from mostly brick-and-mortar to a proper omnichannel solution. Omnichannel has become an absolute requirement for modern retailers to survive. Part of this transition includes a re-thinking of the physical store itself. Rather than acting as the critical location in the retail journeya place for customers to browse for and purchase itemsthe store now serves as a place for customers to see, touch, and feel items they have discovered through other shopping channels before final purchase.

Microsoft is helping retailers deliver on their brand promises

In each of the trends mentioned, consumers rushed to welcome the added flexibility to the retail experience. As we continue through the pandemic and return to a new normal, consumers will be reticent to give their newfound conveniences.

Cloud for Retail offer and affiliated retail scenario outcomes

At January NRF we announced Microsoft Cloud for Retail to help retailers deliver seamless experiences across the entire buying journey, and recently we announced even more capabilities within the Microsoft Cloud for Retail public preview to help businesses build flexibility into their systems in order to thrive in the post-pandemic world.

What’s next?

The post 4 post-pandemic retail trends appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Join in the Azure Sentinel Hackathon 2021!

Join in the Azure Sentinel Hackathon 2021!

This article is contributed. See the original author and article here.

Hackathon Banner.png


 


Today, we are announcing the 2nd annual Hackathon for Azure Sentinel! This hackathon challenges security experts around the globe to build end-to-end cybersecurity solutions for Azure Sentinel that delivers enterprise value by collecting data, managing security, detecting, hunting, investigating, and responding to constantly evolving threats. We invite you to participate in this hackathon for a chance to solve this challenge and win a piece of the $19000 cash prize pool*. This online hackathon runs from June 21st to Oct 4th, 2021, and is open to individuals, teams, and organizations globally.



Azure Sentinel provides a platform for security analysts and threat hunters of various levels to not only leverage existing content like workbooks (dashboard), playbooks (workflow orchestrations), analytic rules (detections), hunting queries, etc. but also build custom content and solutions  as well. Furthermore, Azure Sentinel also provides APIs for integrating different types of applications to connect with Azure Sentinel data and insights. Here are few examples of end-to-end solutions that unlocks the potential of Azure Sentinel and drives enterprise value.




 You can discover more examples by reviewing content and solutions in the Azure Sentinel GitHub repo and blogs. You can refer to the last year’s Azure Sentinel Hackathon for ideas too!


 


Prizes


In addition to learning more about Azure Sentinel and delivering cybersecurity value to enterprises, this hackathon offers the following awesome prizes for top projects:



  • First Place (1) – $10,000 USD cash prize  

  • Second Place (1) – $4000 USD cash prize

  • Runners Up (2) – $1500 USD cash prize each 

  • Popular Choice (1) – $1000 USD cash prize

  • The first 10 eligible submissions also qualify to receive $100 each.


Note: Refer to the Hackathon official rules for details on project types that qualify for each prize category


In addition, the four winning projects will be heavily promoted on Microsoft blogs and social media so that your creative projects are widely known to all. The criteria for judging consist of quality of the idea, value to enterprise and technical implementation. Refer to the Azure Sentinel Hackathon website for further details and get started.


 


Judging Panel


Judging commences immediately after the hackathon submission window closes on October 4th, 2021. We’ll announce the winners on or before October 27th, 2021. Our judging panel currently includes the following influencers and experts in the cybersecurity community.



  • Ann Johnson – Corporate Vice President, Cybersecurity Solutions Group, Microsoft

  • Vasu Jakkal – Corporate Vice President, Microsoft Security, Compliance and Identity

  • John Lambert – Distinguished Engineer and General Manager, Microsoft Threat Intelligence Center

  • Nick Lippis – Co-Founder, Co-Chair ONUG

  • Andrii Bezverkhyi – CEO & founder of SOC Prime, inventor of Uncoder.IO


 


 Next Steps



Let the #AzureSecurityHackathon begin!


 


*No purchase necessary. Open only to new and existing Devpost users who are the age of majority in their country. Game ends October 4th, 2021 at 9:00 AM Pacific Time. Refer to the official rules for details. 


 

Migrate & Modernize Linux VMs and Databases into Azure

Migrate & Modernize Linux VMs and Databases into Azure

This article is contributed. See the original author and article here.

Look at Azure as a platform for running your Linux virtual machine and open source database workloads. Check out options for how you can lift and shift existing VMs and databases to Azure and modernize them using cloud native approaches. Matt McSpirit, from the Azure engineering team, joins Jeremy Chapman to show how Azure supports open source platforms across operating systems, with different Linux distros as well as their publishers and open source databases.


 


Screen Shot 2021-06-21 at 11.58.11 AM.png


 


Azure has been working with Red Hat, SUSE, Canonical, Flat Car, Elastic and HashiCorp, and open source databases like MySQL, Postgres, Cassandra, MariaDB for years. More than 60% of our marketplace solutions run on Linux, and we support open source native PaaS services, as well. Beyond the workload level, we contribute back to the upstream Linux and Kubernetes communities that many of the modern and cloud native architectures rely on.


 



 





QUICK LINKS:


01:09 — Run Linux VMs in Azure


03:01 — Move an open source app from on-prem into Azure


06:04 — How to migrate VMs


07:36 — How to move database into Azure


10:52 — Repackage your VM to run as a container


12:40 — Configure an app


13:31 — Other options


14:48 — Wrap up


 


Link References:


To find information related to Linux running on Azure, check out https://azure.com/Linux


Go to Azure migrate and test out a migration at https://aka.ms/azuremigrate


Find the tools to migrate your data stores at https://aka.ms/datamigration


Deploy Red Hat solutions on Azure at https://Azure.com/RedHat


Run SUSE Linux on Azure at https://Azure.com/SUSE


For more on Azure, go to https://Azure.com/AzureMigrate


 


Unfamiliar with Microsoft Mechanics?


We are Microsoft’s official video series for IT. You can watch and share valuable content and demos of current and upcoming tech from the people who build it at Microsoft.



 


Keep getting this insider knowledge, join us on social:











– Up next, we’ll look at Azure as a platform for running your Linux virtual machine and open source database workloads, from your options for how you can lift and shift existing VMs and databases to Azure, and also modernize them using cloud-native approaches. So, today I’m joined by Matt McSpirit from the Azure engineering team, and also no stranger to Microsoft Mechanics. Welcome back.


 


– Hi, thanks for letting me back on.


 


– And thanks for joining us today. So, we wanted to do a show based on Azure support for open source platforms across operating systems, with different Linux distros and also their publishers and open source databases. You might be surprised that more than 50% of the workloads running in Azure are actually running on Linux VMs.


 


– Yeah, that’s right. And we’ve been working with Red Hat, SUSE, Canonical, Flat Car, Elastic and HashiCorp, and open source databases like MySQL, Postgres, Cassandra, MariaDB, and more for years. And it’s actually more than 60% of our marketplace solutions that run on Linux, and we support open source native PaaS services too. Then beyond the workload level, we also contribute back to the upstream Linux and Kubernetes communities that many of the modern and cloud native architectures rely on.


 


– Okay. So let’s unpack this a bit more starting with your options then that we have for running infrastructure in VMs. So, if you’re a Linux shop maybe running multiple platforms, what’s the best way to think about running Linux VMs in Azure?


 


– Well, nearly every business or organization out there is running on multiple platforms. So there isn’t really a concept of a pure Linux shop or a Microsoft shop these days. And both platforms have a lot of pros and cons, and we’ve done a ton of work for performance, reliability, manageability, and security to make Azure the best home for running any open source workload. Now, starting at the foundational level; as I mentioned, we’re working with the leading Linux distros to optimize the kernels for Azure, including tuning the kernel for a Hypervisor, providing updates and patches and performance enhancements for running open source databases. And we also work closely with Red Hat for managed services like Azure Red Hat OpenShift, and SUSE with SAP enhancements. So we want to make sure when you bring your workloads to Azure, there’s benefit in every step of the way, from onboarding to operation, and you gain more security than you may have had on premises, in your private cloud or in another cloud. And whether you’re starting green field or bringing what you’ve already got running through Azure, we’ve got you covered.


 


– Okay, so spinning up a couple of VMs, I think from Azure is pretty straightforward. But, what if you got dozens or hundreds of VMs that constitute your apps, how would I bring those into Azure?


 


– Well, Azure Migrate is, as we know, your one-stop shop in Azure for bringing in virtual machines, databases, complete applications, even entire VMware sites into Azure. And of course, you can rebuild or rehydrate everything using automation for the apps you install in VMs running in Azure. And those will work the same as you’d expect. But unless you’ve fully automated that process, you’ll likely save a ton of time using Azure Migrate.


 


– Great. So, let’s make this real though. Can you show us how you’d move an open source app then from an on-prem system into Azure?


 


– Sure. Now, first I’ll start by showing you our app called Airsonic. It’s an open source Java app that you can find on GitHub and it’s used to host podcasts, as you can see here. Now, it’s running in an on-prem VMware environment and consists of a frontend VM running on Apache Tomcat on Ubuntu, and a backend VM with MySQL also running on Ubuntu. And I want to migrate and modernize the app. So here’s what I’m going to do. We want to start by lifting and shifting the frontend VM into Azure. Now, as I mentioned, the backend database is running in MySQL on a Linux VM. And instead of lifting and shifting that, I’m going to migrate the data directly from in the VM, into Azure MySQL, a PaaS service, so that I don’t have to worry about managing that VM once the data’s in Azure. And finally, we’ll take the frontend VM from the first step and containerize it so that it runs as a container in the Azure Kubernetes Service. And this step is all about modernization and being able to take advantage of cloud native, scalable compute. And of course, that last step of containerizing the app, well, you can do that from anywhere. It doesn’t need to be currently residing in Azure. I just wanted to start by showing you a lift and shift VM-to-VM migration, because it’s probably where most people will start. So, I’m in Azure Migrate, I’ve already performed a VM discovery on my on-premises, VMware environment. And you can see we’ve got hundreds of Linux virtual machines here.


 


– Right. And by the way, if you want to see how that process works for Azure Migrate, we’ve got a complete step-by-step guide. So check out our recent show on Azure virtual machine migration using Azure Migrate. Now the process is the same by the way, for both Linux as well as Windows.


 


– Absolutely. So in my case, since I’ve already run the discovery, I just need to search for the VMs I want to migrate. So in this case, I’m going to search for Woodgrove, and you’ll see that these two VMs that make up our app both are running Ubuntu with two cores and four gigs of RAM. And if I click into software inventory, for this one you’ll see everything running in each machine. I can also see dependencies, which is all the TCP/IP traffic connecting to our VM. This way I can ensure I migrate everything I need to. Now looking at Port 3306, you’ll see my SQL server. And if I switch tabs to my assessment and click into assess Azure VM, then edit our assessment properties, you’ll see all of the options for basic properties, VM size and pricing. Now I’m going to close that and next, I’ll create an assessment using those two VMs. I’ll give it a name, AirsonicPAYG, pay as you go. Now I’ll create a group with my two VMs and I’ll call it Airsonic. Then I’ll search for Woodgrove again, select my two VMs and finally hit create assessment. Now, if I click into the assessment, you’ll see the details for Azure readiness, with cost details for storage and compute. And when I click into Azure readiness, I can also see the recommended VM size, in my case both the standard F1. If I click into cost details, you’ll see the cost estimates broken down by compute and storage. Now I can tweak all of these values, but everything looks good to me and we can start migrating.


 


– So now that you have the assessment complete, how do you go about doing the actual migration?


 


– Well, with Azure Migrate, you could move both these VMs and the tools even scale to thousands of VMs if you’ve got a load of apps. But in my case, I’m only replicating the one VM for the front end. So, I’m still in Azure Migrate and below the assessment tools are the migration tools. So I’m going to click into replicate. I’m going to choose my virtualization platform, in this case VMware, select my appliance and hit next. And here I could import my assessment to migrate. But since I just want to migrate the one VM, I’ll specify the settings manually. I’m going to search again for Woodgrove and choose the Airsonic frontend VM and hit next. And here I need to enter standard options in Azure as the target settings, like location, subscription, VNet, etc. Now to save time everything’s filled in, and so I’m going to hit next. Now I can choose my VM size and I’ll just pick a standard D2a_v4 and hit next. I’ll keep all selected in the disks tab, and hit next. And now I’m ready to replicate. So let’s go ahead and do that. That’ll save the contents of the VM into my storage account. And back on the Azure Migrate tab, you’ll see our replication has started. If I click into it I can test from here, but to save a step, I’m just going to go straight into Migrate. Select my server and hit migrate, and in just a moment, it’s now a running clone of my original VM. And as you can see, it’s running now in Azure. Here, I could just continue using the on-premise database if I needed to keep it on-prem, but I’d just need to make sure this VM could reach it, and then just redirect the app’s DNS settings to this new IP.


 


– Okay. So in this case, your app is still running, but now you’ve got the front end as a VM in Azure, but the database is left running as VM in your on-prem environment. So, how would I move the database then into Azure?


 


– Well, we’ve got tools that can help with that as well. So, instead of replicating and migrating as a VM, I’m going to convert it to the PaaS service. That way in the future, I don’t need to worry about managing that underlying VM. So let’s do that. We’ll use the Azure Database Migration Service to migrate our databases contents to Azure MySQL. Now DMS works for both MySQL and Postgres. The first thing I needed to do in this case was create a MySQL instance in Azure, which I have done in advance. Now in the MySQL workbench, I set up a tab for my source VMs database on-prem and one for my target in Azure. Knowing the source, I can see all of my tables, but in the target you’ll see there aren’t any tables. Now using the CLI I’ll run a MySQL dump against my source database. Enter the source database password. And with this command, I can see it’s already created a dump, and this is a pretty small database. Now I just need to import the dump I just created into our target database in Azure with this command containing the address. I’ll enter the target’s database password, and this is just copying a schema and a table structure over to the target instance, but not the data yet. And once my dump has moved into the target, I can go back to the my SQL workbench, and in the target database tab I’ll refresh and I can now see all of my tables are there. But they’re still empty, so let’s fill them. So next I’ll head over to the database migration service in Azure. I’ve already created a DMS instance in advance, but now I’ll create a migration project. I’ll give it a name, Airsonic test migration. I’ll choose the target server type and you’ll see additional options here, including Postgres and Mongo, but we’re using MySQL. And now I’ll hit create and run activity. It’s going to ask me for source details, server name or IP. I’ll enter the IP. I’m going to keep Port 3306 into my username, root, and my password, and then I’ll move to the target settings. So I’m going to paste in the server name, its address, my username, Airsonic admin and password. And now I can choose the databases I want to migrate. I only need to move the Airsonic one. So next, I’ll configure the migration settings and you’ll see DMS found our 36 tables. I’ll take a look at everything and keep all the tables selected and move on to the summary. And now I just need to give the activity a name, Airsonic migration 1, and hit start migration. Now I can monitor everything here, but since our database is only a little bigger than two MB, if I hit refresh, you’ll see it’s complete. Now, if I click into it, you’ll see my tables are all complete as well. So now, it’s up and running in our Azure instance. And if you have a large database, we’ve got super fast, parallelized, Azure DMS migration for MySQL, where we’ve been able to burst up to 188 GiB per hour, which is great to minimize the servicing window. And if you need to, you can even migrate deltas between the first and final migration. Now, finally, we would normally update the connection strings so our app knows about the new location of our database. But we’re going to wait, because I want to modernize the front end to run in containers, and that’s going to take just a few minutes. But this will also open up better scalability, and I won’t need to maintain the Ubuntu VM.


 


– So how do you repackage or convert then your front end VM and everything inside of it so it can run as a container in Azure without having to rewrite the app?


 


– So for that, I can use another tool from Azure Migrate called app containerization to containerize the front end VM without needing to rewrite the application. So under explore more in the Web apps to Container section, I’ll click into app containerization. And you’ll see this works with ASP.NET apps running in AKS and Java web apps running on Apache Tomcat. Now from there, I’ll download the tool and with it running, I’ll need to specify the app type, in our case, Java. I’m going to walk through the pre-recs and make sure SSH is enabled, then continue. And now I need to log in with my Azure credentials, select my tenant and subscription and continue. And now I can enter the IP address or fully qualified domain name of my server. And I’ll use the IP here and enter the username and password. Now, once I validate connectivity, I can continue and that will discover the apps running on my server, and you’ll see it found our Airsonic app. So I just need to select it and enter the target container, Airsonic V1. And here, I’ll take a look at the parameters. I’ll keep all of these settings and check all the boxes for username, password, and URL and click apply. Now I need to make a slight edit to the app folder location to map everything correctly. So I’ll add a new folder and enter the path. Then I’ll change the target storage to a persistent volume and save my changes. And now we can move on to the build. Now I just need to choose my Azure container registry and select my container. I’ll hit build. And after a moment, it succeeds. So now I can move on to deployment specifications, where I just need to select my AKS cluster called Contoso-AKS, and then continue. Now I’ll specify my file share with subscription, storage account, and the file share. Now to save time, I’ve configured these in advance. Now in the deployment configuration, I need to configure the app. So here’s where I can check the prefix string, ports, replicas, load balancer type, and I’ll keep what’s there, and enter my username, password, and the URL. I’ll keep the storage config as it is, and I’ll hit apply. And now we’re ready to deploy. So I’ll do that, and once it’s finished, I can click into the link here and see the public IP and the resources. And just to test this out, I’m going to go into kube control and run a get service command, and you’ll see our Airsonic containers running along with the external IP and port that we just saw. So now let’s try this in the browser. I’m going to paste in the address and there’s our app, and I’ll make sure everything works and it’s looking good. So now we’ve migrated our app into Azure and even modernized it to use MySQL PaaS and cloud native containers. So it’s going to be way more scalable and easier to manage.


 


– Okay. So now that your app is running in Azure, what are some of the other things that you can do?


 


– Well, there’s a lot. But to look at just a few highlights, firstly, there are a ton of options to take advantage of for high availability. So those start with VM options for availability sets and availability zones for redundancy, all the way to disaster recovery solutions to ensure your services are as resilient as you need them to be. Now moving OpenStack into the management layer, we’ve also integrated Linux and open source databases. So, all the scaling and elasticity in Azure works for Linux, such as virtual machines scale sets, diagnostics monitoring and software update management are all built in. You can use the Azure portal to manage all of the Linux-based services. And you can take advantage of proactive security with Azure Defender and the Azure Security Center to keep your infrastructure and your data protected. Plus, there are AI and ML capabilities that can be applied to your Linux stack, enhancing your application and workloads with cognitive services or machine learning services. And if your organization uses managed Linux services, we’ve worked closely with Red Hat and SUSE to offer unique, integrated support experiences where you can raise tickets and our support team will work with Red Hat or SUSE support teams to triage cases together. And in fact, Azure is the only cloud service doing that today.


 


– Right. And these are just a few examples of how Microsoft is working with the open source community. So Matt, if anyone is watching and they want to get started, what do you recommend?


 


– To find just about everything related to Linux running on Azure, check out azure.com/linux. And once you’re ready to test out a migration, you can get to Azure migrate at aka.ms/azuremigrate. And to find the tools to migrate your data stores, check out aka.ms/datamigration. We’ve also got a ton of learning content available on Microsoft Learn.


 


– Thanks Matt, for the comprehensive overview and look at what Microsoft’s doing with the open source community and also how you’d bring your open source apps into Azure. Of course, keep watching Microsoft Mechanics for the news and deep dives in the latest tech. And be sure to subscribe if you haven’t already, and thanks so much for watching.




Saving time and money with Azure Maps – Smart Streetlamp

Saving time and money with Azure Maps – Smart Streetlamp

This article is contributed. See the original author and article here.

Azure Maps Location and a Smart Streetlight


For those of us that live in city centers, we regularly benefit from our connected urban environment through initiatives that are making our cities “smarter”. From the Azure IoT & Maps perspective, we look at this same benefit from the other direction. What does it mean for one location in a city to be made smarter and what are the benefits we get without maybe realizing it? To illustrate this, let us look at the typical upgrade path from a traditional streetlamp to a Smart streetlamp.

















IoTGirl_0-1623863466561.png

 



 What are the things a streetlight can do that are autonomous?


 


The simplest examples of this would be using a more efficient bulb and adding an ambient light sensor so that energy is not spent lighting a bulb, even an efficient LED bulb, if it is not needed.  Other environmental sensors such as temperature and humidity could allow the lamp to decide not to operate if it is too hot or too moist. However smart it may seem to be, this autonomous light would still require someone to report any failures and for it to be checked on a regular basis.


IoTGirl_1-1623863466569.png

 



What are the benefits we can realize quickly if that streetlight, with these simple sensors, is now connected?


 


If the bulb fails or light chooses not to function correctly, for a reason such as it is too hot or too humid, an alert can be sent to have the light serviced. If the smart streetlight loses connectivity, it can still operate autonomously but an alert can be generated at the reporting center that this smart device has stopped communicating and should be checked.  A connected device can decide how often it wants to report but it can also be asked for status as needed. This bi-directional communication completely removes the need to check a device unless there is a reported reason to do so saving time, resources, and money!


IoTGirl_2-1623863466574.png

 



 


What are extra benefits we can realize if that streetlight has sensors added that go beyond the function of the streetlight itself?


 


Some cities have chosen to add microphones and listen for abnormal sounds such as a gunshot.  Based on the decibels of the sound at various smart streetlamp locations, police and ambulance resources can be sent close to where the sound originated.  In higher traffic regions, a camera may also be added for extra security and information gathering.


 



In all cases, it is important to know where the lamp is located and that is where Azure Maps pulls it all together. 



If you would like to dig deeper into Smart Devices, Azure IoT and Azure Maps, I have the following recommended links:



Thank you for your interest in Azure Maps. If you have any questions feel free to post them in our Q&A section, we are here to help!

Securely collaborate with guests using Azure AD guest access reviews

Securely collaborate with guests using Azure AD guest access reviews

This article is contributed. See the original author and article here.

Companies collaborate with hundreds of clients, partners, and vendors every day. Today’s organizations use many applications and devices, and managing digital identities for these guests  increases the risk of security breaches. More than 40% of IT leaders said that they want an identity governance solution that improves their security posture, according to an internal Microsoft survey.


 


guest accounts.png


 


These decision-makers’ top concern  is the increased risk of security breaches due to distributed access to company resources. This problem is exacerbated as more companies adopt hybrid work and require secure collaboration with external users. IT admins have no way to track usage or answer the following questions:


 



  •  What content are users interacting with?

  • How long have the resources been shared?

  • Are accounts still active?

  • Are user privileges at risk of expiring?


 


Organizations can manage guest access with automated reviews


More than 70% of survey respondents said they either don’t have a process for managing guest accounts or they manually manage guest accounts. Manual processes often involve reliance on custom scripts or middleware, increasing the chance of human error that leads to elevated security risk. Also, an IT admin can never know all of the external users who require access to company resources. Business managers are the ones who are best suited for identity and access management activities for their guests and external partners.   


 


An Azure Active Directory Identity Governance solution empowers Microsoft customers to securely collaborate with guests across organizational boundaries. Customers can set up automated, periodic access reviews using an intuitive interface that provides smart recommendations, ensuring that guests gain the right access to the right resources for the right amount of time.


 


Once guests no longer require access to sensitive data, companies can automatically revoke their access to those resources. If a business owner or a manager isn’t in Azure AD, guests can review their own membership in a group.


 


periodic access certifications.png


Figure 1: Access review features enable customers to securely manage guest access at scale.


 


Automated provisioning and deprovisioning of guest access to sensitive data enables customers to move away from custom scripts and reduces errors associated with manual processes Automated provisioning and de-provisioning of guest access into SaaS applications ensures that the only way guests can access these apps is through permissions set up by the organization and not decisions made on a case-by-case basis by an IT admin.


 


In large organizations, business managers are best suited to manage guest access for collaboration. Azure AD governance features put control firmly in the hands of business managers who are best suited to provide appropriate levels of access to sensitive data to external users. By delegating to non-administrators, customers can ensure that the right people are managing access to their department’s sensitive data. Delegation of responsibility reduces the IT helpdesk burden and frees up the IT staff for more strategic initiatives.


 


The response from Azure AD governance customers has been positive:


“Azure Active Directory guest access reviews give us that ability to be agile in our collaboration with external parties, with the right level of control, so our security, legal, and data privacy people are comfortable.” ~ Avanade


 


Microsoft customers in regulated industries and those that work with the government have to regularly demonstrate to auditors the effectiveness of their controls over access rights. Azure AD access reviews for guests enable these customers to easily prove to auditors that their organization has the appropriate controls in place. Azure AD provides a centralized view of all access reviews with a simple interface involving very few configuration steps, enabling IT admins to see which resources a user can or cannot access across a multi-cloud, multi-device, and fragmented application landscape.


 


Watch our video review of guest user access across all Microsoft 365 groups and Microsoft Teams for a step-by-step overview of Azure AD Access Reviews. To learn more about Microsoft Identity Governance solutions, visit our website.


 


 


 


Learn more about Microsoft identity:


Microsoft 365 Developer Podcast – Microsoft Graph Connectors with Brian T. Jackett

This article is contributed. See the original author and article here.

Jeremy and Paul talk with Brian T. Jackett about the Graph Connectors API.


https://www.podbean.com/player-v2/?i=ms7mb-106d50d-pb&from=pb6admin&share=0&download=0&rtl=0&fonts=Courier%20New&skin=1&btn-skin=6


Listen to the show here: Microsoft Graph Connectors with Brian T. Jackett (m365devpodcast.com) 


Links from the show:



Microsoft News



Community Links