Now, as you know that there are the following 3 types of Project Operations Deployments – To keep the comparison understandable and simple, let’s review the below in a short summary. Comparison Project Operations Lite Project Operations for Resource/Non-stocked Project Operations for Production/Stocked Finance & Operations module is not setup Finance & Operations setup is … Continue reading Difference between Project Operations Deployment Types
Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.
Microsoft Graph API is a superb way to leverage Microsoft 365 services as a developer. Before you start to wonder where all to get started from, this post is for you! Microsoft Graph Explorer Here’s how you can start using the Microsoft Graph API – Important Notes Here’s the official link to the Microsoft Graph … Continue reading Getting Started with Microsoft Graph API
Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.
This article is contributed. See the original author and article here.
On December 14, 2023, The Ambassador Projects Demo Day was held at. The event brought together rising developers from all over the world to collaborate and create innovative solutions to real-world problems.
We would like to extend our sincerest thanks to the Gold Leads Amtaulla Bohara, Anushka Bhatnagar, Arpita Das, Hadil BenAmor, John Aziz, Mudasir Murtaza, Muhammad Samiullah, and Rohit Yadav for their hard work and dedication in putting together such an amazing event and leading this past cycle of the projects program. Without their tireless efforts, this event would not have been possible.
The winning team was Digital Digesters. Their project, YouTube Summarizer, was chosen as the winner because of its innovative approach to solving a real-world problem. YouTube Summarizer is an AI tool to transcribe YouTube videos. The judges were impressed with the team’s ability to work together and create a solution that was both practical and innovative. Congratulations to DANTON KIPKURUI, Ian Peter, Madhav Gaur, SHREYANSHI RATHI.
Other teams that participated in the Ambassadors Projects demo day included Onboarding Software, Catalyst, Data Sensei, and Inclusi-AI Vitality. Each team worked tirelessly to create innovative solutions to real-world problems. Although they did not win, their projects were impressive and showed great promise.
Onboarding software: – Build a healthy eco-community by integrating recruiting software that will aid in maintaining a diverse workforce and equip recruiters with the ability to hire talent from all over the world.
Data Sensei: – DataSensei-DBInsights is a dedicated project aimed at empowering individuals and businesses with the knowledge and skills essential for proficient database management and administration. In a world where data is a valuable asset, our mission is to provide clear and comprehensive guidance on database technologies, best practices, and real-world applications.
Team Catalyst: AI Chat Bot for The Microsoft Learn Student Ambassadors program. Powered by Chat GPT-4, Amy is not just any bot; she’s been meticulously trained with Student Ambassador Handbook. Whether you’re a new ambassador or a seasoned member, Amy is here to provide precise and insightful answers to all your MLSA (Microsoft Learn Student Ambassadors) Program-related queries.
Team Inclusi-AI-Vitality: A comprehensive mental well-being app powered by Flask, Next.js, OpenAI API, and Azure services. This project aims to provide users with a personalized mental well-being app that offers a range of features to support their emotional well-being. The app utilizes Flask as a backend framework, Next.js for a dynamic frontend, OpenAI API for natural language processing and conversational AI, and Azure services for cloud hosting and scalability.
Overall, this cycle of Ambassador Projects was a huge success. The event brought together some of the brightest minds in the industry and showcased some truly innovative solutions to real-world problems. We look forward to seeing what the future holds for these talented developers.
This article is contributed. See the original author and article here.
How to use Azure Maps to Build a Taxi Hailing Web App
Learn how simple it is to set up an Azure Maps Resource account and quickly create applications that have beautiful maps that can be used in a range of solutions. In this tutorial we are going to create a simple and fast taxi hailing Web application with only HTML, CSS and Vanilla JavaScript.
The Taxi Hailing Web Application
Roam Rides is a fictional company that wants to extend their static web application to have capability to offer route calculation and generate a price for a trip that must be made in that calculated route. In this tutorial we are going to add basic map functionalities and features using a CDN link that loads the Azure Maps SDK into our project.
In the search box, type in Azure Maps then select Azure Maps Accounts as show in the picture below.
Select +Create and in the Create an Azure Maps Account resource page create a new resource group or use a preferred resource group, enter name of the Azure Maps Resource you want to create, for example I will name the resource Roam, the name should be unique.
Click Review + Create.
After the deployment is done select Go to resource and on the left panel select Authentication and copy the Primary Key.
Build the Taxi Hailing Website and Add Azure Maps Map Control
This section introduces you to Azure Maps Map Control and gives you guidance on how you can directly use Azure Maps SDK CDN to start practicing and build an applications with Azure Maps in a fast, clean and easy way, just to get your feet wet. To show you this, we are going to build a taxi hailing app. To get all the code we are going to use for this tutorial, feel free to fork the repository azureMaps-roamrides from GitHub.
Create an index.html, index.css, map.js and index.js file.
In the index.html file add the following html code.
The html head code
Roam Rides
Notice the SDK files we have just imported into the project:
We first of all add references to the css style sheet of the map, which is a CDN link. We then add a global version of the Web SDK which is also served as a CDN.
We then add a reference for map.js as , in order for it to be loaded first.
Add the rest of the html code shown below.
Roam Rides
Get a ride to your destination.
let's go
Next let’s add some code to map.js.
The following code creates a GetMap() function that will create a map object.
let map, datasource, client;
function GetMap(){
//Instantiate a map object
var map = new atlas.Map('myMap', {
// Replace with your Azure Maps subscription key. https://aka.ms/am-primaryKey
authOptions: {
authType: 'subscriptionKey',
subscriptionKey: '<key in your primary subscription key here'
}
});
}
We then add this section to the function, this part of the code will create a data source and add map layers only when all resources are ready and have been all loaded.
//Wait until the map resources are ready.
map.events.add('ready', function() {
//Create a data source and add it to the map.
datasource = new atlas.source.DataSource();
map.sources.add(datasource);
//Add a layer for rendering the route lines and have it render under the map labels.
map.layers.add(new atlas.layer.LineLayer(datasource, null, {
strokeColor: '#b31926',
strokeWidth: 5,
lineJoin: 'round',
lineCap: 'round'
}), 'labels');
//Add a layer for rendering point data.
map.layers.add(new atlas.layer.SymbolLayer(datasource, null, {
iconOptions: {
image: ['get', 'icon'],
allowOverlap: true
},
textOptions: {
textField: ['get', 'title'],
offset: [0, 1.2]
},
filter: ['any', ['==', ['geometry-type'], 'Point'], ['==', ['geometry-type'], 'MultiPoint']] //Only render Point or MultiPoints in this layer.
}));
In this other section, still under the GetMap() function, we are going to pick out the latitude and longitude from the input boxes that we have in the html document. The split method in JavaScript will be used to derive the coordinates from the input boxes. We can finally calculate the route and find necessary information.
//Create the GeoJSON objects which represent the start and end points of the route.
//starting coordinates
let start_lat=parseFloat(startLocation.value.split(':')[1].split(',')[0])
let start_long=parseFloat(startLocation.value.split(':')[1].split(',')[1])
var startPoint = new atlas.data.Feature(new atlas.data.Point([start_long,start_lat]), {
title: `${startLocation.value.split(':')[0]}`,
icon: "pin-red"
});
//destination coordinates
let end_lat=parseFloat(endLocation.value.split(':')[1].split(',')[0])
let end_long=parseFloat(endLocation.value.split(':')[1].split(',')[1])
var endPoint = new atlas.data.Feature(new atlas.data.Point([end_long,end_lat]), {
title: `${endLocation.value.split(':')[0]}`,
icon: "pin-round-red"
});
//Use MapControlCredential to share authentication between a map control and the service module.
var pipeline = atlas.service.MapsURL.newPipeline(new atlas.service.MapControlCredential(map));
//Construct the RouteURL object
var routeURL = new atlas.service.RouteURL(pipeline);
//Start and end point input to the routeURL
var coordinates= [[startPoint.geometry.coordinates[0], startPoint.geometry.coordinates[1]], [endPoint.geometry.coordinates[0], endPoint.geometry.coordinates[1]]];
//Make a search route request
routeURL.calculateRouteDirections(atlas.service.Aborter.timeout(10000), coordinates).then((directions) => {
//Get data features from response
var data = directions.geojson.getFeatures();
datasource.add(data);
});
});
Lastly, we add to the GetMap() function the following code. This code will create date formatter for your local region to display information on the pick up and drop off time. Finally it then appends the data and information of that route requested. How it does this? We use fetch API to get response from the Azure server that will serve us with the route calculation result. You can use Postman to test for some of these endpoints that have been shared in the code.
The following JavaScript code utilizes fetch api to suggest locations while typing into the text boxes. The getLocations() function does the job of getting this locations through the help of fetch api, that utilizes the end point specified above that returns a number of results as a response. The getLocations() function then appends these results to the data list specified as elements.
Now our last step is to open the index.html page and see if the web app works. If it works, it supposed to look like the one shown in the screenshot below.
Type in a pick up street and a drop street and observe let’s go label change to pricing.
There we have it. You have successfully helped Roam Rides to achieve their goal. Congratulations on implementing Azure Apps on a web application.
This article is contributed. See the original author and article here.
We continue to expand the Azure Marketplace ecosystem. For this volume, 164 new offers successfully met the onboarding criteria and went live. See details of the new offers below:
Get it now in our marketplace
Akamai Segmentation: Akamai Segmentation simplifies and accelerates segmentation projects, providing affordable and context-rich visibility into traffic across cloud, PaaS, on-prem, and hybrid environments. The infrastructure-agnostic tool manages policies, attaches policies to workloads, and provides an overlay segmentation solution that reduces friction and decreases convergence time.
Atlassian Migration Suite: Solidify’s Atlassian migration suite helps users easily migrate Jira issues, test data, and Confluence pages to Microsoft Azure DevOps. The migrator PRO tool allows for the migration of all issues, including attachments and comments, while maintaining issue hierarchy and links.
HumanSoft – HR Solution Built on Power Platform: HumanSoft is comprehensive HCM software for large or medium-sized organizations. It integrates with Microsoft Teams and covers the employee lifecycle from hiring to retirement. HumanSoft has a self-service portal, reporting and analytics, recruitment and onboarding/offboarding features, learning and development elements, and more.
Insight for Web Server (IWS): Insight for Web Server is a security solution that acts as a reverse proxy response scanner for HTTP/HTTPS endpoints, protecting against various types of information leakage and malicious attacks. It complements existing security systems by extending the protection profile to include protection against outgoing information leakage.
Jira Migrator Pro: This tool from Solidify streamlines the migration process from Jira to Microsoft Azure DevOps. It ensures comprehensive migration of all issues, attachments, comments, issue hierarchy, and links. The PRO version offers advantages such as custom mapping for objects and arrays, user mapping automation, and priority support. It also allows for easy configuration and workflow insights.
Kollective EdgeCache: EdgeCache is a video caching platform that streamlines bandwidth requests, groups streaming traffic to a central point, and limits concurrent connections to supercharge network efficiencies. It requires no additional hardware, and it can handle requests behind a firewall, enable backhaul network setups, and simplify tunneled remote worker scenarios.
RidgeBot: RidgeBot enables enterprises, governments, web app teams, and others to affordably and efficiently test their systems. The system automates the penetration testing process and emulates adversary attacks to validate an organization’s cybersecurity posture. RidgeBot provides a clearer picture of security gaps, which lets security experts devote more energy to research.
Go further with workshops, proofs of concept, and implementations
AI Leadership/Learn GPT: 2-Day Workshop: ACP’s workshop offers two phases of keynote speeches, knowledge transfer, and interactive elements. It aims to provide solutions to technical challenges and potential in your existing specialist areas, with a focus on the use of Microsoft Azure OpenAI Service for generative AI. The workshop also covers cost estimates, content transfer, and workflow creation.
AI Use Case Workshop (Generative AI/Microsoft 365 Copilot): This workshop from KPCS identifies opportunities for implementing generative AI in businesses. The solution proposal involves using AI services such as Microsoft 365 Copilot for enhancing productivity, Azure Cognitive Services for building intelligent apps, and other Microsoft AI services that align with business requirements.
Azure Cost Optimization: Intellias will identify areas of overspending and potential cost-saving opportunities for cloud-based systems. Intellias will assess Azure deployments, implement Azure Monitor and Azure Advisor, and prepare an optimization proposal based on cost-benefit analysis. You can expect reduced cloud computing platform bills, improved profitability and observability, optimized resource utilization, and enhanced performance.
Azure Networking: NW Computing will implement two Meraki vMX virtual appliances to extend your SD-WAN to Microsoft Azure in availability zones for redundancy and high availability. Azure Functions will be set up for automatic failover and updating of user-defined routes. Best-practice Azure security and management will be implemented.
Azure Virtual Desktop Basic Implementation: Chalkline will set up and manage Azure Virtual Desktop so your business can access desktops and applications from anywhere, with robust security and cost-efficient infrastructure. Start with a discovery call to ask questions and learn about Chalkline’s approach toward providing fully managed solutions.
Azure Virtual Desktop Proof of Concept: Chalkline will set up and manage Azure Virtual Desktop so your business can access desktops and applications from anywhere, with robust security and cost-efficient infrastructure. This proof of concept includes a discovery call, migration and modernization, a testing session, a review session, and a Q&A.
Azure Virtual Desktop Workshop for SMB: Chalkline will set up and manage Azure Virtual Desktop so your business can access desktops and applications from anywhere, with robust security and cost-efficient infrastructure. This workshop covers Azure Virtual Desktop basics, exploring benefits, security, cost optimization, customization, and migration.
BI4ALL Databricks Lakehouse Test Automation: 8-Day Implementation: This test framework from BI4ALL is designed for Microsoft Azure Databricks and detects errors and anomalies. The framework promotes transparency in data quality processes, fostering a culture of continuous improvement. It empowers data professionals to fortify the foundation of their analytical solutions and enhances the quality and reliability of data-driven endeavors.
CAF Workshop: The Microsoft Cloud Adoption Framework provides a set of practices and recommended steps to successfully adopt the cloud. Sii will use CAF to guide your business through strategy, planning, readiness, adoption, and secure governance and management. Topics include defining motivations, creating a cloud adoption plan, migrating to Microsoft Azure, and implementing best practices for security and compliance.
Cloud-Native AI-Infused Application Design: 2-Day Workshop: Reply offers AI-integrated applications for businesses to redesign customer experience, enable intelligent manufacturing processes, and improve knowledge management. Reply uses OpenAI services on Microsoft Azure and provides support from vision to product-ready implementation. This workshop will involve developing a common vision, analyzing use cases, and creating a detailed road map.
Community Training – Platform Support: Wipfli offers a platform for organizations to provide digital skills training across large regions. Wipfli contributes ongoing support for adoption and customer success, which includes answering functionality questions, coordinating with a Microsoft support team, and offering technology guidance.
Customer Service Gen AI Bot – 4 Week Implementation: Generative AI has revolutionized customer service by providing swift, accurate responses to inquiries, streamlining communication, and freeing up human agents for more complex tasks. Decision Inc. will implement a Customer Service Automation Bot for improved customer experiences, increased efficiency, and strengthened relationships.
Data Foundation Advisory Services: SoftwareOne offers enterprises a suite of services to establish, standardize, and optimize data infrastructure. This engagement includes envisioning, discovery, and adoption of required changes based on priorities from the discovery phase. Workshops and knowledge transfer sessions will be included.
Data Governance with Purview: Inetum offers a complete data governance solution based on Microsoft Purview. Its consultancy teams help organizations build data trust through collaborative data quality, offering consulting, coaching, data architecture, and deployment services. The approach combines tools to design and deploy governance with the efficiency provided by Microsoft Purview, saving time and improving business efficiency.
Demand Forecasting Accelerator: 8-Week Proof of Concept: LTIMindtree’s Demand Forecasting Accelerator uses machine learning models to predict future demand at a distributor SKU level, enabling better product planning and coordination in the supply chain. The solution identifies the best machine learning model for different product categories and logs different metrics for better tracking.
Demand Forecasting: 10-Week Implementation: Tiger’s forecasting solution on Microsoft Azure improves forecast accuracy, reduces inventory stockouts, controls sourcing and manufacturing, and facilitates labor planning. The solution can be scaled across categories and markets quickly and seamlessly. Key deliverables include harmonized data, finalized modeling dataset, and model documentation.
Gen AI ISO27001 Policy Bot: 4 Week Implementation: The ISO 27001 Policy Bot from Decision Inc. streamlines access to information, enhances productivity, and promotes a self-service culture. It ensures compliance and aligns employees with organizational goals, while also facilitating continuous learning.
Gen AI-Powered Strategy and Lateral Thinking Workshop: Brainstorm, a tool from Slalom, combines Microsoft Azure OpenAI and Slalom’s facilitation frameworks to generate 40 percent more ideas during client workshops. Kick-start innovation conversations and enable teams to continue ideating after this workshop session from Slalom, maintaining access to all ideas using Azure data storage solutions.
HCLTech’s PDOC Application Migration and Modernization Services: HCLTech helps enterprises move and update their legacy applications to modern cloud-native platforms and architectures. This service covers the entire lifecycle of cloud transformation, from assessment to optimization, and includes cloud re-platforming, re-architecting, re-engineering, and DevOps services.
HCLTech’s PDOC Data Modernization and Migration Services: HCLTech offers data modernization and migration services to help enterprises improve and maximize their data assets. This comprehensive approach covers the entire data lifecycle, from discovery to governance, and includes prebuilt tools and a flexible, scalable solution.
IBM Consulting Global Hybrid Retail (HC4R) – Store: IBM Hybrid Cloud for Retail is an integrated suite of assets that helps retailers create a unified store operating platform. It features AI-driven workflows, modern user experiences, and next-gen performance insights. The modular solution works alongside existing technology to deliver seamless omnichannel experiences, empowering associates and optimizing operations.
Infinity LAMPS: 4-Week Implementation: LAMPS is a platform for automating the migration of SAP workloads to Microsoft Azure. LTIMindtree’s deployment process involves setting up a secure Azure subscription, identifying use cases, creating process flow models, building workflows, and conducting end-user testing. Deliverables include deployment, automation of three workflows, validation, and a road map for automation.
LTIMindtree Assessment and Proof of Concept for Azure VMware Solution: LTIMindtree offers end-to-end migration solutions for on-premises infrastructure to Microsoft Azure using Microsoft Azure VMWare Solution. This offer includes assessment, design, migration, validation, management, and operation of client infrastructure.
LTIMindtree KenAI: 6-Week Proof of Concept: LTIMindtree KenAI is an integrated automation framework that helps accelerate MLOps on Microsoft Azure. It standardizes and streamlines the AI/ML journey by leveraging built-in components to operationalize models with speed and scalability. KenAI provides observability across models, insights around drift management, ground truth evaluation, and model explanations.
LTIMindtree REDAR Consulting: LTIMindtree REDAR is an AI-driven solution suite that helps manage product portfolios in e-commerce marketplaces. It provides actionable recommendations on pricing, new product opportunities, and promotional strategies. The suite is powered by Microsoft Azure and offers smarter and faster insights to decode market demand for a sustainable product portfolio.
Money in Motion: 6-Week Proof of Concept: Money in Motion, an AI/ML-driven data monetization solution using Microsoft Azure, helps banks tap into the immense value inherent in their payment transactions data, grow their customer base, and deepen product penetration. The solution from LTIMindtree also offers low-cost and sustainable funds, data-driven insights, and hyper-personalized recommendations.
Proof of Concept SAP on Azure Migration: Vnomic automates and engineers SAP landscape deployment and management on Microsoft Azure, reducing costs and time to value from months to hours. It meets SAP and Azure best practices, improves IT staff productivity, and accelerates cloud migration projects. The solution also optimizes infrastructure utilization.
SUSE Rancher Enterprise Kubernetes Management Platform: 3-Week Implementation: SUSE Rancher is an open-source enterprise computing platform for running Kubernetes clusters on-premises, in the cloud, and at the edge. This offer from Frontier Digital lets organizations operate Kubernetes with a NeuVector zero-trust container security platform for safeguarding cloud-native applications.
SUSE Rancher Enterprise Kubernetes Management Platform: 1-Week Proof of Concept: SUSE Rancher is an open-source enterprise computing platform for running Kubernetes clusters on-premises, in the cloud, and at the edge. Frontier Digital offers a code-first proof of concept that includes a NeuVector zero-trust container security platform.
This article is contributed. See the original author and article here.
Introduction
Migrating a full-stack application can be an intricate job, even if you are using the same technologies on different clouds. Some things need to be done in order to have a fully functional application. If both platforms support the programming language and version, then that’s one thing to put aside and start working on figuring out how to connect the database. Databases differ in the language they speak. You can use a universal dialect like SQLAlchemy that facilitates communication with multiple databases like MySQL. The last problem is to provide the application with the credentials it needs in a way it understands to establish a connection. Once you are done and the database is up and running. Here, comes the part where you look for a tool to import your data. Luckily, mysql CLI provides a command that you can use to import your data.
In this blog, you will go through a step-by-step guide, from preparing your Full-stack web application to be deployed to Azure and exporting your data from Google Cloud SQL to deploying your application to Azure App Service, migrating your MySQL database to Azure Databases for MySQL and connecting it to your application.
We have got you covered whether you already have a full-stack application working on Google or looking to bring your first full-stack application to the Internet. You’ll learn everything you need to do to deploy your website to Microsoft Azure.
What will you learn?
In this blog, you’ll learn to:
Export a database from Google Cloud SQL to Cloud Storage and save it locally.
Create an Azure Web App to host your application and a MySQL database to store your data.
Fork a GitHub repository and configure continuous deployment from GitHub to Azure App service.
Modify the application environment variables to bind the app with the database.
Import data to MySQL database using mysql CLI inside Azure App service SSH session.
What is the main objective?
Migrating a Full stack application from Google Cloud to Microsoft Azure including a Python web app and MySQL database.
Prerequisites
An Azure subscription.
If you don’t already have one, you can sign up for anAzure free account.
For students, you can use the freeAzure for Students offerwhich doesn’t require a credit card only your school email.
Step 2: Create an Azure Web App and a MySQL Database.
Step 3: Fork the Code and Configure Azure App Service Deployment.
Step 4: Configure Azure App Service with your Relational Database.
Step 5: Import your Data into Azure MySQL using Azure App Service.
Step 1: Export your Data from Google Cloud SQL
Google Cloud SQL provides you with the ability to export your database as a SQL dump file which can be used to recreate the whole database with all its tables and data anywhere you want.
In this step,you export your data from Cloud SQL to have a potable and reusable copy from your entire database.
Complete the following steps to export your data from Cloud SQL in Google Cloud:
2. Type cloud sql in the search bar at the top of the console page and select SQL from the options that appear.
3. Select the Instance ID of the Cloud SQL instances that you want to export.
4. Select Export from the top navigation menu to export your database.
5. Perform the following tasks to export data to Cloud Storage:
What
Value
File format
Select SQL.
Data to export
Select the name of the database that has your tables and data.
Destination
Select Browse to choose a cloud storage bucket. Currently, the only supported destination is Google Cloud Storage
6. Select the + icon to create a new bucket.
7. Enter a globally unique name for your bucket followed by selecting CREATE. Leave all the other options to the default values as you will delete this bucket later.
8. Select CONFIRM to proceed with the creation process. This prompt asks if you want to make the bucket open for public access or private, private will work for you.
9. Select the SELECT button to select the newly created bucket to save your data inside.
10. Select EXPORT to confirm your selection and initiate the data export process.
11. Select the name of the file from the notification pane at the bottom right of the screen to redirect you to the storage bucket that has the exported file.
12. Select the DOWNLOAD button to download the data locally to your device.
13. Select DELETE to delete the bucket after the download finishes as you no longer need it.
Congratulations! You successfully exported your database from Google Cloud SQL. The application source code is available on GitHub so, there is no need to do anything from the Application side. In the next step, you’ll create an Azure Web App and a MySQL database.
If you don’t have a database on Google Cloud and want to follow along you can use my data export file Cloud_SQL_Export_2023-10-15 (22_09_32).sql from john0isaac/flask-webapp-mysql-db . (github.com).
If you don’t have a database on Google Cloud and want to follow along you can use my data export file Cloud_SQL_Export_2023-10-15 (22_09_32).sql from john0isaac/flask-webapp-mysql-db . (github.com).
Step 2: Create an Azure Web App and a MySQL Database
Azure App Service is an HTTP-based service for hosting web applications, REST APIs, and mobile back ends. You can develop in your favorite language, be it .NET, .NET Core, Java, Node.js, PHP, and Python. Applications run and scale with ease on both Windows and Linux-based environments.
In this step, you create an Azure App service to host your Python application and a MySQL database to store the migrated data.
Complete the following steps to create an Azure Web App and a MySQL database in the Azure portal:
2. Type app services in the search bar at the top of the portal page and select App Service from the options that appear.
3. Select Create from the navigation menu followed by selecting Web App + Database.
4. Perform the following tasks:
In the Project Details section,
What
Value
Subscription
Select your preferred subscription.
Resource group
Select the Create new under (New) Resource group to create a new resource group to store your resources. Enter a unique name for the resource group followed by selecting OK.
Region
Select a region close to you for best response times.
In the Web App Details section,
What
Value
Name
Enter a unique name for your applications. This is the same subdomain for your deployed website.
Runtime stack
Select Python 3.8.
In the Database, Azure Cache for Redis, and Hosting sections,
What
Value
Engine
Select MySQL – Flexible Server.
Server name
Enter a unique name for your server. This is the place that will host your different database instances
Database name
Enter a unique name for your database. This is the instance that will store your tables and data
Add Azure Cache for Redis?
Select No. Azure Cache for Redis is a high-performance caching service that provides in-memory data store for faster retrieval of data but will incur more charges to your account.
Hosting Plan
Select Basic. You can scale it up later the difference between the two plans is their different capabilities and the cost per service you are receiving.
5. Select Review + create.
6. Save the Database details in a safe place as you need them to connect to your database. This is the only time that you have access to the database password.
7. Select Create to initiate the deployment process.
8. After the deployment finishes, selectGo to resourceto inspect your created resource. Here, you can manage your resource and find important information like the Deployment center and configuration settings for your website.
Congratulations! You successfully created a web application and a database with a single button this enables you to deploy your code and migrate your data later to them as the website and database are initially empty. In the next step, you will get the website code and deploy it to Azure App service.
Step 3: Fork the Code and Configure Azure App Service Deployment
The sample code you are using is an Artists Booking Venues Web Application powered by Python (Flask) and MySQL Database.
In this step, you’ll:
Fork a GitHub repository on GitHub.
Configure continuous deployment from the Deployment center on Microsoft Azure.
2. Select Fork to create a copy from the source code to your own GitHub account.
3. Navigate back to your newly created deployment on Microsoft Azure. Select Deployment Center.
4. To link your GitHub repository with the Web App, Perform the following tasks:
What
Value
Source
Select GitHub.
Signed in as
Select your preferred Account.
Organization
Select your Organization. This is your GitHub username if you haven’t forked the repository to an organization.
Repository
Select the name of the forked repository flask-webapp-mysql-db.
Branch
Select main.
5. Select Save to confirm your selections.
6. Wait for the deployment to finish. You can view the GitHub Actions deployment logs by selecting the Build/Deploy Logs.
7. Once the deployment is successful, select the website URL from the deploy job to view the live website.
Congratulations! You successfully deployed a website to Azure App Service and as you can see the website works as expected.
But if you try to navigate to any page that needs to make a call to the database you get the following error.
Let’s go ahead and solve this error by configuring the database.
Step 4: Configure Azure App Service with your Relational Database
This web application uses SQLAlchemy ORM (Object Relational Mapping) capabilities to map Python classes defined in models.pyto database tables.
It also handles the initialization of a connection to the database and uses the create_all() function to initiate the table creation process.
But how do you trigger this function to make all of that happen?
If you navigate to the beginning of the app.py you will find that in order for the application to call the setup_db() function it needs an environment variable called DEPLOYMENT_LOCATION.
You may wonder, why are we using this? The answer is quite simple, different deployment locations require different database configurations.
Feel free to check out the difference in the environmentfolder.
Let’s go ahead and define this environment variable to start the database creation process.
1. Navigate back to your web app on Azure and select Configuration from the left side panel under the Settings label.
2. From the Configuration window, select + New application setting to add a new environment variable.
3. Add the following name and value in the input text fields followed by selecting Ok.
Name
Value
DEPLOYMENT_LOCATION
azure
4. Confirm that DEPLOYMENT_LOCATION is in the list of Application settings then, select Save followed by selecting Continue.
5. Wait a couple of seconds then, refresh the website to see the update.
Congratulations! It works but wait a minute… Where is the data? Everything is blank! You haven’t imported the database yet but now the website is connected to the database and the tables have been created, which means that you can insert new data from the website, update, and delete it but you don’t have access to the old data yet. In the next step, you will work on importing your data using the SSH feature from Azure App service.
Step 5: Import your Data into Azure MySQL using Azure App Service
This application and database are deployed to a virtual network so, you can’t access them unless you use a virtual machine deployed to the same virtual network and that’s why you are going to make use of the SSH feature in your web app to access the database through the web app and import your data.
To import the data you need a database dump or a .SQL file uploaded to your GitHub repository. If you don’t have that you can use my database export from the repository that you forked from here.
To import the data you need a database dump or a .SQL file uploaded to your GitHub repository. If you don’t have that you can use my database export from the repository that you forked from here.
Let’s go ahead and SSH into the website.
1. Navigate back to your web app and select SSH from the left side panel under the Developer Tools label.
2. Select Go -> to open the SSH session in a new window.
Inside the ssh session, perform the following tasks:
3. Execute this command to update the installed packages.
apt-get update
4. Execute this commandto install mysql as it doesn’t come preinstalled. If prompted Do you want to continue? type y and press Enter.
apt-get install default-mysql-server
5. Execute this command to import your .SQL file data to the MySQL database. The file referred to in this command was uploaded with the website data from GitHub.
mysql –host=$AZURE_MYSQL_HOST –user=$AZURE_MYSQL_USER –password=$AZURE_MYSQL_PASSWORD $AZURE_MYSQL_NAME<'Cloud_SQL_Export_2023-10-15 (22_09_32).sql' –ssl
Note that I had to clean up the exported SQL from Google Cloud a little bit but I didn’t add anything to it I just removed the unnecessary to avoid errors in the ssh session.
6. Navigate back to the website, refresh any page and you’ll find all the data there.
Congratulations!!! you have come a long way taking the data and web application from Google Cloud to Microsoft Azure through all the steps in this blog.
Clean Up
You can now safely delete the Google Cloud SQL database and disable your App Engine or even delete the whole project.
Once you finish experimenting on Microsoft Azure you might want to delete the resources to not consume any more money from your subscription.
You can delete the resource group and it will delete everything inside it or delete the resources one by one that’s totally up to you.
Conclusion
Congratulations, you have learned and applied all the concepts behind taking an existing Python web application and a MySQL database and migrating them to Microsoft Azure.
This gives you the ability to build your own web applications on Azure and explore other databases like Azure Cosmos DB or Azure Databases for PostgreSQL as you will find that at the end you just need a connection string to connect with a different database and a dialect to translate your code to a language that the database understands. You have also learned that you can deploy your website to Microsoft Azure by selecting your website’s programming language, no extra configuration is needed or creation of any file.
This article is contributed. See the original author and article here.
We continue to expand the Microsoft AppSource ecosystem. For this volume, 115 new offers successfully met the onboarding criteria and went live. See details of the new offers below:
Akamai Segmentation: This product from Akamai Technologies simplifies and accelerates segmentation projects, providing affordable and context-rich visibility into traffic across cloud, PaaS, on-prem, and hybrid environments. It offers a single infrastructure-agnostic tool to manage policies across all environments, attaches policies to workloads, and provides an overlay segmentation solution that reduces friction and decreases convergence time.
Drill Down Combo Bar PRO (Pin): From ZoomCharts, Drill Down Combo Bar PRO (Pin) is a customizable chart used by 80% of Fortune 200 companies. It offers multiple chart types, stacking and clustering options, on-chart interactions, and full customization. Popular use cases include sales and marketing, human resources, accounting and finance, and manufacturing. The visual is mobile-friendly and comes with 30 days of free access to paid features.
Drill Down Timeline PRO (Pin): This customizable visual from ZoomCharts allows for time-based data visualization and drill-down to specific periods. The visual is suitable for various industries, including banking, sales, IT, logistics, and manufacturing. It offers in-app purchases and comes with 30 days free access to paid features. ZoomCharts visuals are mobile-friendly and offer interactive drilldowns, smooth animations, and rich customization options.
Drill Down Waterfall PRO (Pin): This customizable chart from ZoomCharts, known for its smooth animations, offers intuitive and interactive drilldowns, zooming, and rich customization options for creating reports that empower business users. Popular use cases include accounting, inventory management, human resources, and sales/marketing. The visual is mobile-friendly and comes with 30 days free access to paid features.
GDPR Registry: A data registry is essential for companies to keep track of personal data and potential breaches. From PD-Consult, GDPR Registry keeps track of all the personal data a company gathers. It contains information on what, where, why, and how long data is kept. In case of a data leak, the breach registry is also included.
HumanSoft – HR Solution Built on Power Platform: HumanSoft from Genbil Software is comprehensive HCM software for medium to large organizations. It integrates with Microsoft Teams and covers the entire employee lifecycle from hiring to retiring, including core HR functions, self-service portal, recruitment, onboarding/offboarding, learning and development, performance and recognition, talent and succession management, and reporting and analytics.
Kollective EdgeCache: Kollective’s EdgeCache is a video-caching platform that streamlines bandwidth requests, groups streaming traffic to a central point, and limits concurrent connections to supercharge network efficiencies. It requires no additional hardware and can handle requests behind a firewall, enable backhaul network setups, and simplify tunneled remote worker scenarios.
Go further with workshops, proofs of concept, and implementations
Contract Management System: 4-Week Implementation: From LTIMindtree, the Contract Manager app automates creation, review, approval, and signing. It eliminates manual activities, enhances collaboration, increases efficiency, and provides an audit trail. The implementation process includes discovery, assessment, design, and MVP. The deliverables include a technical design document, architecture diagram, recommendations, and app inventory.
Copilot Discovery Assessment: This MindWorks discovery assessment prepares customers for Microsoft Copilot adoption. It helps identify business scenarios, optimize workplace technology, and leverage AI. It includes the Envision Workshop, which showcases Copilot use cases and designs custom use cases for corporations. Also included is the Advisory, which helps build AI and Copilot strategy, calculate ROI, and develop an implementation road map.
Crafting Your Vision with the AI and Copilot Blueprint: 10-Week Program: From Changing Social, this program offers AI-powered training and readiness services to help organizations fully utilize Microsoft 365. The service includes a comprehensive introduction to AI, deep dives into Copilot’s functionalities, and training on apps and permissions. The program also includes stakeholder mapping, pilot deployment, and executive coaching.
Customer Portal: 4-Month Implementation: From Spyglass MTG, this implementation helps organizations create customer collaboration portals using Microsoft Power Pages. The process involves reviewing current engagement methods, identifying gaps, designing and implementing the portal, and providing post-implementation support. The result is a fully functional portal that enhances customer engagement and communication strategies.
Deployment Advisor: 1-Week Implementation: This tool from LTIMindtree compares Microsoft Dynamics 365 CRM solutions and counts components in both environments, helping end users understand differences between solutions. It saves time during development and deployment, improves solution quality, and predicts and prevents issues that could impact revenue. Ideal for comparing instances and ensuring they’re in sync.
Dynamics 365 Business Central QuickStart Basic: Prysma offers implementation services for Microsoft Dynamics 365 Business Central, including financial management, sales, inventory, and purchasing. The team provides consultation, training, data migration, and support. The client is responsible for data extraction and validation, and for designating a key user. Joint tasks include defining analytical dimensions and configuration.
Dynamics 365 Business Central Workshop: From Covenant Technology Partners, this workshop identifies a company’s needs in Microsoft Dynamics 365 Business Central through staff interviews and process analysis. It evaluates current accounting software, maps current and future processes, conducts gap analysis, and creates a road map for implementing solutions. The workshop can be conducted in two four-hour sessions or multiple days.
Enhance Identity Security with Entra ID P2: 6-Week Implementation/Proof of Concept: This service from Bulletproof provides Microsoft 365 customers with best practice implementations and understanding of Entra ID P2, which offers a comprehensive solution with features such as multi-factor authentication, conditional access policies, identity protection, privileged identity management, and password-less authentication.
Expense720 Onboarding: Expense720 is a simple-to-use app that digitizes invoice- and expense-handling processes, helping to avoid common problems such as time-consuming tasks, slow approvals, compliance challenges, security concerns, and inefficiencies. On720.com offers onboarding services to ensure successful implementation and setup, with consultancy available for more advanced use cases.
FSM Mobility: 8-Week Consultation and Implementation Workshop: FSM Mobility Application streamlines field technicians’ work order journey with immediate data sync across devices. LTIMindtree offers a free workshop to assess field service productivity and create a modernization road map. The Field Service Mobile Application complements the Field Service Module and can be customized as per business requirements.
HCLTech’s PDOC Digital Workplace Migration and Modernization Services: HCLTech offers services for Microsoft 365, covering modernization assessment, migration services, Windows 11 transformation, Azure Virtual Desktop migration and implementation, virtual workplace, and workplace automation solutions. Its services enable enterprises to enhance productivity, collaboration, security, and user experience.
Incident Response Retainer: Sentinel Technologies offers incident response services for Microsoft 365, providing remote and on-site support, tried and tested tools, full-scope forensics analysis, and technical expertise beyond cyber security. Sentinel’s around-the-clock rapid response helps minimize business impacts and restore normal service. Insurance and third-party friendly. It has proactive services, discounted rates, contracted service levels, and flexible use provisions.
MDS AI + Copilot Technical Advisory Workshop: 3-Day Engagement: This Maureen Data Systems workshop includes three defined milestones to help customers assess different Copilot solutions and align their enterprise productivity with their business objectives. At the end of the workshop, customers will have identified specific enterprise AI objectives and received guidance from experienced IT professionals.
MDS Microsoft 365 Copilot Extensibility Workshop: 3-Day Engagement: This Maureen Data Systems workshop covers three defined milestones and includes sharing Microsoft Azure concepts and delivering a detailed road map. Outcomes include a Copilot customization and connection strategy, evaluation of content and data operations, technical insight, and responsible AI practices.
Microsoft 365 Compliance Consulting: Fujitsu offers specialized consulting services for achieving data security and compliance for Microsoft Azure and 365. Fujitsu, a global system integrator with extensive experience in compliance consulting for Azure and Microsoft 365, identifies personal data, assesses existing privacy practices, and provides recommendations for improvement.
Microsoft 365 Copilot Immersive Workshop: 2-Day Workshop: This service from Noventiq helps organizations explore the possibilities of AI with Microsoft Copilot and create an implementation and adoption plan tailored to their needs. The workshop showcases Copilot capabilities within Microsoft 365 and aims to accelerate AI and Copilot journeys, empower users, increase user satisfaction and engagement, and drive innovation and transformation.
Microsoft 365 Copilot Optimization: Alfa Connections offers a workshop to help customers migrate data to Microsoft 365 Copilot effectively and securely. The workshop provides reports and suggestions on data challenges, governance, security, and adoption. The engagement includes identifying high-value scenarios, showcasing intelligence, and developing an implementation plan.
Microsoft 365 Copilot Readiness: Alfa Connections offers a workshop to help customers migrate data to Microsoft 365 Copilot effectively and securely. The workshop provides reports and suggestions on data challenges, governance, security, and adoption. The engagement includes identifying high-value scenarios, showcasing intelligence, and developing an implementation plan.
Microsoft 365 Copilot Readiness Workshop: Advaiya Solutions’ Microsoft 365 Copilot workshop offers insights into its capabilities, live demos of common tasks, and strategies for harnessing its power. Professionals across various industries can attend to gain insights into the potential impact of Copilot on productivity, efficiency, and the overall digital experience.
Microsoft 365 Copilot Workshop: This Trans4mation IT workshop targets business decision makers, IT professionals, and adoption managers. It offers insights into AI integration and practical applications tailored to participants’ needs. The workshop covers Microsoft 365 Copilot, with inspiring presentations on AI possibilities and strategic planning for implementation. Participants gain practical skills and strategies for effective AI integration.
Microsoft 365 Information Protection: 1-Day Workshop: Grant Thornton offers a workshop that provides strategic and technical advice on Microsoft 365 implementation, including discovery and classification of data, labeling strategy, metrics development, system integration, and data governance. The workshop helps establish goals, identify solutions, develop a roadmap, enhance security posture, and determine metrics for program accomplishments and risks.
Microsoft Dynamics 365 Sales Copilot Training: 3-Hour Session: Advaiya Solutions’ Microsoft Dynamics 365 Sales Copilot Training helps sellers and administrators maximize seller-customer engagement through an intelligent assistant. The three-hour session covers using Copilot with CRM, Teams, and Outlook, and attendees will gain proficiency in Copilot and effective prompts. Customizations and configurations can also be added.
Securing On-Prem AD with Defender for Identity: 3-Week Implementation/Proof of Concept: This service from Bulletproof helps detect advanced attacks in hybrid environments by monitoring user behavior and activities with learning-based analytics. It protects user identities and credentials stored in Active Directory and provides clear incident information on a simple timeline for fast triage.
SVA Workshop: Microsoft and Data Protection: Available only in German, this workshop from SVA discusses the challenges of protecting IT systems from unwanted disruptions and attacks, particularly in the context of cloud computing. It explores the legal and regulatory frameworks necessary for efficient use of cloud services, with a focus on Microsoft’s approach to data privacy and security in the German and European markets.
This article is contributed. See the original author and article here.
Agenda
This article will provide a demonstration on how to utilize either SAS token authentication or managed identity from API Management to make requests to Azure Storage. Furthermore, it will explore and compare the differences between these two options.
Comparision
The choice between Managed Identity and SAS Token depends on factors such as the level of control required, the duration of access, and the specific security requirements of your application. Both options offer different levels of access control and security features for accessing Azure Storage.
Azure Managed Identity vs. SAS (Shared Access Signature) Token
Authentication
Advantage
Disadvantage
Azure Managed Identity
Azure Managed Identity provides an automatic and seamless way to authenticate with Azure Storage.
Managed Identity allows you to specify the necessary scopes and permissions required for accessing Azure Storage. You can assign specific roles to the managed identity.
With Managed Identity, you can assign RBAC roles at a granular level to control access to Azure Storage resources.
Managed Identity offers a secure way to access Azure Storage, as it eliminates the need to store and manage secrets or credentials in your application code.
While Managed Identity offers RBAC, the level of granularity might be limited compared to SAS tokens.
Managed Identity tokens have a default lifetime and are automatically refreshed by Azure. There is limited control over token expiration.
SAS Token
SAS token allows you to define specific permissions and access levels for resources in Azure Storage. This includes read, write, delete, or list operations.
SAS tokens are generated for specific resources or containers in Azure Storage, providing a more restricted access scope compared to managed identities.
You can set an expiration time for the SAS token, after which it becomes invalid. This provides an additional layer of security and helps to control access to storage resources.
With SAS tokens, you can grant temporary access to specific resources or containers without the need for permanent credentials.
SAS tokens require manual generation and management, which can be cumbersome and time-consuming, especially when dealing with multiple client applications or frequent token rotation.
SAS tokens have an expiration date, and once expired, they become invalid. This requires additional effort to generate and distribute new tokens to maintain access, which may impact the continuity of your application.
If not properly secured, an exposed SAS token could lead to unauthorized access to your Azure Storage resources. It is crucial to ensure secure handling and storage of SAS tokens to prevent potential security breaches.
Revoking access granted through a SAS token can be challenging, as it usually requires updating the access policy or generating a new token. This might cause inconvenience and delay if you need to revoke access quickly and efficiently.
It is crucial to select the appropriate authentication method for accessing Azure Storage based on your specific use cases. For instance, if the permission for your client applications is permanent and long-term, it may be preferable to leverage Azure Managed Identity, as the assigned permission remains in place indefinitely. On the other hand, if you only need to grant temporary access to your client applications, it is more suitable to use SAS Token. SAS tokens can be created with an expiration date, and the permission will automatically expire once the token becomes invalid. This grants more control over the duration of access for temporary scenarios.
Below are the instructions to implement both Azure Managed Identity and SAS Token authentication options.
OPTION 1: Authentication via managed identity
This shows you how to create a managed identity for an Azure API Management instance and how to use it to access Azure Storage. A managed identity generated by Microsoft Entra ID allows your API Management instance to easily and securely access Azure Storage which is protected by Microsoft Entra ID. Azure manages this identity, so you don’t have to provision or rotate any secrets.
Configuration
The initial step involves enabling the managed identity for your APIM service, followed by assigning the appropriate permissions for blob uploading. You must go to the “Managed identities” blade to enable the system assigned identity firstly.
To add the storage permission, you can navigate to the same blade and click on the “Azure role assignments” button. It is important to carefully consider the role assignment based on your specific use cases, as there are multiple built-in roles available for authorizing access to blob data using Microsoft Azure Active Directory. For testing purposes, you can grant the “Storage Blob Data Contributor” permission to the managed identity.
For more detailed information regarding the built-in roles for blobs, please refer to the documentation provided below.
In the section, I included a policy for authentication with managed identity and also for rewriting the blob path. This example utilizes the name of the storage account that is set within the named values.
Within the section, to make the error response from Storage side easier to troubleshoot, I use the policy to convert the response format, because it is generated in XML format.
Error example:
Test
Simply using the test panel to do a test and check if everything works fine. The response will come back with 201-Created when the file has been uploaded successfully.
The file upload to the storage container was successful.
OPTION 2: Access Storage through SAS token
This is a method to access Storage Account from APIM service using SAS token. By setting the SAS token as named values, it can help reuse the SAS token.
One thing you might need to be careful about is that the SAS token should be handled and maintained manually because there is no integration between Storage Account and Key Vault for key re-creation.
Prerequisite
A SAS token is required before implementation. There are some ways to create a SAS token. You can generate the token from the Azure portal by selecting “Shared Access Signature” from the menu and providing the necessary information.
Additionally, both Azure PowerShell and Azure CLI can be utilized to generate the token.
– By using the Azure PowerShell, the examples within the below documentation can help you to create the SAS token.
After generating the SAS token, let’s move forward to API management service to set up the required configurations to restore the SAS token in the Named values on for API reference.
Policy
In the section, I added the policy for rewriting the blob path as well as assigning the SAS token. This example uses the name of Storage Account that is set in the named values.
Within the section, to make the error response from Storage side easier to troubleshoot, I use the policy to convert the response format, because it is generated in XML format.
Error example:
Test
Simply using the test panel to do a test and check if everything works fine. The response will come back with 201-Created when the file has been uploaded successfully.
The file upload to the storage container was successful.
Conclusion
In conclusion, both Azure Managed Identity and SAS Token authentication methods offer secure ways to upload blob files to Azure Storage from API Management.
Azure Managed Identity provides seamless authentication and eliminates the need to store and manage credentials, improving security. It allows for granular access control through RBAC and is suitable for permanent permission scenarios. However, it is limited to Azure services and requires dependency on Azure AD.
SAS Token authentication offers greater flexibility with temporary access and fine-grained control over permissions. It allows for the generation of tokens with specific expiration dates, providing enhanced security. However, SAS token management can be more complex, requiring manual generation and distribution of tokens.
When choosing between the two methods, consider the longevity of permissions needed and the level of control required. Azure Managed Identity is ideal for long-term permissions. Assess your specific use case to determine the most secure and convenient authentication approach for uploading blob files to Azure Storage from API Management.
This article is contributed. See the original author and article here.
Hello Friends! It’s been a crazy 2023 and I have to believe that 2024 will be equally fast paced. I posted a video with 3 tips to start the year strong in 2024 given all we are work on across Microsoft 365, Copilot and employee experience areas at Microsoft. I hope you mostly fun and also useful.
I know you will appreciate the downloadable copies of our LinkedIn Adoption Newsletter that I’ve created for you. If you subscribe to our Microsoft Adoption News on LinkedIn then that is wonderful, but if not I wanted to give you a special link here so you don’t have to click around. These documents have all the recent resources we’ve posted on adoption.microsoft.com (AMC), information from across our community and notes about AMC itself. I’ve also included the digital download of an Adoption News Special Supplement – It’s not about the AI, it’s about the Trust – On Becoming an AI Powered Organization. LinkedIn article links are here for the main Newsletter and here for the supplemental paper in case you’d rather engage or share from there.
In this paper I discuss hypothesis and practices yielded from observing early adoption of Copilot experiences, reviewing research and listening to adoption specialist around the world. I’m interested to hear your perspective on the micro-action mapping and the insights that precede that.
Heather Cook and I will return on Monday, January 8th with another fast-paced episode of Mondays at Microsoft to help us keep track of the changes we find ourselves navigating. I hope you can join us, when we stream live at 8am Pacific for 30 minutes or replay.
More than anything I want to say thank you for all you are doing across the communities and in your organization. We are the compass that will keep AI adoption on track. My team and I are so thrilled to share this journey with you!
This article is contributed. See the original author and article here.
Hi everyone! Brandon Wilson here once again with this month’s “Check This Out!” (CTO!) guide, and apologies for the delay!
These posts are only intended to be your guide, to lead you to some content of interest, and are just a way we are trying to help our readers a bit more, whether that is learning, troubleshooting, or just finding new content sources! We will give you a bit of a taste of the blog content itself, provide you a way to get to the source content directly, and help to introduce you to some other blogs you may not be aware of that you might find helpful.
From all of us on the Core Infrastructure and Security Tech Community blog team, thanks for your continued reading and support!
This blog post will assume that you have a fundamental understanding of Windows containers. If that isn’t the case, then then I highly recommend reading Get started: Run your first Windows container.
Many developers and IT Admins are in the midst of migrating long standing applications into containers to take advantage of the myriad of benefits made available with containerization.
NOTE: Not all applications are able to equally take advantage of the benefits of containerization. It is another tool for the toolbox to be used at your discretion.
But moving an existing application into a container can be a bit tricky. With this blog post I hope to help make that process a little bit easier for you.
At Microsoft, we’re committed to providing our customers with the tools they need to succeed wherever they are. By extending Azure services to the customer’s preferred environments like System Center, we empower customers with access to Azure’s potential along with a consistent experience across their hybrid estate. Today we’re excited to deliver on that commitment as we announce that System Center Virtual Machine Manager (SCVMM) enabled by Azure Arc is now generally available to manage SCVMM resources in Azure.
It’s been almost two weeks since the first post-End of Life Patch Tuesday for Windows Server 2012/R2. To receive that critical security patch from November’s Patch Tuesday, your servers must be enrolled in Extended Security Updates. Fortunately, it’s not too late. You can enroll in WS2012 ESUs enabled by Azure Arc anytime, with just a few steps!
Optimizing your Azure cloud investments is crucial for your organization’s success, helping you minimize unnecessary expenses, and ultimately drive better ROI. At Microsoft, we’re committed to optimizing your Azure environments and teaching you how to do it with resources, tools, and guidance, supporting continuous improvement of your cloud architectures and workloads, in both new and existing projects. We want you to gain confidence to reach your cloud goals, to become more effective and efficient when you have a better grasp of how to work in the cloud most successfully. To do that, our wide range of optimization skilling opportunities help you confidently achieve your cloud goals, resulting in more effectiveness and efficiency through a deeper knowledge of successful cloud operations.
I am excited to announce a comprehensive refresh of the Well-Architected Framework for designing and running optimized workloads on Azure. Customers will not only get great, consistent guidance for making architectural trade-offs for their workloads, but they’ll also have much more precise instructions on how to implement this guidance within the context of their organization.
Are you looking for a way to accelerate your cloud journey and optimize your IT infrastructure, data, and applications? If so, you might be interested in the Brand NewAzure Expert Assessment Offering! It is being launched as a new option within the Microsoft Solution Assessment Program. This is a free one-to-one offering from Microsoft that helps you plan your cloud adoption by collaborating with a Certified Azure Expert who will personally guide you through the assessment, and will make remediation recommendations for your organization.
We are excited to announce that Azure is making it easier for customers to reduce Compute costs by providing them the ability to hibernate Virtual Machines (VMs). Starting today, customers can hibernate their VMs and resume them at a later time. Hibernating a VM deallocates the machine while persisting the VM’s in-memory state. While the VM is hibernated, customers don’t pay for the Compute costs associated with the VM and only pay for storage and networking resources associated with the VM. Customers can later start back these VMs when needed and all their apps and processes that were previously running simply resume from their last state.
Ransomware attacks can cause significant damage to organizations and individuals, including data loss, security breaches, and costly business disruptions. When successful, they can disable a business’ core IT infrastructure, and cause destruction that could have a debilitating impact on the physical, economic security or safety of a business. And unfortunately, over the last few years, the number of ransomware attacks have seen a significant growth in their numbers as well as their sophistication. Having a sound BCDR strategy in place is essential to meeting your overall goals when it comes to ensuring security against ransomware attacks and minimizing their possible impact on your business. To make sure all customers are well protected against such attacks, Azure Backup provides a host of capabilities, some built-in while others optional, that significantly improve the security of your backups. In this article, we discuss some such capabilities offered by Azure Backup that can help you prepare better to recover from ransomware attacks as well as other data loss scenarios.
Reducing spend is more important than ever given today’s dynamic economy. Today’s businesses strive to create efficiencies that safeguard against unpredictable shifts in the economy, beat competitors, and prioritize what matters most. But accomplishing this is less about cutting costs and more about the ability to continuously optimize your cloud investments. Continuous optimization can help you drive innovation, productivity, and agility and realize an ongoing cycle of growth and innovation in your business.
In this article, we will see how Azure Site Recovery offers an automated way to help you ensure that all your DR data, to which you would fail over, is safe and free of any malware using Microsoft Defender for Cloud.
Azure Site Recovery helps ensure business continuity by keeping business apps and workloads running during outages. Site Recovery replicates workloads running on physical and virtual machines (VMs) from a primary site to a secondary location. After the primary location is running again, you can fail back to it. Azure Site Recovery provides Recovery Plans to impose order, and automate the actions needed at each step, using Azure Automation runbooks for failover to Azure, or scripts.
In July this year, weannounced the launch of Azure Migrate and Modernize, and Azure Innovate, our flagship offerings to help accelerate your move to the cloud. Azure Migrate and Modernize helps you migrate and modernize your existing applications, data and infrastructure to Azure, while Azure Innovate helps you with your advanced innovation needs such as infusing AI into your apps and experiences, advanced analytics, and building custom cloud native applications.
While there are multiple methods for obtaining explicit outbound connectivity to the internet from your virtual machines on Azure, there is also one method for implicit outbound connectivity – default outbound access. When virtual machines (VMs) are created in a virtual network without any explicit outbound connectivity, they are assigned a default outbound public IP address. These IP addresses may seem convenient, but they have a number of issues and therefore are only used as a “last resort”…
Azure DDoS Protection is a service that constantly innovates itself to protect customers from ever-changing distributed denial-of-service (DDoS) attacks. One of the major challenges of cloud computing is ensuring customer solutions maintain security and application availability. Microsoft has been addressing this challenge with its Azure DDoS Protection service, which was launched in public preview in 2017 and became generally available in 2018. Since its inception, Microsoft has renamed its Azure DDoS Protection service to better reflect its capabilities and features. We’ll discuss how this protection service has transformed through the years and provide more insights into the levels of protection offered by the separate tiers.
As the holiday season approaches, businesses and organizations should brace for an increase in Distributed Denial of Service (DDoS) attacks. Historically, this period has seen a spike in such attacks, targeting sectors like e-commerce and gaming that experience heightened activity. DDoS threats persist throughout the year, but the holiday season’s unique combination of increased online activity and heightened cyber threats makes it a critical time for heightened vigilance.
We are excited to announce that Personal Desktop Autoscale on Azure Virtual Desktop is generally available as of November 15, 2023! With this feature, organizations with personal host pools can optimize costs by shutting down or hibernating idle session hosts, while ensuring that session hosts can be started when needed.
In this post, I want to talk about Microsoft Assessments but more specifically Microsoft Assessments Milestones because they are a very useful tool which is not widely used.
In case you don’t know what Microsoft Assessments are, they are a free, online platform that helps you evaluate your business strategies and workloads. They work through a series of questions and recommendations that result in a curated guidance report that is actionable and informative.
In the following sections of this blog, I will provide a step-by-step guide to help you migrate away from MMA to AMA. This guide is designed to make the transition as smooth and seamless as possible, minimizing any potential disruptions to your monitoring workflow.
But that is not all. To make things even easier, there is a GitHub site that hosts the necessary binaries for this migration process. These binaries will be used to install a set of utilities in Azure, including a process dashboard. This dashboard will provide you with a visual representation of the migration process, making it easier to track and manage.
Ok, let’s get into today’s topic which is removing SMBv1 from domain controllers. Like my previous blog on NTLM, a lot of great content has already been written on SMBv1. My objective is to not to rehash the why but rather focus on how you can take action in a production environment.
For Post #1, I offer to you a quick’n’easy way to use Intune Remediations to get some info from Windows PCs.
Last reboot dates/times are frequently used as simple indicators of life for devices. I was asked if this is captured anywhere in Intune and oddly, I’d never looked – but as I went hunting through Intune (Portal and Graph), the more I looked, the more I couldn’t find it anywhere obvious. “Surely it can’t be THIS hard…?“
Azure Services and the solutions you deploy into Azure are connected to the Microsoft global wide-area network also known as the Microsoft Global Network or the Azure Backbone. There are a few different ways to connect to an Azure service from a subnet, depending on your requirements around securing access to these services. Your requirements should dictate which method you choose.There are some common misconceptions around connectivity, and the goal of this article is to provide some clarity around connecting to Azure Services.
Buenos días and welcome to número tres in the holiday ’23 series.
This one is sure to please the crowd – it’s the NEW AND IMPROVED easy to setup/deploy/use solution for when IT Ops/Support needs a local admin ID and password to perform some management task(s) on a Windows endpoint.
Server migrations to Azure Virtual Machines either through Azure Migrate or via a redeploy approach can benefit from Azure policies to accelerate adoption of Azure first party services across BCDR, Security, Monitoring and Management.
Our Cloud Adoption Framework’s guidance for Azure Landing Zones already provides a good baseline of recommended Azure policies. However, a variation to this baseline is described in this article with a focus on newly migrated Azure Virtual Machine resources.
How can a SOC team ingest and analyze Windows Logs with Microsoft Sentinel? What are the main options to ingest Windows Logs into a Log Analytics Workspace and use Microsoft Sentinel as a SIEM to manage security incidents from events recorded on these logs?
Traditionally, during the setup of an access package, you could specify who can request access, including users and groups in the organization’s directory or guest users. Now, you have the option to use an automatic assignment policy to manage access packages. This policy includes membership rules that evaluate user attribute values to determine access. You can create one automatic assignment policy per access package, which can assess built-in user attributes or custom attribute values generated by third-party HR systems and on-premises directories. Behind the scenes, Entitlement Management automatically creates dynamic security groups based on the policy rules, which are adjusted as the rules change.
When your Windows products reach the end of support, Extended Security Updates (ESUs) are there to protect your organization while you modernize your estate. To take advantage of this optional service, you’d purchase and download ESU product keys, install them, and finally activate the extended support.
You can now get three additional years of Extended Security Updates (ESUs) if you need more time to upgrade and modernize your Windows Server 2012, Windows Server R2, or Windows Embedded Server 2012 R2 on Azure. This also applies to Azure Stack HCI, Azure Stack Hub, and other Azure products.
Universal Print is a cloud-based print solution that enables a simple, rich, and secure print experience for users while also reducing time and effort for IT pros. By shifting print management to the cloud, IT professionals can simplify administration and end-users can easily print, reducing the expense of organizations’ print infrastructure.
Today, we start to roll out Copilot in Windows (in preview) for Windows 10, version 22H2 to Windows Insiders in the Release Preview Channel. Bringing Copilot to Windows 10enables organizations managing both Windows 11 and Windows 10 devices to continue considering a rollout of Copilot in Windows and provide this powerful productivity experience to more of their workforce.
Recent Comments