Learning How to Transition Your SQL Server Skills to Azure SQL | Data Exposed

This article is contributed. See the original author and article here.

The Azure Data team at Microsoft recently launched FOUR new ways for you to learn Azure SQL. They’ll show you how to translate your existing SQL Server expertise to Azure SQL including Azure SQL Database and Azure SQL Managed Instance through all-new content on YouTube, GitHub, and Microsoft Learn. After completing the video series, learning path, or workshop, you will have a foundational knowledge of what to use when, as well as how to configure, secure, monitor, and troubleshoot Azure SQL. In this episode, Bob Ward, Anna Hoffman, and Marisa Brasile share the details of their all-new content and how to get started.

 

Watch on Data Exposed

 

Resources:
YouTube: https://aka.ms/azuresql4beginners

Github: https://aka.ms/azuresqlworkshop

Microsoft Learn: https://aka.ms/azuresqlfundamentalsb

 

View/share our latest episodes on Channel 9 and YouTube!

Connecting the Wio Terminal to Azure IoT

Connecting the Wio Terminal to Azure IoT

This article is contributed. See the original author and article here.

It’s been a few months now since I started playing with the Wio Terminal from Seeed Studio. It is a pretty complete device that can be used to power a wide range of IoT solutions—just look at its specifications!

 

WioT-Hardware-Overview.png
  • Cortex-M4F running at 120MHz (can be overclocked to 200MHz) from Microchip (ATSAMD51P19) ;
  • 192 KB of RAM, 4MB of Flash ;
  • Wireless connectivity: WiFi 2.4 & 5 GHz  (802.11 a/b/g/n), BLE, BLE 5.0, powered by a Realtek RTL8720DN module ;
  • 2.4″ LCD screen, 320×240 pixels ;
  • microSD card reader ;
  • Built-in sensors and actuators: light sensor, LIS3DH accelerometer, infrared emitter, microphone, buzzer, 5-way switch ;
  • Expansion ports: 2x Grove ports, 1x Raspberry-Pi compatible 40-pin header.

 

Please refer to the application’s README to learn how to test that the sample is working properly using Azure IoT Explorer.

 

azure-iot-explorer-telemetry.gif azure-iot-explorer-send-command.gif

 

Azure IoT Explorer – Monitoring Telemetry Azure IoT Explorer – Sending Commands


It is important to mention that this sample application is compatible with IoT Plug and Play. It means that there is a clear and documented contract of the kind of messages the Wio Terminal may send (telemetry) or receive (commands).

 

You can see the model of this contract below—it is rather straightforward. It’s been authored using the dedicated VS Code extension for DTDL, the Digital Twin Description Language.

 

 

{
  "@context": "dtmi:dtdl:context;2",
  "@id": "dtmi:seeed:wioterminal;1",
  "@type": "Interface",
  "displayName": "Seeed Studio Wio Terminal",
  "contents": [
    {
      "@type": [
        "Telemetry",
        "Acceleration"
      ],
      "unit": "gForce",
      "name": "imu",
      "schema": {
        "@type": "Object",
        "fields": [
          {
            "name": "x",
            "displayName": "IMU X",
            "schema": "double"
          },
          {
            "name": "y",
            "displayName": "IMU Y",
            "schema": "double"
          },
          {
            "name": "z",
            "displayName": "IMU Z",
            "schema": "double"
          }
        ]
      }
    },
    {
      "@type": "Command",
      "name": "ringBuzzer",
      "displayName": "Ring buzzer",
      "description": "Rings the Wio Terminal's built-in buzzer",
      "request": {
        "name": "duration",
        "displayName": "Duration",
        "description": "Number of milliseconds to ring the buzzer for.",
        "schema": "integer"
      }
    }
  ]
}

 

 

When connecting to IoT Hub, the Wio Terminal sample application “introduces itself” as conforming to the dtmi:seeed:wioterminal;1 model.

 

This allows you (or anyone who will be creating IoT applications integrating with your device, really) to be sure there won’t be any impedence mismatch between the way your device talks and expects to be talked to, and what your IoT application does.

 

A great example of why being able to automagically match a device to a corresponding DTDL model is useful can be illustrated with the way we used the Azure IoT Explorer earlier. Since the device “introduced itself” when connecting to IoT Hub, and since Azure IoT Explorer has a local copy of the model, it automatically showed us a dedicated UI for sending the ringBuzzer command!

 

pnp-command.png

 

Azure SDK for Embedded C

 

In the past, adding support for Azure IoT to an IoT device using the C programming language required to either use the rather monolithic (ex. it is not trivial to bring your own TCP/IP or TLS stack) Azure IoT C SDK, or to implement everything from scratch using the public documentation of Azure IoT’s MQTT front-end for devices.

 

Enter the Azure SDK for Embedded C!

 

The Azure SDK for Embedded C is designed to allow small embedded (IoT) devices to communicate with Azure services.

 

The Azure SDK team has recently started to put together a C SDK that specifically targets embedded and constrained devices. It provides a generic, platform-independent, infrastructure for manipulating buffers, logging, JSON serialization/deserialization, and more. On top of this lightweight infrastructure, client libraries for e.g Azure Storage or Azure IoT have been developed.

 

You can read more on the Azure IoT client library here, but in a nutshell, here’s what I had to implement in order to use it on the Wio Terminal connected:

 

  • As the sample uses symmetric keys to authenticate, we need to be able to generate a security token.
    • The token needs to have an expiration date (typically set to a few hours in the future), so we need to know the current date and time. We use an NTP library to get the current time from a time server.
    • The token includes an HMAC-SHA256 signature string that needs to be base64-encoded. Luckily, the recommended WiFi+TLS stack for the Wio Terminal already includes Mbed TLS, making it relatively simple to compute HMAC signatures (ex. mbedtls_md_hmac_starts) and perform base64 encoding (ex. mbedtls_base64_encode).
  • The Azure IoT client library helps with crafting MQTT topics that follow the Azure IoT conventions. However, you still need to provide your own MQTT implementation. In fact, this is a major difference with the historical Azure IoT C SDK, for which the MQTT implementation was baked into it. Since it is widely supported and just worked out-of-the-box, the sample application uses the PubSubClient MQTT library from Nick O’Leary.
  • And of course, one must implement their own application logic. In the context of the sample application, this meant using the Wio Terminal’s IMU driver to get acceleration data every 2 seconds, and hooking up the ringBuzzer command to actual embedded code that… rings the buzzer.

Conclusion

I hope you found this post useful! I will soon publish additional articles that go beyond the simple “Hey, my Wio Terminal can send accelerometer data to the cloud!” to more advanced use cases such as remote firmware upgrade. Stay tuned! :)

 

Let me know in the comments what you’ve done (or will be doing!) with your Wio Terminal, and also don’t hesitate to ask any burning question you may have. Of course, you can also always find me on Twitter.

 

 

Wireless connectivity, extensibility, processing power… on paper, the Wio Terminal must be the ideal platform from IoT development, right? Well, ironically, one thing it doesn’t do out-of-the-box is to actually connect to an IoT cloud platform!

 

You will have guessed it by now… In this blog post, you’ll learn how to connect your Wio Terminal to Azure IoT. More importantly, you will learn about the steps I followed, giving you all the information you need in order to port the Azure IoT Embedded C libraries to your own IoT device.

Connecting your Wio Terminal to Azure IoT

 

I have put together a sample application that should get you started in no time.

 

You will need a Wio Terminal, of course, an Azure IoT Hub instance, and a working Wi-Fi connection. The Wio Terminal will need to be connected to your computer over USB—kudos to Seeed Studio for providing a USB-C port, by the way!—so it can be programmed.

 

Here are the steps you should follow to get your Wio Terminal connected to Azure IoT Hub:

 

  1. If you don’t have an Azure subscriptioncreate one for free before you begin.
  2. Create an IoT Hub and register a new device (i.e. your Wio Terminal). Using the Azure portal is probably the most beginner-friendly method, but you can also use the Azure CLI or the VS Code extension. The sample uses symmetric keys for auhentication.
  3. Clone and open the sample repository in VS Code, making sure you have the PlatformIO extension installed.
  4. Update the application settings (include/config.h) file with your Wi-Fi, IoT Hub URL, and device credentials.
  5. Flash your Wio Terminal. Use the command palette (Windows/Linux: Ctrl+Shift+P / macOS: ⇧⌘P) to execute the PlatformIO: Upload command. The operation will probably take a while to complete as the Wio Terminal toolchain and the dependencies of the sample application are downloaded, and the code is compiled and uploaded to the device.
  6. Once the code has been uploaded successfully, your Wio Terminal LCD should turn on and start logging connection traces.
    You can also open the PlatformIO serial monitor to check the logs of the application (PlatformIO: Serial Monitor command).

 

> Executing task: C:Userskartben.platformiopenvScriptsplatformio.exe device monitor <
--- Available filters and text transformations: colorize, debug, default, direct, hexlify, log2file, nocontrol, printable, send_on_enter, time
--- More details at http://bit.ly/pio-monitor-filters
--- Miniterm on COM4  9600,8,N,1 ---
--- Quit: Ctrl+C | Menu: Ctrl+T | Help: Ctrl+T followed by Ctrl+H ---
Connecting to SSID: WiFi-Benjamin5G
......
> SUCCESS.
Connecting to Azure IoT Hub...
> SUCCESS.

 

Your device should now be sending its accelerometer sensor values to Azure IoT Hub every 2 seconds, and be ready to receive commands remotely sent to ring its buzzer.

 

Please refer to the application’s README to learn how to test that the sample is working properly using Azure IoT Explorer.

 

azure-iot-explorer-telemetry.gif azure-iot-explorer-send-command.gif

 

Azure IoT Explorer – Monitoring Telemetry Azure IoT Explorer – Sending Commands


It is important to mention that this sample application is compatible with IoT Plug and Play. It means that there is a clear and documented contract of the kind of messages the Wio Terminal may send (telemetry) or receive (commands).

 

You can see the model of this contract below—it is rather straightforward. It’s been authored using the dedicated VS Code extension for DTDL, the Digital Twin Description Language.

 

 

{
  "@context": "dtmi:dtdl:context;2",
  "@id": "dtmi:seeed:wioterminal;1",
  "@type": "Interface",
  "displayName": "Seeed Studio Wio Terminal",
  "contents": [
    {
      "@type": [
        "Telemetry",
        "Acceleration"
      ],
      "unit": "gForce",
      "name": "imu",
      "schema": {
        "@type": "Object",
        "fields": [
          {
            "name": "x",
            "displayName": "IMU X",
            "schema": "double"
          },
          {
            "name": "y",
            "displayName": "IMU Y",
            "schema": "double"
          },
          {
            "name": "z",
            "displayName": "IMU Z",
            "schema": "double"
          }
        ]
      }
    },
    {
      "@type": "Command",
      "name": "ringBuzzer",
      "displayName": "Ring buzzer",
      "description": "Rings the Wio Terminal's built-in buzzer",
      "request": {
        "name": "duration",
        "displayName": "Duration",
        "description": "Number of milliseconds to ring the buzzer for.",
        "schema": "integer"
      }
    }
  ]
}

 

 

When connecting to IoT Hub, the Wio Terminal sample application “introduces itself” as conforming to the dtmi:seeed:wioterminal;1 model.

 

This allows you (or anyone who will be creating IoT applications integrating with your device, really) to be sure there won’t be any impedence mismatch between the way your device talks and expects to be talked to, and what your IoT application does.

 

A great example of why being able to automagically match a device to a corresponding DTDL model is useful can be illustrated with the way we used the Azure IoT Explorer earlier. Since the device “introduced itself” when connecting to IoT Hub, and since Azure IoT Explorer has a local copy of the model, it automatically showed us a dedicated UI for sending the ringBuzzer command!

 

pnp-command.png

 

Azure SDK for Embedded C

 

In the past, adding support for Azure IoT to an IoT device using the C programming language required to either use the rather monolithic (ex. it is not trivial to bring your own TCP/IP or TLS stack) Azure IoT C SDK, or to implement everything from scratch using the public documentation of Azure IoT’s MQTT front-end for devices.

 

Enter the Azure SDK for Embedded C!

 

The Azure SDK for Embedded C is designed to allow small embedded (IoT) devices to communicate with Azure services.

 

The Azure SDK team has recently started to put together a C SDK that specifically targets embedded and constrained devices. It provides a generic, platform-independent, infrastructure for manipulating buffers, logging, JSON serialization/deserialization, and more. On top of this lightweight infrastructure, client libraries for e.g Azure Storage or Azure IoT have been developed.

 

You can read more on the Azure IoT client library here, but in a nutshell, here’s what I had to implement in order to use it on the Wio Terminal connected:

 

  • As the sample uses symmetric keys to authenticate, we need to be able to generate a security token.
    • The token needs to have an expiration date (typically set to a few hours in the future), so we need to know the current date and time. We use an NTP library to get the current time from a time server.
    • The token includes an HMAC-SHA256 signature string that needs to be base64-encoded. Luckily, the recommended WiFi+TLS stack for the Wio Terminal already includes Mbed TLS, making it relatively simple to compute HMAC signatures (ex. mbedtls_md_hmac_starts) and perform base64 encoding (ex. mbedtls_base64_encode).
  • The Azure IoT client library helps with crafting MQTT topics that follow the Azure IoT conventions. However, you still need to provide your own MQTT implementation. In fact, this is a major difference with the historical Azure IoT C SDK, for which the MQTT implementation was baked into it. Since it is widely supported and just worked out-of-the-box, the sample application uses the PubSubClient MQTT library from Nick O’Leary.
  • And of course, one must implement their own application logic. In the context of the sample application, this meant using the Wio Terminal’s IMU driver to get acceleration data every 2 seconds, and hooking up the ringBuzzer command to actual embedded code that… rings the buzzer.

Conclusion

I hope you found this post useful! I will soon publish additional articles that go beyond the simple “Hey, my Wio Terminal can send accelerometer data to the cloud!” to more advanced use cases such as remote firmware upgrade. Stay tuned! :)

 

Let me know in the comments what you’ve done (or will be doing!) with your Wio Terminal, and also don’t hesitate to ask any burning question you may have. Of course, you can also always find me on Twitter.

 

Experiencing Data Latency in Azure portal for Log Alerts – 08/06 – Resolved

This article is contributed. See the original author and article here.

Final Update: Thursday, 06 August 2020 13:30 UTC

We’ve confirmed that all systems are back to normal with no customer impact as of 8/6, 13:00 UTC. Our logs show the incident started on 8/06, 10:00 UTC and that during the 3 hours that it took to resolve the issue customers could have experienced a delay in alerting.

  • Root Cause: The failure was due to some backend dependencies.  
  • Incident Timeline:3 Hours – 8/06, 10:00 UTC through 8/06, 13:00 UTC

We understand that customers rely on Azure Monitor as a critical service and apologize for any impact this incident caused.

-Eric Singleton


Azure Database for MySQL – Service update

Azure Database for MySQL – Service update

This article is contributed. See the original author and article here.

 

The year 2020 has started eventfully, and it continues to present challenging times for people, businesses, and economies around the world. As our CEO Satya Nadella puts it, “We have seen two years of digital transformation in two months.” Azure Database for MySQL service is at the heart of this transformation, empowering online education, video streaming services, digital payment solutions, e-commerce platforms, gaming services, news portals, government and healthcare websites to support unprecedented growth at optimized cost. It’s immensely satisfying to see Azure Database for MySQL enable our customers to meet growing demands for their services during these critical times. Azure Database for MySQL service, with community version of MySQL, is powering mission critical services such as healthcare services for citizens of Denmark, digital payment application for citizens of Hong Kong, music and video streaming platforms for citizens of India, Korea, and Japan, online news websites, and mobile gaming services including our very own Minecraft Realms.

 

MySQL – Popular choice for Internet scale web or mobile applications

MySQL is a popular choice of database engine for designing internet scale consumer applications, which are highly transactional online applications with short chatty transactions against a relatively small database size. These applications are typically developed in Java or php and migrated to run on Azure virtual machine scale sets (VMSS) or Azure App Services or are containerized to run on Azure Kubernetes Service (AKS). The database is typically required to scale high volume of incoming transactions. Most of our customers leverage proxysql load balancer proxy and read replicas to scale out and meet the workload demands for their business. MySQL versions 5.7 and 8.0 continue to be popular choices among our customers for meeting their performance and scale goals.

 

ReferenceArchitecture.png

 

What’s new in Azure Database for MySQL?

Over the last six months, we’ve focused on enhancing security and governance for customers, simplifying performance tuning, and reducing cost for our customers in addition to increasing the regional availability of our optimized large storage platform with 16-TB storage and 20K IOPs scale. This aligns with our promise of making Azure the most secure and trusted cloud for our customers. A complete list of all the features we’ve released is available via Azure updates, but I’ll summarize a few important updates below.

 

Enterprise Grade Security, Compliance & Governance

  • Data encryption at rest using customer managed keys. Since we launched Azure Database for MySQL to public, all customer data is always encrypted at rest using service managed keys. The service is fully compliant with PCI DSS, HIPAA and FedRAMP certifications. With this release, we allow our customers to bring their own key for data encryption of their data at rest. This was one of the highly requested ask by our finance, healthcare industry and government customers to meet their compliance and regulatory requirements. Learn more about this feature here.
  • Infrastructure double encryption. The feature provides an additional layer of protection for customers’ data at rest. Infrastructure double encryption uses the FIPS 140-2 validated cryptographic module, but with a different encryption algorithm. The key used in Infrastructure Double encryption is managed by the Azure Database for MySQL service. Infrastructure double encryption is not enabled by default since the additional layer of encryption can have a performance impact. Learn more about this feature here.
  • Minimum TLS version enforcement ability. Azure Database for MySQL currently support TLS v1.0, 1.1 and 1.2. Our recommendation is to upgrade to TLS version v1.2 to enhance security, and with this feature, we allow customers to control and enforce the right behavior for their MySQL servers from the server side. Server administrators can simply go in the Azure portal and set the minimum TLS version on the server side to meet the compliance. Security administrators can define right policies at the subscription or organization level using Azure Policy to ensure the minimum TLS version for all the MySQL servers in the Azure subscription meets the compliance and regulatory requirements defined by the organization. Learn more about this feature here.

         MinTLS.png

 

 

  • Private Link for Azure Database for MySQL. Azure Private Link is the most secure way to isolate and connect to Azure Database for MySQL either within the same Azure region or across regions. Customers can also use this feature to disable public endpoints, which ensures that there aren’t any connections coming from public endpoints. Learn more about this feature .
  • Azure Active Directory Authentication for Azure Database for MySQL. Azure Active Directory authentication allows customersdatabase by using identities defined in Azure AD and manage credentials in a central place. For consistent role management, database access can be managed by using Active Directory groups, as well as . Learn more about this feature here.
  • Governance capability by enforcing Azure policies – All of the above security features are opt-in and driving the right security practices and behavior is a shared responsibility. To standardize and enforce these security controls at an organization level, customers can leverage Azure Policy. Azure Policy is an Azure service that is used to create, assign, and manage policies. These policies enforce different rules and effects on resources to ensure that they stay compliant with corporate standards and service level agreements. Azure Policy meets this need by evaluating resources for non-compliance with assigned policies, at the same time ensuring that all data stored by Azure Policy is encrypted at rest. We are glad to announce the integration of Azure Database for MySQL with Azure Policy to enforce these compliance requirements at scale.

We’ve created an Azure Policy GitHub repository that contains quick-start samples from the community. For more information about using these sample policies, see the article here.

 

Intelligent Performance

Besides the security controls, the engineering team focused on making life easier for devops team who are tasked to manage the performance of large fleet of servers. Intelligent performance is our differentiated feature which includes query store, query performance insight, and performance recommendations. Intelligent performance allows devops teams to better understand their workloads, visually inspect them and identify bottlenecks, and to see a list of recommendations for improving the performance of database workloads.

For some canonical workloads like WordPress, we are taking a step forward to allow users to configure an optimized performance configuration for their Azure Database for MySQL server using resource tags. To take advantage of this feature and drive performance optimizations for WordPress applications, users can simply set the following resource tag on Azure Database for MySQL server used for WordPress application at time of server creation. Learn more about this feature here.

  • Name: AppProfile
  • Value: WordPress

We are constantly using pattern analysis to programmatically analyze metrics telemetry of servers and provide targeted Advisor recommendation which can enable users to improve performance of their MySQL servers out of the box without any code changes. The recent recommendation which we added is to increase tmp_table_size and max_heap_table_size values for servers which are impacted by temp tables spills to storage. The increase in the server parameter values for those impacted servers can improve overall workload performance out of the box. For Azure Database for MySQL servers impacted by temp table spills to storage, customers will see the recommendation to increase the values in Advisor recommendation blade in the portal. Learn the latest about this feature here.

As scale demands of the workload increases, customers can leverage read replicas to scale-out and proxysql to transparently split the reads and writes from the application. In some scenarios, there may still be high amount of thread churn inside the server with short burst of highly concurrent transactions limiting the transaction throughput due to high cpu contention.  To minimize this and improve the performance out of the box, we released thread pool feature which can be enabled using server parameters in Azure Database for MySQL service. Learn more about this feature here.

 

Cost Optimization

Reducing cost is top in the mind of customers and therefore, it is a top priority for us too. We released few features in this area to ensure customers are provisioning the right size for their workloads and benefit from capacity commitments. This is an active area of investment for us and we are committed to do more in this area so please stay tuned.

  • Recommendation to optimize cloud spend – The Recommendations feature gives daily insights about the database server to help users optimize performance and cost. Azure Advisors is a personalized cloud consultant that helps users follow guidelines to optimize their Azure deployments. We started off with Performance based recommendations, but we have now expanded the portfolio to include cost optimization recommendations through right sizing and Reserved instance.
  • 3 years RI expansion – We started off by providing 1-year RI announced in Microsoft Ignite 2019. However, learning from the feedback from customers we have quickly expanded the support for 3 years RI as well to let customer save cost for long term commitments. Learn more about reserved instances and its use here.

Large Storage with up to 16TB storage. With our new storage infrastructure that supports up to 16TB, we have done bunch of optimizations in the storage engine including switching to snapshot-based backups. The 16 TB storage also supports IOPs up to 20K IOPs for higher concurrent scaling. In a subset of Azure regions, all newly provisioned servers can support up to 16-TB storage and 20K IOPs. We are also working towards rolling out this storage infrastructure in remaining Azure regions which will be the default storage option going forward.

 

Getting Started with Azure Database for MySQL Service

Users new to Azure Database for MySQL can get started by leveraging the following QuickStart articles:

The connection to Azure Database for MySQL requires users to specify the username in the format username@servername. For more information on this requirement, read more here.

 

Workbench.png

 

Migrating to Azure Database for MySQL service

Customers looking to migrate their database to Azure Database for MySQL can use the

  • Dump and Restore – For offline migrations where users can afford some downtime, leverage dump and restore using community tools like mysqldump/mydumper. Read more in our documentation. For migrating large databases, leverage the best practices shared by our field customer engineer working closely with some of our mission critical customers.
  • Azure Database Migration Service – For seamless migrations to Azure Database for MySQL service with minimal downtime, customers use the Azure Database Migration Service. Learn more about this service in our documentation. The best practices for migrating MySQL databases using Azure Database Migration service can be found here.
  • Data-in replication – For minimal downtime migrations, data-in replication which relies on binlog based replication can also be leveraged. Data-in replication is preferred for minimal downtime migrations by hands-on experts who are looking for more control over migration. You can read more in our documentation.

To migrate users from an existing environment to a Azure Database for MySQL server, leverage the script documented here.

 

Planned Maintenance Notification

If you want to get alerted for upcoming planned maintenance to your Azure Database for MySQL server, we recommend subscribing to planned maintenance notification. Learn more about this feature here.

 

Stay updated

To stay updated around the latest development with Azure Database for MySQL service, we recommend the following:

 

Feedback

We are constantly looking at ways to improve our service and prioritize highly requested items. If you have any feedback, you can leverage the following forums:

  • UserVoice
  • Use Azure Portal to leave us your feedback
 

      PortalFeedback.png

 

Questions

If our documentation fails to provide clarity, we encourage customers to contact us with questions.

 

Support

For support with an existing Azure Database for MySQL server, use the Azure portal to open a support request with us.

 

Experiencing Data Access Issue in Azure portal for Log Analytics – 08/06 – Resolved

This article is contributed. See the original author and article here.

Final Update: Thursday, 06 August 2020 06:26 UTC

We’ve confirmed that all systems are back to normal with no customer impact as of 8/6, 06:03 UTC. Our logs show the incident started on 8/05, 21:41 UTC and that during the 8 hours 22 minutes that it took to resolve the issue, customers using the AUDIT_LOG_REST_API in the Australia Southeast Region could have experienced a delay with ingested data.

  • Root Cause: The failure was due to a bad deployment.
  • Incident Timeline: 8 Hours & 22 minutes – 8/05, 21:41 UTC through 8/06, 06:03 UTC

We understand that customers rely on Azure Log Analytics as a critical service and apologize for any impact this incident caused.

-Eric Singleton


Assess your AWS virtual machines with Azure Migrate

This article is contributed. See the original author and article here.

Over the last few months I’ve been doing a “summer tour” of user groups and delivering a talk entitled “Start your datacentre transformation journey with Azure Migrate”, during my talk I mainly focus on customer journeys that are moving resources from on prem to the cloud.  However, due to some questions I’ve had from the audience I want to change focus a little and share with you the ability to use Azure Migrate to help you if you are looking to move from another cloud provider to Azure.

 

The first step of any migration journey regardless of your starting point and destination is an assessment, gathering information about your current environment. I talked about it why I think it’s so important and what information you should be gathering in my Datacentre Migration Checklist blog post.

 

The Azure Migrate: Server Assessment Tool can help not only assess your VMware, and Hyper-V virtual servers, or physical servers but it can also assess those living in other clouds.  And in this video I show you the process of assessing your AWS virtual machines with a view to moving them to Azure.  You can watch the full video here or on Microsoft Channel 9.

 

 

 

 

You can find more information here: 

 

 

I hope you enjoyed the video if you have any questions feel free to leave a comment.