Introduction to Inclusion and Accessibility with Microsoft

Introduction to Inclusion and Accessibility with Microsoft

Link to closed captioning version: https://www.youtube.com/watch?v=qM7wFZo5yfI&t=175s.

Hi. I’m Jenny Lay-Flurrie, Chief Accessibility Officer at Microsoft. We’re going to talk about accessibility, so first let’s explain what this is. There are lots of examples of accessibility in the world. There’s accessibility of buildings and physical space with ramps, power door openers and more. And there’s also digital accessibility. Rendering websites, software, and games, to be accessible and inclusive for people with disabilities. All this and a lot more are great. Because at the core, accessibility is about creating experiences that are inclusive of the one billion people with disabilities around the world. With accessibility, we have both an opportunity and a responsibility to create inclusive tech that works for all of us. Accessibility is NOT optional. It is a key priority for Microsoft.

Technology can connect people in how they communicate, how they learn, transact and experience the world. And when tech is inclusive, we can connect people and information in amazing ways. On the flip side, if accessibility is not considered and your process does not prioritize accessibility, you have the power to exclude people, which is clearly not what we want to do. So, if we’re really going to lean into our mission to empower every person and every organization on the planet to achieve more, we have to think about accessibility, and embed it into the DNA of Microsoft. It’s an ecosystem. It starts with the people that we hire and empower. Right the way through to our marketing, communications, to the standards to which we hold our suppliers, vendors, and partners, our products, our innovations, and our workplace. Because we have an amazing opportunity to explore the great potential and hard questions, of how to create the next generation of accessible tech and the wave of innovation that comes with it. Just imagine what we can do together.

We’re going to take you on a journey through time and space, and to illustrate some of the common scenarios. Just remember that everyone’s experience is completely different. And we’re going to share just a few stories that will be helpful in understanding accessibility for now and the future. Thank you for investing your time to watch this today.

Meet Jenny Lay-Flurrie, Chief Accessibility Officer at Microsoft

Meet Jenny Lay-Flurrie, Chief Accessibility Officer at Microsoft

When it comes to assistive technologies the person leading the way for Microsoft is their Chief Accessibility Officer Jenny Lay-Flurrie. She’s from Birmingham, England, is profoundly deaf, works in Seattle, Washington, USA, and is passionate about the importance of putting inclusion at the heart of corporate culture. This is no small undertaking as it requires a paradigm shift in corporate thinking. But Jenny has never shied away from a fight. This October 1, 2020 interview may be giving you a first look at this amazing woman. Here’s a more in-depth look into her: https://news.microsoft.com/stories/people/jenny-lay-flurrie.html.

Large-scale Data Analytics with Azure Synapse – Workspaces with CLI

Large-scale Data Analytics with Azure Synapse – Workspaces with CLI

This article is contributed. See the original author and article here.

One of the challenges of large scale data analysis is being able to get the value from data with least effort. Doing that often involves multiple stages: provisioning infrastructure, accessing or moving data, transforming or filtering data, analyzing and learning from data, automating the data pipelines, connecting with other services that provide input or consume the output data, and more. There are quite a few tools available to solve these questions, but it’s usually difficult to have them all in one place and easily connected.

 

If this article was helpful or interesting to you, follow @lenadroid on Twitter.

 

Introduction

This is the first article in this series, which will cover what Azure Synapse is and how to start using it with Azure CLI. Make sure your Azure CLI is installed and up-to-date, and add a synapse extension if necessary:

$ az extension add --name synapse

 

What is Azure Synapse?
In Azure, we have Synapse Analytics service, which aims to provide managed support for distributed data analysis workloads with less friction. If you’re coming from GCP or AWS background, Azure Synapse alternatives in other clouds are products like BigQuery or Redshift. Azure Synapse is currently in public preview.

 

Serverless and provisioned capacity
In the world of large-scale data processing and analytics, things like autoscale clusters and pay-for-what-you-use has become a must-have. In Azure Synapse, you can choose between serverless and provisioned capacity, depending on whether you need to be flexible and adjust to bursts, or have a predictable resource load.

 

Native Apache Spark support
Apache Spark has demonstrated its power in data processing for both batch and real-time streaming models. It offers a great Python and Scala/Java support for data operations at large scale. Azure Synapse provides built-in support for data analytics using Apache Spark. It’s possible to create an Apache Spark pool, upload Spark jobs, or create Spark notebooks for experimenting with the data.

 

SQL support
In addition to Apache Spark support, Azure Synapse has excellent support for data analytics with SQL.

 

Other features
Azure Synapse provides smooth integration with Azure Machine Learning and Spark ML. It enables convenient data ingestion and export using Azure Data Factory, which connects with many Azure and independent data input and output sources. Data can be effectively visualized with PowerBI.

At Microsoft Build 2020, Satya Nadella announced Synapse Link functionality that will help get insights from real-time transactional data stored in operational databases (e.g. Cosmos DB) with a single click, without the need to manage data movement.

 

Get started with Azure Synapse Workspaces using Azure CLI

Prepare the necessary environment variables:

$ StorageAccountName='<come up with a name for your storage account>'
$ ResourceGroup='<come up with a name for your resource group>'
$ Region='<come up with a name of the region, e.g. eastus>'
$ FileShareName='<come up with a name of the storage file share>'
$ SynapseWorkspaceName='<come up with a name for Synapse Workspace>'
$ SqlUser='<come up with a username>'
$ SqlPassword='<come up with a secure password>'

Create a resource group as a container for your resources:

$ az group create --name $ResourceGroup --location $Region

Create a Data Lake storage account:

$ az storage account create 
  --name $StorageAccountName 
  --resource-group $ResourceGroup 
  --location $Region 
  --sku Standard_GRS 
  --kind StorageV2

The output of this command will be similar to:

{- Finished ..
  "accessTier": "Hot",
  "creationTime": "2020-05-19T01:32:42.434045+00:00",
  "customDomain": null,
  "enableAzureFilesAadIntegration": null,
  "enableHttpsTrafficOnly": false,
  "encryption": {
    "keySource": "Microsoft.Storage",
    "keyVaultProperties": null,
    "services": {
      "blob": {
        "enabled": true,
        "lastEnabledTime": "2020-05-19T01:32:42.496550+00:00"
      },
      "file": {
        "enabled": true,
        "lastEnabledTime": "2020-05-19T01:32:42.496550+00:00"
      },
      "queue": null,
      "table": null
    }
  },
  "failoverInProgress": null,
  "geoReplicationStats": null,
  "id": "/subscriptions/<subscription-id>/resourceGroups/Synapse-test/providers/Microsoft.Storage/storageAccounts/<storage-account-name>",
  "identity": null,
  "isHnsEnabled": null,
  "kind": "StorageV2",
  "lastGeoFailoverTime": null,
  "location": "eastus",
  "name": "<storage-account-name>",
  "networkRuleSet": {
    "bypass": "AzureServices",
    "defaultAction": "Allow",
    "ipRules": [],
    "virtualNetworkRules": []
  },
  "primaryEndpoints": {
    "blob": "https://<storage-account-name>.blob.core.windows.net/",
    "dfs": "https://<storage-account-name>.dfs.core.windows.net/",
    "file": "https://<storage-account-name>.file.core.windows.net/",
    "queue": "https://<storage-account-name>.queue.core.windows.net/",
    "table": "https://<storage-account-name>.table.core.windows.net/",
    "web": "https://<storage-account-name>.z13.web.core.windows.net/"
  },
  "primaryLocation": "eastus",
  "provisioningState": "Succeeded",
  "resourceGroup": "<resource-group-name>",
  "secondaryEndpoints": null,
  "secondaryLocation": "westus",
  "sku": {
    "capabilities": null,
    "kind": null,
    "locations": null,
    "name": "Standard_GRS",
    "resourceType": null,
    "restrictions": null,
    "tier": "Standard"
  },
  "statusOfPrimary": "available",
  "statusOfSecondary": "available",
  "tags": {},
  "type": "Microsoft.Storage/storageAccounts"
}

Retrieve the storage account key:

$ StorageAccountKey=$(az storage account keys list 
  --account-name $StorageAccountName 
  | jq -r '.[0] | .value')

Retrieve Storage Endpoint URL:

$ StorageEndpointUrl=$(az storage account show 
  --name $StorageAccountName 
  --resource-group $ResourceGroup 
  | jq -r '.primaryEndpoints | .dfs')

You can always check what your storage account key and endpoint are by looking at them, if you’d like:

$ echo "Storage Account Key: $StorageAccountKey"
$ echo "Storage Endpoint URL: $StorageEndpointUrl"

Create a fileshare:

$ az storage share create 
  --account-name $StorageAccountName 
  --account-key $StorageAccountKey 
  --name $FileShareName

Create a Synapse Workspace:

$ az synapse workspace create 
  --name $SynapseWorkspaceName 
  --resource-group $ResourceGroup 
  --storage-account $StorageAccountName 
  --file-system $FileShareName 
  --sql-admin-login-user $SqlUser 
  --sql-admin-login-password $SqlPassword 
  --location $Region

The output of the command should show the successful creation:

{- Finished ..
  "connectivityEndpoints": {
    "dev": "https://<synapse-workspace-name>.dev.azuresynapse.net",
    "sql": "<synapse-workspace-name>.sql.azuresynapse.net",
    "sqlOnDemand": "<synapse-workspace-name>-ondemand.sql.azuresynapse.net",
    "web": "https://web.azuresynapse.net?workspace=%2fsubscriptions%<subscription-id>%2fresourceGroups%2fS<resource-group-name>%2fproviders%2fMicrosoft.Synapse%2fworkspaces%<synapse-workspace-name>"
  },
  "defaultDataLakeStorage": {
    "accountUrl": "https://<storage-account-name>.dfs.core.windows.net",
    "filesystem": "<file-share-name>"
  },
  "id": "/subscriptions/<subscription-id>/resourceGroups/<resource-group-name>/providers/Microsoft.Synapse/workspaces/<synapse-workspace-name>",
  "identity": {
    "principalId": "<principal-id>",
    "tenantId": "<tenant-id>",
    "type": "SystemAssigned"
  },
  "location": "eastus",
  "managedResourceGroupName": "<managed-tesource-group-id>",
  "name": "<synapse-workspace-name>",
  "provisioningState": "Succeeded",
  "resourceGroup": "<resource-group-name>",
  "sqlAdministratorLogin": "<admin-login>",
  "sqlAdministratorLoginPassword": <admin-password>,
  "tags": null,
  "type": "Microsoft.Synapse/workspaces",
  "virtualNetworkProfile": null
}

After you successfully created these resources, you should be able to go to Azure Portal, and navigate to the resource called $SynapseWorkspaceName within $ResourceGroup resource group. You should see a similar page:

lenadroid_0-1599094247688.png

 

What’s next?

You can now load data and experiment with it in Synapse Data Studio, create Spark or SQL pools and run analytics queries, connect to PowerBI and visualize your data, and many more.

 

Stay tuned for next articles to learn more! Thanks for reading!

 

If this article was interesting to you, follow @lenadroid on Twitter.

Secure isolation guidance for Azure and Azure Government

Secure isolation guidance for Azure and Azure Government

This article is contributed. See the original author and article here.

One of the most common concerns for public sector cloud adoption is secure isolation among tenants when multiple customer applications and data are stored on the same physical hardware, as described in our recent blog post on secure isolation.  To provide customers with more detailed information about isolation in a multi-tenant cloud, Microsoft has published Azure guidance for secure isolation, which provides technical guidance to address common security and isolation concerns pertinent to cloud adoption.  It also explores design principles and technologies available in Azure and Azure Government to help customers achieve their secure isolation objectives.  The approach relies on isolation enforcement across compute, storage, and networking, as well as built-in user access control via Azure Active Directory and Microsoft’s internal use of security assurance processes and practices to correctly develop logically isolated cloud services. Read more on our Azure Gov blog here

 

 

About the Author 

 

steve v.jpeg

 

As Principal Program Manager with Azure Government Engineering, @StevanVidich  is focused on Azure security and compliance. He publishes and maintains Azure Government documentation and works on expanding Azure compliance coverage.

 

Experiencing Data Ingestion Latency Issue in Azure portal for Log Analytics – 09/02 – Investigating

This article is contributed. See the original author and article here.

Initial Update: Wednesday, 02 September 2020 16:36 UTC

We are aware of issues within Log Analytics and are actively investigating. Some customers may experience intermittent Data Latency and incorrect alert activation for Heartbeat, Perf and SecurityEvent in East US region.

  • Work Around: None
  • Next Update: Before 09/02 20:00 UTC

We are working hard to resolve this issue and apologize for any inconvenience.
-Saika