Using A Function App with a Dedicated Static IP

Using A Function App with a Dedicated Static IP

This article is contributed. See the original author and article here.

function-app-vnet-integration-dmz-reference-architectures.png

Create an HTTP Trigger Function



  • Create an HTTP Trigger C# function with a name of your liking.

  • Use an App Service Hosting Plan SKU or Standard or Premium, both support VNET integration. You may use Standard plan for this set up, as it is cheaper.

  • Replace the out of the box code for the HTTP Trigger with the following code.




#r “Newtonsoft.Json”
#r “System.Text.Json”
using System.Net;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Primitives;
using Newtonsoft.Json;
using System.Collections.Generic;
using System.Net.Http;
using System.Net.Http.Headers;
using System.Text.Json;
using System.Threading.Tasks;
using System.Globalization;
using System.Text.Json.Serialization;
//using System.Exception;
public class Repository
{
[JsonPropertyName(“name”)]
public string Name { get; set; }
[JsonPropertyName(“description”)]
public string Description { get; set; }
[JsonPropertyName(“html_url”)]
public Uri GitHubHomeUrl { get; set; }
[JsonPropertyName(“homepage”)]
public Uri Homepage { get; set; }
[JsonPropertyName(“watchers”)]
public int Watchers { get; set; }
[JsonPropertyName(“pushed_at”)]
public string JsonDate { get; set; }
public DateTime LastPush =>
DateTime.ParseExact(JsonDate, “yyyy-MM-ddTHH:mm:ssZ”, CultureInfo.InvariantCulture);
}
public static async Task<IActionResult> Run(HttpRequest req, ILogger log)
{
log.LogInformation(“C# HTTP trigger function processed a request.”);
string name = req.Query[“name”];
string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
dynamic data = JsonConvert.DeserializeObject(requestBody);
name = name ?? data?.name;
string responseMessage = string.IsNullOrEmpty(name)
? “This HTTP triggered function executed successfully. Pass a name in the query string or in the request body for a personalized response.”
: $“Hello, {name}. This HTTP triggered function executed successfully.”;
// Testing thisdisregard for demo
// ReadTextFile();
log.LogInformation(“Performing ProcessRepositories code section…”);
var repositories = await ProcessRepositories();
foreach (var repo in repositories)
{
if (repo == null)
{
continue;
}
log.LogInformation($“Repo Name: {repo.Name}”);
log.LogInformation($“Repo Description: {repo.Description}”);
}
return new OkObjectResult(responseMessage);
}
// Ref URL: https://github.com/dotnet/samples/blob/master/csharp/gettingstarted/consolewebapiclient/Program.cs
private static async Task<List<Repository>> ProcessRepositories()
{
var client = new HttpClient();
client.DefaultRequestHeaders.Accept.Clear();
client.DefaultRequestHeaders.Accept.Add( new MediaTypeWithQualityHeaderValue(“application/vnd.github.v3+json”) );
client.DefaultRequestHeaders.Add(“User-Agent”, “.NET Foundation Repository Reporter”);
var streamTask = client.GetStreamAsync(“https://api.github.com/orgs/dotnet/repos”);
var repositories = await System.Text.Json.JsonSerializer.DeserializeAsync<List<Repository>>(await streamTask);
return repositories;
}
// Ref URL https://docs.microsoft.com/enus/dotnet/csharp/programmingguide/filesystem/howtoreadfromatextfile
private void ReadTextFile(ILogger log)
{
try
{
// Example #1
// Read the file as one string.
string text = System.IO.File.ReadAllText(@10.10.2.5testtestfile.txt”);
// Display the file contents to the console. Variable text is a string.
log.LogInformation(“Contents of WriteText.txt = {0}”, text);
}
catch(Exception ex)
{
log.LogInformation(“Exception = {0}”, ex.Message);
}
}


Run the HTTP Trigger and see the Function run successfully


 


Configure Function’s Application Setting



  • Create a Config Variable WEBSITE_VNET_ROUTE_ALL in Application settings for the function and set the value to 1.


Set-website-vnet-route-all-to-1-function-app-configuration.png


 


Create a VNET and required subnets


VNET-with-AzureFirewallSubnet.png

Subnet-for-VNET-integration-resized.png

VNET-integrationsubnet-delegate-subnet-to-server-farmpng-resized.png


  • Ensure a subnet AzureFirewallSubnet is created, dedicated for the Azure Firewall.

  • In the subnet for VNET integration, set the Delegate subnet to a service to Microsoft.Web/serverFarms


Create Simple NSG



  • Create an out of the box NSG rule and associate that to the VNET Integration Subnet


InboundNSG-resized.png


OutboundNSG-resized.png


 


Create an Azure Firewall



  • Using the AzureFirewallSubnet subnet, create an Azure Firewall. Once provisioned, note down the Firewall private IP


Create-AzureFirewall-resized.png


  • Click on the public IP name listed under Firewall public IP and note down the Firewall public IP, in this case 52.252.28.12. The name is provided during Azure Firewall creation.


Azurefirewall-public-ip-configuration-resized.png


  • Click on Rules under Settings for the Firewall. Navigate to Application Rules and create an application rule collection as below. Note: to test the scenario, rule Action will be set to Deny or Allow.


Edit-Firewall-Rule-Deny-resized.png


  • Set the values as shown above

  • Set the target FQDN to api.github.com

  • Protocol:Port Https:443

  • Set the source to the CIDR of the subnet for VNET integration, this is the easiest way. Alternatively, you may set the source to the IP assigned by the VNET integration subnet to the function app. This can be done by enabling the logs and seeing the details for the logs for AzureFirewall, the assigned IP address is listed.


Create Route Table



Create-route-table-resized.png




  • Create a route such that the next hop is for the private IP of the virtual appliance, in this case the private IP of the Azure Firewall



Set-route-for-route-table-resized.png


Test the scenario





  • Set the application rule to Deny, and run the Azure Function. You will see the Function will respond with an error in the output window. This is because the Firewall is blocking the traffic via the outbound public static IP.




  • Now set the Application Rule in the Azurefirewall to Allow and run the Azure Function. The call is now successful, retrieving the GitHub repositories per the logic in the HTTP Trigger function, this is viewable in the output window.




  • The public static outbound IP address is Firewall public IP, in this case 52.252.28.12. This is the IP to be shared with the Vendors or 3rd Party Software providers such that it can be whitelisted and calls can be made to the function via this public static IP.



Take your security to the next level with professional security services

Take your security to the next level with professional security services

This article is contributed. See the original author and article here.

Not every organization has the capacity nor the expertise to have a dedicated security operations team. Others may need a second set of eyes to review alerts in their network or simply want to ensure that they keep up with the latest techniques being used by adversaries. Some may want an additional assessment of their security posture or just need some evaluation of possible vulnerabilities in their network, while others need immediate help on ongoing breach.


Microsoft security services, together with a vast network of partners help address these challenges across global regions for public companies, private industries, and government entities to help protect against the most sophisticated adversaries. Joining forces with our extensive ecosystem of leading services partners that provide offerings such as managed threat hunting and managed detection and response, enables us to offer the best security solutions from Microsoft with the services your organization needs to secure your business.


To help you discover the range of security services offerings available to you, we’re excited to announce a new professional services catalog now available in Microsoft 365 Defender. At the moment, you can find it under Endpoints > Partners & APIs > Professional Services at security.microsoft.com. While it’s in the Endpoints section for the time being, you’ll find partners that support and build on many of our security products.


 


We’ve organized both first and third-party services along the following categories:



  • Educate and maintain your internal team’s security capabilities to prevent, detect, contain, and remediate threats.

  • Evolve your organization’s security posture through improved processes and technologies that modernize threat detection, containment, and remediation capabilities.

  • Protect your organization by proactively evaluate your organization’s ability to effectively prevent, detect, and respond to cyber threats before they disrupt your business.

  • Respond to security incidents quickly, effectively and at scale with complete incident response including investigation, containment, remediation, and crisis management.

  • Managed security services that assist organizations to detect threats early and help minimize the impact of a breach.


 


blogpic1.png


Figure 1: Image of Professional services catalog


 


Within each of the categories, you’ll see both Microsoft and third-party services that align to it along with a short description.


Click View to open up a fly-out screen on the right with additional details about the service, a typical engagement duration, a description of outcomes, and a link directly to the partner page about the service.


 


prof_servicesBlog.gif


Figure 2: Professional services catalog experience


 


Microsoft professional services are here to help
Microsoft has a range of security services that are offered to customers, including:



  • Microsoft Threat Experts – Microsoft Threat Experts – Targeted Attack Notifications is a managed threat hunting service. Once you apply and get accepted, you’ll receive targeted attack notifications from Microsoft threat experts, so you won’t miss critical threats to your environment. These notifications will help you protect your organization’s endpoints, email, and identities. Microsoft Threat Experts – Experts on Demand lets you get expert advice about threats your organization is facing. It’s available as a subscription service.

  • Microsoft Managed Desktop – If you are looking for someone to manage your desktop security, Microsoft Managed Desktop is a unique ITaaS solution that combines endpoint management and security monitoring and response in a way that gives users a secure and productive device experience that IT pros can trust.

  • Microsoft Detection and Response Team (DART) – If you have experienced a breach, the Microsoft Detection and Response Team will help your organization establish visibility of attacker activity, instantly start remediation, limit financial impact, get you back to business faster, and help you become cyber-resilient.

  • Microsoft Consulting ServicesMicrosoft Consulting Services help you prepare for the future, identify risks, and upgrade your environment by applying enterprise technology to business problems and guiding digital transformation. Through a rich set of Security solutions, Microsoft Consulting Services can help you modernize your Security posture by applying Zero Trust principles and modernize your Security Operations.

  • Compromise Recovery Security Practice (CRSP) – Part of Microsoft Consulting Services, the CRSP team can help you recover your environment post-security breach or ransomware attack.


 


Designed to deliver best-of-breed security, Microsoft offers partners opportunities to extend their existing security offerings on top of our open framework and a rich and complete set of APIs, allowing them to build extensions and integrations to Microsoft’s security platform. Security vendors interested in becoming our partners can use this page to get started.


As always, we welcome your feedback and would be glad to keep in touch.


Already a partner? Want to be in the catalog?
Please contact us at M365D_Prof_Serv_Cata@microsoft.com with your offering information and we will be happy to discuss your nomination for the catalog.


 





 


 

How-to find out the SQL MI Service Tier basic details with T-SQL

How-to find out the SQL MI Service Tier basic details with T-SQL

This article is contributed. See the original author and article here.

We are starting a series of blog posts on how to find out the necessary information about Azure SQL Managed Instance (SQL MI) for your automation scripts and tools. We hope it will help you to build some extraordinary applications on the product that we love building.


There are already known ways to determine these properties through the Azure Portal or through your favorite script language (Powershell, Azure CLI or any other), we just wanted to provide some useful information on the good, old, original and ever-useful T-SQL.



Even though Azure SQL Managed Instance in its origin is the same as SQL Server, there are some specifics about SQL MI that cannot be found on the on-premises or Infrastructure as a Service (IaaS) offerings of SQL Server. Those differences start with the Service Tier that is a part of the Platform as a Service (PaaS) offering. 
So here are some of the basic how-tos: 


 


How-to ensure that Managed Instance is your engine


This information and recommendation have been public since the launch of the Azure SQL Managed Instance – the value that the SERVERPROPERTY function returns, specific to the SQL MI has a value of 8.


 

IF( SERVERPROPERTY('EngineEdition') = 8 ) 
BEGIN
    PRINT 'This is an Azure SQL Managed Instance.';
END
ELSE
BEGIN
	PRINT 'This is NOT an Azure SQL Managed Instance.';
END

 


Check the SQL MI.PNG


On the image provided, you can see the output, if you are running your code on Azure SQL Managed Instance.


 


 


How-to find out the Service Tier of your Managed Instance


Right now there are 2 offerings – GP (General Purpose) & BC (Business Critical), each one focusing on the specific type of the workloads with specific requirements, but there is no guarantee that one day it might not suffer changes, and any of the alternative ways of finding it out are risky, and so here is the official way of find out through the sys.server_resource_stats DMV, that provides a good amount of useful  and important information, regarding the Managed Instance service tier.

The Azure SQL MI Platform as a Service (PaaS) offering is a very dynamic and potentially a tier can be changed by the user at almost any moment with just a couple of clicks or a single T-SQL batch, making it quite different to a regular SQL Server installation, where things are less likely to change with the same ease, typically.


In order to obtain the value of the current SQL MI Service Tier, please get the latest available value for the SKU from the above mentioned sys.server_resource_stats view:


 

SELECT TOP 1 sku as ServiceTier
	FROM [sys].[server_resource_stats]
	ORDER BY end_time DESC;

 


ServiceTier.PNGIn my case, I have executed this query on the Generic Purpose instance, as you ca see in the results pane. You can easily save this value into a variable or use it directly in an IF statement to build some logic around your code.


 


 


How-to find out used Hardware Generation


It is important to know the generation of the hardware that is being used on your Azure SQL Managed Instance. Besides knowing wether you are running the latest and the greatest generation of hardware available, some of the constraints (such as RAM limits) have been connected to it, and understanding and checking this information might be important for troubleshooting and/or optimization.


Currently, only Generation 5 (Gen 5) is available for your SQL MI, but at the launch of SQL MI, a couple of years ago, we had 2 different generations available – including now obsolete Generation 4 (Gen 4), and one day in the future, there might be others. 


 


To check this information, we shall be using the sys.server_resource_stats DMV again, this time selecting the [hardware_generation] column as the result:


 


 

SELECT TOP 1 hardware_generation as HardwareGeneration
	FROM [sys].[server_resource_stats]
	ORDER BY end_time DESC;

 


Hardware Generation.PNG


 


In this blog post we have provided you with just a couple of T-SQL scripts that will allow your better discovery of the SQL MI environment with the help of T-SQL. Stay tuned for even more insight that we are looking to bring to you.

Azure Marketplace new offers – Volume 152

Azure Marketplace new offers – Volume 152

This article is contributed. See the original author and article here.











We continue to expand the Azure Marketplace ecosystem. For this volume, 99 new offers successfully met the onboarding criteria and went live. See details of the new offers below:

























































































































































































































































































































































































































Applications


A10 Training - Virtual Lab (Japan).png

A10 Training – Virtual Lab (Japan): This virtual lab for A10 Networks’ Cloud Access Proxy and SSL Insight solutions allows users to customize the lab environment for troubleshooting, operational checks, and proofs of concept. This offer is available only in Japanese.


Augury Machine Health.png

Augury Machine Health: Augury’s Halo Machine Health combines advanced sensors with powerful AI-based diagnostics and collaboration tools to help teams understand when machines are at risk and how to intervene.


Azure DevOps to Zendesk connector.png

Azure DevOps to Zendesk connector: This connector from IntegrateCloud integrates Microsoft Azure DevOps with Zendesk, allowing support agents to create Azure DevOps work items from Zendesk ticket forms.


BHC3 AI Suite.png

BHC3 AI Suite: BakerHughesC3.ai brings together the energy technology expertise of Baker Hughes with C3.ai’s AI software for digital transformation in the oil and gas industry. The core of BHC3 AI Suite is an extensible, model-driven abstraction layer that dramatically enhances data scientist and application developer productivity.


Carebook Digital Pharmacy Platform.png

Carebook Digital Pharmacy Platform: Carebook is an all-in-one, customer-facing digital pharmacy solution. Engage your customers, easily connect with patients, provide online prescription refills, and seamlessly integrate with third-party e-commerce and loyalty systems.


Certis Smart Integrated Transport Hub.png

Certis Smart Integrated Transport Hub: Smart Transport Hub from Certis delivers a platform capable of transforming traditional transportation frameworks into intelligent, proactive networks. Support effective project development and asset management while securing the long-term viability of your transportation system.


Circit - Audit Confiirmations.png

Circit – Audit Confirmations: Circit, an open banking and audit platform, connects audit firms to banks and other evidence providers to facilitate more efficient and higher-quality audits. Circit automates the audit confirmation process using platform tools and open banking connectivity.


Cloud-Native CMS_DXP for Drupal.png

Cloud-Native CMS/DXP for Drupal: This preconfigured image offered by VMLab, an authorized reseller for Websoft9, provides Drupal 9.1.2, Apache 2.4.46, MySQL 5.7.34, PHP 7.3.28, phpMyAdmin 5.1.0, and Docker 20.10.6 on CentOS 7.9.


Cloud-Native Database for PostgreSQL.png

Cloud-Native Database for PostgreSQL: This preconfigured image offered by VMLab, an authorized reseller for Websoft9, provides PostgreSQL 9.4, 9.6, 10, or 11, along with pgAdmin 5.2 and Docker 20.10.6 on CentOS or Ubuntu.


Cloud-Native Database for RethinkDB.png

Cloud-Native Database for RethinkDB: This preconfigured image offered by VMLab, an authorized reseller for Websoft9, provides RethinkDB 2.4.1 and NGINX 1.20.0 on Ubuntu 20.04. RethinkDB is a NoSQL database that stores schema-less JSON documents.


Cloud-Native In-memory Database for Redis.png

Cloud-Native In-memory Database for Redis: This preconfigured image offered by VMLab, an authorized reseller for Websoft9, provides Redis 3.0, 3.2, 4.0, 5.0, or 6.0, along with NGINX 1.20.0 and RedisInsight on CentOS or Ubuntu.


Cloud-Native MQ for Apache ActiveMQ Power by VMLab.png

Cloud-Native MQ for Apache ActiveMQ: This preconfigured image offered by VMLab, an authorized reseller for Websoft9, provides Apache ActiveMQ 5.16.1 and OpenJDK 1.8.0 on CentOS or Ubuntu. Apache ActiveMQ is a popular and powerful open-source messaging and integration patterns server.


Collabspace ARCHIVE- Content Protection and Search.png

Collabspace ARCHIVE: Content Protection and Search: Collabspace ARCHIVE features versioning, email thread archiving, and file preview, along with data encryption and Write-Once-Read-Many (WORM) storage to prevent tampering or deletion. 


Compose Low-Code Platform on Azure.png

Compose Low-Code Platform on Azure: Governments and businesses alike can use Compose to achieve their digitization and productivity goals. The low-code application development platform is well-suited for data gathering, process automation, and case management.


Content Collaboration Platform based on ownCloud.png

Content Collaboration Platform based on ownCloud: This image offered by VMLab provides version 10.7 of ownCloud on CentOS 7.9. ownCloud is client–server software for creating file-hosting services. Also included are Apache 2.4.46, PHP 7.4.19, MySQL 5.7.34, Redis 5.0.9, ONLYOFFICE Docs Community 6.1.1, phpMyAdmin 5.1.0, and Docker 20.10.6.


Creative Work Subscription and Publishing Platform.png

Creative Work Subscription and Publishing Platform: This image offered by VMLab provides Ghost 4.5.0 on CentOS 7.9. Ghost is a headless Node.js content management system for professional publishing. Also included are NGINX 1.20.0, MySQL 5.7.34, Node.js 14.17, Docker 20.10.6, and phpMyAdmin 5.1.0.


Data Exploration and Visualization for Superset.png

Data Exploration and Visualization for Superset: This preconfigured image offered by VMLab provides Apache Superset 1.0, NGINX 1.20.0, PostgreSQL 10.16, Docker 20.10.6, Redis 3.2.12, and PgAdmin 5.2 on Ubuntu 20.04. Apache Superset is an open-source app for data exploration and data visualization.


DeepVu Aluminum Price Forecasting.png

DeepVu Aluminum Price Forecasting: DeepVu offers machine learning models to predict the price of a futures contract of a commodity that is traded on the Chicago Mercantile Exchange or the Shanghai Futures Exchange.


DevOps Automation Server for Jenkins.png

DevOps Automation Server for Jenkins: This preconfigured image offered by VMLab, an authorized reseller for Websoft9, provides Jenkins 2.277.4, OpenJDK 1.8.0, and NGINX 1.20.0 on CentOS or Ubuntu. Jenkins is an automation server with a plug-in ecosystem for supporting delivery pipelines.


EDB Enterprise Plan - PostgreSQL.png

EDB Enterprise Plan – PostgreSQL: The EDB Enterprise Plan features tools, support, and a PostgreSQL database extended by EDB’s enterprise-class software. This provides compatibility with Oracle Database, so you can migrate quickly to EDB on Microsoft Azure.


EY OpsChain Traceability.png

EY OpsChain Traceability: EY OpsChain Traceability brings trust via transparency into any supply chain. The solution is part of the EY OpsChain product suite, built on the EY Blockchain SaaS platform, which is hosted on Microsoft Azure.


Fortpress Starter.png

Fortpress Starter: Fortpress is an all-in-one application management system for developing blogs, web apps, websites, APIs, and more. Fortpress provides an integrated code editor for HTML, JavaScript, and CSS.


Hardened Windows Server 2019.png

Hardened Windows Server 2019: This image of Windows Server 2019 is preconfigured by The Center for Internet Security (CIS) to the recommendations in the associated CIS Benchmark and the U.S. Department of Defense’s Security Technical Implementation Guide (STIG).


HealthEC Population Health Management Platform.png

HealthEC Population Health Management Platform: HealthEC’s population health management solution aggregates claims and clinical data to deliver key insights into cost drivers and evaluate quality measures that impact performance-based reimbursements while enabling the care team to be a change agent.


Hólos - Fluxos.png

Hólos – Fluxos: Hólos from MPS Informática provides the resources necessary for the development and implementation of business process management and electronic document management. This app is available only in Portuguese.


Horizon Protection Engine.png

Horizon Protection Engine: Horizon Protection Engine integrates with cloud, hybrid-cloud, and on-premises architectures and allows users to utilize Microsoft Azure Information Protection to secure their business applications. This app is available only in Italian.


Incorta Intelligent Ingest for Microsoft Azure Synapse.png

Incorta Intelligent Ingest for Microsoft Azure Synapse: With Incorta Intelligent Ingest, you can seamlessly integrate your Oracle application/ERP data into Microsoft Azure for rapid analytics in Azure Synapse. Incorta fast-tracks the process for preparing data by automating the data modeling and staging steps. 


IN-D Aadhaar Number Masking.png

IN-D Aadhaar Number Masking: Aadhaar is a random 12-digit number issued to residents of India. Banks, hotels, and many other companies store Aadhaar images of their customers. IN-D Aadhaar Number Masking can verify and mask these images at the click of a button, helping companies maintain regulatory compliance.


Joshua Tree.png

Joshua Tree: Joshua Tree is an eco-design tool for clothing makers that collects and consolidates internal data as well as audit data carried out with suppliers, with a focus on environmental and social performance. This app is currently available only in French.


LEA365.png

LEA365: LEA is a chatbot included in Aurera’s MyCoach 365 change management solution. LEA integrates with Microsoft Teams to assist users and optimize their experience. This app is available in French.


MAESTRO.png

MAESTRO: Maestro is an IoT platform developed on Microsoft Azure components that provides end-to-end integration across operational, strategic, and tactical data flows. Sensors and machines can be managed and monitored remotely, alarms and notifications can be generated, and there is a dashboard for users.


Magento Server.png

Magento Server: This server image offered by Cloud Infrastructure Services provides Magento Open Source, Apache Web Server, MySQL Server, ElasticSearch, and all the required PHP modules on Ubuntu Server 20.04. Magento Open Source is a free e-commerce platform.


Mingdao APaaS and Zero-Code Platform.png

Mingdao APaaS and Zero-Code Platform: This preconfigured image offered by VMLab provides Mingdao On-Premise 2.4.1 and Docker 20.10.6 on CentOS 7.9. Mingdao is a rapid design and development tool for enterprise software. This app is available in English, Simplified Chinese, and Traditional Chinese.


Motif Discovery.png

Motif Discovery Application: Improve productivity and learn more about your machine data with Motif Discovery Application, a tool that automatically processes time series data and creates training data for machine learning models. Discovered patterns are clustered based on similarity.


Web App Runtime for Node.js.png

Node.js Runtime for Web App: This image offered by VMLab provides Node.js 10, 12, or 13 on CentOS 7.9. Node.js is a JavaScript runtime. Also included are NGINX 1.20.0, MySQL 5.7, Redis 5.0.9, Docker 20.10.6, phpMyAdmin 5.1.0, adminMongo 0.0.23, PM2 4.5.6, Express 4.16.1, and your choice of several versions of MongoDB.


Numerator Consumer Data Capture Link.png

Numerator Consumer Data Capture Link: Historically, consumer purchase data has been controlled by retailers, leaving brands with limited access. Numerator Link from Market Track democratizes data access (purchase or viewership), and brands can access consumer data from permissioned user accounts.


Open-Source Agile and Scrum Project Management.png

Open-Source Agile and Scrum Project Management: This preconfigured image offered by VMLab provides Zentao 12.4.3 on CentOS 7.9. Zentao is open-source project management software. Also included are Apache 2.4.46, PHP 7.2.34, MySQL 5.6.51, Redis 5.0.9, phpMyAdmin 5.1.0, and Docker 20.10.6.


Open-Source Cloud-Native ERP and CRM.png

Open-Source Cloud-Native ERP and CRM: This preconfigured image offered by VMLab provides Odoo Community Edition 12, 13, or 14 on Ubuntu 20.04. Odoo is a suite of open-source business apps. Also included in the image are NGINX 1.20.0, PostgreSQL 11.11, PgAdmin 5.2, and Docker 20.10.6.


Open-Source Collaboration Platform for Developers.png

Open-Source Collaboration Platform for Developers: This preconfigured image offered by VMLab, an authorized reseller for Websoft9, provides Mattermost Team Edition 5.34.0, an open-source messaging platform, on Ubuntu 18.04. Also included are NGINX 1.20.0, MySQL 5.7.34, Docker 20.10.6, and phpMyAdmin 5.1.0.


Open-Source IT Monitor Platform.png

Open-Source IT Monitor Platform: This preconfigured image offered by VMLab, an authorized reseller for Websoft9, provides Zabbix 5.2 on Ubuntu 20.04. Zabbix is an open-source distributed monitoring solution. Also included in the image are NGINX 1.20.0, MySQL 8.0.25, phpMyAdmin 5.1.0, and Docker 20.10.6.


petranna.db.MySQL managed service.png

petranna.db.MySQL managed service: The petranna.db.MySQL service allows an unlimited number of tenants on the same RDBMS instance, leading to a significant reduction in costs and faster time-to-market for SaaS applications.


Web App Runtime for PHP (LAMP).png

PHP Runtime for Web App (LAMP): This preconfigured image offered by VMLab, an authorized reseller for Websoft9, provides PHP 5.6, 7.0, 7.2, or 7.4, along with Apache 2.4.46, Redis 5.0.9, phpMyAdmin 5.1.0, and Docker 20.10.6 on CentOS 7.9. Also included is MySQL 5.7 or 8.0.


Web App Runtime for PHP (LEMP).png PHP Runtime for Web App (LEMP): This preconfigured image offered by VMLab, an authorized reseller for Websoft9, provides PHP 5.6, 7.0, 7.2, or 7.4, along with NGINX 1.20, Redis 5.0.9, phpMyAdmin 5.1.0, and Docker 20.10.6 on CentOS 7.9. Also included is MySQL 5.7 or 8.0.
Pobuca Customer Voice.png

Pobuca Customer Voice: Pobuca Customer Voice helps you know your customers by collecting and processing their feedback. Analyze all customer communications in natural language from voice recordings, emails, chats, and social media, and extract customer experience alerts and insights.


Pobuca Knowledge.png

Pobuca Knowledge: Pobuca Knowledge leverages AI to analyze conversations from all customer touchpoints and extract Q&As, which you can use for your knowledge base and to engage your customer service agents.


Redmine Open-Source Project Management.png

Redmine Open-Source Project Management: This preconfigured image offered by VMLab, an authorized reseller for Websoft9, provides Redmine 4.2.1, NGINX 1.20.0, MySQL 5.7.34, phpMyAdmin 5.1.0, and Docker 20.10.6 on Ubuntu 20.04. Redmine is a project management web app that uses the Ruby on Rails framework.


Sail Indoor Positioning.png

Sail Indoor Positioning: Sail is an indoor positioning platform that enables mobile app developers to create indoor navigation, location analytics, and personalization services. Sail uses machine learning to continually reconfigure the system, removing the need for any fingerprinting or calibration.


SailPoint Access Risk Management Solution.png

SailPoint Access Risk Management Solution: SailPoint Access Risk Management automates real-time access risk analysis, simplifies GRC (governance, risk, and compliance) processes, and identifies a potential user’s risks before access is granted.


SDx Transact - Smart Cloud.png

SDx Transact – Smart Cloud: HxGN SDx provides a central digital platform to optimize the design, engineering, construction, operation, and maintenance of industrial assets. Ensure safe and efficient operations in your facility.


Seatti Desk Booking & Team Collaboration.png

Seatti Desk Booking & Team Collaboration: Whether you’re working on-site, at home, on a business trip, or in a café, Seatti lets you plan your workspace locations and collaborate with your team. Stay connected in a hybrid, distributed workplace.


Secure Store.png

Secure Store: The prescriptive analytics of Secure Store Version 5.4, built with artificial intelligence, allows retailers to make informed decisions to improve profitability, reduce shrink, and achieve rapid return on investment.


Senserva for Azure.png

Senserva for Azure: Senserva, a cloud security posture management solution, produces priority-based risk assessments for all the Microsoft Azure objects that Senserva manages, enabling customers to perform optimal discovery and remediation.


Soft-ex Unified Analytics for Microsoft Teams.png

Soft-ex Unified Analytics for Microsoft Teams: The latest release of Soft-ex’s unified communication analytics solution offers a new user interface with a widget-driven dashboard. The configurable solution integrates with Microsoft Teams to deliver granular visibility.


TACHYUS. Production Optimization for Oil & Gas.png

TACHYUS. Production Optimization for Oil & Gas: The Tachyus platform and its Data Physics technology empower oil producers to make data-driven decisions that maximize asset value. Data Physics combines reservoir physics and machine learning to rapidly integrate data sources and identify optimal injection and drilling plans.


Teamviewer IoT Edge.png

TeamViewer IoT Edge: TeamViewer IoT lets you remotely control edge devices. To process events, such as machine failure or anomalies, TeamViewer IoT includes data processing on the edge as well as cloud-based predictive maintenance solutions.


TelluCare Digital Supervision.png

TelluCare Digital Supervision: Digital Supervision from Tellu allows healthcare personnel to use cameras to watch over patients in nursing homes or home-care environments. Camera images can also be anonymized for privacy reasons.


Verita Research Partner (VRP) Isolation Service.png

Verita Research Partner (VRP) Isolation Service: The VRP Isolation Service uses novel AC electrokinetics-based microarrays to capture DNA fragments greater than 300 bp, which are primarily derived from cell necrosis or extracellular vesicles carrying proteins and RNA. After isolation, these biomarkers can be investigated using standard lab techniques.


Visual Machine Learning Stack for TensorFlow.png

Visual Machine Learning Stack for TensorFlow: This preconfigured image offered by VMLab, an authorized reseller for Websoft9, provides TensorFlow 2.5.0, TensorBoard, NGINX 1.20.0, and Python 3.8.5 on Ubuntu 20.04. TensorFlow is an end-to-end open-source platform for machine learning.


VMware Tanzu Standard.png

VMware Tanzu Standard: VMware Tanzu Standard Edition simplifies the operation of Kubernetes for multi-cloud and hybrid-cloud deployments, centralizing management and governance for many clusters and teams across Microsoft Azure and edge or on-premises environments.


Windows Server 2019 with SQL Express.png

Windows Server 2019 with SQL Express: This image offered by Belinda CZ s.r.o. provides Windows Server 2019 with SQL Express. Also included is SQL Server Management Studio, an integrated environment to manage SQL bases.


WordPress.png

WordPress: This image offered by Niles Partners provides WordPress 5.7.1, a popular publishing platform for blogs and websites. WordPress offers customization through themes, extensions, and plug-ins.



Consulting services


4-Week Data Science Proof of Concept.png

4-Week Data Science Proof of Concept:  Hitachi Solutions experts will work with your team to identify data and deliver a robust modeling environment designed to accelerate your cloud journey using Hitachi Solutions Advanced Analytics DataLab and Microsoft Azure.


App Modernization- 10-Day Assessment.png

App Modernization: 10-Day Assessment: Cloud Temple will help your modernization project into the Microsoft Azure cloud by defining stakes, realizing a target architecture, and building scenarios using services such as Azure Migrate. This offer is available only in France.


Azure Backup- 1-Hour Briefing.png

Azure Backup: 1-Hour Briefing: Learn how Chunghwa Telecom can provide your enterprise with remote monitoring and management, restoration exercises, and disaster recovery execution to meet all your Microsoft Azure cloud backup needs. This offer is available only in Chinese.


Azure Check-up.png

Azure Check-up: The Azure Check-up from Unica ICT Solutions is a quick and easy way to map the status of your Microsoft Azure environment based on criteria such as security, continuity, performance, and cost. This offer is available only in Dutch. 


Azure Datacenter Foundation - Implementation.png

Azure Datacenter Foundation – Implementation: This one-week implementation by timengo includes a roadmap report with recommendations and configurations for a Microsoft Azure foundation based on timengo’s reference architecture. This offer is available only in Danish.


Azure Quick Start- 4-Day Implementation Service.png

Azure Quick Start: 4-Day Implementation Service: Bytes Software Services will explain Microsoft Azure capabilities and how they relate to your strategic adoption of the cloud platform, giving you an understanding of the requirements and blockers for implementation.


Azure Virtual Desktop- 2-Week Implementation.png

Azure Virtual Desktop: 2-Week Implementation: This implementation by Cloud Services provides your organization with assessment, configuration, cost estimates, testing, and fine-tuning to set up and manage compliant and fully secured virtualized desktop environments.


Azure Virtual Desktop- 3-Day Assessment.png

Azure Virtual Desktop: 3-Day Assessment: This assessment by +Aliance provides a cost estimate for transitioning to Azure Virtual Desktop, allowing you to enable remote work, reduce office costs, and combine offices in one infrastructure. This offer is available only in Russian.


Azure Virtual Desktop- 5-Day Proof of Concept.png

Azure Virtual Desktop: 5-Day Proof of Concept: This offer from Cisilion provides everything you need (demo, design, deployment, configuration, testing, and handover) to understand and prove the value of Azure Virtual Desktop across key stakeholders within your organization.


Azure VMware Solutions- 4-Week Implementation.png

Azure VMware Solutions: 4-Week Implementation: The Logicalis Azure VMware Solution (AVS) combines best-in-class technologies with a proven methodology so you can innovate, optimize your business, and achieve your transformation vision without a hassle.


Celonis Proof of Value (1 Week).png

Celonis Proof of Value (1 Week): This offer from diconium digital solutions proves the value of Celonis big data technology process mining, helping you visualize your business processes like never before and turn them into extraordinary experiences. This offer is available only in German.


Cloud Readiness-Check- 4-Hour Assessment.png

Cloud Readiness-Check: 4-Hour Assessment: Experts from redIT will examine your infrastructure and apps, then show you how to optimally move local IT to the cloud with a combination of Microsoft Azure, Microsoft 365, and Dynamics 365. This offer is available only in German.


Cloud Security- 1-Week Assessment.png

Cloud Security: 1-Week Assessment: This Ironstone offer provides a thorough analysis of, and useful insights into, your cloud security setup. Get a clear report assessing cloud security risks, recommendations, and actions so your organization can take a proactive approach.


Cogmation- 2-Week Implementation.png

Cogmation: 2-Week Implementation: This offer from L&T Technology Services provides end-to-end test automation for multipurpose applications and embedded devices via Cogmation on Azure Virtual Machines, a highly customizable and easily deployable test automation solution.


Confidential Compute 10-Week Assessment.png

Confidential Compute 10-Week Assessment: Does your data require conclusive assurances of security and fall under specific regulatory demands? This offer from PwC can help you exercise full control over your data in the cloud and utilize it to further enable your business. 


Corrosion Detection Using Azure 8-Week Proof of Concept.png

Corrosion Detection Using Azure 8-Week Proof of Concept: This consulting offer from Affine shows how Microsoft Azure deep learning can detect and classify corrosion via imagery. The solution uses Azure IoT Hub, Azure Machine Learning, Azure Synapse Analytics, and more.


Data Analytics on Azure- 2-Week Implementation.png

Data Analytics on Azure: 2-Week Implementation: Turn raw data into insights that drive decision-making and revenue with Navisite services and Microsoft Azure. Navisite data experts will help you define, design, and build a transformative analytics-driven organization.


DIP Capital Markets- 4-Week Proof of Concept.png

DIP Capital Markets: 4-Week Proof of Concept: Neudesic experts will help you deliver next-level insights across private equity assets by using its Document Intelligence Platform, knowledge mining framework, and preconfigured architectures to quickly deploy AI and use cases.


DynTek Managed Security Services.png

DynTek Managed Security Services: Using the power of Azure Sentinel, DynTek will actively monitor your environment to help you sort through the noise of false positives in order to hunt for security threats and turn alerts into actionable intelligence. 


Easy Cloud and Applications- 3-Weeks Briefing.png

Easy Cloud and Applications: 3-Weeks Briefing: Over the course of this three-week consulting engagement, the experts at RCR will help you create and build sustainable, scalable, and cost-efficient cloud applications using all the power that Microsoft Azure has to offer.


EPAM Azure Security- 6-Weeks Assessment.png

EPAM Azure Security: 6-Weeks Assessment: EPAM’s security assessment uses a combination of industry best practices and frameworks to understand the current security posture, critical gaps, and future state of your enterprise Microsoft Azure cloud environment.


ETL Conversion to ADF- 2-Day Assessment.png

ETL Conversion to ADF: 2-Day Assessment: Bitwise will provide a high-level time and cost estimate to migrate your on-premises data integration jobs to Microsoft Azure Data Factory using automated extract-transform-load (ETL) conversion.


Evolution ASEINFO- Implementation in 4 Weeks.png

Evolution ASEINFO: Implementation in 4 Weeks: Simplify your organization’s personnel records management, payroll, recruitment, evaluation, training, and more with this offer from Asesores en Informática, available only in Spanish, to implement Evolution ASEINFO on Microsoft Azure.


F&R Starter Kit Proof of Concept- 4-Hour Proof of Concept.png

F&R Starter Kit Proof of Concept: 4-Hour Proof of Concept: This proof of concept from Bardess Group delivers dynamic finance, accounting, and reporting charts to compare results to prior period, budget, and/or forecast without all the traditional manual work required. 


FAR Accelerator - A 1-Day Project Scope Workshop.png

FAR Accelerator – A 1-Day Project Scope Workshop: Learn how you can benefit from Bardess Group’s finance, accounting, and reporting accelerator on Microsoft Azure to pull data from various sources for presentation in browser-based applications that are accessible anywhere.


Image Analytics- 2-Week Proof of Concept.png

Image Analytics: 2-Week Proof of Concept: The experts from Cognilytic Technologies will work with you to build cutting-edge facial recognition or object detection scenarios to extract specific information using Microsoft Azure Cognitive Services pretrained models. 


Innovation Factory- 6-Week Implementation.png

Innovation Factory: 6-Week Implementation: Create cloud-native applications running on Microsoft Azure with Devoteam Alegri’s Innovation Factory. The dedicated packages of this offer will help you understand, explore, and realize your business case.


Managed IT Services.png

Managed IT Services: Big Idea Technology provides complete outsourced IT management services (virtual CIO, network monitoring, backup and disaster recovery, and more) to small to midsized businesses, and performs project-based engagements in most IT disciplines.


CentriLogic.png

Managed Services for Azure from CentriLogic: CentriLogic’s certified experts will work with you to design, deploy, and continuously support a right-sized environment on Microsoft Azure that is suited to your organization’s workloads and applications.


Microsoft 365 Cloud Security- 1-Day Assessment.png

Microsoft 365 Cloud Security: 1-Day Assessment: Direkt Gruppe experts will provide concrete recommendations for your organization to deploy specific security features (Azure Information Protection, Advanced Threat Protection, and more) in your Microsoft 365 environment.


Modern Data Platform Pilot- 8-Week Implementation.png

Modern Data Platform Pilot: 8-Week Implementation: Digia’s fast-track implementation of a modern data platform on Microsoft Azure and Microsoft Power BI empowers a future of data-driven decision-making through dashboards and key performance indicators (KPIs).


NNIT Managed Azure Virtual Desktop Service.png

NNIT Managed Azure Virtual Desktop Service: NNIT provides a coherent approach to optimize your virtual desktop solution, focusing on operational value and consistent processes. NNIT can also handle round-the-clock day-to-day operations and required governance processes.


Promotions Optimization Using Azure- 6-Week Proof of Concept.png

Promotions Optimization Using Azure: 6-Week Proof of Concept: Affine offers robust marketing campaign design and management to consumer and retail-sector firms with this solution that uses Microsoft Azure services such as Azure Data Factory and Azure Functions.


Satisnet Cyber Security Gap Analysis Assessment.png

Satisnet Cyber Security Gap Analysis Assessment: Satisnet provides a complete cybersecurity gap analysis to highlight where your organization should focus. Deliverables include a report on how to use the Microsoft security stack, namely Azure Defender and Microsoft 365 Defender.


Smart SAP on Azure- 4-Week Implementation.png

Smart SAP on Azure: 4-Week Implementation: Streamline your transformation to SAP HANA with Microsoft Azure via myCloudDoor, which has implemented hundreds of SAP on Azure solutions and earned the SAP on Azure advanced specialization designation from Microsoft.


Threat Management Assessment.png

Threat Management Assessment: IBM’s three-day assessment identifies gaps in your Microsoft Azure hybrid-cloud security program and provides directional recommendations to improve security operations, incident response, compliance, and governance.


Web Application Modernization- 1-Day Assessment.png

Web Application Modernization: 1-Day Assessment: In this session, Sela’s experts will use tools and discussions to assess your web application’s readiness for modernization to Microsoft Azure Web Apps, which can greatly improve scalability and availability. 


Windows Server and SQL Migration- 1-Hour Assessment.png

Windows Server and SQL Migration: 1-Hour Assessment: Zones offers services to enhance your security posture and extend the life of your Windows Server 2008 and SQL 2008 family of products by assessing, designing, and implementing their migration to Microsoft Azure.



Five Practical Tips to Limitless Data Integration Using Azure Data Factory | Data Exposed

This article is contributed. See the original author and article here.

Azure Data Factory (ADF) is a fully managed, easy-to-use, serverless data integration solution to ingest all your on-premises, hybrid, and multi-cloud data In this episode with Wee Hyong Tok, you will learn about the latest product updates to ADF, and how ADF is integrated with Azure Synapse Analytics and Azure Purview.


 


Watch on Data Exposed



Resources:

Get nostalgic with new Microsoft Teams backgrounds

Get nostalgic with new Microsoft Teams backgrounds

This article is contributed. See the original author and article here.

It’s that time of the week again: #ThrowbackThursday. The day we reflect on old pictures, funny GIFs, and the nostalgic moments that have defined us over the years.

The post Get nostalgic with new Microsoft Teams backgrounds appeared first on Microsoft 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Train smarter with NVIDIA pre-trained models and TAO Transfer Learning Toolkit on Microsoft Azure

Train smarter with NVIDIA pre-trained models and TAO Transfer Learning Toolkit on Microsoft Azure

This article is contributed. See the original author and article here.

One of the many challenges of deploying AI on edge is that IoT devices have limited compute and memory resources. So, it becomes extremely important that your model is accurate and compact enough to deliver real-time inference at the edge. Juggling between the accuracy of the model and the size is always a challenge when creating a model; smaller, shallower networks suffer from poor accuracy and deeper networks are not suitable for edge. Additionally, achieving state-of-the-art accuracy requires collecting and annotating large sets of training data and deep domain expertise, which can be cost-prohibitive for many enterprises looking to bring their AI solutions to market faster.  NVIDIA’s catalog of pre-trained models and Transfer Learning Toolkit (TLT) can help you accelerate your model development. TLT is a core component of the NVIDIA TAO, an AI-model-adaptation platform. TLT provides a simplified training workflow geared for the non-experts to quickly get started building AI using pre-trained models and guided Jupyter notebooks. TLT offers several performance optimizations that make the model compact for high throughput and efficiency, accelerating your Computer Vision and Conversational AI applications.


 


Training is compute-intensive, requiring access to powerful GPUs  to speed up the time to solution. Microsoft Azure Cloud offers several GPU optimized Virtual machines (VM)  with access to NVIDIA A100, V100 and T4 GPUs.


In this blog post, we will walk you through the entire journey of training an AI model starting with provisioning a VM on Azure to training with NVIDIA TLT on Azure cloud.


 


Pre-trained models and TLT


 


Transfer Learning is a training technique where you leverage the learned features from one model to another. Start with a pretrained model that has been trained on representative datasets and fine-tuned  with weights and biases. These models can be easily retrained with custom data in a fraction of the time it takes to train from scratch.



TLT_on_azure.png


Figure 1 – End-to-end AI workflow


 


The NGC catalog, NVIDIA’s hub of GPU-optimized AI and HPC software contains a diverse collection of pre-trained models for computer vision and conversational AI use cases that span industries from manufacturing, to retail to healthcare and more. These models have been trained on images and large sets of text and speech data to provide you with a highly accurate model to start with. For example, People detection and segmentation and body pose estimation models can be used to extract occupancy insights in smart spaces such as retail, hospitals, factories, offices, etc. Vehicle and License plate detection and recognition models can be used for smart infrastructure. Automatic speech recognition (ASR) and Natural language processing (NLP) models can be used for smart speakers, video conferencing, automated chatbots and others. In addition to these highly specific use case models, you also have the flexibility to use the general purpose pre-trained models from popular open model architectures such as ResNet, EfficientNet, YOLO, UNET, and others. These can be used for general use cases in object detection, classification and segmentation.


 


Once you select your pre-trained model, you can fine-tune the model on your dataset using TLT. TLT is a low-code Jupyter notebook based workflow, allowing you to adapt an AI model in hours, rather than months. The guided Jupyter notebook and configurable spec files make it easy to get started.


 


Here are few key features of TLT to optimize inference performance:



  • Model pruning removes nodes from neural networks while maintaining comparable accuracy, making the model compact and optimal for edge deployment without sacrificing accuracy.

  • INT8 quantization enables the model to run inference at lower INT8 precision, which is significantly faster than running in floating point FP16 or FP32


Pruning and quantization can be achieved with a single command in the TLT workflow.


Setup an Azure VM


 


We start by first setting up an appropriate VM on Azure cloud. You can choose from the following VMs which are powered by NVIDIA GPUs – ND 100, NCv3 and NC T4_v3 series. For this blog, we will use the NCv3 series which comes with V100 GPUs. For the base image on the VM, we will use the NVIDIA provided GPU-optimized image from Azure marketplace. NVIDIA base image includes all the lower level dependencies which reduces the friction of installing drivers and other prerequisites. Here are the steps to setup Azure VM


 


Step 1 – Pull the GPU optimized image from Azure marketplace by clicking on the “Get it Now” button.


cshah31_1-1625755027598.png


Figure 2 – GPU optimized image on Azure Marketplace


 


Select the v21.04.1 version under the Software plan to select the latest version. This will have the latest NVIDIA drivers and CUDA toolkit. Once you select the version, it will direct you to the Azure portal where you will create your VM.


cshah31_2-1625755027605.png


Figure 3 – Image version selection window


 


Step 2 – Configure your VM


In the Azure portal, click “Create” to start configuring the VM.


cshah31_3-1625755027614.png


Figure 4 – Azure Portal


 


This will pull the following page where you can select your subscription method, resource group, region and Hardware configuration. Provide a name for your VM. Once you are done you can click on the “Review + Create” button at the end to do a final review.


Note: The default disk space is 32GB. It is recommended to use >128GB disk for this experiment


 


cshah31_4-1625755027625.png


Figure 5 – Create VM window


 


Make the final review of the offering that you are creating. Once done, hit the “Create” button to spin up your VM in Azure.


Note: Once you create, you will start incurring cost, so please review the pricing details.


 


cshah31_5-1625755027637.png


Figure 6 – VM review


 


Step 3 – SSH in to your VM


Once your VM is created, SSH into your VM using the username and domain name or IP address of your VM.


 


 

ssh <username>@<IP address>

 


 


 


Training 2D Body Pose with TLT


 


In this step, we will walk through the steps of training a high performance 2D body pose model with TLT. This is a fully convolutional model and consists of a backbone network, an initial prediction stage which does a pixel-wise prediction of confidence maps (heatmap) and part-affinity fields (PAF) followed by multistage refinement (0 to N stages) on the initial predictions. This model is further optimized by pruning and quantization. This allows us to run this in real-time on edge platforms like NVIDIA Jetson.


In this blog, we will focus on how to run this model with TLT on Azure but if you would like to learn more about the model architecture and how to optimize the model, check out the two part blog on Training/Optimization 2D body pose with TLT – Part 1 and Part 2. Additional information about this model can be found in the NGC Model card.


 


Step 1 – Setup TLT


For TLT, we require a Python Virtual environment. Setup the Python Virtual Environment. Run the commands below to set up the Virtual environment.


 


 

sudo su - root
usermod -a -G docker azureuser
apt-get -y install python3-pip unzip
pip3 install virtualenvwrapper
export VIRTUALENVWRAPPER_PYTHON=/usr/bin/python3
source /usr/local/bin/virtualenvwrapper.sh
mkvirtualenv launcher -p /usr/bin/python3

 


 


 


Install Jupyterlab and TLT Python package. TLT uses a Python launcher to launch training runs. The launcher will automatically pull the correct docker image from NGC and run training on it. Alternatively, you can also manually pull the docker container and run it directly inside the docker. For this blog, we will run it from the launcher.


 


 

pip3 install jupyterlab
pip3 install nvidia-pyindex
pip3 install nvidia-tlt

 


 


 


Check if TLT is installed properly. Run the command below. This will dump a list of AI tasks that are supported by TLT.


 


 

tlt info --verbose

Configuration of the TLT Instance

dockers:
         nvcr.io/nvidia/tlt-streamanalytics:
               docker_tag: v3.0-py3
               tasks:
1. augment
2. classification
3. detectnet_v2
4. dssd
5. emotionnet
6. faster_rcnn
7. fpenet
8. gazenet
9. gesturenet
10. heartratenet
11. lprnet
12. mask_rcnn
13. retinanet
14. ssd
15. unet
16. yolo_v3
17. yolo_v4
18. tlt-converter
         
          nvcr.io/nvidia/tlt-pytorch:
          docker_tag: v3.0-py3

tasks:

1. speech_to_text
2. text_classification
3. question_answering
4. token_classification
5. intent_slot_classification
6. punctuation_and_capitalization
format_version: 1.0

tlt_version: 3.0

published_date: mm/dd/yyyy

 


 


 


Login to NGC and download Jupyter notebooks from NGC


 


 

docker login nvcr.io

cd /mnt/
sudo chown azureuser:azureuser /mnt/

wget --content-disposition https://api.ngc.nvidia.com/v2/resources/nvidia/tlt_cv_samples/versions/v1.1.0/zip -O tlt_cv_samples_v1.1.0.zip

unzip -u tlt_cv_samples_v1.1.0.zip  -d ./tlt_cv_samples_v1.1.0 && cd ./tlt_cv_samples_v1.1.0

 


 


 


Start your Jupyter notebook and open it in your browser.


 


 

jupyter notebook --ip 0.0.0.0 --port 8888 --allow-root

 


 


 


Step 2 – Open Jupyter notebook and spec file


In the browser, you will see all the CV models that are supported by TLT. For this experiment we will train a 2D body pose model. Click on the “bpnet” model in the Jupyter notebook. In this directory, you will also find Jupyter notebooks for popular networks like YOLOV3/V4, FasterRCNN, SSD, UNET and more. You can follow the same steps to train any other models.


cshah31_6-1625755027642.png


Figure 7 – Model selection from Jupyter


 


Once you are inside, you will find a few config files and specs directory. Spec directory has all the ‘spec’ files to configure training and evaluation parameters. To learn more about all the parameters, refer to the 2D body pose documentation.


cshah31_7-1625755027647.png


Figure 8 – Body Pose estimation training directory


Step 3 – Step thru the guided notebook


Open ‘bpnet.ipynb’ and step through the notebook. In the notebook, you will find learning objectives and all the steps to download the dataset and pre-trained model and run training and optimizing the model. For this exercise, we will use the open source COCO dataset but you are welcome to use your custom body pose dataset. Section 3.2 in the notebook talks about using a custom dataset.


cshah31_8-1625755027655.png


Figure 9 – Jupyter notebook for training


 


 In this blog, we demonstrated a body pose estimation use case with TLT but you can follow the steps to train any Computer Vision or conversational AI model with TLT.  NVIDIA pre-trained models, Transfer Learning Toolkit and GPUs in the Azure cloud simplify the journey and reduce the barrier to starting with AI. The availability of GPUs in Microsoft Azure Cloud allows you to quickly start training without investing in your own hardware infrastructure, allowing you to scale the computing resources based on demand.


By leveraging the pre-trained models and TLT, you can easily and quickly adapt models for your use-cases and develop high-performance models that can be deployed at the edge for real-time inference.


 


Get started today with NVIDIA TAO TLT on Azure Cloud.


 


Resources:



 

Excel Influencer Teaching The TikTok Masses

Excel Influencer Teaching The TikTok Masses

This article is contributed. See the original author and article here.

What do you get when you combine music, dancing and Excel? The answer could only be Kat Norton.


 


In a little over a year, the Office Apps and Services MVP has made a name for herself as Miss Excel on the world’s biggest social media platforms.


 


In short, 15-second videos, Kat tackles big Excel concepts, like how to create automated checklist hacks and interactive heat maps on the spreadsheet program, all the while dancing to a carefully curated soundtrack.


 


It’s perhaps this latter ingredient that is most integral to Kat’s rapid rise. Not only does the “Chief Excel Officer” pack as much information into each clip as possible, but she makes it engaging for the audience to digest. For the uninitiated, this video that teaches dynamic dropdown menu tips and tricks to the sound of Snoop Dog’s “Drop It Like It’s Hot” is a great place to start.


 


“As well as the content, a lot of people love my energy with it,” Kat says.


 


kat2.jpg


 


Kat’s rise in popularity has been sudden and impressive. The 28-year-old New Yorker started uploading videos in June of last year. By her fourth upload on TikTok, Kat was already notching video views of more than 100,000. By her fourth week, Kat’s account had attracted millions of viewers and more than 100,000 followers.


 


Soon after successfully branching out to Instagram and creating her own online courses, Kat had enough momentum to quit her corporate job and focus full-time on Miss Excel. Moreover, Kat also teaches Excel skills to businesses, as well as schools like the New York Institute of Technology, which commission her to speak.


 


“It’s almost like a science experiment to see how far it can go,” Kat says. “I always go to bed with a smile on my face – it’s amazing to be able to help people learn new things. It’s so rewarding.”


 


Another major milestone came in June of this year as Kat earned her first Microsoft MVP title. 


 


Kat says she cannot wait to work further with MVPs and be a part of the community of experts. “It’s such an honor to work with MVPs, there are so many brilliant minds and I learn so much from being around the community. It’s a humbling experience to work with them,” Kat says.


 


The sky is the limit for Kat. The Excel expert says she looks forward to building more courses and expanding her content into other Microsoft programs like PowerPoint and Word. For the moment, however, Kat says she is more than happy to take things as they come.


 


“I couldn’t have predicted anything that’s happened so far, I’m constantly surprised by the opportunities that appear and emails that land in my inbox,” Kat says.


 


“If there are other MVPs out there that are thinking to do something similar with social media, jump in if it’s something you feel called to do. It’s a trial by fire but you learn a lot along the way.”


 


Kat invites any MVPs with questions about social media – or indeed Excel – to get in touch. For more on Kat, check out her TikTok and Instagram.


 


kat.jpg

Building a digital guide dog for railway passengers with impaired vision

Building a digital guide dog for railway passengers with impaired vision

This article is contributed. See the original author and article here.

wopauli_3-1625716018283.png


 


 


Background


 


wopauli_0-1625715916483.png


 


Catching your train on time can be challenging under the best of circumstances. Trains typically only stop for a few minutes, leaving little room for mistakes. For example, at Munich Main station around 240 express trains and 510 regional trains leave from 28 platforms per day. Some trains can also be quite long, up to 346 meters (1,135 ft) for express ICE trains. It is extremely important to quickly find the correct platform and the correct platform section, and it is convenient ato enter the train through the door closest to a reserved seat. This already challenging adventure becomes even more so, if a vision impairment forces a customer to rely exclusively on auditory or tactile feedback. When traveling autonomously, without assistance, it is common practice to walk along the outside of a train, continuously tapping it with a white cane, to discover opened and closed doors (image above). While this works in principle, this practice has limitations, both in terms of speed and reliability. We therefore partnered with the Deutsche Bahn Systel GmbH, to build the Digital Guide Dog, an AI-powered smartphone application that uses computer vision and auditory and haptic feedback to guide customers to the correct platform section and train car door. In this blog post, we are sharing some of the details and unique challenges that we experienced while the AI model behind this application. Before we jump into the details, let’s watch a brief video describing our customer story.


 


 


https://play.vidyard.com/Ervptr2VDm9mPr7VWyjaMn.html?


We also have a German version of this video: https://videos.microsoft.com/customer-stories/watch/FdKfxkZx7VRheMuFJKdThZ? 


 


Approach


 


wopauli_1-1625715939161.png


 


At its core, the application relies on an object detection model, which draws bounding boxes around every opened and closed door in the camera image. Using the coordinates of the corners of bounding boxes, the application can then guide customers into the correct direction (image above). Even though this is probably one of the most common and canonical applications of AI these days, there were a couple of unique challenges that made this project interesting. First, it is important to select an AI model that considers the context around a door, to decided whether it is looking at an opened or closed door, or something entirely different. Second, model errors can have detrimental, even fatal consequences, because of dangerous hazards that come with the territory of being inside a busy train station. Third, the model has to process video frames at a high frame, directly on the smart phone. In the following sections, we talk in more detail about how we tackled each of these challenges.  


 


Considering the context of an object


wopauli_1-1625716425277.png


It is important to select an AI model that considers the context around a door, to decide whether it is looking at an opened or closed door, or something entirely different. For example, in the image above, the model has to be able to recognize the closed door on the left, and an opened door on the right. The tricky part for the opened door is that it contains the same door flies that would represent a closed door (except that they would be touching each other). This gets even trickier for doors that only have one door fly that is pushed to one side. It would be a catastrophic failure, if the model recognized the door fly as a closed door. This situation would overwhelm many computer vision algorithms that treat object detection and classification as two separate problems. Many computer vision algorithms rely on approaches related to selective search (see this blog post, for example), in effect resizing and moving a window across an image, to then classify the objects contained in the window. We therefore chose to use the YOLO (You Only Look Once) v5, because it reformulates object detection and classification into a single challenge, taking the entire image input under consideration. We used a model that had been pretrained on the ImageNet dataset, and fine-tuned using Azure Machine Learning, including hyperparameter sweeps with HyperDrive.


 


Error Analysis


The second challenge was that we had to ensure that the model could be trusted in guiding customers to their doors. A train station contains many potential dangers, most of all the risk of falling onto the train tracks and being run over by a train. For this purpose, we had to take great care in preparing the model for various potential scenarios, exactly understanding its limitations, so that we can communicate those clearly to users. We carefully curated an annotated image dataset that would cover various types of train models and model years, diverse perspectives of doors, as well as diverse surroundings. In addition to training the model on objects we were interested in, we also trained the model to recognize objects that could be mistaken for doors (e.g., gaps between cars and windows). We then partnered with a team in Microsoft Research to perform error analysis (released open source in form of Jupyter notebook widgets). In essence, this approach involves assigning features to images, such as train model and year, and distance and angle to doors, to then train a decision tree that aims to predict model errors based on these features.


 


CoreML


One remaining challenge was to then convert the YOLO v5 model from PyTorch to CoreML, so that it would be able to process camera images in real-time on the smartphone. This was necessary to avoid costs related to transferring data between the phone and the cloud, reduce processing latency, and, most importantly, due to privacy concerns, ensuring that camera images are not intercepted or stored (see this repository for how to anonymize images when creating a dataset). Model conversion to CoreML can be accomplished using Core ML Tools. To achieve high enough image throughput, we ensured that all neural network operations are supported by the Neural Engine of the smart phone. This required us to explore various changes to the model architecture. We then used HyperDrive to combine a search over these changes with a search over common hyperparameters (e.g., learning rate, weight decay, momentum), to optimize model speed and accuracy.


 


Conclusion


In this blog post, we tried to share our learnings about unique challenges that we encountered when working on a project that initially appeared to be a canonical use case of computer vision model for object detection. In future work, we are planning to expand the scope of the application, to further improve the autonomy of passengers with impaired vision. Please let us know your thoughts in the comments below.

MAR-10337802-1.v1: DarkSide Ransomware

MAR-10337802-1.v1: DarkSide Ransomware

This article is contributed. See the original author and article here.

This report is provided “as is” for informational purposes only. The Department of Homeland Security (DHS) does not provide any warranties of any kind regarding any information contained herein. The DHS does not endorse any commercial product or service referenced in this bulletin or otherwise.

This document is marked TLP:WHITE–Disclosure is not limited. Sources may use TLP:WHITE when information carries minimal or no foreseeable risk of misuse, in accordance with applicable rules and procedures for public release. Subject to standard copyright rules, TLP:WHITE information may be distributed without restriction. For more information on the Traffic Light Protocol (TLP), see http://www.cisa.gov/tlp.

Description

This Malware Analysis Report (MAR) is the result of analytic efforts by the Cybersecurity and Infrastructure Security Agency (CISA). CISA processed three (3) files associated with a variant of DarkSide ransomware. NOTE: CISA has no evidence that this variant is related to the pipeline incident, referred to in Joint Cybersecurity Advisory AA21-131A: DarkSide Ransomware: Best Practices for Preventing Business Disruption from Ransomware Attacks.

Ransomware is designed to encrypt the victim’s files to extort and ransom for their recovery. DarkSide is a ransomware-as-a-service (RaaS)–the developers of the ransomware received a share of the proceeds from the cybercriminal actors who deploy it, known as “affiliates.” This DarkSide ransomware variant executes a dynamic-link library (DLL) program used to delete Volume Shadow copies available on the system. The malware collects, encrypts, and send system information to the threat actor’s command and control (C2) domains and generates a ransom note to the victim.

CISA is distributing this MAR, which includes suggested response actions and recommended mitigation techniques, to help network defenders identify and mitigate risks.

For a downloadable copy of IOCs, see: MAR-10337802-1.v1.WHITE.stix.

Click here for a PDF version of this report.

Submitted Files (3)

156335b95ba216456f1ac0894b7b9d6ad95404ac7df447940f21646ca0090673 (156335b95ba216456f1ac0894b7b9d…)

3ba456cafcb31e0710626170c3565aae305bc7c32a948a54f0331d0939e0fe8a (045621d9.BMP)

f6fba207c71d1f53f82d96a87c25c4fa3c020dca58d9b8a266137f33597a0b0e (README.045621d9.TXT)

Domains (2)

baroquetees.com

rumahsia.com

IPs (2)

176.103.62.217

99.83.154.118

156335b95ba216456f1ac0894b7b9d6ad95404ac7df447940f21646ca0090673

Tags

downloaderloaderransomwaretrojan

Details
Name 156335b95ba216456f1ac0894b7b9d6ad95404ac7df447940f21646ca0090673.dll
Size 55810 bytes
Type PE32 executable (DLL) (GUI) Intel 80386, for MS Windows
MD5 f587adbd83ff3f4d2985453cd45c7ab1
SHA1 2715340f82426f840cf7e460f53a36fc3aad52aa
SHA256 156335b95ba216456f1ac0894b7b9d6ad95404ac7df447940f21646ca0090673
SHA512 37acf3c7a0b52421b4b33b14e5707497cfc52e57322ad9ffac87d0551220afc202d4c0987460d295077b9ee681fac2021bbfdebdc52c829b5f998ce7ac2d1efe
ssdeep 768:u2v9Ij6f3J8OT1PMK30DbQDH2doyomHRL83M4/NShWxEs0l29SFd2Xyj09rLd:fmET1PMK3qbpHY3M4wWmXgSFTSrLd
Entropy 6.789366
Antivirus
Ahnlab Ransomware/Win.DarkSide
Antiy Trojan[Ransom]/Win32.DarkSide.gen
Avira TR/AD.DarkSideRansom.muasl
BitDefender Trojan.GenericKD.46189032
ClamAV Win.Packed.DarkSide-9262656-0
Comodo Malware
Cyren W32/Trojan.HLZV-8042
ESET a variant of Win32/Filecoder.DarkSide.B trojan
Emsisoft Trojan.GenericKD.46189032 (B)
Ikarus Trojan-Ransom.DarkSide
K7 Trojan ( 005795061 )
Lavasoft Trojan.GenericKD.46189032
McAfee GenericRXOX-NH!F587ADBD83FF
NANOAV Trojan.Win32.Encoder.iuukal
Quick Heal Trojanransom.Encoder
Symantec Downloader
Systweak trojan-ransom.darkside
TACHYON Ransom/W32.DarkSide.55810
TrendMicro Ransom.17F5A898
TrendMicro House Call Ransom.17F5A898
VirusBlokAda BScope.TrojanRansom.Convagent
Zillya! Trojan.Encoder.Win32.2315
YARA Rules

No matches found.

ssdeep Matches

No matches found.

PE Metadata
Compile Date 2021-04-05 18:09:20-04:00
Import Hash 6c8408bb5d7d5a5b75b9314f94e68763
PE Sections
MD5 Name Raw Size Entropy
db99af79840cc24e4a2bc8920af97c4d header 1024 1.699168
6738c20d4ea897835026864651841fca .text 37376 6.090461
4e6ca671cfd10e3aa0e2dcd99bc287b6 .text1 1024 5.130274
c0265513cd36f1d659cc71bd70bfef58 .rdata 512 3.215043
3853bbcd5344aff518bb2f1ccbd05bdd .data 12288 7.713634
4d2b117a0087a34a0cb8575f34413c47 .ndata 3584 7.935769
Packers/Compilers/Cryptors
Borland Delphi 3.0 (???)
Relationships
156335b95b… Connected_To baroquetees.com
156335b95b… Connected_To rumahsia.com
156335b95b… Dropped 3ba456cafcb31e0710626170c3565aae305bc7c32a948a54f0331d0939e0fe8a
156335b95b… Dropped f6fba207c71d1f53f82d96a87c25c4fa3c020dca58d9b8a266137f33597a0b0e
Description

This artifact is a 32-bit DLL that is a Darkside ransomware variant. The program is called ‘encryptor2.dll’. When it is executed, it will invoke the Volume Shadow service (vssvc.exe) to delete any Volume Shadow copies available on the system.

The malware collects information on the system to include the operating system, default language, username, hostname, domain, and operating system (OS) architecture. This information is encrypted and sent to one of the following command-and-control (C2) domains:

—Begin C2 Domains—
baroquetees[.]com
rumahsia[.]com
—End C2 Domains—

The malware reads the system GUID and uses the value to generate a unique eight character hexadecimal extension that it appends to the encrypted files. This extension is also used as the name of the running service the program uses to encrypt the user’s data.

—Begin Service Example—
HKLMSystemCurrentControlSetservices.045621d9
HKLMSystemCurrentControlSetservices.045621d9DisplayName Data: “.045621d9”
HKLMSystemCurrentControlSetservices.045621d9ObjectName Data: “LocalSystem”
HKLMSystemCurrentControlSetservices.045621d9ImagePath Data: <Path to the DLL>
—End Service Example—

This variant of the malware contains a hard-coded key ‘_M8607761bf3212d6’ that it uses to decrypt an embedded base64 encoded configuration that runs the ransomware program. The program is configured to avoid encrypting any files located in directories that contain the following strings:

—Begin Avoided Directories—
$recycle.bin
config.msi
$windows.~bt
$windows.~ws
windows
appdata
application data
boot
google
mozilla
program files
program files (x86)
programdata
system volume information
tor browser
windows.old
intel
msocache
perflogs
x64dbg
public
all users
default
—End Avoided Directories—

Any files with the following extensions will not be encrypted:

—Begin File Extensions—
.386
.adv
.ani
.bat
.bin
.cab
.cmd
.com
.cpl
.cur
.deskthemepack
.diagcab
.diagcfg
.diagpkg
.dll
.drv
.exe
.hlp
.icl
.icns
.ico
.ics
.idx
.ldf
.lnk
.mod
.mpa
.msc
.msp
.msstyles
.msu
.nls
.nomedia
.ocx
.prf
.ps1
.rom
.rtp
.scr
.shs
.spl
.sys
.theme
.themepack
.wpx
.lock
.key
.hta
.msi
.pdb
.sql
—End File Extensions—

Before the encryption routine starts, the program will check to determine if any of the following processes are running, and shut them down:

—Begin Running Processes—
oracle
ocssd
dbsnmp
synctime
agntsvc
isqlplussvc
xfssvccon
mydesktopservice
ocautoupds
encsvc
firefox
tbirdconfig
mydesktopqos
ocomm
dbeng50
sqbcoreservice
excel
infopath
msaccess
mspub
onenote
outlook
powerpnt
steam
thebat
thunderbird
visio
winword
wordpad
notepad
—End Running Processes—

The following services will also be terminated:

—Begin Terminated Services—
.vss
.sql
svc$
memtas
mepocs
sophos
veeam
backup
GxVss
GxBlr
GxFWD
GxCVD
GxCIMgr
—End Terminated Services—

After the encryption routine runs, a bitmap image file is created in the path C:ProgramData with the same name as the encryption extension, e.g. ‘045621d9.BMP’. The following registry keys are created that generate a ransom note wallpaper on the user’s desktop:

—Begin Wallpaper Registry Keys—
HKUDEFAULTControlPanelDesktopWallpaper Data: <Path to .BMP file>
HKCUControlPanelDesktopWallpaper    Data: <Path to .BMP file>
—End Wallpaper Registry Keys—

The .BMP file contains instructions to the victim for recovering data (Figure 1).

In each directory that the program has encrypted files, a ransom note is dropped with the naming format ‘README.<UniqueID>.TXT’. The file contains instructions for the victim to follow to recover files.

The following is an example of the recovery instructions:

—Begin Recovery Instructions—

———– [ Welcome to DarkSide ] ————->

What happend?
———————————————-
Your computers and servers are encrypted, backups are deleted. We use strong encryption algorithms, so you cannot decrypt your data.
But you can restore everything by purchasing a special program from us – universal decryptor. This program will restore all your network.
Follow our instructions below and you will recover all your data.

What guarantees?
———————————————-
We value our reputation. If we do not do our work and liabilities, nobody will pay us. This is not in our interests.
All our decryption software is perfectly tested and will decrypt your data. We will also provide support in case of problems.
We guarantee to decrypt one file for free. Go to the site and contact us.

How to get access on website?
———————————————-
Using a TOR browser:
1) Download and install TOR browser from this site: hxxps[:]//torproject.org/
2) Open our website: hxxp[:]//dark24zz36xm4y2phwe7yvnkkkkhxionhfrwp67awpb3r3bdcneivoqd.onion/ZWQHXVE7MW9JXE5N1EGIP6IMEFAGC7LNN6WJCBVKJFKB5QXP6LUZV654ASG7977V

When you open our website, put the following data in the input form:
Key:

lmrlfxpjZBun4Eqc4Xd4XLJxEOL5JTOTLtwCOqxqxtFfu14zvKMrLMUiGV36bhzV5nfRPSSvroQiL6t36hV87qDIDlub946I5ud5QQIZC3EEzHaIy04dBugzgWIBf009Hkb5C7IdIYdEb5wH80HMVhurYzet587o6GinzDBOip4Bz7JIznXkqxIEHUN77hsUM8pMyH8twWettemxqB3PIOMvr7Aog9AIl1QhCYXC1HX97G5tp7OTlUfQOwtZZt5gvtMkOJ9UwgXZrRSDRc8pcCgmFZhGsCalBmIC08HCA40P7r5pcEn2PdBA6tt5oHma19OMBra3NwlkZVUVfIql643VPuvDLNiDtdR1EZhP1vb2t2HsKlGOffG7ql9Y2JWcu2uwjqwVdSzQtlXWM6mEy3xdm3lcJnztQ5Nh7jJ7bYgAb1hODbN9UektcOzYC0e0ZqjPVLY3opxNvYgCk8Bz9clmNXqsvMjBQXJQVb8o0IPMcDjYyhJuG0EevGlAWVq8WGS7JraW22zvlz8SQ4HdgUEJR0VbrsitXqIbIF9S2XGZmtxEsRStAey

!!! DANGER !!!
DO NOT MODIFY or try to RECOVER any files yourself. We WILL NOT be able to RESTORE them.
!!! DANGER !!!

—End Recovery Instructions—

Screenshots

Figure 1. -

Figure 1. –

What is a MIFR? A Malware Initial Findings Report (MIFR) is intended to provide organizations with malware analysis in a timely manner. In most instances this report will provide initial indicators for computer and network defense. To request additional analysis, please contact CISA and provide information regarding the level of desired analysis.

What is a MAR? A Malware Analysis Report (MAR) is intended to provide organizations with more detailed malware analysis acquired via manual reverse engineering. To request additional analysis, please contact CISA and provide information regarding the level of desired analysis.

Can I edit this document? This document is not to be edited in any way by recipients. All comments or questions related to this document should be directed to the CISA at 1-888-282-0870 or CISA Service Desk.

Can I submit malware to CISA? Malware samples can be submitted via three methods:

CISA encourages you to report any suspicious activity, including cybersecurity incidents, possible malicious code, software vulnerabilities, and phishing-related scams. Reporting forms can be found on CISA’s homepage at www.cisa.gov.