A special offer for small businesses using Google’s legacy G Suite

A special offer for small businesses using Google’s legacy G Suite

This article is contributed. See the original author and article here.

Now through July 2023, small businesses like yours can get a 60 percent discount on a 12-month Microsoft 365 Business Basic, Business Standard, or Business Premium subscription.

The post A special offer for small businesses using Google’s legacy G Suite appeared first on Microsoft 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Romance scams in 2021: Fraudsters to the left of you, fakers to the right

Romance scams in 2021: Fraudsters to the left of you, fakers to the right

This article was originally posted by the FTC. See the original article here.

Love happens year-round, not only on Valentine’s Day. Unfortunately, romance scams are the same. So, along with sharing (or not) some chocolate, make Valentine’s Day a time to share with people you care about some ways to spot and avoid romance scams. Because, according to a new FTC report, people sent $547 million to online romance scammers last year.

While many of the people who told the FTC they were defrauded said they were contacted on a dating app, romance scammers found them on social media, too. In fact, more than a third of the people who lost money to an online romance scam said the contact started on Facebook or Instagram, often through an unexpected private message.

Romance scammers typically spin complicated stories to convince people to send money. In 2021, people reported scammers asking them to send money for one (imaginary) health or financial crises after another. Other scammers pretended to be successful cryptocurrency investors and used romance to lure people into sending money for bogus investments.

Scammers ask to get paid in ways that let them get money quickly and anonymously. In 2021, about one in four people used a gift card to send money to a romance scammer. The most money was reported lost — $139 million — through payments made in cryptocurrency.

How can you avoid a romance scam?

  • If someone appears on your social media and rushes you to start a friendship or romance, slow down.
  • Don’t send a reload, prepaid, or gift card; don’t wire money; and don’t send cryptocurrency to someone you met online.
  • If you suspect a romance scam, cut off contact. Tell the online app or social media platform right away, and then tell the FTC at ReportFraud.ftc.gov.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Azure Marketplace new offers – February 10, 2022

Azure Marketplace new offers – February 10, 2022

This article is contributed. See the original author and article here.











We continue to expand the Azure Marketplace ecosystem. For this volume, 118 new offers successfully met the onboarding criteria and went live. See details of the new offers below:










































































































































































































































































































































































































































Get it now in our marketplace


BDRSuite Backup for Microsoft 365.png BDRSuite Backup for Microsoft 365: Safeguard your entire Microsoft 365 domain, including emails, contacts, calendars, OneDrive for Business, SharePoint Online, and Microsoft Teams with agentless and automated backups using BDRSuite Backup for Microsoft 365.
Cado Response logo.png Cado Response: Cado Response is a forensic and incident response platform built for enterprise cloud environments. The solution’s advanced capability allows experts to detect previously undetected threats across dynamic and ephemeral resources, including containers.
DNS Guard-Marketplace.png

DNS Guard: This comprehensive security solution from BGProtect deters malicious activity on the network across all your devices and Domain Name System (DNS) assets. DNS Guard quickly identifies the root cause of current and past attacks and offers mitigation strategies.


EDUardo Sustainability Simulation.png

EDUardo Sustainability Simulation: EDUardo’s eco module trains companies to minimize environmental impact using lifelike simulations in real time. Participants can learn about sustainable production processes while navigating the challenges of green transformation.


EuroLinux.png

EuroLinux: EuroLinux is an enterprise-class Linux operating system based on Red Hat Enterprise Linux source code. Functionally compatible with RHEL, Oracle Linux, and CentOS Linux, EuroLinux is designed for both servers and workstations.


HoloMaintenance.png HoloMaintenance: HoloMaintenance uses Microsoft HoloLens and Microsoft Azure services to provide real time, remote repair and maintenance support to onsite operators. Experts can guide frontline workers through complex repairs, thus saving time and cost.
Jetdocs.png

Jetdocs: Automate and create your own internal ticketing and approval system across different service teams with Jetdocs. Fully integrated with Microsoft Teams, Jetdocs comes with over 80 templates built to work right out of the box and improve response time.


Jetveo Platform and App Builder.png

Jetveo Platform and App Builder: The Jetveo Platform and App Builder offers the best of both worlds by combining an intuitive user interface with the power of C# and low-code programming. It enables developers to provide business solutions quickly and efficiently.


Labelbox.png

Labelbox: Designed to help AI teams build and operate production-grade machine learning systems, Labelbox allows AI teams to rapidly create training data with minimal human supervision and improve model performance within a unified platform. 


LAMP Stack by Cloudwrxs.png

LAMP Stack by Cloudwrxs: This Cloudwrxs LAMP image enables developers to easily build websites and applications by providing a secure, stable, and high-performance coding environment. The image comes with the latest releases of PHP, Apache, and MariaDB on Linux.


Microsoft 365 GroupMGR.png

Microsoft 365 GroupMGR: This governance tool provides an overview of all existing groups, users, and their related assets in the Microsoft 365 environment. Integrated with Microsoft Teams and SharePoint, GroupMGR keeps all your data secure and private. 


Pavooq.png

Pavooq: Pavooq is an online platform that visually analyzes the quality of your workplace communication. Using graphs and supporting stats it can measure your team’s cohesion and reveal pitfalls and potential risks caused by employee communication patterns.


ProntoForms for SharePoint.png

ProntoForms for SharePoint: Empower your field service staff with ProntoForms mobile solution which allows technicians to collect complex data on their mobile devices, access company data remotely while in the field, and automatically share results with back-office systems such as a Microsoft SharePoint and Microsoft OneDrive.


Reverse Lit Lights Detection API.png

Reverse Lit Lights Detection API: Using an AI-generated image, VicissimDet or Vehicle Reverse Lit Lights Detector API enables applications to detect if the reverse lights of a vehicle are activated. It is used to develop surveillance apps to regulate traffic and parking violations.


SSHepherd.png

SSHepherd: SSHepherd is a critical component of your multi-layered security strategy. Reduce external brute force attacks by hiding SSH/RDP attack surfaces from hackers’ scans, automatically or manually terminate sessions based on rogue behavior, and more.


Vitis 2021.1 Development VM - Centos 7.8.png

Vitis 2021.1 Development VM – CentOS 7.8: This offer from Xilinx provides a Vitis 2021.1 AI acceleration solution on CentOS 7.8. Use it to accelerate vision and image processing, data analytics, machine learning, quantitative finance, and other diverse workloads.


Vitis 2021.1 Development VM - Ubuntu 18.04.png

Vitis 2021.1 Development VM – Ubuntu 18.04: This offer from Xilinx provides a Vitis 2021.1 AI acceleration solution on Ubuntu 18.04. Use it to accelerate vision and image processing, data analytics, machine learning, quantitative finance, and other diverse workloads. 


Vitis 2021.1 Development VM - Ubuntu 20.04.png

Vitis 2021.1 Development VM – Ubuntu 20.04: This offer from Xilinx provides a Vitis 2021.1 AI acceleration solution on Ubuntu 20.04. Use it to accelerate vision and image processing, data analytics, machine learning, quantitative finance, and other diverse workloads.


wordpress.png

WordPress: This offer from ATH Infosystems provides an image of WordPress optimized for production environments on Microsoft Azure. WordPress is a free and open-source content management system (CMS) written in PHP and paired with a MySQL or MariaDB database.


Workday for Microsoft Teams.png

Workday for Microsoft Teams: This solution by Workday allows employees to quickly access data and tasks without having to leave their collaboration environment. Give feedback to team members, request time off, and submit expenses without leaving Microsoft Teams.


Xilinx Alveo U250 2021.1 Deployment VM-Centos 7.8.png

Xilinx Alveo U250 2021.1 Deployment VM-CentOS 7.8: Xilinx’s offer provides a pre-installed Xilinx runtime and deployment shell for accelerating applications on the Alveo U250 accelerator card. This Azure-based virtual machine comes preconfigured with CentOS 7.8.


Xilinx Alveo U250 2021.1 Deployment VM-Ubuntu 18.04.png

Xilinx Alveo U250 2021.1 Deployment VM-Ubuntu 18.04: This offer from Xilinx provides a pre-installed Xilinx runtime and deployment shell for accelerating applications on the Alveo U250 accelerator card. This Azure-based virtual machine comes preconfigured with Ubuntu 18.04.


Xilinx Alveo U250 2021.1 Deployment VM-Ubuntu 20.04.png

Xilinx Alveo U250 2021.1 Deployment VM-Ubuntu 20.04: This offer from Xilinx provides a pre-installed Xilinx runtime and deployment shell for accelerating applications on the Alveo U250 accelerator card. This Azure-based virtual machine comes preconfigured with Ubuntu 20.04.



Go further with workshops, proofs of concept, and implementations


AI & Machine Learning- 11-Day Proof of Concept .png

AI & Machine Learning: 11-Day Proof of Concept: Experts from DCube will review your technical and business requirements and demonstrate how Microsoft Azure data and analytics services can help you build a machine learning solution. This offer is available only in French.


Apps Modernization- 4-Week Proof of Concept.png

Apps Modernization: 4-Week Proof of Concept: Develonica’s proof of concept is designed to help you modernize your in-house legacy applications using Microsoft Azure, Microsoft Teams, and Microsoft Power Platform. Learn how to maximize the ROI on your IT investment.


Azure Business Continuity- 2-Week Implementation.png

Azure Business Continuity: 2-Week Implementation: Minimize the impact of cyberattacks with Zak Solutions’ disaster recovery implementation using Microsoft Azure Site Recovery and Microsoft Operations Management Suite (OMS) monitoring solution.


Azure Container Quick Start- 3-Day Workshop.png

Azure Container Quick Start: 3-Day Workshop: In this hands-on workshop for IT professionals who want to get a better understanding of the Microsoft Azure container services ecosystem, UMB’s experts will demonstrate the architectural possibilities, agility, and resource efficiency of Azure Kubernetes Services.


Azure Data Factory- 1-Day Workshop.png

Azure Data Factory: 1-Day Workshop: Create data-driven workflows and transform data at scale with no code or maintenance requirement in this workshop by SWORD. The experts from SWORD will provide an overview of hybrid ETL and ELT pipelines within the Azure Data Factory visual environment.


Azure Design & Migration- 6-Week Implementation.png

Azure Design & Migration: 6-Week Implementation: Byte’s team will assess, map, design, and migrate your data and workloads to Microsoft Azure with zero-to-minimal disruption. Fast track your digital transformation journey using Microsoft’s Cloud Adoption Framework (CAF).


Azure DevOps Migration- 2-Week Proof of Concept.png

Azure DevOps Migration: 2-Week Proof of Concept: Plan your TFS to Azure DevOps migration with DCube’s solution. Experts will show your how to design and develop workflows for your organization’s code build and deployment. This offer is available only in French.


Azure Landing Zone- 3-Week Implementation.png

Azure Landing Zone: 3-Week Implementation: Maximize your cloud adoption benefits with Cegeka’s implementation of a secure and compliant Azure Landing Zone based on Microsoft’s Cloud Adoption Framework (CAF).


Azure Networking Services- 4-Week Proof of Concept.png

Azure Networking Services: 4-Week Proof of Concept: Projetlys will review your company’s on-premises applications and then prepare them for migration to Microsoft Azure while taking into account compatibility constraints, business impacts and cost assessment. This offer is available only in French.


Azure Purview- 2-Week Implementation.png

Azure Purview: 2-Week Implementation: This pilot engagement is designed for customers who want to start with workloads that offer quick wins before committing to a full deployment. Data#3 will on-board your first set of data in Azure Purview and help you uncover valuable insights. along with best practices and tools to plan your migration.


Azure Purview- 5-Day Proof of Concept.png

Azure Purview: 5-Day Proof of Concept: For organizations who are considering Azure Purview to meet their data governance requirements, Data#3 proof of concept provides a platform to evaluate Azure Purview. Participants will receive a qualified overview and a road map of recommended next steps and best practices.


Azure Purview- 8-Week Implementation.png

Azure Purview: 8-Week Implementation: The experts at Data#3 will simplify the deployment process of Azure Purview and help you efficiently manage your data across the enterprise. Help your data curators access the information they need to make better decisions.


Synapse & Customer Insights- 1-Day Workshop.png

Azure Synapse Analytics & Customer Insights: 1-Day Workshop: Elevate your customer engagement strategy with Agile Solutions’ workshop. Learn to use the Azure Synapse data platform to identify and convert potential sales leads and gain insights to retain and grow your existing customer base.


Azure Synapse Analytics- 1-Day Workshop.png

Azure Synapse Analytics: 1-Day Workshop: SWORD will provide an overview of the capabilities of Azure Synapse Analytics along with Power BI. Learn how advanced predictive analytics and visualization can offer valuable business insights and transform your work processes.


Azure Virtual Desktop Virtualization- 2-Day Workshop.png

Azure Virtual Desktop Virtualization: 2-Day Workshop: Experts from novaCapta will demonstrate the benefits of implementing Microsoft Azure Virtual Desktop or Windows 365 so your employees can access a secure, productive, and collaborative workplace from anywhere.


Azure Virtual Desktop- 3-Week Proof of Concept.png

Azure Virtual Desktop: 3-Week Proof of Concept: Enable a secure, remote desktop experience for employees while taking advantage of Microsoft Azure Virtual Desktop’s security features. Fellowmind’s proof of concept will deliver an actionable roadmap for full implementation.


Azure VMware Solution- 2-Week Proof of Concept.png

Azure VMware Solution: 2-Week Proof of Concept: Computacenter will demonstrate the value of Azure VMware solution and will utilize a 3-node trial cluster for this proof of concept. A roadmap to accelerate your cloud adoption will be provided.


Cloud Application Migration- 2-Day Workshop.png

Cloud Application Migration: 2-Day Workshop: R Systems will evaluate your on-premises workloads and resources, including virtualized Windows and Linux machines, and prepare them for migration to Microsoft Azure. Training will be offered in Polish or English.


Cloud Engagements- 3-Day Implementation.png

Cloud Engagements: 3-Day Implementation: Leverage Xpand IT’s extensive experience in Microsoft Azure technology in these two prepackaged offers that will help you lay down the foundations for building a cloud-native business.


Cognitive Plant Operations Adviser- 8-Week Implementation.png

Cognitive Plant Operations Adviser: 8-Week Implementation: The deep causal reasoning embedded in TCS’ solution will make your plant operations predictive, prescriptive, and future-ready. Using Microsoft Azure cognitive services, this solution can reduce your industry operations and maintenance costs.


Data & AI Opportunity Catalogue- 1-Day Workshop.png

Data & AI Opportunity Catalogue: 1-Day Workshop: AI Consulting Group’s opportunity catalogue will expose your organization to the art of possibilities. You will uncover key AI adoption opportunities available through Microsoft Azure AI and ML accelerators that can be customized to your business needs.


Data Modernization on Azure- 6-Day Proof of Concept.png

Data Modernization on Azure: 6-Day Proof of Concept: The experts from DCube will support you in the development and implementation of your data platform using a host of Microsoft Azure data and analytics services. Learn how your enterprise can increase scalability, optimize performance, and reduce cost. This offer is available only in French.


Data Platform- 60-Day Implementation.png

Data Platform: 60-Day Implementation: Empower your business with a scalable and affordable modern data platform. Using a combination of Microsoft solutions, Long View Systems consultants will help you change the way you create, consume, and communicate information.


Digital Product Workbench- 8-Week Implementation.png

Digital Product Workbench: 8-Week Implementation: With this consulting offer from Virtusa, you can see how the digital product workbench implemented on Microsoft Azure can help accelerate and innovate your product development cycle while reducing costs.


Endpoint Management- 5-Day Workshop.png

Endpoint Management: 5-Day Workshop: In this engagement, experts from vNext IQ will show you how to manage and protect your devices, apps, and users with Microsoft Defender for Endpoint. Integrate intelligent security and risk-based controls into your existing environment.


IoT Solution Design Workshop- 5-Day Workshop.png

IoT Solution Design Workshop: 5-Day Workshop: T-Systems Multimedia Solutions will collaborate with you to develop a use case to implement a host of Microsoft Azure services so you can deploy smart industrial IoT solutions. Gain real-time insights and optimize your production processes. This offer is available only in German.


Legacy System Modernization- 5-Week Implementation.png

Legacy System Modernization: 5-Week Implementation: The experts from Kanda Software will modernize your existing on-premises legacy applications using a multitude of modalities like refactoring, rehosting, or rebuilding as per your technical and business requirements.


Linux OSS DB Migration- 8-Week Implementation.png

Linux OSS DB Migration: 8-Week Implementation: In this end-to-end engagement, Bosch will migrate your Linux and open source databases to Microsoft Azure. This implementation will enable your organization to execute a secure migration strategy while optimizing cloud spend.


Microsoft Azure + Citrix- 5-Day Workshop.png

Microsoft Azure + Citrix: 5-Day Workshop: Learn to extend your Citrix applications and desktops to Microsoft Azure in this productivity without limits workshop. eGroup will provision the necessary resources so your users can access apps and data as and when needed.


Microsoft Azure + Zerto- 5-Day Workshop.png

Microsoft Azure + Zerto: 5-Day Workshop: eGroup will help you investigate the usage of Microsoft Azure as a disaster recovery replication target utilizing Zerto virtual replication. The goal is to test and manage your recovery solution while running it in a disaster-recovery mode.


Microsoft Azure Sentinel 5-Week Workshop.png

Microsoft Azure Sentinel: 5-Week Workshop: ProArch will empower your IT team to identify and quickly triage security alerts and proactively block threats to your Microsoft 365 cloud and on-premises environments by utilizing Microsoft Sentinel.


Microsoft Azure Virtual Desktop- 1-Week Workshop.png

Microsoft Azure Virtual Desktop: 1-Week Workshop: eGroup will tailor a Microsoft Azure Virtual Desktop workshop to fit your unique business needs. This offer comes with a customized setup, deployment, and UI so users can access applications as needed while keeping your data safe.


Migrating Workloads to Cloud- 12-Week Implementation.png

Migrating Workloads to Cloud: 12-Week Implementation: Kanda Software consultants will plan, prepare, migrate, and run your workloads to a new environment in Microsoft Azure. Reduce costs while increasing the performance and availability of your cloud applications.


OneData Master & Metadata Tool- 2-Week Implementation.png

OneData Master & Metadata Tool: 2-Week Implementation: OneDNA offers an easy-to-use tool to manage your master and metadata from a central location. This Microsoft Azure-based tool allows you to securely store your data on any platform of your choice.


Shift Analytics on Azure- 1-Week Proof of Concept.png

Shift Analytics on Azure: 1-Week Proof of Concept: In this proof of concept, experts from SWORD will demonstrate the advantages of Microsoft Synapse Analytics as they support and optimize your cloud integration journey.


Turnkey Business Ready Azure- 2-Week Implementation.png

Turnkey Business Ready Azure: 2-Week Implementation: Infield’s solution offers an array of fixed-cost plans for any industry. The goal is to provide a future-ready Azure governance model to enable cost management, automation, monitoring, and compliancy validation.



Contact our partners



Aireen



App Modernization Asset Discovery: 3-Week Assessment



Architecture Technical Engagement (ATE): 10-Day Evaluation



Aspen Unified PIMS



Avanade Mainframe Modernization



Azure Arc Consulting: 5-Day Assessment



Azure Cost Optimization: 2-Week Assessment



Azure Data & Analytics: Half-Day Assessment



Azure Migration: 4-Week Assessment



Azure Milestone Application: 4-Hour Assessment



Azure Purview Data Governance: 3-Day Briefing



Azure Synapse Migration: 6-Week Assessment



Azure VMware AVS Consulting: 5-Day Assessment



Azure VMware Solution: 15-Day Assessment



Barracuda Data Inspector



BEXTLabs – Virtual Machines



Bimasakti Property and Tenancy Management System



Cognigy AI



Conscia Cyberdefense Enterprise Solution



Conscia Cyberdefense XDR Protection



Corporate Instructor – New-Age Corporate Training



Credivera



Cross-Play Enablement Blueprint (CEB)



CSG Journey Orchestration



Data Security Protection Toolkit (DSPT) View



Database Performance Diagnostics: 4-Week Evaluation



Electronic Design Automation: Root Cause Analysis



E-Test Center



Expense Management



FEITIAN Passwordless Key Azure Active Directory Sign-In



FintechX Open Banking & Open Finance API Platform



GravityZone Security for Containers



GravityZone Security for Virtualized Environments



HARAGO, KIRAGO – Azure Virtual Desktop Management Portal



Helpdesk for Microsoft 365



HR Master



Identity Troubleshooter



Jalios Workplace



Magnifica Business



Managed Detection & Response (MDR) for Endpoints



Managed XDR for Cloud



Managed XDR for Hybrid



Microsoft Dynamics 365 Online Training



Move to Cloud: 11-Day Assessment



Multiplayer Mode Enablement (MME)



Node-RED Professional



One Control Tower



QueryPie – Data Governance Solution



QVine Soar Cloud Consulting: Business Automation, Data Management, AI/ML



SAP on Azure Consulting: 1-Week Assessment



SCORE+: HEDIS Performance Management Suite



SkyStep Orthotics ERP SaaS



TeamViewer for Frontline Workers



Text Classification Engine: 2-Hour Evaluation Briefing



UniteAR Augmented Reality Platform



Verme Workforce Management Platform



VisitTracker for Microsoft Teams



Zero Trust Security Assessment




 

CISA Adds 15 Known Exploited Vulnerabilities to Catalog

This article is contributed. See the original author and article here.

CISA has added 15 new vulnerabilities to its Known Exploited Vulnerabilities Catalog, based on evidence that threat actors are actively exploiting the vulnerabilities listed in the table below. These types of vulnerabilities are a frequent attack vector for malicious cyber actors of all types and pose significant risk to the federal enterprise.

CVE Number CVE Title Remediation Due Date

CVE-2021-36934

Microsoft Windows SAM Local Privilege Escalation Vulnerability

2/24/2022

CVE-2020-0796

Microsoft SMBv3 Remote Code Execution Vulnerability

8/10/2022

CVE-2018-1000861

Jenkins Stapler Web Framework Deserialization of Untrusted Data Vulnerability

8/10/2022

CVE-2017-9791

Apache Struts 1 Improper Input Validation Vulnerability

8/10/2022

CVE-2017-8464

Microsoft Windows Shell (.lnk) Remote Code Execution Vulnerability

8/10/2022

CVE-2017-10271

Oracle Corporation WebLogic Server Remote Code Execution Vulnerability

8/10/2022

CVE-2017-0263

Microsoft Win32k Privilege Escalation Vulnerability

8/10/2022

CVE-2017-0262

Microsoft Office Remote Code Execution Vulnerability

8/10/2022

CVE-2017-0145

Microsoft SMBv1 Remote Code Execution Vulnerability

8/10/2022

CVE-2017-0144

Microsoft SMBv1 Remote Code Execution Vulnerability

8/10/2022

CVE-2016-3088 

Apache ActiveMQ Improper Input Validation Vulnerability

8/10/2022

CVE-2015-2051

D-Link DIR-645 Router Remote Code Execution

8/10/2022

CVE-2015-1635

Microsoft HTTP.sys Remote Code Execution Vulnerability

8/10/2022

CVE-2015-1130

Apple OS X Authentication Bypass Vulnerability

8/10/2022

CVE-2014-4404

Apple OS X Heap-Based Buffer Overflow Vulnerability

8/10/2022

Binding Operational Directive (BOD) 22-01: Reducing the Significant Risk of Known Exploited Vulnerabilities established the Known Exploited Vulnerabilities Catalog as a living list of known CVEs that carry significant risk to the federal enterprise. BOD 22-01 requires FCEB agencies to remediate identified vulnerabilities by the due date to protect FCEB networks against active threats. See the BOD 22-01 Fact Sheet for more information.

Although BOD 22-01 only applies to FCEB agencies, CISA strongly urges all organizations to reduce their exposure to cyberattacks by prioritizing timely remediation of Catalog vulnerabilities as part of their vulnerability management practice. CISA will continue to add vulnerabilities to the Catalog that meet the meet the specified criteria.

Leverage Azure Databricks jobs orchestration from Azure Data Factory

Leverage Azure Databricks jobs orchestration from Azure Data Factory

This article is contributed. See the original author and article here.

This post was authored by Leo Furlong, a Solutions Architect at Databricks.


Many Azure customers orchestrate their Azure Databricks pipelines using tools like Azure Data Factory (ADF). ADF is a popular service in Azure for ingesting and orchestrating batch data pipelines because of its ease of use, flexibility, scalability, and cost-effectiveness. Many Azure Databricks users leverage ADF, for not only ingesting RAW data into data landing zones in Azure Data Lake Storage Gen2 (ADLS) or Azure Blob Storage, but also for orchestrating the execution of Azure Databricks notebooks that transform data into a curated Delta Lake using the medallion architecture.


 


In its current form, ADF customers can execute Azure Databricks jobs using the execute Notebook, Python, or Jar activities. Under the covers, these activities create a job in Azure Databricks by submitting to the Runs submit API and checking for status completion using the Runs get API. ADF customers can also execute an existing Azure Databricks job or Delta Live Tables pipeline to take advantage of the latest job features in Azure Databricks. It is extremely easy to execute an Azure Databricks job in ADF using native ADF activities and the Databricks Jobs API. The approach is similar to how you can execute an Azure Databricks Delta Live Tables pipeline from ADF. Additionally, you can have ADF authenticate to Azure Databricks using a personal access token (PAT), Azure Active Directory (Azure AD) token, or Managed Identity, with the last option being the best practice and least complex.


 


Configuration for Executing Azure Databricks Jobs from ADF
The sections below walkthrough how to build and configure a modular ADF pipeline that can execute any Azure Databricks defined job using out-of-the-box ADF pipeline activities and managed identity authentication. The full sample code can be found in the following Gists (regular and with parameters). You can also program the pipeline yourself using the following steps.


Figure 1 - Modular ADF pipeline for executing Databricks Jobs using managed identity (MI).png


Figure 1 – Modular ADF pipeline for executing Azure Databricks jobs using managed identities (MI)



Step 1 – Create ADF pipeline parameters and variables


The pipeline has 3 required parameters:



  1. JobID: the ID for the Azure Databricks job found in the Azure Databricks Jobs UI main screen. This parameter is required.

  2. DatabricksWorkspaceID: the ID for the workspace which can be found in the Azure Databricks workspace URL. This parameter is required.

  3. WaitSeconds: the number of seconds to wait in between each check for job status.


Figure 2 - ADF Pipeline Parameters.png


Figure 2 – ADF pipeline parameters


 


 


Figure 3 - Example Azure Databricks Jobs UI.png


 Figure 3 – Example Azure Databricks Jobs UI



The pipeline also has one variable called JobStatus with a default value as “Running”. This variable will be used to set the Job status while we are running the Azure Databricks job. When the Job Status changes, the ADF pipeline will update the variable.


 


Figure 4 - ADF pipeline Variables.png


Figure 4 – ADF pipeline variables


 


Step 2 – Execute the Azure Databricks Run Now API


The first step in the pipeline is to execute the Azure Databricks job using the Run Now API. This is done using the ADF Web activity and leveraging dynamic expressions. Configure the following values in the web activity:


 


URL: click “Add dynamic content” and enter the formula @concat(‘https://’,pipeline().parameters.DatabricksWorkspaceID,’.azuredatabricks.net/api/2.1/jobs/run-now’).
Method: POST
Body: click “Add dynamic content” and enter the formula @concat(‘{“job_id”:’,pipeline().parameters.JobID,’}’).
Integration runtime: select the correct integration runtime for your environment. The integration runtime should have network connectivity to the Azure Databricks workspace.
Authentication: select Managed Identity in the drop down menu.
Resource: enter the value 2ff814a6-3304-4ab8-85cb-cd0e6f879c1d. This ID represents the identifier for the Azure Databricks login application in Azure and is consistent for all tenants and customers.


 


Figure 5 - Web Activity to execute Databricks Job.png


Figure 5 – Web Activity to execute Azure Databricks job


 


Figure 6 - Dynamically constructed URL.png


Figure 6 – Dynamically constructed URL


 


Figure 7 - Dynamically constructed Body.png


Figure 7 – Dynamically constructed body


 


Step 3 – ADF Until activity


The second step in the pipeline is an Until activity. The Until activity will be used to check the Azure Databricks job execution status until it completes. All activities inside of the Until activity will execute until the JobStatus pipeline variable is no longer equal to the value “Running”. Configure the following values in the Until activity:


 


Expression: click “Add dynamic content” and enter the formula @not(equals(variables(‘JobStatus’),’Running’)).
Timeout: optionally, enter a timeout value for the Until activity that is less than the default.


 


Figure 8 - ADF Until activity.png


Figure 8 – ADF Until activity


 


To program activities inside the Until activity, click on the pencil button in the Activities menu. Within the Until activity, 3 activities are used to check the Azure Databricks job status, set the ADF pipeline variable, and wait to recheck the job status if it hasn’t already completed.


 


Figure 9 - Check Databricks Job status flow.png


Figure 9 – Check Azure Databricks job status flow


 


Step 4 – Check the Azure Databricks Job status using the Runs get API


The first activity inside the Until activity is to check the Azure Databricks job status using the Runs get API. This is done using the ADF Web activity and leveraging dynamic expressions. The return value from the Runs get API call will not only provide the Job status, but it will also provide the status for the individual tasks in a multi-task job and provide the Run URLs to navigate to the Azure Databricks job run executions in the Azure Databricks workspace UI for viewing status or troubleshooting. Configure the following values in the web activity:


 


URL: click “Add dynamic content” and enter the formula @concat(‘https://’,pipeline().parameters.DatabricksWorkspaceID,’.azuredatabricks.net/api/2.1/jobs/runs/get?run_id=’,activity(‘Execute Jobs API’).output.run_id). Make sure the activity value in the formula is equal to the name of the first web activity you created in the pipeline.
Method: GET
Integration runtime: select the correct integration runtime for your environment. The integration runtime should have network connectivity to the Azure Databricks workspace.
Authentication: select Managed Identity in the drop down menu.
Resource: enter the value 2ff814a6-3304-4ab8-85cb-cd0e6f879c1d. This ID represents the identifier for the Azure Databricks login application in Azure and is consistent for all tenants and customer.


 


Figure 10 - Get Job Run Status.png


 Figure 10 – Get job run status


 


 


Figure 11 - Dynamic Job Run Status Expression.png
Figure 11 – Dynamic job run status expression


 


Step 5 – Set ADF variable with job run status


The second activity inside the Until activity is a Set variable activity which is used to set the value of the pipeline variable JobStatus to the value returned from the Runs get API call. The expression checks whether the API return value of the life_cycle_state field is “PENDING” or “RUNNING” and sets the variable to “Running”. If the life_cycle_state field is not “PENDING” or “RUNNING”, then the variable is set to the result_state field. Configure the following values in the set variable activity:


 


Name: in the Name drop down menu, select the JobStatus variable


Value: click “Add dynamic content” and enter the formula. Make sure the activity name in the formula matches the name of your first Until web activity.
@if(
or(
equals(activity(‘Check Job Run API’).output.state.life_cycle_state, ‘PENDING’), equals(activity(‘Check Job Run API’).output.state.life_cycle_state, ‘RUNNING’)
),
‘Running’,
activity(‘Check Job Run API’).output.state.result_state
)


 


Figure 12 - Set the variable to the Runs Get output.png


Figure 12 – Set the variable to the Runs get output


 


Step 6 – Wait to recheck job run status


The third activity inside the Until activity is a Wait activity which is used to wait a configurable number of seconds before checking the Runs get API again to see whether the Azure Databricks job has completed. Configure the following values in the wait activity:


Wait time in seconds: click “Add dynamic content” and enter the formula. @pipeline().parameters.WaitSeconds


 


Figure 13 - Wait before rechecking Job status.png


Figure 13 – Wait before rechecking job status


 


Use modular ADF pipeline to execute Azure Databricks jobs


The modular pipeline is now complete and can be used for executing Azure Databricks jobs. In order to use the pipeline, use the Execute Pipeline activity in master pipelines used to control orchestration. In the settings of the activity, configure the following values:


 


Invoked pipeline: select “Execute Databricks Job using MI” from drop down menu
Wait on completion: checked
Parameters: set the values for the pipeline parameters:



  • JobID: the ID for the Azure Databricks job found in the Azure Databricks Jobs UI main screen.

  • DatabricksWorkspaceID: the ID for the workspace which can be found in the Databricks workspace URL.

  • WaitSeconds: the number of seconds to wait in between each check for job status.


 


Figure 14 - Execute Pipeline Activity in Master pipeline.png


Figure 14 – Execute Pipeline activity in master pipeline


 


Adding the Managed Identity Authentication
Instructions for adding the ADF Managed Identity to the Azure Databricks workspace as a Contributor (Workspace admin) are in the following blog article.


 


If your organization wants to give the ADF Managed Identity limited permissions, you can also add the ADF Application ID to the Azure Databricks workspace using the Service Principal SCIM API. You can then assign permissions to the user using the permissions API. The Application ID for the ADF Managed Identity can be found in Azure Active Directory under Enterprise Applications.


 


Leveraging cluster reuse in Azure Databricks jobs from ADF


To optimize resource usage with jobs that orchestrate multiple tasks, you can use shared job clusters. A shared job cluster allows multiple tasks in the same job run to reuse the cluster. You can use a single job cluster to run all tasks that are part of the job, or multiple job clusters optimized for specific workloads. To learn more about cluster reuse, see this Databricks blog post.