Announcing two new Microsoft Dynamics 365 Fundamentals certifications!

Announcing two new Microsoft Dynamics 365 Fundamentals certifications!

This article is contributed. See the original author and article here.

People in many different roles and at various stages in their careers can all benefit from the two new fundamentals certifications being added to the Microsoft training and certification portfolio: Microsoft Certified: Dynamics 365 Fundamentals Customer Engagement Apps (CRM) and Microsoft Certified: Dynamics 365 Fundamentals Finance and Operations Apps (ERP). With more and more businesses using Microsoft Dynamics 365, and a growing need for people skilled in using it to help take advantage of its potential, the opportunities for people who can demonstrate their skill with certification are also increasing. See if these two new certifications can help you advance your career.

Microsoft Certified: Dynamics 365 Fundamentals Customer Engagement Apps (CRM)

The training for this certification gives you a broad understanding of the customer engagement capabilities of Dynamics 365. It covers the specific capabilities, apps, components, and life cycles of Dynamics 365 Marketing, Sales, Customer Service, Field Service, and Project Operations, and also the features they share, such as common customer engagement features, reporting capabilities, and integration options. Exam MB-910: Dynamics 365 Fundamentals: Customer Engagement apps (CRM), which you must pass to earn this certification, measures skills in these areas. There are no prerequisites for this certification.

Is this certification right for you?

If you want to gain broad exposure to the customer engagement capabilities of Dynamics 365, and you’re familiar with business operations and IT savvy—with either general knowledge or work experience in information technology (IT) and customer relationship management (CRM)—this certification is for you.  Acquiring these skills and getting certified to validate them can help you advance, no matter where you are in your career or what your role is. Here are just a few examples of who can benefit from this certification:

  • IT professionals who want to show a general understanding of the applications they work with
  • Business stakeholders or people who use Dynamics 365 and want to validate their skills and experience.
  • Developers who want to highlight their understanding of business operations and CRM.
  • Student, recent graduates, or people changing careers who want to leverage Dynamics 365 customer engagement capabilities to move to the next level.

Microsoft Certified: Dynamics 365 Fundamentals Finance and Operations Apps (ERP)

The training for this certification covers the capabilities, strategies, and components of these finance and operations areas: Supply Chain Management, Finance, Commerce, Human Resources, Project Operations, and Business Central, along with their shared features, such as reporting capabilities and integration options. Exam-MB-920: Dynamics 365 Fundamentals: Finance and Operations apps (ERP), which you must pass to earn this certification, measures skills in these areas. There are no prerequisites for this certification.

Is this certification for you?

If you’re looking for broad exposure to the enterprise resource planning (ERP) capabilities of Dynamics 365 and a better understanding of how finance and operations apps fit within the Microsoft ecosystem—and how to put them to work in an organization—this certification is for you. People who are familiar with business operations, have a fundamental understanding of financial principles, and are IT savvy—with either general knowledge or work experience in IT and the basics of ERP—are good candidates for this certification.

By earning this fundamentals certification, you will gain the skills to solve ERP problems and you can validate those skills to current or potential employers, helping to open career doors. If one of the following groups describes you, earning this certification can benefit you.

  • IT professionals who want to highlight their broad understanding of the applications they work with
  • Technical professionals and business decision makers who are exploring how Dynamics 365 functionality can integrate with apps they’re using.
  • Business stakeholders and others who use Dynamics 365 and want to learn more.
  • Developers who want to show a deeper understanding of business operations, finance, and ERP.
  • Students, recent graduates, and people changing careers who want to leverage Dynamics 365 finance and operations apps to move to the next level.

 Exam MB-901 is being retired

The two exams associated with these new certifications, Exam MB-910 and MB-920, are replacing Exam MB-901: Microsoft Dynamics Fundamentals. Exam MB-901 will expire on June 30, 2021. After that date, it will no longer be available for you to take – only MB-910 and MB-920 will be – so if you’re currently preparing for MB-901, make sure you take and pass that exam before June 30, 2021.

Explore these two new fundamentals certifications and other Dynamics 365 and Microsoft Power Platform fundamentals certifications. While you’re looking for ways to build and validate skills to stay current and enhance your career, why not browse the 16 Dynamics 365 and Microsoft Power Platform role-based certifications too? Find a training and certification that’s a perfect match for where you are now.

Rating Control to represent data from Dataverse in a Canvas Power App | Power Platform

Rating Control to represent data from Dataverse in a Canvas Power App | Power Platform

Although, you can use Text based controls to represent data as is in a Canvas Power App. So why not go an extra mile to make it look more intuitive.

Scenario

Let’s say there are Accounts and you have some kind of Compliance Ratings on them to represent compliant they are based on certain criteria. Could be a Numeric value or Decimal value. But Rating Control represents only in Numeric i.e. Whole Numbers.

So, for this example, I’m using a Rating field which is of type Whole Number in Dynamics 365 on Account entity.

And the complete Dataset looks like this in Dynamics 365 / Dataverse.

Note: I tried with Decimal but it rounded the values hence, sticking to Whole Number.

Rating Control

  1. Let’s say below is the Gallery and you want to show Ratings in the form of stars instead of traditional numeric values.
  2. Now, let’s use Rating Control below the Names of the Accounts to show the Rating values. Select the first Row once you connect to the Common Data Service i.e. CDS / Dataverse Data Source and select the entity you want to populate it with. In this example, we are using Accounts. So my Gallery is populated with Accounts records.
    Select the first record and navigate to Insert tab and look for Input controls as shown below

    Now, look for Rating control.

  3. Once you select Rating, it’ll appear iteratively since it’ll be applied “For Each” of these records in your Gallery control.
  4. I’ve just rearranged them under the name to make them look proper.
  5. Let’s make it Read Only by changing the behavior, so that the Users don’t accidently touch and set a value during Runtime. Although, this won’t affect the actual data but the representation will be incorrect in that case.
  6. Also, the Max will represent the length of your Rating whereas the Default value are kept to 0 in case the field value is not set at Source. (I think 1 would still be misrepresenting)
  7. Now, I’ll connect this Control to the Data Source’s field i.e. Rating field on the Account entity that represents the Rating value.
    On the Default, I’m setting ThisItem.Rating where ThisItem represents the Row of the Account i.e. the Account record itself and Rating is the field in Dynamics 365’s Account entity which we saw in the scenario above.
  8. And that’s it. You can Save and Publish your App and Run it.

Rating values

Now that we’ve added the Rating Control, let’s Run our App and see how it represents the data from your Accounts entity.

And the Ratings represent the below data –

Hope this was useful!

Here are some more Canvas Power Apps posts you might want to check –

  1. Clear a field value & Reset Form in a Canvas Power App [Quick Tip]
  2. Get Dynamics 365 field metadata in a Canvas App using DataSourceInfo function | Common Data Service
  3. Debug Published Canvas Power App with other users using Monitor | Power Platform
  4. Download a File from a Canvas Power App using a button | Power Platform
  5. AddColumns() function to dynamically add columns to a Data table in Canvas Power App | SharePoint List
  6. Implement real-time search in Gallery of CDS records in a Canvas Power App | Power Platform
  7. Implement character length validation in a Canvas Power App | Power Platform
  8. Log Canvas Power App telemetry data in Azure Application Insights | Power Apps
  9. Call HTTP Request from a Canvas Power App using Flow and get back Response | Power Automate
  10. Send a Power App Push Notification using Flow to open a record in Canvas App | Power Automate
  11. Dependent OptionSets in a Canvas Power App for 1:N related CDS entities | Power Platform
  12. Restore older version of a Canvas Power App | Power Platform

Thank you!!

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Entity Icon for entities on Unified Interface | Quick Tip

Entity Icon for entities on Unified Interface | Quick Tip

Now, Unified Interface being the only interface we have to work on in Dynamics 365 CRM, Here’s a quick tip on how and where you can set the Entity Icon for the Unified Interface.

Default Icons

One each entity, Custom or otherwise, here’s where you can set the Icon.

  1. Let’s say Commissions is a custom entity which has a default Icon.
  2. This is how it appears in the Unified Interface for any custom entity which doesn’t have any Icon set.
  3. Now, let’s look at how we can set the entity icons.

Set Icons for Entity

  1. Firstly, you need to create a Web Resource of type SVG Icon.
  2. Select the entity you want to set the Icon to in the Unified Interface. Let’s say Commission entity shows the Default Icon. Select the entity and look for Update Icons on the Solution.
  3. When you click on Update Icon, you’ll need to go to the Unified Interface.
  4. Now, in the New Icon field, you’ll have to select the Name of the SVG Icon from Step #1 above, i.e. cf_commission in this case.
  5. Select OK and Publish changes.
  6. And you’ll find the Icon is now updated for the entity entirely wherever it’ll be used on the Unified Interface. Since the Classic UI has been ruled out, we don’t need to update the classic Icons anymore.

Here are some Dynamics 365 related posts which you might need to check out –

  1. Find deprecated JS code used in your Dynamics 365 environment | Dynamics 365 v9 JS Validator tool | XrmToolBox
  2. Make On-Demand Flow to show up in Dynamics 365 | Power Automate
  3. Track and Set Regarding are disabled for Appointments in Dynamics 365 App For Outlook message | Demystified
  4. Cancelled Bookings Imported in Time Entries in Dynamics 365 PSA issue | [Quick Tip]
  5. Remove ‘This Email has been blocked due to potentially harmful content.’ message in Dynamics 365 Emails | OrgDbSettings utility
  6. Get GUID of the current View in Dynamics 365 CRM JS from ribbon button | Ribbon Workbench
  7. Get Dynamics 365 field metadata in a Canvas App using DataSourceInfo function | Common Data Service
  8. Dynamics 365 App For Outlook missing on SiteMap in CRM? Use shortcut link [Quick Tip]
  9. Debug Ribbon button customization using Command Checker in Dynamics 365 CE Unified Interface
  10. Pass Execution Context to JS Script function as a parameter from a Ribbon button in Dynamics 365 | Ribbon Workbench

Thank you!!

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

HOW TO AUTHOR SCCM REPORTS LIKE A SUPERHERO

HOW TO AUTHOR SCCM REPORTS LIKE A SUPERHERO

This article is contributed. See the original author and article here.

Hi there, I am Matt Balzan, and I am a Microsoft Modern Workplace Customer Engineer who specializes in SCCM, APP-V and SSRS/PowerBI reporting.

Today I am going to show you how to write up SQL queries, then use them in SSRS to push out beautiful dashboards, good enough to send to your boss and hopefully win that promotion you’ve been gambling on!

INGREDIENTS

  • Microsoft SQL Server Management Studio (download from here)
  • A SQL query
  • Report Builder
  • Your brain

Now that you have checked all the above, let us first go down memory lane and do a recap on SQL.

WHAT EXACTLY IS SQL?

SQL (Structured Query Language) is a standard language created in the early 70s for storing, manipulating and retrieving data in databases.

Does this mean that if I have a database with data stored in it, I can use the SQL [or T-SQL] language to grab the data based on conditions and filters I apply to it?   Yep, you sure can!

A database is made up of tables and views and many more items but to keep things simple we are only interested in the tables and views.

DB-BLOG.png

Your table/view will contain columns with headers and in the rows is where all the data is stored.

ROWS-BLOG.png

The rows are made up of these columns which contain cells of data. Each cell can be designed to have text, datetime, integers. Some cells can have no values (or NULL values) and some are mandatory to have data in them. These settings are usual setup by the database developer. But thankfully, for reporting we need not worry about this. What we need is data for our reports, and to look for data, we use t-SQL!

But wait! – I have seen these queries on the web, and they all look double-Dutch to me!

No need to worry, I will show you how to read these bad boys with ease!

ANATOMY OF A SQL QUERY

We have all been there before, someone sent you a SQL query snippet or you were browsing for a specific SQL query and you wasted the whole day trying to decipher the darn thing!

Here is one just for simplicity:

select Name0,Operating_System_Name_and0 from v_R_System where Operating_System_Name_and0 like ‘%server%’

My quick trick here is to go to an online SQL formatter, there are many of these sites but I find this one to be perfect for the job: https://sql-format.com/

Simply paste the code in and press the FORMAT button and BOOM!

query.png

Wow, what a difference!!! Now that all the syntax has been highlighted and arranged, copy the script, open SSMS, click New Query, paste the script in it and press F5 to execute the query.

This is what happens:

ANATOMY-BLOG.png

  1. The SELECT command tells the query to grab data, however in this case it will only display 2 columns (use asterisk * to get all the columns in the view)
  2. FROM the v_R_System view
  3. WHERE the column name contains the keyword called server using the LIKE operator (the % tells the filter to bring back zero, one, or multiple characters)
  4. The rows of data are then returned in the Results pane. In this example only all the server names are returned.

 TIPS!

Try to keep all the commands in CAPITALS – this is good practice and helps make the code stand out for easier reading!

ALWAYS use single quotes for anything you are directing SQL to. One common mistake is to use code copied from the browser or emails which have different font types. Paste your code in Notepad and then copy them out after.

Here is another example:   Grab all the rows of data and all the columns from the v_Collections view.

SELECT * FROM v_Collections

The asterisk * means give me every man and his dog. (Please be mindful when using this on huge databases as the query could impact SQL performance!)

Sometimes you need data from different views. This query contains a JOIN of two views:

SELECT vc.CollectionName

,vc.MemberCount

FROM v_Collections AS vc

INNER JOIN v_Collections_G AS cg ON cg.CollectionID = vc.CollectionID

WHERE vc.CollectionName LIKE%desktop%

ORDER BY vc.CollectionName DESC

OK, so here is the above query breakdown:

  1. Grab only the data in the column called vc.CollectionName and vc.MemberCount from the v_Collections view
  2. But first JOIN the view v_Collections_G using the common column CollectionID (this is the relating column that both views have!)
  3. HOWEVER, only filter the data that has the word ‘desktop‘ in the CollectionName column.
  4. Finally, order the list of collection names in descending order.

SIDE NOTE: The command AS is used to create an alias of a table, view or column name – this can be anything but generally admin folk use acronyms of the names (example: v_collections will be vc) – Also noteworthy, is that when you JOIN tables or VIEWS you will need to create the aliases, they probably might have the same column names, so the alias also solves the problem of getting all of the joined columns mixed up.

T-SQL reference guide: https://docs.microsoft.com/en-us/sql/t-sql/language-reference?view=sql-server-ver15

OK, SO WHAT MAKES A GOOD REPORT?

It needs the following:

  • A script that runs efficiently and does not impact the SQL server performance.
  • The script needs to be written so that if you decide to leave the place where you work, others can understand and follow it!
  • Finally, it needs to make sense to the audience who is going to view it – Try not to over engineer it – keep it simple, short and sweet.

SCENARIO: My customer wants to have a Task Sequence report which contains their company logo, the task sequence steps and output result for each step that is run, showing the last step run.

In my lab I will use the following scripts. The first one is the main one, it will join the v_r_System view to the vSMS_TaskSequenceExecutionStatus view so that I can grab the task sequence data and the name of the device.

SELECT vrs.Name0

,[PackageID]

,[AdvertisementID]

,vrs.ResourceID

,[ExecutionTime]

,[Step]

,[GroupName]

,[ActionName]

,[LastStatusMsgID]

,[LastStatusMsgName]

,[ExitCode]

,[ActionOutput]

FROM [vSMS_TaskSequenceExecutionStatus] AS vtse

JOIN v_R_System AS vrs ON vrs.ResourceID = vtse.ResourceID

WHERE AdvertisementID = @ADVERTID

ORDER BY ExecutionTime DESC

The second script is for the @ADVERTID parameter – When the report is launched, the user will be prompted to choose a task sequence which has been deployed to a collection. This @ADVERTID parameter gets passed to the first script, which in turn runs the query to grab all the task sequence data rows.

Also highlighted below, the second script concatenates the column name pkg.Name (this is the name of the Task Seq) with the word ‘ to ‘ and also with the column name col.Name (this is the name of the Collection), then it binds it altogether as a new column called AdvertisementName.

So, for example purposes, the output will be: INSTALL SERVER APPS to ALL DESKTOPS AND SERVERS – this is great, as we now know which task sequence is being deployed to what collection!

SELECT DISTINCT

adv.AdvertisementID,

col.Name AS Collection,

pkg.Name AS Name,

pkg.Name + ‘ to ‘ + col.Name AS AdvertismentName

FROM v_Advertisement adv

JOIN (SELECT

PackageID,

Name

FROM v_Package) AS pkg

ON pkg.PackageID = adv.PackageID

JOIN v_TaskExecutionStatus ts

ON adv.AdvertisementID = (SELECT TOP 1 AdvertisementID

  FROM v_TaskExecutionStatus

  WHERE AdvertisementID = adv.AdvertisementID)

JOIN v_Collection col

ON adv.CollectionID = col.CollectionID

ORDER BY Name

LET US BEGIN BY CREATING THE REPORT

  1. Open Microsoft Endpoint Configuration Manager Console and click on the Monitoring workspace.
  2. Right-click on REPORTS then click Create Report.
  3. Type in the name field – I used TASK SEQUENCE REPORT
  4. Type in the path field – I created a folder called _MATT (this way it sits right at the top of the folder list)
  5. Click Next and then Close – now the Report Builder will automatically launch.
  6. Click Run on the Report Builder popup and wait for the UI to launch. This is what it looks like:

REPORT-BUILDER-BLOG.png

STEP 1 – ADD THE DATA SOURCE

 ADD-DS-BLOG.gif

To get the data for our report, we need to make a connection to the server data source.

Right-click on Data Sources / Click Add Data Source… / Click Browse… / Click on Configmr_<yoursitecode> / Scroll down to the GUID starting with {5C63 and double-click it / Click Test Connection / Click OK and OK.

STEP 2 – ADD THE SQL QUERIES TO THEIR DATASETS

ADD-DATASETS-BLOG.gif

Copy your scripts / Right-click on Datasets / click on Add Dataset… / Type in the name of your dataset (no spaces allowed) / Select the radio button ‘Use a dataset embedded in my report’ / Choose your data source that was added in the previous step / Paste your copied script in the Query window and click OK / Click the radio button ‘Use the current Window user’ and click OK.

STEP 3 – ADJUST THE PARAMETER PROPERTIES

PARAMS-PROPS-GEN-BLOG.png

 

Expand the Parameters folder / Right-click the parameter ADVERTID / Select the General tab, under the Prompt: field type DEPLOYMENT – leave all the other settings as they are.

PARAMS-PROPS-BLOG.png

 

Click on Available Values / Click the radio button ‘Get values from a query’ / for Dataset: choose ADVERTS, for Value field choose AdvertisementID and for Label field choose AdvertisementName / Click OK.

Now when the report first runs, the parameter properties will prompt the user with the DEPLOYMENT label and grab the results of the ADVERTS query – this will appear on the top of the report and look like this (remember the concatenated column?):

DeploymentLabel.png

OK cool – but we are not done yet. Now for the fun part – adding content!

STEP 4 – ADDING A TITLE / LOGO / TABLE

ADD-TITLE-BLOG.gif

Click the label to edit your title / change the font to SEGOE UI then move its position to the centre / Adjust some canvas space then remove the [&ExecutionTime] field.

 

ADD-LOGO-BLOG.gif

From the ribbon Insert tab / click Image, find a space on your canvas then drag-click an area / Click Import…, choose ALL files (*.*) image types then find and add your logo / Click OK.

 ADD-TABLE-BLOG.gif

Next click on Table, choose Table Wizard… / Select the TASKSEQ dataset and click Next / Hold down shift key and select all the fields except PackageID, AdvertisementID & ResourceID.

Drag the highlighted fields to the Values box and click Next / Click Next to skip the layout options / Now choose the Generic style and click Finish.

Drag the table under the logo.

STEP 5 – SPIT POLISHING YOUR REPORT

TS-SPITPOLISH-BLOG.png

A Placeholder is a field where you can apply an expression for labels or text, you wish to show in your report.

In my example, I would like to show the Deployment name which is next to the Task Sequence title:

ADD-PLACEHOLDER-BLOG.gif

In the title text box, right-click at the end of the text TASK SEQUENCE: / Click Create Placeholder… / under Value: click on the fx button / click on Datasets / Click on ADVERTS and choose First(AdvertisementName).

Finally, I wanted the value in UPPERCASE and bold. To do this I changed the text value to: =UCASE(First(Fields!AdvertisementName.Value, “ADVERTS”)) , click on OK.

I then selected the <<Expr>> and changed the font to BOLD.

Do not forget to save your report into your folder of choice!

Once you have finished all the above settings and tweaks, you should be good to run the report. Click the Run button from the top left corner in the Home ribbon.

If you followed all the above step by step, you should have now a report fit for your business – this is the output after a Deployment has been selected from the combo list:

FINAL-REPORT-BLOG.png

  • To test the Action Output column, I created a task sequence where I intentionally left some errors in the Run Command Line steps, just to show errors and draw out some detail.
  • To avoid massive row height sizes, I set the cell for this column to font size 8.
  • To give it that Azure report style look and feel, I only set the second row in the table with the top border being set. You can change this to your own specification.

Please feel free to download my report RDL file as a reference guide (Attachment on the bottom of this page)

 LESSONS LEARNED FROM THE FIELD

  • Best advice I give to my customers is to start by creating a storyboard. Draw it out on a sheet of blank or grid paper which gives you an idea where to begin.
  • What data do you need to monitor or report on?
  • How long do these scripts take to run? Test them in SQL SERVER MANAGEMENT STUDIO and note the time in the Results pane.
  • Use the free online SQL formatting tools to create proper readable SQL queries for the rest of the world to understand!
  • Who will have access to these reports? Ensure proper RBAC is in place.
  • What is the target audience? You need to keep in mind some people will not understand the technology of the data.

COMING NEXT TIME…

My next blog will go into a deep dive in report design. I will show you how to manipulate content based on values using expressions, conditional formatting, design tips, best practices and much more….Thanks for reading, keep smiling and stay safe!

DISCLAIMER

The sample files are not supported under any Microsoft standard support program or service. The sample files are provided AS IS without warranty of any kind. Microsoft further disclaims all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose. The entire risk arising out of the use or performance of the sample files and documentation remains with you. In no event shall Microsoft, its authors, or anyone else involved in the creation, production, or delivery of the files be liable for any damages whatsoever (including, without limitation, damages for loss of business profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use the sample scripts or documentation, even if Microsoft has been advised of the possibility of such damages.

Matt Balzan | Modern Workplace Customer Engineer | SCCM, Application Virtualisation, SSRS / PowerBI Reporting

Deploy Windows SSUs and LCUs together with one cumulative update

Deploy Windows SSUs and LCUs together with one cumulative update

This article is contributed. See the original author and article here.

You can now deploy the December 2020 latest cumulative update (LCU) and servicing stack update (SSU) together via our new one cumulative update package, or separately.

On September 9th, 2020, I announced the work in progress to simplify on premises deployments of servicing stack updates. Today, I am excited to announce that you can take advantage of this new capability using Windows Server Update Services (WSUS) and the Windows Insider Program for Business.

We have released the December 2020 LCU and the December 2020 SSU to WSUS in two ways for devices running Windows 10, version 2004 and later: to the typical Security Updates category and to the Windows Insider Pre-Release category.

To deploy the cumulative update and servicing stack update separately, no special action is needed. Just ensure, as always, that you deploy the SSU prior to deploying the LCU so that both updates install successfully on the device.

To deploy the LCU and SSU together using the new one cumulative update package, simply follow three easy steps.

Step 1: Sync the Windows Insider Pre-release category

  • In the WSUS console, from Products and Classifications, select Windows Insider Pre-Release Product and Upgrades. Sync WSUS.
  • In Microsoft Endpoint Manager Configuration Manager, navigate to the Products tab of
    Software Update Point Component Properties and select Windows Insider Pre-Release. Select OK to confirm this selection.

windows-insider-pre-release.gif

Step 2: Select the OS version

From the list of All Updates, select the cumulative update for the version of Windows 10 running on the device(s) that will receive the update. Currently, this would be either of the following:

  • 2020-12 Cumulative Update for Windows 10 Version 2004
  • 2020-12 Cumulative Update for Windows 10 Version 20H2

cumulative-update.png

Step 3: Deploy the update

Deploy the update to the desired devices in your organization the same way you would deploy any other monthly cumulative update.

Check your preferred method of reporting and note that your devices are now running the December LCU (KB4592438) and SSU (KB4593175).

That’s it! It’s that simple.

The best part? Like all preview builds published to commercial devices in the Release Preview Channel and to the WSUS Windows Insider Pre-Release category, testing out this new deployment technology for LCUs and SSUs from WSUS is fully supported.

If you run into an issue that prevents you or other users in your organization from deploying or updating using this new one cumulative package, use this online form to request assistance directly from Microsoft Support at no cost to you. Or contact customer support through your typical channel.

Try out this new way of deploying LCUs and SSUs and let us know what you think by commenting below or reaching out to me directly on Twitter @ariaupdated.

Unleash the power of predictive analytics in Azure Synapse with machine learning and AI

Unleash the power of predictive analytics in Azure Synapse with machine learning and AI

This article is contributed. See the original author and article here.

We are really excited to introduce the preview of new machine learning experiences in Azure Synapse Analytics, to make it easier for data professionals to enrich data and build predictive analytics solutions.


 


AI and machine learning is an important aspect of any analytics solution. By integrating Azure Synapse Analytics with Azure Machine Learning and Azure Cognitive Services, we are bringing together the best of two worlds, to empower data professionals with the power of predictive analytics and AI.  Data engineers working in Azure Synapse can access models in Azure Machine Learning’s central model registry, created by data scientists. Data engineers can also build models with ease in Azure Synapse, using the code-free automated ML powered by Azure Machine Learning and use these models to enrich data.


 


Linking workspaces to enable collaboration between Data professionals and ML professionals


Linked services can be created to enable seamless collaboration across an Azure Synapse and an Azure Machine Learning workspace. Linked workspaces allow data professionals in Synapse to leverage new machine learning experiences aiming to make it easier to collaborate across Synapse and Azure ML.


 


johnmac_MS_5-1607312207773.png


Create the Azure Machine Linked service in your Synapse workspace


 


Seamlessly access Azure Machine Learning models from Synapse


Data professionals working in Azure Synapse can collaborate seamlessly with ML professionals who create models in Azure Machine Learning. These models can be shared and deployed directly in Azure Synapse for enrichment of data.


 


johnmac_MS_6-1607312207809.png


In the model scoring wizard, enrich with an existing model


 


By supporting the portable ONNX model format, users can bring a variety of models to Synapse for performant batch scoring, right where the data lives. This removes the need for data movement and ensures that the data remains within the security boundaries defined by Azure Synapse. Columns containing predicted values can easily be appended to the original views and tables that are used to populate your Power BI reports.


 


Enrich data with Azure Cognitive Services pre-trained models


Fully integrated data enrichment capabilities powered by Azure Cognitive Services allow Synapse users to enrich data and gain insights by leveraging state of the art pre-trained AI models. The first two models available through the Synapse workspace are Text Analytics (Sentiment Analysis) and Anomaly detector. In the future you’ll see more pre-trained models available for use.


 


johnmac_MS_7-1607312207821.png


Leverage Azure Cognitive Services in Azure Synapse for sentiment analysis


 


johnmac_MS_8-1607312207837.png


Leverage Azure Cognitive Services in Azure Synapse for Anomaly detection


 


Train models in Synapse using Automated ML powered by Azure Machine Learning


Data professionals can also build models with ease in Azure Synapse, using code-free automated ML powered by Azure Machine Learning. These Automated ML runs will be executed on Synapse serverless Apache Spark pools and tracked in the Azure Machine Learning service.


 


johnmac_MS_9-1607312207858.png


Select the task type for your Automated ML run in Azure Synapse


 


All the machine learning experiences in Azure Synapse produce code artifacts such as PySpark Notebooks or SQL scripts, that allow users of all skill levels to easily operationalize their work in data integration pipelines, to support end-to-end analytics flows from a single unified Synapse experience.


 


Get started today


We are expanding Azure Synapse to bring together the best in big data analytics and machine learning so you can leverage the full power of Azure. These new experiences in Synapse Studio will streamline the way data teams collaborate and build predictive analytics solutions. A large number of our customers are already taking advantage of predictive analytics solutions. Learn more about how you can get started on your journey with ML experiences in Azure Synapse by using the links provided below.


 


 


Resources:


AzureSynapse ML Docs overview: https://aka.ms/synapseMLDocs


Azure Synapse Automated ML tutorial: https://aka.ms/SynapseMLDocs_AutoML_tutorial


Azure Synapse model scoring tutorial: https://aka.ms/SynapseMLDocs_Scoring_Tutorial


Azure Synapse Cognitive Services tutorial: https://aka.ms/SynapseML_Docs_Cognitive_Services


Link Azure Synapse workspace to Azure ML workspace:https://aka.ms/SynapseMLDocs_link_AML


Azure Synapse TechCommunity: Check this blog daily to see a roundup of all the new tutorial blogs that will be posted for the next two weeks.


 


johnmac_MS_0-1607376512541.png