This article is contributed. See the original author and article here.

The Develop Hub in Azure Synapse Analytics enables you to write code and define business logic using a combination of notebooks, SQL scripts, and data flows. This gives us a development experience which provides the capability to query, analyze, and model data in multiple languages, along with giving us Intellisense support for these languages. This provides a rich interface for authoring code and in this post, we will see how we can use the Knowledge Center to jump-start our development experience.


 


The first thing we will do open up our Synapse workspace. From there, choose the Develop option to open the Develop Hub.


charlesfeddersenMS_1-1607546590235.png


The Develop Hub option is selected


 


Once inside the Develop Hub, we can create new SQL scripts, notebooks, data flows, and more. In the list of options is the Browse gallery menu item. Select this to open up the Knowledge Center gallery.


charlesfeddersenMS_2-1607546590241.png


The new Develop item option is selected first, followed by the Browse gallery option


 


Load a notebook from the Knowledge Center


In the Knowledge Center gallery, choose the Notebooks tab. This will bring up the set of pre-created notebooks in the gallery. These notebooks cover a variety of languages, including PySpark, Scala, and C# (with Spark.NET). For this post, we will use the Getting Started with Delta Lake example in Spark.NET C#.


charlesfeddersenMS_3-1607546590250.png


The Getting Started with Delta Lake notebook is selected


 


This brings up a preview, giving us context around what the notebook has to offer.


charlesfeddersenMS_4-1607546590257.png


The Getting Started with Delta Lake notebook preview tells us what to expect in the notebook


 


If we do not have an Apache Spark pool already, we will be prompted to create one. Apache Spark pool creation may take a few minutes, and we will receive a notification when it is ready to go.


charlesfeddersenMS_5-1607546590259.png


Create an Apache Spark pool named SampleSpark


 


From here, we can execute the cells in the notebook and try out some of the capabilities of the Apache Spark pools. We can make use of Intellisense to simplify our development experience, as well.


charlesfeddersenMS_6-1607546590264.png


A C# notebook execution is in progress


 


Notebooks are not the only element of the Knowledge Center gallery, however–we can also work with serverless SQL pool and dedicated SQL pools by way of SQL scripts.


 


Generate a SQL script from the Knowledge Center


Returning to the Knowledge Center gallery, we can choose the SQL scripts tab, which brings up a set of samples and templates for use. We will take a look at the sample SQL script entitled Generate your COPY statement with dynamic SQL. Just as with the notebook from above, we will select the script and then Continue.


charlesfeddersenMS_7-1607546590273.png


The SQL script entitled Generate your COPY statement with dynamic SQL is selected


 


This brings us to a preview page for the script which enables you to create a new dedicated SQL pool or choose an existing one. We will create a new dedicated SQL pool named SampleSQL and select Open script to continue.


charlesfeddersenMS_8-1607546590280.png


The Generate your COPY statement with dynamic SQL preview tells us what to expect in the SQL script


 


Dedicated SQL pool deployment may take a few minutes, and we will receive a notification when it is ready to go. At this point, we can run the query and see the results. Running this query may take several minutes depending on the size of your dedicated SQL pool, as the script will load 170,261,325 records into the newly-created dbo.Trip table.


charlesfeddersenMS_9-1607546590285.png


The SQL script is ready to be executed


 


This particular script shows how to use the COPY command in T-SQL to load data from Azure Data Lake Storage into the dedicated SQL pool. Just as with notebooks, the Develop Hub editor for SQL scripts includes Intellisense.


charlesfeddersenMS_10-1607546590288.png


Intellisense provides a list of column names and operators, simplifying the development process


 


Clean up generated resources


When we’ve finished learning about the Develop Hub, teardown is easy. In the Develop Hub, find the SQL scripts and notebooks we created and for each, select the menu and then choose the Delete option. We will receive a confirmation prompt before deleting to avoid accidental deletion of the wrong script.


charlesfeddersenMS_11-1607546590290.png


Delete a notebook or a SQL script


 


Then, we want to navigate to the Manage Hub and tear down the dedicated SQL pool and the Apache Spark pool that we have created. In the SQL pools menu option, select the menu and then Delete the pool. We will need to type in the name of the dedicated SQL pool (SampleSQL) as a confirmation.


charlesfeddersenMS_12-1607546590295.png


Delete the dedicated SQL pool


 


After that, navigate to the Apache Spark pools menu option, select the menu, and then Delete the pool. We will again need to type in the name of the Apache Spark pool (SampleSpark) to confirm our intent.


charlesfeddersenMS_13-1607546590299.png


Delete the Apache Spark pool


 


That takes care of all of the resources we created in this blog post.


Conclusion


In this blog post, we learned how we can use the Knowledge Center to gain an understanding of what is possible in the Develop Hub.


Quick get started with Azure Synapse and try this tutorial with these resources:



charlesfeddersenMS_14-1607546590306.jpeg


 


 


 

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.