Championing Cloud Computing Curricula Developing a course for Data Analytics

Championing Cloud Computing Curricula Developing a course for Data Analytics

This article is contributed. See the original author and article here.

Introduction


The role of the Data Analyst supports businesses to maximize their competitiveness through gaining insights into data assets.  Data Analysts are responsible for designing and building scalable data models, cleaning, and transforming data, and enabling advanced analytic capabilities. The outcomes of which provide meaningful insights that can drive key business decisions and processes. As well as preparing and querying data, Data Analysts will also design and implement data-driven visualisations and dashboard that add business value and designed for a range of stakeholders.


 


Microsoft Azure and Power BI are powerful tools for businesses to derive business insights and provide large scale analytics capability from a range of datastores. Microsoft’s DA-100 Analysing Data with Microsoft Power BI course and exam gives students the opportunity to develop data analytical skills by using Azure and Power BI. The course is suitable for students at both undergraduate and postgraduate levels and can be run either as an extra-curricular course or integrated as part of a degree programme for academic credit.  Below is an example of the types of cloud data stores that Power BI can connect to and drive visualisations. It can also connect to local data stores and files such as Excel and csv files, offering flexibility for students.


 


 


LeeStott_0-1637855853443.png


 


This blog post will discuss case studies of how DA-100 is being delivered at two institutions: University of Lincoln (UK) and Nanyang Polytechnic (Singapore), covering:


 



  • Teaching approaches for DA-100 learning materials and official exam

  • Academic assessment/capstone strategy


 


This will cover teaching at both undergraduate and postgraduate levels and as part of different types of programmes, demonstrating the flexibility that implementing DA-100 offers. The DA-100 course includes a Microsoft certification exam, however it is recognised that many educators will also want to develop an academic assessment strategy that compliments the DA-100 exam. For example, a typical course/module in the UK is normally run over a 12-week period. A Data Analytics module could integrate the DA-100 learning materials and exam in the first 6 weeks of the module with the remaining 6 weeks dedicated to an academic assessment/capstone project.


 


The use of both the DA-100 certification exam and academic capstone assessment gives students the opportunity to demonstrate data analytics competencies across the Azure data stack and Power BI. In particular, the capstone provides the space for students to design and deploy a Power BI solution of their own design to meet the requirements of a given scenario.


 


Overview of DA-100: Analyzing Data with Microsoft Power BI


For those of you who are unfamiliar, this learning path and exam is at the heart of building fundamental skills for preparing, modelling, visualising, and analysing data. Desirable skills for data analysts, they could equally extend beyond this to serve those looking to build new desirable foundational knowledge or for those who want some experience using Azure’s suite of data tools including Power BI.  To state these as desirable would be an understatement. Combined with machine learning, automation, and cloud technologies, it has never been more accessible to build dynamic enterprise intelligence layer data and analytics solutions. Even if we consider our everyday use, organisations are increasingly democratising access to analytics and dashboards to instil cultures and capabilities among those teams. As recognised in Forbes, these give rise to frontline workers, regardless of industry and background, to translate insights into business impact and promote data-driven decision-making more broadly.


 


Data analytics tops McKinsey’s business area with the greatest need to address potential skills gap, and Gartner’s TalentNeuron analysis for digital skills in demand beyond tech companies and outside of IT. Not only are data science degrees on the rise, but apprenticeship standards in these areas also continue to be developed, supporting the need for these skills in industry right now. It has the power to accomplish many things in today’s climate. Where big data through sophisticated techniques has the potential to create value, there is no denying the ability for business analytics to illuminate deep data insights that can lead to more confident decision-making and real-time action.


 









Microsoft Learn for Educators – Access to DA-100 Course Materials


All Microsoft Learn Educators now have access to select Microsoft Advanced Role-Based  Microsoft curriculum and teaching materials including DA-100. Courses with Advanced Role Based content enable your students to deepen their technical skills and enhance their readiness for employability with industry-recognised certifications.

Member of the Microsoft Learn Educator Ambassador community can find Advanced Role Based curriculum in the Learning Download Center (LDC). Additional teaching materials, like course datasheets, educator guides, and assessment guides are also available.



 


Case Study – University of Lincoln and DA-100


The University of Lincoln (UoL) recently developed a new MSc Cloud Computing programme that aims to provide students with cutting-edge cloud skills and certification opportunities across a range of cloud competencies including cloud data, networking, compute, and development. Below is a programme map for the MSc, and of interest for this blog post is cloud data component of the programme, a 12-week module was developed named Cloud Data Platforms and Tools:


 


LeeStott_1-1637855853453.png


Cloud Data Platforms and Tools – Module Overview


The CDPT (Cloud Data Platforms & Tools) module includes the DA-100 course materials and exam as part of the module learning materials. The module synopsis is as follows:


“Data storage is a fundamental component of IT systems, applications, and devices, with the volume of data generated increasing significantly every year. Cloud storage platforms provide a means to store and process vast volumes of structured and unstructured data that enable large scale analytics for insights. This module will explore and critique cloud-based relational and non-relational databases, and how data is managed, analysed, and processed, such as transactional processing, batch, and streaming data processing. Students will have the opportunity to design and deploy a cloud data storage and analytics solution.”


The module compliments the cloud technology pillars of cloud compute and cloud networking that are embedded in the MSc programme so that cloud data, compute, and networking are robustly covered. Students will learn about a range of cloud data storage approaches, processing, analytics, and visualisation:



  • Database management systems

  • On-premise vs. Cloud data platforms and tools

  • Cloud data storage – blobs/buckets

  • Relational data

  • Non-relational data

  • High-availability and disaster recovery

  • Bigdata concepts

  • Data encryption and security

  • Data policies and governance

  • Transactional, batch, and stream processing

  • Data analytics

  • Data visualisation


The CDPT module runs for 12 weeks with the first six weeks of the module dedicated to delivering the DA-100 course materials, consisting of 13 modules. Each week is comprised of block delivery for DA-100 module slides and associated labs. Students will then be able to take the DA-100 certification exam as part of the CDPT module assessment strategy for academic credit. Essentially this is the first assessment of two for the module.



The next six weeks of delivery for the module will be focused on academic concepts of data storage, distributed systems, and security, building on and complimenting the DA-100 materials. The overall approach is that students will have developed cloud data knowledge and understanding, as well as skills competencies for designing and deploying a data solution that includes an analytics and visualisation component. This is where the proposed DA-100 capstone project fits the module.



The second academic assessment of the module will be a version of the DA-100 capstone project that is presented and attached to this blog. As the capstone allows students to design and develop their own data solution, it satisfies the academic learning outcomes of the CPDT modules as the data solution will be submitted with a technical report. The report will require students to present a critical discussion on aspects of their data solution, including decisions made and future work developments.



Overall, the combination of the DA-100 course materials/certification exam delivered alongside academic materials and the capstone project provides a compelling structure for the CPDT module for students to learn from.


Nanyang Polytechnic and DA-100


NYP Professional Competency Model (NYP-PCM) is a competency-based model that mirrors workplace practices and has been endorsed by our industry partners. A distinctive feature of NYP-PCM is that it allows related Competency Units (CmUs) to be grouped, with a Work-Integration-Unit (WIU), to form Competency Canvases (CCVs).  While CmUs develop specific competencies, WIUs enable learners to synthesise competencies developed through related CmUs within a CCV to perform significant tasks expected by the industry.


LeeStott_2-1637855853480.jpeg


 


Competency Learning Map of Diploma in Business Intelligence & Analytics


In NYP-PCM, industry partners may collaborate with NYP to co-certify learners in recognition of the competencies they have acquired in one or more CmUs, WIUs or CCVs. Microsoft is NYP’s long-standing partner and has collaborated with the School of Information Technology in various initiatives. As a global Tech Leader, Microsoft enjoys a strong community presence in the area of Analytics, with its data visualization and business intelligence tool, Microsoft Power BI.


NYP’s Visual Analytics Canvas aims to develop Visual Analytics competencies in our learners. Upon the completion of this canvas, learners are expected to perform data retrieval and data manipulation task using Extract-Transform-Load (ETL) techniques, analyse data using statistical techniques and develop dashboards to communicate business data insights effectively to the key stakeholders.


There are 4 CmUs and 1 WIU in the Visual Analytics Canvas (See Table 1).


 






























Type



Name



CmU



Business Needs Analysis



CmU



Data Visualisation



CmU



Data Storage Administration



CmU



Decision Analysis



WIU



Visual Analytics Project



Table 1 – CmUs and WIU in Visual Analytics Canvas


 


NYP will co-develop the Visual Analytics Canvas with Microsoft by infusing the Microsoft Certified: Data Analyst Associate curriculum into the Visual Analytics Canvas. Details of the canvas co-development are as shown in Table 2.           


     


















CmU/WIU



Co-development Details



Data Visualisation



·       Infuse with Microsoft Certified: Data Analyst Associate Curriculum


·       Microsoft’s professional certification exam DA-100: Analyzing Data with Microsoft Power BI as part of the Data Visualisation CmU’s ICA. Refer to Skills Measured in DA-100: Analyzing Data with Microsoft Power BI Exam



Visual Analytics Project



·        Infuse with Microsoft Certified: Data Analyst Associate Curriculum and Capstone Project


·       Leverage Microsoft Power BI tool for dashboard development in the project



Table 2 – NYP-Microsoft Visual Analytics Canvas Collaboration


To be awarded the Certificate of Competency in Visual Analytics jointly issued by NYP and Microsoft, learners are expected to pass all the CmUs and WIU in the Visual Analytics Canvas. In addition, learners are also expected to pass Microsoft’s professional certification exam (DA-100: Analysing Data with Microsoft Power BI), which constitutes part of Data Visualisation CmU’s In-Course Assessment.


Learners who are awarded the co-certificate would also receive the Microsoft professional certificate (Microsoft Certified: Data Analyst Associate).


Developing a Capstone Project


Capstone projects are designed to consolidate learning and to apply skills. This could easily be used as part of formal assessment, or as an enrichment opportunity, e.g., theme for hackathon and/or competition. Unlike an exam, the skills developed as part of a capstone project ensure students’ gather practical experience and are able to think more widely about real-world problems faced. It is no surprise that when as academics we came together, the desire to develop a capstone project to put new knowledge into practice was sought after. In as much as demonstrating learning and applied skills are invaluable, capstone projects also offer students a wide range of opportunities to develop critical thinking, problem solving, and presentation of their findings. It is through these experiences that students have the capacity to improve confidence, create a safe space for asking questions, and curate a series of activities for discussion at interview and/or for adding to their CV and resume.



Our Capstone project for DA-100 has been developed to assess learner’s skills in designing a PowerBI solution, creating/implementing a cloud infrastructure diagram using Azure services, and presenting their own cloud data solution. To achieve this, practitioners and learners are provided with a scenario where they take on the role of a city planner responsible for the design and development of urban areas that support community needs and developing short- and long-term plans. Their task is to develop a dashboard that can be readily accessible and used by a variety of different audiences. Learners are encouraged to go one step further to identify ways of improving city services, using data to provide insights into how the city’s population could become more sustainable and bring better understanding of the city’s overall performance.









Microsoft Learn for Educators – Access to DA-100 Capstone Project


Check out the Learning Download Center (LDC) for full access to the Educator and Student guide (with access to recommended activities/tasks, client scenario and rubric) for assessment.



Conclusion


Community of practice, or simply the coming together of like-minded academics, whatever you choose to call us, this blog and the newly designed Capstone Project serve as examples of the different types of output that can be achieved from bringing together experiences, course development, coupled with a handful of determined practitioners. With the help of Microsoft’s Technical Experts. We have been able to draw on our strengths to develop something which each of us and our institutions can use in practice. Not even geography could divide us, as we worked across three time zones and put aside each of our institutional requirements to create a global version of a project that took the best of DA-100 and Microsoft’s learning philosophy.



Once we collectively realised what we wanted to achieve, we gave ourselves license to operate and grow. Assessments were not new to any of us but designing a tangible tool with practitioners and students in mind that we wanted to make accessible for those of all backgrounds was to be no mean feat. We sourced relevant global access to data sets, developed a scenario that could add value to real-world problems, and built a series of activities/tasks that could be adapted to allow all learners to truly demonstrate their new knowledge and skills in a purposeful way. From the ground up our Capstone project choices have intended to serve the wider community of educators.



Educators are increasingly looking for support on how to achieve these benefits in their own practice and, like us, keen to stretch and challenge learners to ensure they have the skills they need to achieve long-term success. Each of us are delighted with the work produced and could not recommend more the opportunities we have had available to us to network and develop our own craft through the sharing of good practice as members of the Microsoft Educator Ambassador community.

We would like to invite you to the following Microsoft Reactor Session where we will discuss these resources in more detail.


Webinar – Please join us on Dec 3rd 9.30am GMT


Date: December 03, 2021


Time: 09:30 AM – 10:30 AM (GMT)


Format: Livestream


Topic: Data Science and Machine Learning


Register


LeeStott_3-1637855853485.jpeg


Interested in running a Data Analytics Capstone Project. 


Capstone Overview


This guide features a single, end-of-course Capstone project in which each student creates and delivers a PowerBI data solution and presentation based on client requirements. The project draws upon the students’ understandings gained from all modules in the course and presents an opportunity to demonstrate their intellectual curiosity around the design, deployment and presentation of a scenario-based PowerBI data solution. It is designed to be flexible enough to be used in a variety of contexts for students of various technical levels.


The Capstone Project Student Guide can be copied and redistributed to students as standalone content, as it provides an example of a baseline project that could be tailored to fit the needs of various classes. You may choose to modify some of the content dependent on which parts of the Capstone are to be assessed and whether a real-world client or pre-defined scenario is used.



The Capstone Project Educator Guide section provides additional guidance for modifications that can be made to the capstone project, as well as comprehensive information on best practices in the preparation and development of the project.



The Capstone Project Rubric section provides additional information for students and teachers in using the project as a summative assessment. Students may use this document to guide them as they complete the activities, ensuring that their work reflects what they have learned in the course.



Overview and Objectives


This is the final activity for the course. The primary objective is to create an interactive PowerBI data solution and dashboard with accompanying presentation that summarizes and justifies an industry client’s needs and goals for a data driven PowerBI solution, along with a diagrammatic representation of the proposed solution. The presentation should also document the steps taken to prepare, model, and visualise the data and state how to deploy and maintain deliverables. This is an opportunity for students to reflect on what they have learned throughout the course, both about Microsoft PowerBI and the client. There is a specific focus on the Microsoft Dataverse, Data Analysis Expressions (DAX), and analytics services for the Capstone.



Capstone Modification Options


As you plan your course, you may choose to tailor the Capstone project to the unique needs of your students. This may be in response to particular learning goals or logistical constraints. Depending on the student cohort and the focus of their study program, you may want to consider changes to:



  • The parts of the Capstone students will be assessed on.

  • The use of either a real-world client or pre-defined client scenario.

  • The rubric criteria


The Capstone can be completed individually or in groups and is modular to suit different student cohorts. You can select which parts of the Capstone project to deploy for assessment purposes.



To align with industry expectations and real-world application of cloud-driven data skills, the Capstone is primarily designed as a scenario-based project. In the baseline version of the project, students interact with a real-world industry client, allowing them to experience the requirements gathering phase and carry out research around configuring suitable data services and associated components. If this is not possible, you can develop a pre-defined industry scenario where students will make assumptions based upon the scenario content to develop an informed set of requirements. You can also ask students to research an existing company and develop a set of recommendations based on publicly available information.



The rubric criteria can easily be modified to reflect the priorities with your learners. You may choose to add or remove rows from the rubric or change the details of the scoring criteria.



Capstone Parts


The Capstone is split into three parts that offer some flexibility to adapt to student needs and their program of study:



  • Part 1 – Design the requirements for the real-world client or pre-defined scenario with a proposal that frames how PowerBI offers the capability to design a solution to meet the client needs and goals.

  • Part 2 – Create/implement a cloud infrastructure diagram for the selected Azure cloud data services. For more technical programs, you may also encourage the students to include a data model diagram as part of this activity.

  • Part 3 – Present the cloud data solution utilising a choice of selected Azure services/technologies from the following list:















Azure Services/Technologies (select)



Data Formats (recommended)



Azure SQL
Azure Cosmos
Azure NoSQL
Azure Storage
Microsoft PowerBI
Azure Web Services


Microsoft SharePoint
Microsoft Teams
Microsoft Active Directory


DAX
Power Platform



.CSV


.TXT


.XLS


SQL
NonSQL






It may be more appropriate to assess students on less-technical programs against Part 1 only. Students on more technical programs may be assessed across Parts 1, 2 and 3. This is a judgement that you, as the instructor, are best equipped to make.


Part 1 – Design the requirements for the real-world client or pre-defined scenario with a proposal that frames how PowerBI offers the capability to design a solution to meet the client needs and goals.


Depending on how you deliver the course, students can use class sessions or independent study time to complete their Capstone work. This can also be facilitated through remote drop-in/support sessions.


Remind students to review the Capstone rubric so they understand the required tasks for Part 1 as listed below:



  • A summary description of the client’s current situation, needs, and goals.

  • Proposal of how Azure cloud data services provide the capabilities to design a data solution to meet the client goals.

  • Expected costs – use the online Azure Pricing Calculator for this.

  • Consideration of data security and access control to various data levels

  • Benefits to their clients.


Part 2 – Create/implement a cloud infrastructure diagram for the selected Azure cloud services data services. For more technical programs, you may also encourage the students to include a data model diagram as part of this activity. 


This part of the Capstone project is designed specifically for students in more technical programs. It involves taking the cloud services identified from Part 1 and integrating them in an interconnected architectural diagram. The diagram can be created in Microsoft PowerPoint or Microsoft Visio using the Microsoft Azure Icon Set.


Alternatively, there are several free to use web-based diagramming tools that are suitable for the task. An example diagram is presented in Appendix B that addresses the pre-defined client scenario for a company focused on a need for cloud data services.


Remind students to review the Capstone rubric so they understand the required tasks for Part 2 as listed below:



  • Diagram: Visualize Azure cloud data services from the identified list of cloud services.


Part 3 – Present the cloud data solution utilising the following selected services (as listed/provided).


Students should select appropriate Azure services/technologies and recommended data format suggestions to prepare their cloud data solution.  


Students can deliver their presentations either in-class or remotely. In terms of timings, approximately 10 minutes for the presentation followed by 5 minutes for questions should suffice. Consider the following options to decide the best format for your students to present their work to an audience:



  • Whole-class presentations: Students share their presentations with the whole class, taking turns and following any time limits – this is only suitable for small class sizes.

  • Small group presentations: Divide the class into small groups. Allow each student a set amount of time to present to the other students in the group.

  • Invite the clients: If possible, inviting the clients to the presentations will provide the students with further feedback opportunities.


Students should be encouraged to discuss and justify why they selected specific cloud services to meet the client’s needs. You may want to ask students to respond to some or all of these reflection questions:



  1. Why did you choose a specific service over another?

  2. What benefits might the client customers expect?

  3. What was the most challenging part of the project?

  4. If you had more time, what improvements would you make?

  5. How are you defining and displaying business information in the dashboard?

  6. How are you using appropriate visualisations for appropriate data story telling? 

  7. Does the data provide meaningful insights to the business?


Client Scenarios


In the baseline project as outlined in the student guide, students are assumed to have access to a real-world client. However, you may also choose to have students research information about a client or provide them with a pre-defined client scenario.


Real-world client


The benefits when using a real-world client are clear, it provides a measure of authenticity for the students and aligns with industry expectations. Generally, the instructor would be responsible for sourcing one or more clients with whom the students can have scheduled time with to understand their current business needs and goals. Ideally a session would be arranged with the client in the first instance where they can communicate their needs and goals to all students, followed by a Q & A session.



The instructor should guide the client to communicate the following information for the students to ensure students have all the information they need to begin their preparatory work:



  • Company type

  • Products

  • Customer expectations

  • Current technology resources

  • Projected needs and goals


Researched client


If direct access to real-world clients is not available, you may choose to have students research a client using publicly available information. This allows for a sense of authenticity without the logistical challenges of managing client interactions. In this case, you may have students choose from a list of appropriate clients that you have pre-selected or allow each student to submit their own potential client for approval. Be sure to confirm that students can find the same information listed in the previous section from publicly available information when creating a list of appropriate clients and/or approving clients submitted by students.


Pre-defined client



If a client is not be available or if there are time constraints to deliver the course, you can also choose to give students a pre-defined client scenario. See below for a data analytics scenario that can be used as the pre-defined client scenario. You should feel free to modify the scenario or develop one of your own.



The client city planner is responsible for the design and development of urban areas, including community needs and developing short- or long- term plans. Urban planning is a valuable force for city leaders to achieve sustainable development and bringing about a difference to the communities it serves. The client city planner is looking for ways to improve the quality, performance, and safety of its existing eco-system. However, their primary focus is on finding ways to ensure optimal efficiency and increasing its use of technologies to inform its smarter cities strategic policy and agenda. Much of the data gathered by the client city planner is made available publicly including but not limited to transportation, environment, health, education, infrastructure, jobs, and economy. This data is stored in a variety of different formats that can be downloaded from:



The client city planner believes that a combination of both historical (meaning persistent data storage) and real time (meaning streamed from the source) data could be used to identify ways of improving city services and/or provide insights into how the city’s population could become more sustainable. They would also like to understand how different data sources come together to bring a better understanding of the city’s overall performance. They require a data dashboard that is capable of visualising multiple sources of data of the city’s eco-system. Also, they need a dashboard that is readily accessible to be used by a variety of different audiences.


The following information can be derived from the client scenario above, which is similar to the basic information that can be derived from a real-world client:



  • Company type – Specializes in city planning

  • Their products – Service-led, to develop and set policies and agenda for transport, the local economy, jobs, green infrastructure, renewable energy, climate change, etc. 

  • Their customers’ expectations – A city that is purposefully designed and built to serve its community.

  • Their current situation and how they do things now – Currently, they collect data locally and then categorise data sets based on topics, trends, or themes for members of the public and/or organisations to use. They have access to the data but don’t have the capability to process, analyse and visualise it.

  • Their projected needs and goals – The client city planner would like to find ways to bring data sources together and visualise data to create meaningful .


Preparing and Supporting Students


We recommend that you introduce the Capstone after Module 2 to allow students to start thinking about how they can apply what they are learning each week to the project. The end-of-module questions, particularly the questions with an open-ended format, lend themselves well to preparing your students to think independently about the processes to design a suitable cloud solution for the project. See the introduction to the Module Questions section for suggestions of how to encourage high student engagement with the Module Questions to help scaffold the process of designing a full-scale solution for the Capstone project.


If students have questions or are facing problems while completing the tasks, direct them to the following problem-solving strategies before asking you for help:



  • Review the content in the Student Capstone Guide.

  • Research the official Microsoft Azure Documentation.

  • Ask peers for help.

  • Review their client notes/pre-defined scenario document.


You may consider adding extra sessions as needed for the Capstone preparation and presentation delivery. This will give students dedicated collaborative time to work in groups and allow you to informally monitor their progress as they move through the various tasks.


Capstone Project – Student Guide


Overview


In this Capstone, you’ll create a presentation that summarizes your client’s current situation, needs, and goals; lists the services you’d recommend for your client; documents the expected costs; outlines an architectural diagram; and states the benefits for your client. This is an opportunity for you to reflect on what you have learned throughout the course about Microsoft Azure Data services and your client.


Preparing for the Capstone


To best prepare for the Capstone, consider the following:



  • Start preparing early. Read this Capstone guide early in the course.

  • Meet with your client and take notes. Take and keep thorough notes about your client and remember it’s better to get too much information from your client than not enough.

  • Think about your client in each module. As you learn new concepts in each module, consider how they might apply to your client.

  • Work on the presentation early and complete the end-of-module questions. Consider building your presentation throughout the course. The end-of-module questions are designed to help support your understanding of the Capstone tasks, so think about your project as your complete them.

  • Refer to the rubric as you work. Use the Capstone rubric document to ensure you fully understand the requirements for the tasks presented below.


Capstone Tasks


Task 1 – Design a proposal that frames how PowerBI offers the capability to design a solution to meet a client’s needs and goals.  


In the first phase of this project, you will be working to understand your client’s needs and using what you know about Azure data services to design a solution. By the end of this task, you should have a clear idea of what your client’s needs are and how your solution meets those needs. You will be using this information to create a presentation in Task 3, so be sure that you have all the following documented:



  • A summary description of their client’s current situation, their needs, and their goals.

  • Proposal of how Azure cloud data services provide the capabilities to design a data solution to meet the client goals.

  • Expected costs – use the online Azure Pricing Calculator for this.

  • Consideration of data security and access control to various data levels.  

  • Benefits to their clients.


Task 2 – Create/implement a cloud infrastructure diagram and/or data model diagram for the selected Azure cloud services data services.  


Use the list of Azure services identified in Task 1 and integrate them in an interconnected architectural diagram. The diagram can be created in Microsoft PowerPoint or Microsoft Visio using the Microsoft Azure Icon Set. Alternatively, there are several free to use web-based diagramming tools that are suitable for the task.


Be sure to include the following in your diagram:



  • Visualize core cloud data services from the identified list of cloud services.

  • Correctly highlight relationships between Cloud data services and other services through connections.

  • Identify each of the customer endpoints.


Your solution should demonstrate the following:



  • Import a dataset to PowerBI by carrying out clean, transform, and load tasks.

  • Visualise a range of meaningful insights from the processed data using suitable queries.


Task 3 – Present the cloud data solution utilising a choice of selected Azure services/technologies (as listed/provided).  


In this phase, you will deliver a ten-minute presentation that describes the needs analysis, recommendations, and explanations that you documented earlier in the project. After the presentation, others will have an opportunity to ask you questions about your process and recommendations. Make sure you include clear evidence to support the decisions that you have made, based on the information that you gathered about your client, and that your presentation is organized and detailed enough for your audience to have a good understanding of the benefits of your recommendations.


Support and Resources


If you have questions or are facing problems while completing the tasks, use the following problem-solving strategies before asking your instructor for help:



  • Review the content in the Student Capstone Guide.

  • Research the official Microsoft Azure Documentation.

  • Ask peers for help.

  • Review their client notes/pre-defined scenario document.


Post-Project Reflection Questions



  1. Why did you choose a specific service over another?

  2. What benefits might the client customers expect?

  3. What was the most challenging part of the project?

  4. If you had more time, what improvements would you make?

  5. How are you defining and displaying business information in the dashboard?

  6. How are you using appropriate visualisations for appropriate data story telling?

  7. Does the data provide meaningful insights to the business?


DA-100             Appendix A – Capstone Rubric










































































































Item



Approaches standard



Meets standard



Exceeds standard (Includes items in Meets Standard)



Summarize where & how you included this item



Task 1



Description of Solution



Description of solution does not specifically address the use of each tool.



Description of solution specifically addresses the use of each tool.



Description of solution includes an explanation of how security is administered.



 



Description of client’s current cloud situation, goals, and needs



Description of the client’s situation, goals, and needs aren’t clear and relevant.



Description of the client’s situation, goals, and needs are clear and relevant to the organization.



Description includes direct quotations from the client that provide details about their situation, goals, and needs and that are relevant to the organization.


 



 



Recommended Microsoft Azure service solutions for the client and reasoning



Not all recommendations fit the client, or cover all aspects of the course, or demonstrate an understanding of Azure products and services.



Recommendations fit clearly with an understanding of the client. Recommendations consider services and solutions from the course. Reasoning for recommendations demonstrates an understanding of Azure and cloud concepts.


 



Recommendations at times evidence critical thinking and explain products and services that weren’t chosen and a rationale as to why



 



Description of the costs expected for the client



Costs aren’t clearly explained or lack detail, or the costs don’t include support or a service level agreement (SLA).



Costs are clearly explained in a detailed way that’s easy to understand. All costs are considered, including support and SLAs.



All costs include supporting information that relates cost to the client’s specific needs.


 



 



Description of the benefits for the client



Client benefits aren’t clearly explained or lack detail, or a total cost of ownership (TCO) calculation isn’t included



Client benefits are clearly explained in a detailed way that’s easy to understand and includes a TCO calculation.



Benefits include supporting information that relates to the client’s current situation and potential needs in the future.


 



 



 


 



Description of solution does not specifically address  the use of access control to data and general security


 



Description of solution clearly explains the use of access control to data in the solution.


 



Description of solution includes a robust explanation of how security and access control is administered.


 



 



Task 2



Visualize through a diagram core cloud data service from the identified list of cloud services/solutions from Task 1



 



 



 


 



 



Cloud services in diagram correctly highlight relationships / connections to other services



Some cloud services evidence interconnectivities but lacks cohesiveness and detail.



Cloud services interconnection and relationships are presented correctly and clearly



Cloud services interconnection is well-considered and detailed to a good standard for all services.


 



 



Create a dashboard using Power BI



Dashboard includes only one tile, or only one kind of visualization, or does not relate to the business described.



Dashboard includes two or more tiles, with different visualizations, and relates to the business described.



Dashboard includes three or more tiles with different visualizations.



 



Task 3



Presenter aids



Presenter does not describe how the Microsoft Power BI solution can add value to the client business.



Presenter describes several aspects of the Microsoft Power BI solution, referencing the architecture diagram, and explains how the solution adds value to the client business.



Presenter describes a flow, dashboard, and app as part of the Microsoft Power BI solution, referencing each in the architecture diagram and explaining how the work together to add value to the business.



 



Presentation aids



Not all materials are organized or easy to understand. Not all visual and/or audio elements help audience understanding, some might distract.



Materials are organized and clear. Visual or audio elements help audience understanding.



Materials are interesting, easy to understand, and include at least one way to gather audience responses beyond just asking if there are any questions.


 



 



Delivery of presentation



Presenter isn’t prepared or doesn’t engage with the audience.



Presenter is prepared and engaged with the presentation and the audience. Communication of ideas is mostly clear and effective.



Presenter communicates beyond just reading the words on the presentation materials. Communication of ideas is consistently clear and effective.



 



 





 


 


 


 


 


 


 


 


 


 


 


 

Azure Synapse Analytics and Azure Purview Work Better Together

Azure Synapse Analytics and Azure Purview Work Better Together

This article is contributed. See the original author and article here.

Data warehouse, data integration, and big data analytics together are continuing to grow at planetary scale in enterprises. Azure Synapse Analytics provides limitless analytics services to query data using different compute engines and programming languages. Azure Synapse workspaces allow the data engineering, machine learning, and BI projects to coexist without creating silos in experiences, processes, and tools. As data continues to explode and be used, it’s more important than ever to fully govern the data.


The Azure Purview integration in Azure Synapse provides a comprehensive data governance solution with a single pane of glass for all analytics workloads. Organizations can run a variety of analytics projects and put data to work much more quickly, productively, and securely, generating insights from all data sources. The embedded data discovery experience in Azure Synapse powered by Azure Purview further increases data agility and time to insights.


In this blog, you will learn how to govern your Azure Synapse workspace by connecting to Azure Purview for automated data discovery and classifications in the Purview Data Map. You can further use the Purview Data Catalog to search enterprise data and use contextual gestures to build analytics workloads.


Register and scan in Azure Purview


Purview data source administrators can start by registering an Azure Synapse workspace under a collection of the Purview Data Map. Admins can choose to register individual workspaces or simply register the Azure Subscription containing all Azure Synapse workspaces. With a few clicks, a recurring scan can be set for automated data discovery of technical metadata and classifications. Azure Purview can support 200-plus classifications that look for sensitive and PII data while scanning. The admin can configure specific databases of a workspace and credentials managed in Azure Key vault for secured connection while scanning. Creating a private endpoint is supported for scanning Azure Synapse workspaces behind VNET. Read more details on how to register and scan Azure Synapse Analytics workspaces.


ChandruS_0-1637701221356.png


 


Register Azure Purview from Azure Synapse workspace


Azure Synapse contributors can register an Azure Purview account by navigating to Manage > External connections > Azure Purview. With a single click, the Azure Synapse workspace is integrated with Azure Purview for data governance. Azure Purview helps discover all data across your organization, track lineage of data, and create a business glossary wherever it is stored: on-premises, across clouds, in SaaS applications, and in Microsoft Power BI. Read step-by-step documentation to Connect Synapse workspace to Azure Purview.


ChandruS_1-1637701221377.png


 


 


Select a Purview account from the dropdown or enter the resource URI manually and click apply. To connect a Purview account behind VNET, read how to Access a secured Azure Purview account.


ChandruS_2-1637701221387.png


 


Once registration is complete, the connection and integration status are shown in the details section.


ChandruS_3-1637701221401.png


 


 


Search and use enterprise data


The search box in the Azure Synapse workspace menu bar is now powered by Azure Purview for the Data, Develop, and Integrate sections. Start typing the keyword in the search bar to let Purview’s intuitive search automatically complete and match the asset by relevance on name, classification, labels, contacts, and more.


ChandruS_4-1637701221410.png


 


Search results are displayed in a dedicated tab for Purview. The familiar search result page experience of Purview is retained inside the Azure Synapse workspace.


ChandruS_5-1637701221423.png


 


With a few clicks, narrow down the search results to exact assets in Purview.


ChandruS_6-1637701221431.png


 


Time to insights with contextual gestures


In the asset details page, Azure Synapse users can perform a variety of contextual gestures to connect for further analytics. Depending on the asset type discovered, users can use the following gestures:



  1. SQL Script experiences to query top 100 rows or create external table

  2. Notebook experiences to create a Spark table or load to DataFrame

  3. Data integration experiences such as new linked service, integration dataset, and development dataflows.


ChandruS_7-1637701221441.png


 


 


ChandruS_8-1637701221450.png


 


Data producers can directly edit the asset from the Azure Synapse workspace and curate by adding business glossary, description, classifications, and contact details.


ChandruS_9-1637701221456.png


 


 


Automated data lineage from Azure Synapse


Data lineage from Azure Synapse are automatically pushed to the connected Purview account for copy and data flow activity. Detailed documents are available for Metadata and lineage from Azure Synapse Analytics.


The lineage metadata is pushed to Purview in real time at the end of each pipeline run. It includes granular details such as column lineage, pipeline run status, row count, and additional metadata.


ChandruS_10-1637701221466.png


 


Lineage status is available from the pipeline run monitoring page of the Azure Synapse workspace.


ChandruS_11-1637701221475.png


 


 


Get started with Azure Purview today 



  • Quickly and easily create an Azure Preview account to try the generally available features.

  • Read quick start documentation on how to connect an Azure Synapse workspace to an Azure Purview account

The 5-Minute Recap: Everything new with Security, Compliance, and Identity on Microsoft Learn

This article is contributed. See the original author and article here.

Welcome to our new monthly blog series featuring the latest Security, Compliance, and Identity content updates on Microsoft Learn! This is our first post, and we’re highlighting recently released updates, including a new learning path we launched during Cybersecurity Awareness Month. Starting in January, we’ll highlight new learning paths, modules, and other content updates we make each month to give you the skills you need on your learning journey.


 


Read on to check out some of the latest updates from our Security, Compliance, and Identity portfolio.


 


Introduction to cybersecurity


Knowing the fundamentals of cybersecurity is a first step toward protecting against cyberthreats. Our new learning path—Describe the basic concepts of cybersecurity—delivers foundational knowledge about cybersecurity concepts including cryptography, authentication, and authorization, along with exploring ways to protect yourself and your business from cyberattacks.


 


AZ – 500: Microsoft Azure Security Technologies


This four-part series of learning paths will equip you with the knowledge you need to take Exam AZ-500.



This learning path will teach you how to secure Azure solutions with Azure Active Directory, implement hybrid identity, deploy Azure AD identity protection, and configure Azure AD privileged identity management.


 



This learning path will teach you how to lock down the infrastructure and network resources that are running in your Azure environment.


 



This learning path will teach you how to deploy and secure Azure Key Vault, configure application security features, implement storage security, and configure and manage SQL database security.


 



This learning path will teach you how to configure and manage Azure monitor, enable, and manage Azure Security Center, and configure and monitor Azure Sentinel.


 


You can take Exam AZ-500: Microsoft Azure Security Technologies once you have completed the learning path to earn a certification.


 


Microsoft Endpoint Configuration Manager


Microsoft Endpoint Configuration Manager—which is part of Microsoft Endpoint Manager—helps you protect the on-premises devices, apps, and data that the people at your organization use to stay  productive. Our newest module, Understand co-management using Microsoft Endpoint Configuration Manager, provides an in-depth look at how to enable co-management based on the implementation path that best suits your organization. You’ll also:



  • Learn about the benefits of co-management

  • Understand the co-management prerequisites

  • Learn about paths to implement co-management


 


We’re excited to hear how you use these updated resources on your journey to certification!

Evaluation Lab: Expanded OS support & Atomic Red Team simulations

Evaluation Lab: Expanded OS support & Atomic Red Team simulations

This article is contributed. See the original author and article here.

Microsoft Defender for Endpoint’s Evaluation Lab is an environment that allows security teams to seamlessly test their defense against threats. We are excited to share that the Evaluation Lab now supports adding Windows 11, Windows Server 2016, and Linux devices. In addition, we’d also like to announce a new partnership with Red Canary’s open-source simulation library, Atomic Red Team! 


 


NOTE: Both updates are only available in the Microsoft 365 Defender portal at security.microsoft.com.


 


Expanded OS support


The evaluation lab now supports the following operating systems: Windows 10, Windows 11, Windows Server 2019, Windows Server 2016 and Linux (Ubuntu). To create a new device, simply select it within the “Add device” wizard. The new device will automatically be onboarded with no required additional steps.


 


Yaniv_Carmel_2-1637601415372.png


 


Once created, you can connect to the device via RDP (Windows) or SSH (Linux). You can connect to a Linux device using any SSH client.


 


Yaniv_Carmel_3-1637601415380.png


 


Atomic Red Team simulations


Powered by Red Canary, Atomic Red Team is an open-source library of tests that security teams can use to simulate adversarial activity in their environments. Atomic tests are simple – each test is mapped to a single MITRE ATT&CK® technique or sub-technique, most of them have no prerequisites, and many come with easy-to-use configuration and cleanup commands.


Evaluation Lab users can now use Atomic Red Team simulations to evaluate Microsoft Defender for Endpoint’s detection capabilities against both Windows and Linux threats. The simulations are provided as script files, so that security teams can choose to run them in the Evaluation lab or any other testing environment of their choice.


 


Yaniv_Carmel_4-1637601557153.png


 


The first simulation, 2021 Threat Detection Report, executes tests according to Red Canary’s latest report of top Windows techniques associated with confirmed threats, as compiled from roughly 20,000 confirmed threats detected across customer environments.


 


The second simulation, Linux techniques, is a collection of simple tests compiled to allow security teams to evaluate Microsoft Defender for Endpoint’s detection capabilities against common Linux persistence, discovery, and defense evasion techniques.


 


We’re looking forward to you trying out the Evaluation Lab updates. Let us know your thoughts and feedback in the comments below or through the feedback tool in the portal!

10 shades of public API hosting on Azure

10 shades of public API hosting on Azure

This article is contributed. See the original author and article here.

APIs are everywhere and there are many ways to host them in Azure! Let us see what are the different possibilities with the pros & cons of each. I am not going to discuss the bits and bytes about each possibility. The purpose of this post is to give you a rough idea of what is possible for a simple scenario (single region, high-availability and disaster recovery are out of scope). I will provide small diagrams for more advanced scenarios.


 


1) Function App – Consumption tier


 


Function Apps ship with HTTP-triggered functions. These can be suitable to expose tiny APIs.


 


Pros: Cost-friendly (economies of scale), Easy to deploy, Fully elastic with built-in auto-scaling from 0 to n instances.


Cons: Limited security controls. Network ACLs are the only way to limit public exposure. Data accessed by such functions must be public from a connectivity perspective. Cold start due to serverless tier. Limited execution time as well as per-execution resource consumption. No WAF (Web Application Firewall) features.


 


Use cases: Lab, PoC, Prototyping, Limited budgets, Basic API needs (ie: no catalog, no versioning, etc.), asynchronous APIs, Synchronous APIs that can live with the cold start, No strong compliance requirements.


 


2) Multi-Tenant App Service – Standard tier


Like functions, Web Apps are pretty neat and easy to get started with. Microsoft is managing everything for you under the hoods. 


 


Pros: Cost-friendly (economies of scale) but fixed cost incurred (unlike functions on consumption tier), Easy to deploy, Auto-scaling plans. Resource is limited to the capacity you are willing to pay. No cold start!


Cons: Limited security controls. Network ACLs are the only way to limit public exposure. Data accessed by such apps must be public from a network perspective. No WAF.


Use cases: Lab, PoC, Prototyping, Limited budgets, Basic API needs (ie: no catalog, no versioning, etc.), No strong compliance requirements.


 


3) Azure Container Instances (ACI)


While Azure Container Instances can be used to host long-running services, I would advise against this idea and keep the ACIs for asynchronous job operations, short-lived executions and as the serverless (virtual kubelets) part of Azure Kubernetes Service. 


 


Pros: Cost-friendly (pay per second of execution), providing the API is not constantly up and running.


Cons: Limited security controls with Windows Containers,  better with Linux as Linux-based ACIs can integrate with virtual networks.


Use cases: Lab, PoC, Prototyping, Limited budgets, Basic API needs (ie: no catalog, no versioning, etc.), No strong compliance requirements. Lift & shift of plain old legacy Windows-based backend services.


4) Functions Apps Consumption tier or App Service standard+ Azure API Management (APIM) Consumption tier


In this setup, you intend to publish APIs through Azure API Management. The pros & cons of the underlying hosting option (app service or function apps) remain as explained earlier and are not repeated below.


 


Pros: Cost-friendly because the serverless flavor of APIM has no fixed cost. It will auto-scale with the actual demand. You can add features to your APIs such as enforcing policies (JWT validation, headers checks etc.) as well as version them. 


Cons: More security controls but there is still a few major caveats: network ACLs remain the only way to limit public exposure of the backend and traffic cannot be forced through APIM because the consumption tier has no static IP so this can’t be used as a network ACL on the backend side. Data accessed by such apps must still be public from a network perspective. Still no WAF because APIM is a a PEP (Policy Enforcement Point) but not a WAF. 


Use cases: Lab, PoC, Prototyping, Limited budgets, More advanced API needs (catalog, versioning, consistent way of exposing APIs etc.), No strong compliance requirements.


 


5) Functions Apps Consumption tier or App Service standard+ Azure API Management (APIM) Basic or Standard tier


In this setup, you intend to publish APIs (and enforce routing) through Azure API Management. 


 


Pros: You benefit from APIM capabilities AND you can restrict traffic to the backend to your APIM instance because as of the basic tier, APIM comes with a static IP. 


Cons: A bit more expensive (fixed cost for APIM). Manual scaling for the Basic tier (plans possible as of Standard). Data stores accessed by the backends must still be public from a network perspective. Still no WAF because APIM is a a PEP (Policy Enforcement Point) but not a WAF. 


Use cases: Limited budgets, More advanced API needs (catalog, versioning, consistent way of exposing APIs etc.), No strong compliance requirements.


 


6) App Service (or Functions) on Premium tier+Private Endpoint+VNET Integration+WAF


In this setup, you want isolate your backend services totally from internet and make them only accessible through a web application firewall (WAF). Because it is a little more complex, here is a small diagram showing the different blocs and their interactions.


plinkvnetintegration.png


The traffic flies from a caller (here a mobile device) to a WAF which has a public IP. The WAF has a backend pool targeting the endpoints defined in the corresponding private endpoint subnet. The app service is integrated with Azure Private Link (and private DNS zone) for the INBOUND traffic. VNET integration for the App Service (or function app) is enabled to handle the OUTBOUND traffic through another VNET’s subnet.


 


Pros: This hosting option is more secure than the preceding ones because the data stores can be firewalled thanks to the control over the outbound traffic of the API.  The backend services are isolated from internet and proxied by a WAF. 


Cons: This architecture is a bit convoluted and is not the best one to run at scale.


Use cases: Stronger focus on security. Basic API needs (no more APIM in the picture). 


 


7) App Service (or Functions) on Premium tier+Private Endpoint+VNET Integration+WAF+APIM Premium


The purpose of this setup is the same as the previous one but you want to combine both WAF & APIM (how it should be) before hitting backend services. 


 


plinkvnetintegrationapimwaf.png


 


Pros: Inbound traffic is more secure because it traverses a WAF and a PEP.  Network ACLs can be set at backend level to only let the API gateway (which has a static IP) call the backend. Outbound traffic of the API gateway can be controlled by a NVA or Azure Firewall.


Cons: This architecture is a bit convoluted and is not the best one to run at scale. APIM premium is expensive but is required because at the time of writing (11/2021), only the Premium tier integrates with Virtual Networks. 


Use cases: Stronger focus on security, advanced API needs and possible geo-distributed APIs setup.


 


WAF+APIM Premium+App Service Environment (ASE)


Before ASE v3, ILB ASEs had a rather bad reputation because of their cost (flat fees), and their complexity. It was indeed quite easy to break them with improperly configured firewall rules. ASE v3 are a breeze to setup and are less expensive (no more flat fee). Therefore ILB ASE comes back as a very interesting option because it offers the best-in-class security at an affordable price, at least from a backend hosting perspective.


 


ase.png


 


 


Pros: Inbound and outbound traffic can be fully controlled by an NVA or Azure Firewall. Intra VNET traffic can be controlled with Network Security Groups. Backends are totally isolated from internet. This setup is scalable because the ASE can most tons of backends and functions. The underlying compute is based on a single-tenant architecture (Isolated tier).


Cons: Costs (incurred by the isolated tiers and APIM premium) and complexity. Although ASE v3 is a breeze compared to its predecessors, this setup is often part of a larger Hub & Spoke architecture, which involves a lot of networking and firewalling work. You do not get started with it over night! 


Use cases: Stronger compliance requirements, advanced API needs and possible geo-distributed APIs setup. This setup is perfectly suitable as a Web Landing Zone that hosts tons of web apps and APIs. 


 


WAF+APIM Premium+AKS


Kubernetes has become a first-class citizen everywhere and AKS is the Microsoft-managed K8s offering on Azure (By the way, Azure Arc also has a ton of handy features to manage K8s clusters at scale wherever they are hosted). So, with this in mind, I could not skip it. Here is a very simplified diagram showing the different building blocks:


 


AKS.png


 


Pros: Very similar to the previous architecture with regards to inbound and outbound, Hub & Spoke integration, etc.. although AKS adds a serious bits of extra complexity network-wise. AKS allows you to host nearly anything and has a very rich ecosystem. When I think AKS, I think all the benefits of VMs with all the benefits of cloud native architectures (Infrastructure as Code, increased resilience, zero downtime, releases during business hours, polyglot apps, etc.). 


Cons: Costs incurred by APIM premium and the AKS node pools, which should involve at least 3 nodes but ideally 5 for a minimal production-grade setup. Another potential deal-breaker for some organizations is the complexity of K8s (AKS). App Services and Function Apps are way easier to work with and it is a Kubernetes lover who tells you this!


Use cases: Stronger compliance requirements, advanced API needs and possible geo-distributed APIs setup. This setup is perfectly suitable as a Web Landing Zone that hosts tons of web apps and APIs. Microservices architectures (K8s and its ecosystem, including service meshes, are very supportive of microservices architectures).


 


10) Container Apps


This new service (public preview in 11/2021) is very promising because it comes with some of the AKS promises without the complexity because Microsoft manages nearly everything for you. Container apps remind me to some extend Service Fabric Mesh, let’s hope they have a better future. However, at the time of writing, it is no way in line with typical enterprise needs (Hub & Spoke) but Microsoft is working on a BYO VNET feature. It is still a little early to come with pros & cons but here are a few of them.


 


Pros: Cost friendly since it scales from 0 to n, like Azure Functions. Easy to deploy and manage.


Cons: N/A (too early)


Use cases: right now, PoCs and protoyping only. In the future, microservices architectures, which is why this service has been built from the ground up.