A Comprehensive Guide to Getting Started with Data API Builder for Azure SQL Database or SQL Server

A Comprehensive Guide to Getting Started with Data API Builder for Azure SQL Database or SQL Server

This article is contributed. See the original author and article here.

Getting started with Data API builder for Azure SQL Database or SQL Server


 


kevin_comba_0-1688562754496.png


 


What is Data API builder?


Data API builder for Azure Databases provides modern REST and GraphQL endpoints to your Azure Databases. Database objects can be safely exposed via REST or GraphQL endpoints with Data API builder, allowing your data to be accessed using modern approaches on any platform, language, or device. Granular security is ensured by an integrated and configurable policy engine; integration with Azure SQL,


SQL Server, PostgreSQL, MySQL, and Cosmos DB provides developers with unprecedented productivity.


Data API Builder Features



  1. REST operations like CRUD ( POST, GET, PUT, PATCH, DELETE) , filtering, sorting and pagination.

  2. GraphQL operations like queries and mutation, queries and mutations.

  3. Support authentication via OAuth2/JWT.

  4. Role-based authorization using received claims.

  5. Allow collections, tables, views and stored procedures to be accessed via REST and GraphQL.

  6. Full integration with Static Web Apps via Database Connection feature when running in Azure.

  7. Easy development via dedicated CLI.

  8. Open Source, you can always contribute


Our Focus.


In this blog, I’m going to walk you through a simple process to get started with Data API Builder with Microsoft SQL Database deployed on Azure. This can also be done using Azure Cosmos Db, Azure Database for PostgreSQL and Azure MySQL Database.


Prerequisites.



  • Basic understanding of Rest CRUD operations like POST and GET

  • Basic understanding of a relational database like Microsoft SQL or MYSQL

  • Willingness to follow along and learn as the above two prerequisites are not a must.


Requirements to get started.


Please click the below links to download .Net 6 and Vs Code editor, you can search Live Preview and Thunder client in the Vs code Extension Tab and install them.



Procedure



  1. Provision and deploy Microsoft SQL on Azure

  2. Create a Data API builder.

  3. Test our API with Thunder client & Insomnia

  4. Link our Azure SQL DB via Data API Builder to our simple HTML Page to view our Data.


kevin_comba_1-1688562810119.png


 


Provision and deploy Microsoft SQL on Azure


 



 


1.jpg


 



  • Search for SQL databases


 


Inked2.jpg


 



  • Click create to start provisioning your service.


 


Inked3.jpg


 



  • Select your subscription, create a resource group, enter a database name and select elastic pool no for this is just a demo project.


 


4.jpg


 



  • Create your SQL server and select a location.


 


Inked6.jpg


 



  • For authentication I have selected both SQL and Azure Active directory, but SQL alone is enough for authenticating this project. Add sever admin and password and keep the safe because we will need them later.


 


InkedInked5.jpg


 


 



  • After creating the SQL server, select locally – redundant backup for demo purposes.


 


Inked7.jpg


 



  • For these options, let’s go with the defaults.


 


Inked9.jpg


 


 



  • For these options, let’s go with the defaults.


 


Inked10.jpg


 



  • Create tags to identify your services on your resource group and create Review + create


 


Inked11.jpg


 


 



  • Click go to resources after deployment is complete


 


Inked13.jpg


 


 



  • On the overview you will see all the essential details of our SQL Database and SQL Server.


 


Inked14.jpg


 


 


 



  • Click on the Query editor and sign in with either SQL Server auth or Azure Active Directory


 


Inked16.jpg


 



  • You can view your database name below


 


Inked17.jpg


 


 



  • Click new query and add this code below to create two tables (Brand & Category) then click run to execute these queries.


 


Inked18.jpg


 


 


 



  • Add this insert statements to insert data in our database.


 


Inked20.jpg


 


 



  • click run to execute and check the message below ‘query succeeded’.


 


Inked22.jpg


 


 



  • refresh to view the newly created tables.


 


Inked23.jpg


 



  • You can easily view tables available in our database


 


Inked24.jpg


 


 



  • Run the below SQL Statement to confirm data inside our table.


 


Inked25.jpg


 


 


Inked26.jpg


 



  • On the connection string copy ADO.NET(SQL authentication) we will use it late.(NB: do not expose your connection string with password, I will do so for demo purposes)


 


Inked28.jpg


 


Create a Data API builder.


 



  • Back to our windows open windows PowerShell and type dotnet, if the below message doesn’t appear. Install .NET 6.0


kevin_comba_24-1688562888616.png


 



  • Run the below command to install Data API Builder locally.


 


kevin_comba_25-1688562888618.png


 



  • Confirm installation was successful by running the below command.


 


kevin_comba_26-1688562888619.png


 


Test our API with Thunder client & Insomnia


 


 



  • Open any folder you want to work on with VS Code then open your terminal. Run dab init –database-type “mssql” –connection-string “enter-your-connection-string-we-copied-in-the-above-steps”. Its good practice to use .env file on your root folder and call it on our connection string ie dab init –database-type “mssql” –connection-string “@env(‘my-connection-string’)”


 


Inked32.jpg


 



  • The command generates a config file called dab-config.json looking like this:


 


Inked33.jpg


 



  • Data-source will directly enable us to specify the database we are using and connection-string.The runtime will specify that our Data API will be consumed by both Rest endpoint is made available at the path `/api/` and GraphQL endpoint is available at `/graphql`. Under host we specify the mode of our data API either production or development, cors enables us to set origins example if yourdormain.com is your client app which will be sending request to your API you should add it under origins. Under authentication you can specify how you want to be authenticated.


 


kevin_comba_29-1688563499583.png


 


 



  • Add the below code for the brand category entities. This entities maps directly to your tables on Azure SQL database, wo do this by specifying the source. Permissions lets use specify the actions, “*” means all operations Create Edit Read Delete are enabled. With role you can assign people roles to do this actions above, anonymous will let anyone do all the actions.


 


kevin_comba_30-1688563499587.png


 



  • To start run the code below.


 


kevin_comba_31-1688563499589.png


 



 


kevin_comba_32-1688563499593.png


 



  • launch your Thunder client VS code extension( if on windows use CTRL + SHIFT + R). create a new request, select GET and https://localhost:5001/api/category. Click send and you should get a response from our deployed Azure SQL database.


 


kevin_comba_33-1688563499596.png


 


 



 


kevin_comba_34-1688563499598.png


 



  • The GET verb also supports several query parameters (also case sensitive) that allow you to manipulate and refine the requested data:

    • $orderby: return items in the specified order

    • $first: the top n items to return

    • $filter: expression to filter the returned items

    • $select: list of field names to be returned




 


kevin_comba_35-1688563499602.png


 


 



  • The query parameters select will Only select CategoryID field after you click send.


 


kevin_comba_36-1688563499604.png


 



  • The query parameters `filter` and `eq` will return CategoryName equal to Category x after you click send.


 


kevin_comba_37-1688563499606.png


 



  • Data API offers validations, example you can’t just send categoryName as the only field because our table requires both categoryName and categoryID. An error will be returned.


 


kevin_comba_38-1688563499608.png


 


 



  • REST Endpoint POST: create a new category after you add the json object below with that data and  click send.


 


kevin_comba_39-1688563499612.png


 



  • REST Endpoint POST: create a new category after you add the json object below with that data and  click send.


 


kevin_comba_40-1688563499614.png


 



  • REST Endpoint PUT: will update a category after you add the json object below with that data and  click send.


 


kevin_comba_41-1688563499617.png


 


 



  • Whenever you need to access a single item, you can get the item you want through a GET request by specifying its primary key example CategoryID


 


kevin_comba_42-1688563499620.png


 



  • To delete a record by specifying its primary key example CategoryID and set the HTPP request to delete, and click send.


 


kevin_comba_43-1688563499621.png


 



  • GraphQL endpoint is available at `/graphql`, set the HTPP request to post and provide the schema below and click send. For this example we need REST client like Postman or Insomnia to query. I have used Insomnia.


 


Inked50.jpg


 


 


 



  • This is how we filter using the filter method.


Inked51.jpg


 



  • This is how we orderBy descending or ascending.


 


Inked52.jpg


 


 


 


Link our Azure SQL DB via Data API Builder to our simple HTML Page to view our Data.


 For this demo am using a simple html page with jQuery and bootstrap. is helping me to make the HTTP request to our API server using fetch, get the response and append it to our html page dynamically.



  • Step one – add this code in index.html. and right click on the code to open with live server. Copy the URL displayed after the html page loads.


kevin_comba_47-1688563499636.png


 


 



  • Step two – Paste the URL below on the dab-config.json on origins. Save your code. On your terminal `CTRL + C` to terminate the  Data API server and start it again using `dab start`.


Inked54.jpg


 



  • There you go it, successfully connected your static web app with Data API builder for Azure Databases engine.


kevin_comba_49-1688563499646.png


 


 


Read more:



 


 

Maratona de Inteligência Artificial: aulas ao vivo e trilha de estudos gratuita

Maratona de Inteligência Artificial: aulas ao vivo e trilha de estudos gratuita

This article is contributed. See the original author and article here.


A Inteligência Artificial está cada dia mais presente em nosso dia a dia. Segundo o Relatório do Futuro do Trabalho publicado pelo Fórum Econômico Mundial, as habilidades em IA representem a terceira maior prioridade para as estratégias de treinamento das empresas, ao lado do pensamento analítico e criativo.

 

Para te ajudar a obter novos conhecimentos, teremos a Maratona #AISkills, uma imersão gratuita e online, destinada para pessoas que querem aprender a usar a IA na prática, abordando temas como Ética na IA, Prompt Engineering, GitHub Copilot, Análise de Dados, Machine Learning, Azure OpenAI e muito mais. 

 


Captura de tela 2023-07-06 124352.jpg

































18 de Julho, 12:30h


Palestrante:


Danielle Monteiro



Princípios do Microsoft Responsible AI na prática
IA responsável é a aplicação de princípios éticos e sociais na criação, desenvolvimento e uso de tecnologias de Inteligência Artificial (IA). Estes princípios ajudam a garantir que os sistemas de IA sejam usados de forma responsável e segura, ao mesmo tempo em que ajudam a proteger os direitos humanos.



19 de Julho, 12:30h


Palestrante:


Henrique Eduardo Souza


Ganhe produtividade com Prompt Engineering
Prompt engineering é a prática de fornecer instruções específicas a um modelo de IA para produzir os resultados desejados. Do Chat GPT ao GitHub Copilot, você aprenderá a escrever prompts eficientes para gerar textos e até mesmo, linhas de código.

01 de Agosto, 12:30h


Palestrante:


Cynthia Zanoni


Como criar um jogo de pedra, papel e tesoura com o GitHub Copilot
O GitHub Copilot está transformando a produtividade e a jornada de aprendizado de pessoas desenvolvedoras. Nesta palestra, vamos explorar o Codespaces e o GitHub Copilot para criar um jogo de pedra, papel e tesoura!

03 de Agosto, 12:30h


Palestrante: 


Livia Regina Bijelli Ferreira


Microsoft Fabric: análise de dados para a era da IA
O mundo de hoje está inundado de dados — sempre em fluxo dos dispositivos que usamos, dos aplicativos que construímos e das interações que temos. O Microsoft Fabric é uma plataforma de análise unificada de ponta a ponta que reúne todos os dados e ferramentas de análise de que as organizações precisam. O Fabric integra tecnologias como Azure Data Factory, Azure Synapse Analytics e Power BI em um único produto unificado, capacitando dados e profissionais de negócios a liberar o potencial de seus dados e estabelecer as bases para a era da IA.

15 de Agosto, 12:30h


Palestrante:


Pablo Lopes



Criando um assistente com o Azure OpenAI Service
Crie um assistente natural com o Azure OpenAI, faça notas, pegue os pontos mais importantes e feedback sobre as ideias como especialista.



16 de Agosto, 19h


Palestrante:


Beatriz Matsui


Introdução a MLOps: Conceitos e Prática
O machine learning está no núcleo da inteligência artificial, e muitos aplicativos e serviços modernos dependem de modelos de machine learning preditivos. Nesta sessão vamos falar sobre conceitos de MLOps (Machine Learning Operations) e como trabalhar com tais práticas usando ferramentas de DevOps e aprendizado de máquina da Microsoft.

 


Captura de tela 2022-10-24 201024.jpg



O Cloud Skills Challenge é uma plataforma integrada com o Microsoft Learn, portal global de cursos gratuitos da Microsoft, disponível 24 horas por dia, 7 dias por semana. Além de assistir as aulas gratuitas da Maratona #AISkills, você pode participar do nosso grupo de estudos de Inteligência Artificial, basta realizar a sua inscrição no Cloud Skills Challenge.


 


Depois de concluir um desafio, você receberá um selo do Microsoft Learn AI Skills Challenge e um certificado de conclusãoNa Maratona #AISKills, temos 4 trilhas de desafio de estudos. Você pode escolher uma ou mais trilhas!


 


Desafio de Machine Learning


O aprendizado de máquina está no centro da inteligência artificial, e muitos serviços modernos dependem de modelos preditivos de aprendizado de máquina. Saiba como usar o Aprendizado de Máquina do Azure para criar e publicar modelos sem escrever código. Você também explorará as várias ferramentas de desenvolvedor que pode usar para interagir com o espaço de trabalho.


 


Desafio dos Serviços Cognitivos


Os Serviços Cognitivos do Azure são blocos de construção da funcionalidade de IA que podem ser integrados em aplicativos. Você aprenderá a provisionar, proteger e implantar serviços cognitivos. Usando essa ferramenta, você pode criar soluções inteligentes que extraem significado semântico do texto e oferecem suporte a cenários comuns de visão computacional.


 


Desafio de operações de aprendizado de máquina (MLOps)


As operações de aprendizado de máquina (MLOps) aplicam princípios de DevOps a projetos de aprendizado de máquina. Você aprenderá a implementar conceitos chave, como controle de origem, automação e CI/CD para criar uma solução MLOps de ponta a ponta enquanto usa Python para treinar, salvar e usar um modelo de aprendizado de máquina.


 


Desafio do Construtor de IA


Este desafio apresenta o AI Builder, ensina como criar modelos e explica como você pode usá-los no Power Apps e no Power Automate. Você aprenderá a criar tópicos, entidades personalizadas e variáveis para capturar, extrair e armazenar informações em um bot.


 


Os desafios começam dia de 17 de Julho e se quiser antecipar sua inscrição, acesse a página do Cloud Skills Challenge.


 




:stareyes:  BÔNUS  :stareyes:


Benefícios gratuitos para estudantes




Tutoriais do GitHub para iniciantes em programação


Quero trazer um destaque para estes recursos, pois todos os tutoriais foram traduzidos para Português com a ajuda de nossa comunidade Microsoft Learn Student Amabassadors. Então, acesse os tutoriais e não esqueça de deixar uma estrelinha!

 


Cursos no Microsoft Learn com certificado


Grant users access to data assets in your enterprise through the Microsoft Purview policies API

This article is contributed. See the original author and article here.

Microsoft Purview Data owner policies is a cloud-based service that helps you provision access to data sources and datasets securely and at scale. Data owner policies expose a REST API through which you can grant any Azure AD identity (user, group or service principal) to have Read or Modify access to a dataset or data resource. The scope for the access can range from fine-grained (e.g., Table or File) to broad (e.g., entire Azure Resource Group or Subscription). This API provides a consistent interface that abstracts the complexity of permissions for each type of data source.


 


More about Microsoft Purview Data policy app and the Data owner policies at these links:



 


If you would like to test drive the API, sign-up here to join the private preview.

5 ways accounts payable automation drives digital transformation

5 ways accounts payable automation drives digital transformation

This article is contributed. See the original author and article here.

Finance teams’ roles have evolved—and expanded—into new realms. As economic pressures demand that everyone deliver more with fewer resources, finance professionals’ plates are increasingly crowded with analysis, strategy, even supplier relationships—plus all the traditional finance processes they’ve long been responsible for.  

It’s not just moving invoices along—finance teams are being asked by upper management to bring cost-saving, time-optimizing, and value-adding insights to the table when business model evolution and digital transformation are discussed. And they’re looking for any solution that will give their overstretched employees time back in their day to think bigger, by tightening up tedious processes that drain human energy and douse the spark of innovation. 

A man watching a webinar while using a cubicle in an open office setting​.

The Future of Finance

Streamline your accounts payable process and free up resources to fund your business transformation

For finance teams, incorporating automation into accounts payable (AP) is a great place to start. After all, the work of capturing invoices, processing and verifying them, then paying vendors can be complex—with plenty of sub-processes to support the end goal of on-time payments and stronger customer relationships. Automating parts of the accounts payable process lets finance teams spend less time on repetitive tasks and more time on higher-value work that builds agility in finance operations and across your business.  

5 ways accounts payable automation elevates finance operations

To get started on your AP automation journey, check out this webinar, The Future of Finance: Unlocking the Benefits of Accounts Payable Automation, and learn how automating accounts payable data can help you:

1. Focus on strategy—not tedious data tasks

Allow your finance teams to put more energy where it matters—fulfilling, strategic work that keeps them engaged—not cumbersome accounts payable processes that waste time and drag down productivity.  

2. Reign in costs and unnecessary fees

Understand payment trends, analyze vendor performance, and improve processing time to drive significant savings. Cut paper costs by automating manual processes, while helping avoid handwritten errors that lead to mistakes and late payment fees downstream.

3. Stop small errors from becoming big problems

Improve accuracy with automated accounts payable software that limits the unavoidable errors inherent with manual data entry—so you sidestep risk now while dialing in what you need for compliance later.

4. Get to know your vendors

Strengthen vendor relationships by automating AP processes to pay them on time, every time. And over time, analyze accounts payable data to reveal valuable cost-saving insights that put you on more solid footing for future negotiations.

5. Gain more visibility

Put more eyes on your cashflow to help people across your business focus more on cost savings. Empower everyone—not just data specialists or finance operations teams—to support larger initiatives with more readily available data.

When should you automate accounts payable?

To keep pace with competitors, you’re probably looking at ways your finance team can integrate new technology into traditional processes like accounts payable that have long kept business’ bottom lines in order. Because the pace of business isn’t slowing, nor are potential disruptions retreating. To respond to changing business conditions and guard against long-term risk, finance leaders are starting to recognize how the latest AI-powered automation can help them take on what’s currently taking up too much of their employees’ time: 

  • Complex processes: The more complex the process, the more challenging it can be to manage effectively. Inefficient processes with too many steps are not only a drag on people’s time and energy, they’re often a barrier to digital transformation. When that leads to a delayed payment, it can damage customer relationships that have taken years to build—putting the wrong kind of spotlight on finance teams. 
  • Strained IT: When organizations are facing flat (or shrinking) IT budgets and people don’t have the tools they need, it’s hard to build consistent processes. A lack of IT budget is more than annoying; unreliable systems can overwhelm employees, get in the way of strategic work they’d rather be doing, and make it tougher to follow processes that finance regulations demand. 
  • Data overload: Invoices, payment records, information on multiple vendors—all that data is difficult to manage on its own. When people naturally turn to manual processes to make sense of it all, it’s often time intensive and susceptible to errors, fraud, and missed opportunities for insights. 
  • People power: Freeing up resources to hire new talent is tough enough, and retaining that talent is even tougher when mundane tasks fill up their plate. Engaging them in their current role with opportunities to be creative and offer strategic insights is often a better investment than trying to find, onboard, and retain a new hire. 

If these obstacles sound familiar, automating accounts payable processes is a cost-efficient way to start moving your organization past them. When your company is ready to shift from managing cumbersome accounts payable operations to supporting strategic initiatives, Microsoft Dynamics 365 Finance can help. 

How Dynamics 365 Finance modernizes accounts payable operations

Dynamics 365 Finance modernizes accounts payable operations by capturing invoices in multiple formats (digital and manual), processing them while coding and resolving errors with full automation, then automatically paying vendors on time—with analytics around payment scenarios and compliance gathered in real time. Be sure to watch the webinar to learn more, including how one Microsoft customer cut invoice costs in half and reduced overall costs by 25 percent by streamlining procedures across their business, including invoice processing, with accounts payable automation. It also features a demo of how to use AI-powered automation to accelerate digital transformation with Dynamics 365 Finance.  

The post 5 ways accounts payable automation drives digital transformation appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Public Preview: Exchange Online RBAC management in Microsoft Graph using Unified RBAC Schema

Public Preview: Exchange Online RBAC management in Microsoft Graph using Unified RBAC Schema

This article is contributed. See the original author and article here.

Today, we’re excited to announce the public preview of Exchange Online Role Based Access Control (RBAC) management in Microsoft Graph. The preview is designed for admins who want a consistent management interface, and for developers who want to programmatically control RBAC.


The public preview supports create, read, update, and delete APIs in Microsoft Graph which conform to a Microsoft-wide RBAC schema. Exchange Online RBAC role assignments, role definitions, and management scopes are supported through this new API.


With this preview, Exchange Online joins other RBAC systems in the Microsoft Graph Beta API, namely, Cloud PC, Intune, and Azure AD directory roles and entitlement management.


How Unified RBAC for Exchange Online works


Admins assigned the appropriate RBAC role in Exchange Online can access Unified RBAC using the Microsoft Graph beta endpoint or by using Microsoft Graph PowerShell. RBAC data remains stored in Exchange Online and can be configured using Exchange Online PowerShell.


In addition to Exchange RBAC permissions, you will also need one of these permissions:



  • RoleManagement.Read.All

  • RoleManagement.ReadWrite.All

  • RoleManagement.Read.Exchange

  • RoleManagement.ReadWrite.Exchange


Actions and entities supported in this preview:



























































 


Entity



 


Endpoint



Allowed API Actions



Read



Create



Update



Delete



Roles



graph.microsoft.com /beta/roleManagement/exchange/roleDefinitions





X



X





Assignments



graph.microsoft.com /beta/roleManagement/exchange/roleAssignments











Scopes



graph.microsoft.com /beta/roleManagement/exchange/customAppScopes











Role Groups



Not supported



X



X



X



X



Transitive Role Assignment



Not supported



X



X



X



X



Reading the list of role assignments assigned with a management scope:


UnifRBAC01.jpg


Reading the list of Management Scopes:


UnifRBAC02.jpg


List roles using Microsoft Graph PowerShell:


UnifRBAC03.jpg


Try the Public Preview Today


Unified RBAC is available to all tenants today as a part of the public preview. See Use the Microsoft Graph SDKs with the beta API and roleManagement resource type for more information.


We’d love your feedback on the preview. You can leave a comment here or share it with us at exourbacpreview@microsoft.com.


FAQs


Does this API support app-only access?
Not yet. This will be added to the preview later.


Exchange Online Team

You can now save and edit your survey responses

You can now save and edit your survey responses

This article is contributed. See the original author and article here.

People sometimes wish to review and change their form or quiz responses after submission, even days later. We’re happy to share that you can now review and edit responses when needed.


 



  • First, save your response


To save your response, ensure that the form creator has selected “Allow respondents to save their responses” option in the Forms settings.


 


Forms setting - save responseForms setting – save response


Once the setting is enabled, and you submit a form, you will have the option to save your response from the thank you page.


 


“Save my response” in Thank you page


The response will be saved in “Filled forms” in Microsoft Forms.


 


Filled formsFilled forms



  • If enabled, you can then edit your response


 The form’s creator must select “Allow respondents to edit their responses” in the form’s setting.


 


Forms setting - edit responseForms setting – edit response


If enabled, you will have the option to “Save my response to edit” on the thank you page.


 


“Save my response to edit” in Thank you page


As long as the form is open, you have the flexibility to revisit the form at any time to edit your answers. However, edits cannot be made once a form has been closed or deleted.


 


Edit responseEdit response


FAQ


 


Where is the data saved?


As with all forms, the response data will be stored in the form creator’s repository only. Respondents can only view their own and cannot make any changes unless the form creator allows them to edit their response. If the form or the response is removed, the respondent will no longer have access to the response.


 


What’s the limitation of the feature?


Updating a response do not trigger a Power Automate flow. This capability will be enabled soon.


 


We hope you find these new features helpful! We will continue working on enabling more options with Microsoft Forms. Let us know if you have questions or if there’s anything else we can help you with!

Logic Apps Aviators Newsletter – July 2023

Logic Apps Aviators Newsletter – July 2023

This article is contributed. See the original author and article here.

In this issue:






Aviators-Logo@0.75x.png


 


Ace Aviator of the Month


 


July’s Ace Aviator: Anitha Eswaran


AnitaEswaranHeadshot.jfif


 


What is your role and title? What are your responsibilities associated with your position?


I am a Senior Digital Architect with Sonata Software Europe Limited. I am responsible for delivering the right integration technologies to the customers. I interact with my team members to share the latest updates on various integrations and brainstorm the best fit for the client requirement. My manager provides me additional support in exploring new concepts and encourages us to do POCs with latest updates and share the knowledge to the wider forum.


 


Can you provide some insights into your day-to-day activities and what a typical day in your role looks like?


I plan my day beforehand, but it takes a different direction as it progresses.  This is what my role demands. I am involved in multiple assignments and learn everyday with various challenges. More complex it is, the more growth is my thought, and this keeps me motivated throughout the day. I also offer guidance to other teams, and I get excited when the discussion or the task is for Logic Apps integration.


 


What motivates and inspires you to be an active member of the Aviators/Microsoft community?


My spouse, Santosh Ramamurthy, always motivates me in upgrading my skills. He supported my learning path and noticed my passion, he advised me to write blogs, sharing techniques via forums.


I started my career in 2006 and have been travelling with Microsoft Technologies for close to 17 years. With the limited resources those days, we had challenges in getting proper guidance. We referred to books and often learnt from our mistakes. I thought our future generation should have a strong foundation on technologies which they can pass on. So, I started writing blogs, attending forums, giving sessions which will motivate the people who are new to the technology or planning to switch the domain. Resources are abundant nowadays. How to use and where to apply is our skill. Logic Apps Aviators is a community which binds the people at all levels – be it beginner, intermediate or expert.


 


Looking back, what advice do you wish you would have been told earlier on that you would give to individuals looking to become involved in STEM/technology?


As I said earlier, resources are abundant. Learn to use, apply, make mistakes, correct them, and keep moving. Gone are the days where a high-speed PC or a classroom is needed to start your learning path. Most of the technologies are giving free trial and lab environments to encourage the individuals and give an attempt on the new skill. It is only the interest and the passion which keeps their learning spirit on.


 


What has helped you grow professionally?


Passion for learning new skills, handling complexities are my best teachers.  Expand the network and time management is an important skill as we have numerous distractions around us. Tiny drops make a mighty ocean – so learning something new every day (be it simple concept or complex) and we are sure to trace the growth in the learning graph.


 


Imagine you had a magic wand that could create a feature in Logic Apps. What would this feature be and why?


Logic apps can create wonders in integration when some of the minor features can be incorporated to make them more friendly to the beginners.


For instance, when suggesting recurring API integration with D365FO via logic apps, there comes a question of creating a zipped package file. As most connectors are missing this action and even though azure functions / ADF / third-party tool comes to rescue, integration becomes simpler if available readymade.  


Also, a feature to track the state of the previous run – this is needed for integrations to safeguard the data overlap and thus to cancel the run automatically if the previous execution is still in progress.  




News from our product group:


 









































alexzuniga_3-1688394734758.png

 



.NET Framework Custom Code for Azure Logic Apps 


We are excited to announce the public preview release for .NET Framework Custom Code. Read this article to learn more about what’s new. 


alexzuniga_0-1688399428484.png

 



Cargolux Optimizes Workflow and Improves Customer Service with Microsoft Azure


Read about how Cargolux leveraged Azure’s capabilities to streamline their operations, improve efficiency, and deliver a better experience to their customers.


alexzuniga_0-1688394232172.png

 



Azure Logic Apps Capabilities for Mainframe and Midranges Integration


Learn about the ways how Azure Logic Apps integrates with Mainframes and Midranges and why it is important for our Mission Critical customers!


alexzuniga_1-1688394352266.png

 



Data Mapper Patterns: Regular Expressions  


In this post you’ll learn how we can use Regular Expression functions in the new Data Mapper that helps us validate and transform data to ensure of data consistency.


alexzuniga_2-1688394476421.png

 


 



Expose your data from Azure Cosmos DB or Azure SQL through a GraphQL API with Azure API Management  


This article outlines the process of providing secure and controlled access and exposing data from Azure Cosmos DB or Azure SQL by utilizing Azure API Management.



alexzuniga_0-1688394232172.png



New Azure Integration Services Blog! 


We are excited to announce the consolidation of our BizTalk Server, Host Integration Server and Azure Logic Apps Blogs into the Azure Integration Services Blog


alexzuniga_4-1688395010857.png

 



.NET Framework Custom Code in Azure Logic Apps (Standard) – Troubleshooting Strategies


In this post, we are going to discuss some troubleshooting strategies that you can use when developing custom code solutions in Azure Logic Apps (Standard).


alexzuniga_5-1688395245131.png

 



Azure Logic Apps Community Day – On Demand Resources


If you missed Aviators Day or simply want to watch some sessions again, head on over to this post with links to the full day’s recordings. 


alexzuniga_6-1688395429762.png

 



.NET Framework Custom Code – Calling Existing Assemblies


In the .NET Framework Custom Code post, we discussed how to author code and debug it. We now want to expand upon that scenario and discuss how we can call an existing assembly from our custom code project.



 




News from our community:


Microsoft Previews .NET Framework Custom Code for Azure Logic Apps Standard


Post by Steef-Jan Wiggers


 


Read more about the public preview for .NET Framework Custom Code from Aviator’s own Steef-Jan!


 


Resolving 401 “Forbidden” Error When Deploying Logic Apps ARM Template


Post by Harris Kristanto


 


Harris discusses the issue of encountering a 401 Forbidden error during the deployment of Logic Apps ARM templates and provides steps to resolve it. Learn the potential causes of the error and suggests troubleshooting methods, including adjusting authentication settings and ensuring proper access permissions are set, to successfully deploy the Logic Apps template.


 


Compare Azure Messaging Services | How to Chose | Azure Service Bus vs Event Hub vs Event Grid


Post by Srikanth Gunnala


 


In this video Srikanth discusses Azure Messaging Services’ three most utilized components: Azure Service Bus, Azure Event Grid, and Azure Event Hub. See real-world applications and demonstrations on how these services ensure smooth communication between different parts of a software program.


 


Mastering GraphQL Resolvers for Cosmos DB


Post by Ryan Langford


 


Developers looking for more cloud skills should read Ryan’s post on mastering GraphQL Resolvers in Azure Cosmos DB. He covers everything from setting up your API Manager Instance to querying and mutating the graph in Azure Cosmos DB.

Boosting Productivity: Unleashing the Power of Dynamics 365 for Prospect-to-Cash Efficiency

Boosting Productivity: Unleashing the Power of Dynamics 365 for Prospect-to-Cash Efficiency

This article is contributed. See the original author and article here.

Introduction

Companies seek efficiency in their customer engagement activities. Front office salespeople, traveling sales representatives, account managers, and others need to engage efficiently with customers while using Dynamics 365 Sales without spending time to ensure that data flows efficiently between their front-office work environment and the Dynamics 365 Supply Chain Management back-office environment. True end-to-end process integration must work seamlessly across applications, using an integrated process flow from quotation to invoice, to help businesses drive efficiencies in their sales and fulfilment processes, improve accuracy, and reduce lead times.

We are excited to announce the general availability of a set of new features and capabilities that will enhance the efficiency of the prospect-to-cash integration between Dynamics 365 Sales and Dynamics 365 Supply Chain Management. These additions to the prospect-to-cash integration aim to improve efficiency and cover several new features in Dynamics 365 Supply Chain Management 10.0.34, as well as a new Dual-write Supply Chain solution version 2.3.4.203.

This new feature set enables businesses to achieve true end-to-end process support and unlock various benefits. In this blog post, we will briefly showcase and explain the advantages that companies can gain by leveraging this feature set in an integrated scenario between Dynamics 365 Sales and Dynamics 365 Supply Chain Management.

Integrate Sales Quotation Lifecycle

Sales quotations can be created and processed throughout their lifecycle in both Dynamics 365 Sales and Dynamics 365 SCM. It is crucial that when a sales quotation is processed in one application, it is accurately reflected in the other application. Let’s consider a scenario where Dynamics 365 Sales serves as the CRM application. Salespeople utilize Dynamics 365 Sales to create, edit, collaborate on, and communicate sales quotations with customers. From a back-office perspective, it is essential for these sales quotations to be visible in Dynamics 365 SCM. This visibility allows for insights into expected demand, supporting back-office supply planning.

Furthermore, it is equally important for these sales quotations to be accessible in Dynamics 365 SCM to enable collaboration between back-office staff and front-office salespeople. This collaboration enhances sales quotations by incorporating necessary information that only back-office staff possess insights into. Lastly, but certainly not least, it is critical that when a sales quotation is activated and communicated to the customer by the front-office, this event triggers the appropriate quotation update in Dynamics 365 SCM.

With our new feature set, we now allow for such an integrated scenario with fewer touch points, better efficiency, and improved transparency.

Key benefits of this feature include:

End-to-end quotation lifecycle integration:

The sales quotation process can be initiated in either Dynamics 365 Sales or Dynamics 365 SCM and completed in either application, ensuring that changes and lifecycle updates seamlessly flow between both applications. This eliminates the need for manual duplication of sales quotation data and processing.

Unambiguous and transparent quotation lifecycle processing:

With the introduction of new concepts of origin and ownership, it is always clear and transparent which application is responsible for processing the sales quotation. This eliminates human errors in the quotation process.

Reduced cost of ownership:

The end-to-end lifecycle integration is supported without the need for customizations, resulting in reduced costs of ownership.

The integrated sales quotation lifecycle is supported by Dual-write Supply Chain solution version 2.3.4.203 and the following Dynamics 365 SCM 10.0.34 features: Integrate Sales Quotation lifecycle with Dynamics 365 Sales, Copy Supply Chain Management sales quotation data to sales orders synced from Dynamics 365 Sales, Set default ownership for sales quotations when integrated with Dynamics 365 Sales.

To learn more about these features, follow the link:

Add efficiency in quote-to-cash with Dynamics 365 Sales – Finance & Operations | Dynamics 365 | Microsoft Learn

Integrate Pricing

Before the release of Dual-write Supply Chain solution version 2.3.4.203, in conjunction with Dynamics 365 SCM 10.0.34, the recommended method for integrating pricing between the two applications was to configure Dynamics 365 Sales to utilize the system price calculation. This setup, coupled with the synchronization of totals and subtotals from Dynamics 365 SCM to Dynamics 365 Sales, along with the utilization of price quote and price order actions in Dynamics 365 Sales, as well as the implementation of manual discounts in Dynamics 365 Sales, could lead to a loss of transparency. This lack of transparency pertains to identifying which application controls and calculates the monetary values associated with sales quotations and sales orders, including prices, discounts, subtotals, and totals.

In Dynamics 365 SCM 10.0.34, we introduce two features to simplify and enhance transparency in calculations related to price, discount, subtotal, and total when integrating with Dynamics 365 Sales. The first feature is to designate Supply Chain Management as the price master when integrated with Dynamics 365 Sales. The second feature enables the calculation and pushing of prices, discounts, and totals specifically for selective sales orders and sales quotations when integrated with Dynamics 365 Sales.

Key benefits of these features include:

  • Calculations for extended amounts, summary amounts, subtotals, and totals for sales quotations and sales orders are not performed in Dynamics 365 Sales; All calculated monetary fields are calculated in and synchronized from Supply Chain Management.
  • Front-office salespeople can now, if authorized, manually apply a discount from Dynamics 365 Sales which is fully integrated with discounts in Dynamics 365 SCM.
  • Back-office staff can now calculate and push, whenever needed, all price and discount related updates for one or more sales quotation and sales orders from Dynamics 365 SCM to Dynamics 365 Sales.

To learn more about these features, follow the link: 

Work with added efficiency in quote-to-cash with Dynamics 365 Sales – Finance & Operations | Dynamics 365 | Microsoft Learn

Asynchronous or synchronous processing of events

Front-office salespeople using Dynamics 365 Sales need to work efficiently on quotations and sales orders without unnecessary wait times. They also need to maintain efficiency when Dynamics 365 Sales is integrated with Dynamics 365 SCM. The same applies to back-office staff working on quotations and sales orders in Dynamics 365 SCM. Achieving a smooth user experience and efficiency in these tasks heavily relies on asynchronous processing of events.

Asynchronous processing of Sales-integrated events allows events to be processed asynchronously in Dynamics 365 SCM using the message processor framework. This approach significantly enhances the user experience and performance of sales order and sales quotation integration in various use cases.

  • Front-office salespeople in Dynamics 365 Sales activates a quotation. This event will update the Sales Quotation in Dynamics 365 SCM to Sent and create a quotation journal.
  • Front-office salespeople in Dynamics 365 Sales creates an order from a sales quotation won. This event will update the Sales Quotation in Dynamics 365 SCM to Won, create a quotation confirmation journal, link the resulting sales order with the sales quotation, and, if setup, copy sales quotation data from the Dynamics 365 SCM sales quotation to the Dynamics 365 SCM sales order, and synchronize the changes to the sales order in Dynamics 365 Sales.
  • Back-office staff in Dynamics 365 SCM recalculates and pushes prices and totals for one or more sales quotations and sales orders to Dynamics 365 Sales

Key benefits of this feature include:

  • The user experience of front-office salespeople in Dynamics 365 Sales will not be impacted any additional time it may take to process integration related events.
  • Companies can flexibly decide which events to be processed synchronously and which to be processed asynchronously to provide the optimum user experience.
  • Improved system performance which will have a positive impact on user experience in both applications.

To learn more about this feature and Supply Chain at Microsoft, click below:

Feature Insights:

Work with added efficiency in quote-to-cash with Dynamics 365 Sales – Finance & Operations | Dynamics 365 | Microsoft Learn

Supply Chain at Microsoft

Take a tour – Supply Chain Management | Microsoft Dynamics 365

We’re excited to launch Free Trial | Microsoft Supply Chain Center Preview ,  which harnesses generative AI to assist Supply Chain managers in real-time communication with suppliers regarding specific news

The post Boosting Productivity: Unleashing the Power of Dynamics 365 for Prospect-to-Cash Efficiency appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Lesson Learned #388:Retrying Execution in case of Connection Drops/Command Timeouts using ODBC API

This article is contributed. See the original author and article here.

Based on Lesson Learned #368: Connection Retry-Logic using ODBC API code – Microsoft Community Hub I would like to share but was my lesson learned execution TSQL command. Executing a TSQL command connectivity issues or command timeouts can occur, leading to failed executions. To overcome these challenges, it is essential to implement robust error-handling mechanisms. In this article, we will explore how to leverage the ODBC API to retry the execution of TSQL commands when faced with connection drops or command timeouts. 


 


Implementing a Retry Mechanism: The following steps outline how to implement a retry mechanism using the ODBC API.


 




  1. Catch the error: Surround the TSQL command execution with a try-catch block or equivalent error-handling mechanism. In the catch block, examine the error code or exception to identify connection drops or command timeouts.




  2. Determine the retry conditions: Define the conditions under which a retry should occur. For example, you might retry when the error code corresponds to a dropped connection (e.g., SQLSTATE 08S01) or a command timeout (e.g., SQLSTATE HYT00).




  3. Set a maximum retry limit: To prevent infinite retries, set a maximum retry limit. It is essential to strike a balance between allowing enough retries to handle temporary issues and avoiding prolonged execution times.




  4. Introduce a delay between retries: To avoid overwhelming the database server, introduce a delay between retries. Exponential backoff is a popular technique where the delay increases exponentially after each retry, allowing the server time to recover.




  5. Retry the execution: Once the retry conditions are met, re-execute the TSQL command using the same connection. Remember to handle any additional exceptions that may arise during the retry process.




  6. Track retries: Keep track of the number of retries attempted to monitor the effectiveness of the retry mechanism. This information can be useful for troubleshooting and optimizing the system.




Code


 


 


 

        public void MainRetry()
        {
            // Initialize ODBC environment handle
            IntPtr environmentHandle = IntPtr.Zero;
            String sErrorMsg = "";
            Boolean bExecution = false;
            SQLAllocHandle(1, IntPtr.Zero, out environmentHandle);
            //SQLSetEnvAttr(environmentHandle, 201, (IntPtr)2, 0);
            SQLSetEnvAttr(environmentHandle, 200, (IntPtr)380, 0);

            bExecution = bMainRetryExecution(environmentHandle, ref sErrorMsg,"WAITFOR DELAY '00:02:50'",4 );
            if(!bExecution)
            {
                Console.WriteLine("Error: " + sErrorMsg);
            }
            else
            {
                Console.WriteLine("Execution correctly");
            }
            SQLFreeHandle(1, environmentHandle);
        }

        public Boolean bMainRetryExecution(IntPtr environmentHandle, 
                                           ref string sErrorMsg, 
                                           string sQuery = "SELECT 1", 
                                           int iRetryCount = 1)
        {
            // Initialize ODBC connection and statement handles
            Boolean bExecute = false;
            int retryIntervalSeconds = 2;

            for (int i = 1; i <= iRetryCount; i++)
            {
                try
                {
                    IntPtr connectionHandle = IntPtr.Zero;
                    IntPtr statementHandle = IntPtr.Zero;
                    int retcode;

                    Console.WriteLine("Try to execute {0} of {1} Query: {2}", i, iRetryCount, sQuery); 
                    retcode = SQLAllocHandle(2, environmentHandle, out connectionHandle);
                    if (retcode == -1)
                    {
                        sErrorMsg = "Not possible to obtain the environment Handle";
                    }
                    else
                    {
                      if (RetryLogicUsingODBCAPI(connectionHandle) == -1)
                      {
                            sErrorMsg = "Connection was not possible after the retries";
                      }
                      else
                      {
                            retcode = SQLAllocHandle(3, connectionHandle, out statementHandle);
                            if (retcode == -1)
                            {
                                sErrorMsg = "Not possible to obtain the statementHandle";
                            }
                            else
                            {
                                SQLSetStmtAttr(statementHandle, SQL_ATTR_QUERY_TIMEOUT, (IntPtr)(30*(i)), 0);
                                retcode = SQLExecDirect(statementHandle, sQuery, sQuery.Length);
                                if (retcode == -1)
                                {
                                    GetODBCErrorDetails(statementHandle, 3);
                                    sErrorMsg = "Error: not possible to execute the query.";
                                    System.Threading.Thread.Sleep(1000 * retryIntervalSeconds);
                                    retryIntervalSeconds = Convert.ToInt32(retryIntervalSeconds * 1.5);
                                }
                                else
                                {
                                    SQLDisconnect(connectionHandle);
                                    SQLFreeHandle(3, statementHandle);
                                    SQLFreeHandle(2, connectionHandle);
                                    sErrorMsg = "Command executed correctly";
                                    bExecute = true;
                                    break;
                                }
                            }
                        }
                    }
                }
                catch (Exception ex)
                {
                  Console.WriteLine("Error: " + ex.Message);
                  sErrorMsg = "Error: " + ex.Message;
                }
            }
            return bExecute;
        }

 


 


 


Explanation:


 












The script demonstrates a retry mechanism for executing TSQL commands using the ODBC API. Let’s go through the code step by step:




  1. MainRetry() is the entry point method. It initializes the ODBC environment handle (environmentHandle) using the SQLAllocHandle() function. The SQLSetEnvAttr() function is called to set an attribute related to query execution time. Then, it calls the bMainRetryExecution() method to perform the actual execution.




  2. bMainRetryExecution() is the method responsible for executing the TSQL command and handling retries. It takes the ODBC environment handle (environmentHandle), an error message string (sErrorMsg), the TSQL query string (sQuery), and the number of retry attempts (iRetryCount) as parameters.




  3. Inside the method, a loop is set up to attempt the execution multiple times based on the specified iRetryCount. The loop starts with i set to 1 and continues until it reaches iRetryCount.




  4. Within each iteration of the loop, the method attempts to execute the TSQL command. It first initializes the ODBC connection handle (connectionHandle) using SQLAllocHandle().




  5. If obtaining the connection handle fails (retcode == -1), an error message is set, indicating the inability to obtain the environment handle.




  6. If obtaining the connection handle is successful, the method calls RetryLogicUsingODBCAPI() to handle the retry logic for the connection. The details of this method are not provided in the code snippet, but it likely includes connection establishment and retry mechanisms specific to the application. You could find more information here: Lesson Learned #368: Connection Retry-Logic using ODBC API code – Microsoft Community Hub




  7. Once the retry logic completes, the method proceeds to allocate the ODBC statement handle (statementHandle) using SQLAllocHandle(). If the allocation fails, an error message is set.




  8. If the statement handle is successfully allocated, the method sets the query timeout attribute using SQLSetStmtAttr(), adjusting the timeout value based on the current retry attempt (30*(i)). 




  9. The TSQL command is then executed using SQLExecDirect() with the statement handle and the provided query string. If the execution fails (retcode == -1), the GetODBCErrorDetails() method is likely called to retrieve specific error information. The code sets an appropriate error message in the sErrorMsg variable, waits for a specified interval using Thread.Sleep(), and increases the retry interval by multiplying it by 1.5. Of course, we could capture the error and depending if the execution and provide other ways to react , also, remember that the connection would be re-stablished – Lesson Learned #381: The connection is broken and recovery is not possible message using API ODBC – Microsoft Community Hub




  10. If the execution succeeds, the code disconnects from the database, frees the statement and connection handles using SQLDisconnect(), SQLFreeHandle(), and SQLFreeHandle(), respectively. It sets a success message in sErrorMsg, sets bExecute to true, and breaks out of the retry loop. 




  11. Finally, the method catches any exceptions that occur during the execution and sets the error message accordingly.




  12. The method returns the bExecute flag to indicate whether the execution was successful (true) or not (false).




The provided code showcases a basic retry mechanism utilizing the ODBC API for executing TSQL commands.











Guia de Estudos:  AZ-500 Microsoft Azure Security Technologies

Guia de Estudos: AZ-500 Microsoft Azure Security Technologies

This article is contributed. See the original author and article here.

As organizações estão cada vez mais dependendo de tecnologias em nuvem para melhorar a eficiência e otimizar as operações no ambiente empresarial acelerado de hoje. À medida que a adoção da nuvem cresce, também aumenta a demanda por medidas de segurança robustas para proteger dados e aplicativos sensíveis. A certificação AZ-500 Microsoft Azure Security Technologies tem o objetivo de fornecer aos profissionais as habilidades e o conhecimento necessários para proteger a infraestrutura, os serviços e os dados do Azure.


 


A abordagem de Segurança Zero Trust, que parte do pressuposto de que todos os usuários, dispositivos e redes são não confiáveis e requerem verificação constante, é uma das metodologias de segurança mais críticas da indústria atualmente. À medida que as empresas adotam a tecnologia de Inteligência Artificial (IA), surgem novas preocupações de segurança, tornando crucial que as organizações se mantenham atualizadas nas práticas de segurança mais recentes.


 


Este guia de estudos fornece uma visão geral dos objetivos do exame AZ-500, que incluem controles de segurança, gerenciamento de identidade e acesso, proteção da plataforma, proteção de dados e aplicativos, além de recursos de governança e conformidade no Azure. 


 


O que esperar no exame


 


O Exame AZ-500 mede o conhecimento do aluno em implementar, gerenciar e monitorar a segurança de recursos no Azure, em ambientes multi-cloud e híbridos. Isso inclui recomendação de componentes de segurança e configurações para proteger identidade e acesso, dados, aplicativos e redes.


 


O exame consiste em 40 a 60 perguntas e tem duração de 180 minutos. Você pode encontrar perguntas de múltipla escolha, bem como perguntas de arrastar e soltar e perguntas de área ativa na tela.


 


Design sem nome.jpg


Recursos adicionais de aprendizado


Se você não tem muita familiaridade com computação em nuvem, recomendo estudar a trilha Azure Fundamentals:



As porcentagens indicadas em cada tópico da prova são referentes ao peso / volume de questões que você poderá encontrar no exame.


 


AZ-500 Microsoft Azure Security Technologies


 


Gerenciar identidade e acesso (25-30%)


Para gerenciar efetivamente identidade e acesso, os alunos devem ser capazes de projetar e implementar soluções de acesso seguras, como multi-factor authentication e políticas de acesso condicional. Eles também devem ter um bom entendimento do Azure Active Directory e ser capazes de gerenciar contas de usuário, grupos e funções.


 


Tópicos abordados:



 


Rede segura (20-25%)


Os alunos devem ser capazes de projetar e implementar soluções de rede seguras, como redes virtuais privadas (VPNs), Azure ExpressRoute e Azure Firewall no que se refere à segurança de rede. Eles também devem entender os grupos de segurança de rede (NSGs) e a proteção contra DDoS do Azure.


 


Tópicos abordados:



 


Compute, armazenamento e bancos de dados seguros (20-25%)


Os alunos devem estar familiarizados com recursos de segurança do Azure, como Azure Security Center e Azure Key Vault, para garantir a segurança de computação, armazenamento e bancos de dados. Além disso, eles devem ser capazes de projetar e implementar soluções de armazenamento seguras, como criptografia do Azure Storage e Azure Backup. Eles também devem ser capazes de usar recursos de segurança de banco de dados, como auditoria do Azure SQL Database e Criptografia de Dados Transparente (TDE).


 


Tópicos abordados:



 


Gerenciar operações de segurança (25-30%)


Por fim, os alunos devem ser capazes de gerenciar operações de segurança de forma eficaz. Isso inclui monitorar logs e alertas de segurança, responder a incidentes de segurança e implementar políticas e procedimentos de segurança. Eles também devem ter um bom entendimento dos requisitos de conformidade, como GDPR e HIPAA.



 


Tópicos abordados:





  • Planejar, implementar e gerenciar a governança para segurança

  • Gerenciar a postura de segurança usando o Microsoft Defender para Nuvem

  • Configurar e gerenciar a proteção contra ameaças usando o Microsoft Defender para Nuvem

  • Configurar e gerenciar soluções de automação e monitoramento de segurança


 














Captura de tela 2023-05-25 201358.jpg

Documentação técnica




  • Azure Active Directory: Gerencie identidades de usuários e controle o acesso a seus aplicativos, dados e recursos com o Microsoft Azure Active Directory (Azure AD), um componente do Microsoft Entra.




  • Azure Firewall: Saiba como instalar e configurar o Firewall do Azure, um serviço de segurança de rede baseado em nuvem.




  • Azure Firewall Manager: Descubra como configurar o Azure Firewall Manager, um serviço de gerenciamento de segurança global.




  • Azure Application Gateway: Descubra como criar gateways de aplicativos. Esta documentação ajudará você a planejar, implantar e gerenciar o tráfego da Web para seus recursos do Azure.




  • Azure Front Door e CDN: O Azure Front Door é um ponto de entrada escalonável e seguro para fornecer aplicativos Web globais rapidamente.




  • Web Application Firewall: O Web Application Firewall (WAF) protege seus aplicativos da Web contra explorações e vulnerabilidades comuns. O WAF pode ser implantado no Azure Application Gateway ou no Azure Front Door Service.




  • Azure Key Vault: Aprenda a usar o Key Vault para gerar e gerenciar chaves que permitem acessar e criptografar recursos, aplicativos e soluções em nuvem. Tutoriais, referências de API e muito mais estão disponíveis.




  • Políticas Azure virtual network service endpoint: As políticas de endpoint de serviço de rede virtual (VNet) filtram o tráfego de rede virtual de saída para contas de armazenamento do Azure no ponto de extremidade de serviço e permitem a exfiltração de dados para contas específicas. As conexões de ponto de extremidade de serviço para armazenamento do Azure permitem controle de acesso granular para tráfego de rede virtual.




  • Manage Azure Private Endpoints – Azure Private Link: A configuração e implantação de endpoints privados do Azure são adaptáveis. Consultas de link privado revelam GroupId e MemberName. Os valores GroupID e MemberName são necessários para configurar um endereço IP estático para um endpoint privado durante a criação. O endereço IP estático e o nome da interface de rede são propriedades de terminal privado. Crie o terminal privado com essas propriedades. Um provedor de serviços e um consumidor devem aprovar uma conexão de serviço de link privado.




  • Crie um serviço de link privado usando o portal do Azure: Comece desenvolvendo um serviço de Link Privado que se refira ao seu serviço. Permita o acesso do Link Privado ao seu serviço ou recurso protegido pelo Azure Standard Load Balancer. Os usuários do seu serviço têm acesso privado de sua rede virtual.




  • Azure DDoS Protection Standard: Saiba como a Proteção DDoS do Azure, quando combinada com as práticas recomendadas de design de aplicativos, fornece defesa contra ataques DDoS.




  • Endpoint Protection em VMs Windows no Azure: Saiba como instalar e configurar o cliente Symantec Endpoint Protection em uma máquina virtual (VM) existente do Windows Server. Este cliente completo inclui proteção contra vírus e spyware, um firewall e prevenção contra invasões. Usando o VM Agent, o cliente é instalado como uma extensão de segurança.




  • Políticas de uso e segurança – Azure Virtual Machines: É fundamental manter sua máquina virtual (VM) segura para executar aplicativos. Proteger suas VMs pode incluir um ou mais serviços e recursos do Azure que cobrem o acesso seguro à VM e o armazenamento de dados. Este artigo ensinará como proteger sua máquina virtual e seus aplicativos.




  • Security – Azure App Service: Descubra como o Serviço de Aplicativo do Azure pode ajudá-lo a proteger seu aplicativo Web, back-end de aplicativo móvel, aplicativo de API e aplicativo de funções. Ele também demonstra como proteger ainda mais seu aplicativo usando os recursos integrados do Serviço de Aplicativo. Os componentes da plataforma do Serviço de Aplicativo, como máquinas virtuais do Azure, armazenamento, conexões de rede, estruturas da Web, gerenciamento e recursos de integração, são ativamente protegidos e reforçados.




  • Azure Policy: Com definições de política que impõem regras e efeitos para seus recursos, o Azure Policy ajuda você a gerenciar e prevenir problemas de TI.




  • Visão Geral do Microsoft Defender for Servers: O Microsoft Defender for Servers protege seus servidores Windows e Linux em execução no Azure, Amazon Web Services (AWS), Google Cloud Platform (GCP) e on-premises. A detecção e resposta de endpoint (EDR) e outros recursos de proteção contra ameaças são fornecidos pela integração do Defender for Servers com o Microsoft Defender for Endpoint. Descubra como projetar e planejar uma implantação bem-sucedida do Defender for Servers.




  • Microsoft Defender for Cloud: O Microsoft Defender for Cloud protege cargas de trabalho de nuvem híbrida com gerenciamento de segurança unificado e proteção avançada contra ameaças.




  • Visão Geral – Microsoft Threat Modeling Tool: O ciclo de vida de desenvolvimento de segurança da Microsoft depende da ferramenta de modelagem de ameaças (SDL). Ele permite que os arquitetos de software identifiquem e mitiguem potenciais problemas de segurança desde o início, quando eles são relativamente simples e baratos de corrigir. Reduz significativamente os custos de desenvolvimento. Projetamos a ferramenta para especialistas que não são de segurança para simplificar a modelagem de ameaças para todos os desenvolvedores, fornecendo orientações claras sobre como criar e analisar modelos de ameaças.




  • Azure Monitor: Serviços de monitoramento no Azure e no local. Métricas, logs e rastreamentos podem ser agrupados e analisados. Envie alertas e notificações ou use soluções automatizadas.




  • Microsoft Sentinel: Saiba como começar a usar o Microsoft Sentinel por meio de casos de uso. Com o SIEM reinventado para o mundo moderno, você pode ver e interromper as ameaças antes que elas causem danos. O Microsoft Sentinel oferece uma visão panorâmica da empresa.




  • Azure Storage: O Armazenamento do Azure fornece armazenamento para objetos, arquivos, discos, filas e tabelas. Há também serviços para soluções de armazenamento híbrido, bem como serviços para transferência, compartilhamento e backup de dados.




  • Azure Files: Compartilhamentos de arquivos em nuvem de nível empresarial que são simples, seguros e sem servidor.




  • Azure SQL: Encontre documentação para os produtos do mecanismo de banco de dados SQL do Azure na nuvem, incluindo banco de dados SQL do Azure, instância gerenciada do Azure SQL e SQL Server na VM do Azure.




 


Espero que este guia tenha sido útil na preparação para o exame de certificação AZ-500 e deseja a você boa sorte em sua jornada de certificação!