This article is contributed. See the original author and article here.
Based on Lesson Learned #368: Connection Retry-Logic using ODBC API code – Microsoft Community Hub I would like to share but was my lesson learned execution TSQL command. Executing a TSQL command connectivity issues or command timeouts can occur, leading to failed executions. To overcome these challenges, it is essential to implement robust error-handling mechanisms. In this article, we will explore how to leverage the ODBC API to retry the execution of TSQL commands when faced with connection drops or command timeouts.
Implementing a Retry Mechanism: The following steps outline how to implement a retry mechanism using the ODBC API.
Catch the error: Surround the TSQL command execution with a try-catch block or equivalent error-handling mechanism. In the catch block, examine the error code or exception to identify connection drops or command timeouts.
Determine the retry conditions: Define the conditions under which a retry should occur. For example, you might retry when the error code corresponds to a dropped connection (e.g., SQLSTATE 08S01) or a command timeout (e.g., SQLSTATE HYT00).
Set a maximum retry limit: To prevent infinite retries, set a maximum retry limit. It is essential to strike a balance between allowing enough retries to handle temporary issues and avoiding prolonged execution times.
Introduce a delay between retries: To avoid overwhelming the database server, introduce a delay between retries. Exponential backoff is a popular technique where the delay increases exponentially after each retry, allowing the server time to recover.
Retry the execution: Once the retry conditions are met, re-execute the TSQL command using the same connection. Remember to handle any additional exceptions that may arise during the retry process.
Track retries: Keep track of the number of retries attempted to monitor the effectiveness of the retry mechanism. This information can be useful for troubleshooting and optimizing the system.
Code
public void MainRetry()
{
// Initialize ODBC environment handle
IntPtr environmentHandle = IntPtr.Zero;
String sErrorMsg = "";
Boolean bExecution = false;
SQLAllocHandle(1, IntPtr.Zero, out environmentHandle);
//SQLSetEnvAttr(environmentHandle, 201, (IntPtr)2, 0);
SQLSetEnvAttr(environmentHandle, 200, (IntPtr)380, 0);
bExecution = bMainRetryExecution(environmentHandle, ref sErrorMsg,"WAITFOR DELAY '00:02:50'",4 );
if(!bExecution)
{
Console.WriteLine("Error: " + sErrorMsg);
}
else
{
Console.WriteLine("Execution correctly");
}
SQLFreeHandle(1, environmentHandle);
}
public Boolean bMainRetryExecution(IntPtr environmentHandle,
ref string sErrorMsg,
string sQuery = "SELECT 1",
int iRetryCount = 1)
{
// Initialize ODBC connection and statement handles
Boolean bExecute = false;
int retryIntervalSeconds = 2;
for (int i = 1; i <= iRetryCount; i++)
{
try
{
IntPtr connectionHandle = IntPtr.Zero;
IntPtr statementHandle = IntPtr.Zero;
int retcode;
Console.WriteLine("Try to execute {0} of {1} Query: {2}", i, iRetryCount, sQuery);
retcode = SQLAllocHandle(2, environmentHandle, out connectionHandle);
if (retcode == -1)
{
sErrorMsg = "Not possible to obtain the environment Handle";
}
else
{
if (RetryLogicUsingODBCAPI(connectionHandle) == -1)
{
sErrorMsg = "Connection was not possible after the retries";
}
else
{
retcode = SQLAllocHandle(3, connectionHandle, out statementHandle);
if (retcode == -1)
{
sErrorMsg = "Not possible to obtain the statementHandle";
}
else
{
SQLSetStmtAttr(statementHandle, SQL_ATTR_QUERY_TIMEOUT, (IntPtr)(30*(i)), 0);
retcode = SQLExecDirect(statementHandle, sQuery, sQuery.Length);
if (retcode == -1)
{
GetODBCErrorDetails(statementHandle, 3);
sErrorMsg = "Error: not possible to execute the query.";
System.Threading.Thread.Sleep(1000 * retryIntervalSeconds);
retryIntervalSeconds = Convert.ToInt32(retryIntervalSeconds * 1.5);
}
else
{
SQLDisconnect(connectionHandle);
SQLFreeHandle(3, statementHandle);
SQLFreeHandle(2, connectionHandle);
sErrorMsg = "Command executed correctly";
bExecute = true;
break;
}
}
}
}
}
catch (Exception ex)
{
Console.WriteLine("Error: " + ex.Message);
sErrorMsg = "Error: " + ex.Message;
}
}
return bExecute;
}
Explanation:
The script demonstrates a retry mechanism for executing TSQL commands using the ODBC API. Let’s go through the code step by step:
MainRetry() is the entry point method. It initializes the ODBC environment handle (environmentHandle) using the SQLAllocHandle() function. The SQLSetEnvAttr() function is called to set an attribute related to query execution time. Then, it calls the bMainRetryExecution() method to perform the actual execution.
bMainRetryExecution() is the method responsible for executing the TSQL command and handling retries. It takes the ODBC environment handle (environmentHandle), an error message string (sErrorMsg), the TSQL query string (sQuery), and the number of retry attempts (iRetryCount) as parameters.
Inside the method, a loop is set up to attempt the execution multiple times based on the specified iRetryCount. The loop starts with i set to 1 and continues until it reaches iRetryCount.
Within each iteration of the loop, the method attempts to execute the TSQL command. It first initializes the ODBC connection handle (connectionHandle) using SQLAllocHandle().
If obtaining the connection handle fails (retcode == -1), an error message is set, indicating the inability to obtain the environment handle.
If obtaining the connection handle is successful, the method calls RetryLogicUsingODBCAPI() to handle the retry logic for the connection. The details of this method are not provided in the code snippet, but it likely includes connection establishment and retry mechanisms specific to the application. You could find more information here: Lesson Learned #368: Connection Retry-Logic using ODBC API code – Microsoft Community Hub
Once the retry logic completes, the method proceeds to allocate the ODBC statement handle (statementHandle) using SQLAllocHandle(). If the allocation fails, an error message is set.
If the statement handle is successfully allocated, the method sets the query timeout attribute using SQLSetStmtAttr(), adjusting the timeout value based on the current retry attempt (30*(i)).
The TSQL command is then executed using SQLExecDirect() with the statement handle and the provided query string. If the execution fails (retcode == -1), the GetODBCErrorDetails() method is likely called to retrieve specific error information. The code sets an appropriate error message in the sErrorMsg variable, waits for a specified interval using Thread.Sleep(), and increases the retry interval by multiplying it by 1.5. Of course, we could capture the error and depending if the execution and provide other ways to react , also, remember that the connection would be re-stablished – Lesson Learned #381: The connection is broken and recovery is not possible message using API ODBC – Microsoft Community Hub
If the execution succeeds, the code disconnects from the database, frees the statement and connection handles using SQLDisconnect(), SQLFreeHandle(), and SQLFreeHandle(), respectively. It sets a success message in sErrorMsg, sets bExecute to true, and breaks out of the retry loop.
Finally, the method catches any exceptions that occur during the execution and sets the error message accordingly.
The method returns the bExecute flag to indicate whether the execution was successful (true) or not (false).
The provided code showcases a basic retry mechanism utilizing the ODBC API for executing TSQL commands.
This article is contributed. See the original author and article here.
As organizações estão cada vez mais dependendo de tecnologias em nuvem para melhorar a eficiência e otimizar as operações no ambiente empresarial acelerado de hoje. À medida que a adoção da nuvem cresce, também aumenta a demanda por medidas de segurança robustas para proteger dados e aplicativos sensíveis. A certificação AZ-500 Microsoft Azure Security Technologies tem o objetivo de fornecer aos profissionais as habilidades e o conhecimento necessários para proteger a infraestrutura, os serviços e os dados do Azure.
A abordagem de Segurança Zero Trust, que parte do pressuposto de que todos os usuários, dispositivos e redes são não confiáveis e requerem verificação constante, é uma das metodologias de segurança mais críticas da indústria atualmente. À medida que as empresas adotam a tecnologia de Inteligência Artificial (IA), surgem novas preocupações de segurança, tornando crucial que as organizações se mantenham atualizadas nas práticas de segurança mais recentes.
Este guia de estudos fornece uma visão geral dos objetivos do exame AZ-500, que incluem controles de segurança, gerenciamento de identidade e acesso, proteção da plataforma, proteção de dados e aplicativos, além de recursos de governança e conformidade no Azure.
O que esperar no exame
O Exame AZ-500 mede o conhecimento do aluno em implementar, gerenciar e monitorar a segurança de recursos no Azure, em ambientes multi-cloud e híbridos. Isso inclui recomendação de componentes de segurança e configurações para proteger identidade e acesso, dados, aplicativos e redes.
O exame consiste em 40 a 60 perguntas e tem duração de 180 minutos. Você pode encontrar perguntas de múltipla escolha, bem como perguntas de arrastar e soltar e perguntas de área ativa na tela.
Recursos adicionais de aprendizado
Se você não tem muita familiaridade com computação em nuvem, recomendo estudar a trilhaAzure Fundamentals:
As porcentagens indicadas em cada tópico da prova são referentes ao peso / volume de questões que você poderá encontrar no exame.
AZ-500 Microsoft Azure Security Technologies
Gerenciar identidade e acesso (25-30%)
Para gerenciar efetivamente identidade e acesso, os alunos devem ser capazes de projetar e implementar soluções de acesso seguras, como multi-factor authentication e políticas de acesso condicional. Eles também devem ter um bom entendimento do Azure Active Directory e ser capazes de gerenciar contas de usuário, grupos e funções.
Os alunos devem ser capazes de projetar e implementar soluções de rede seguras, como redes virtuais privadas (VPNs), Azure ExpressRoute e Azure Firewall no que se refere à segurança de rede. Eles também devem entender os grupos de segurança de rede (NSGs) e a proteção contra DDoS do Azure.
Planejar e implementar a segurança para acesso público aos recursos do Azure
Compute, armazenamento e bancos de dados seguros (20-25%)
Os alunos devem estar familiarizados com recursos de segurança do Azure, como Azure Security Center e Azure Key Vault, para garantir a segurança de computação, armazenamento e bancos de dados. Além disso, eles devem ser capazes de projetar e implementar soluções de armazenamento seguras, como criptografia do Azure Storage e Azure Backup. Eles também devem ser capazes de usar recursos de segurança de banco de dados, como auditoria do Azure SQL Database e Criptografia de Dados Transparente (TDE).
Tópicos abordados:
Planejar e implementar a segurança avançada para computação
Planejar e implementar a segurança para armazenamento
Por fim, os alunos devem ser capazes de gerenciar operações de segurança de forma eficaz. Isso inclui monitorar logs e alertas de segurança, responder a incidentes de segurança e implementar políticas e procedimentos de segurança. Eles também devem ter um bom entendimento dos requisitos de conformidade, como GDPR e HIPAA.
Tópicos abordados:
Planejar, implementar e gerenciar a governança para segurança
Gerenciar a postura de segurança usando o Microsoft Defender para Nuvem
Configurar e gerenciar a proteção contra ameaças usando o Microsoft Defender para Nuvem
Configurar e gerenciar soluções de automação e monitoramento de segurança
Documentação técnica
Azure Active Directory:Gerencie identidades de usuários e controle o acesso a seus aplicativos, dados e recursos com o Microsoft Azure Active Directory (Azure AD), um componente do Microsoft Entra.
Azure Firewall:Saiba como instalar e configurar o Firewall do Azure, um serviço de segurança de rede baseado em nuvem.
Azure Firewall Manager:Descubra como configurar o Azure Firewall Manager, um serviço de gerenciamento de segurança global.
Azure Application Gateway:Descubra como criar gateways de aplicativos. Esta documentação ajudará você a planejar, implantar e gerenciar o tráfego da Web para seus recursos do Azure.
Azure Front Door e CDN:O Azure Front Door é um ponto de entrada escalonável e seguro para fornecer aplicativos Web globais rapidamente.
Web Application Firewall:O Web Application Firewall (WAF) protege seus aplicativos da Web contra explorações e vulnerabilidades comuns. O WAF pode ser implantado no Azure Application Gateway ou no Azure Front Door Service.
Azure Key Vault:Aprenda a usar o Key Vault para gerar e gerenciar chaves que permitem acessar e criptografar recursos, aplicativos e soluções em nuvem. Tutoriais, referências de API e muito mais estão disponíveis.
Políticas Azure virtual network service endpoint:As políticas de endpoint de serviço de rede virtual (VNet) filtram o tráfego de rede virtual de saída para contas de armazenamento do Azure no ponto de extremidade de serviço e permitem a exfiltração de dados para contas específicas. As conexões de ponto de extremidade de serviço para armazenamento do Azure permitem controle de acesso granular para tráfego de rede virtual.
Manage Azure Private Endpoints – Azure Private Link:A configuração e implantação de endpoints privados do Azure são adaptáveis. Consultas de link privado revelam GroupId e MemberName. Os valores GroupID e MemberName são necessários para configurar um endereço IP estático para um endpoint privado durante a criação. O endereço IP estático e o nome da interface de rede são propriedades de terminal privado. Crie o terminal privado com essas propriedades. Um provedor de serviços e um consumidor devem aprovar uma conexão de serviço de link privado.
Crie um serviço de link privado usando o portal do Azure:Comece desenvolvendo um serviço de Link Privado que se refira ao seu serviço. Permita o acesso do Link Privado ao seu serviço ou recurso protegido pelo Azure Standard Load Balancer. Os usuários do seu serviço têm acesso privado de sua rede virtual.
Azure DDoS Protection Standard:Saiba como a Proteção DDoS do Azure, quando combinada com as práticas recomendadas de design de aplicativos, fornece defesa contra ataques DDoS.
Endpoint Protection em VMs Windows no Azure:Saiba como instalar e configurar o cliente Symantec Endpoint Protection em uma máquina virtual (VM) existente do Windows Server. Este cliente completo inclui proteção contra vírus e spyware, um firewall e prevenção contra invasões. Usando o VM Agent, o cliente é instalado como uma extensão de segurança.
Políticas de uso e segurança – Azure Virtual Machines:É fundamental manter sua máquina virtual (VM) segura para executar aplicativos. Proteger suas VMs pode incluir um ou mais serviços e recursos do Azure que cobrem o acesso seguro à VM e o armazenamento de dados. Este artigo ensinará como proteger sua máquina virtual e seus aplicativos.
Security – Azure App Service:Descubra como o Serviço de Aplicativo do Azure pode ajudá-lo a proteger seu aplicativo Web, back-end de aplicativo móvel, aplicativo de API e aplicativo de funções. Ele também demonstra como proteger ainda mais seu aplicativo usando os recursos integrados do Serviço de Aplicativo. Os componentes da plataforma do Serviço de Aplicativo, como máquinas virtuais do Azure, armazenamento, conexões de rede, estruturas da Web, gerenciamento e recursos de integração, são ativamente protegidos e reforçados.
Azure Policy: Com definições de política que impõem regras e efeitos para seus recursos, o Azure Policy ajuda você a gerenciar e prevenir problemas de TI.
Visão Geral do Microsoft Defender for Servers:O Microsoft Defender for Servers protege seus servidores Windows e Linux em execução no Azure, Amazon Web Services (AWS), Google Cloud Platform (GCP) e on-premises. A detecção e resposta de endpoint (EDR) e outros recursos de proteção contra ameaças são fornecidos pela integração do Defender for Servers com o Microsoft Defender for Endpoint. Descubra como projetar e planejar uma implantação bem-sucedida do Defender for Servers.
Microsoft Defender for Cloud:O Microsoft Defender for Cloud protege cargas de trabalho de nuvem híbrida com gerenciamento de segurança unificado e proteção avançada contra ameaças.
Visão Geral – Microsoft Threat Modeling Tool:O ciclo de vida de desenvolvimento de segurança da Microsoft depende da ferramenta de modelagem de ameaças (SDL). Ele permite que os arquitetos de software identifiquem e mitiguem potenciais problemas de segurança desde o início, quando eles são relativamente simples e baratos de corrigir. Reduz significativamente os custos de desenvolvimento. Projetamos a ferramenta para especialistas que não são de segurança para simplificar a modelagem de ameaças para todos os desenvolvedores, fornecendo orientações claras sobre como criar e analisar modelos de ameaças.
Azure Monitor: Serviços de monitoramento no Azure e no local. Métricas, logs e rastreamentos podem ser agrupados e analisados. Envie alertas e notificações ou use soluções automatizadas.
Microsoft Sentinel:Saiba como começar a usar o Microsoft Sentinel por meio de casos de uso. Com o SIEM reinventado para o mundo moderno, você pode ver e interromper as ameaças antes que elas causem danos. O Microsoft Sentinel oferece uma visão panorâmica da empresa.
Azure Storage:O Armazenamento do Azure fornece armazenamento para objetos, arquivos, discos, filas e tabelas. Há também serviços para soluções de armazenamento híbrido, bem como serviços para transferência, compartilhamento e backup de dados.
Azure Files:Compartilhamentos de arquivos em nuvem de nível empresarial que são simples, seguros e sem servidor.
Azure SQL:Encontre documentação para os produtos do mecanismo de banco de dados SQL do Azure na nuvem, incluindo banco de dados SQL do Azure, instância gerenciada do Azure SQL e SQL Server na VM do Azure.
Espero que este guia tenha sido útil na preparação para o exame de certificação AZ-500 e deseja a você boa sorte em sua jornada de certificação!
This article is contributed. See the original author and article here.
One of the easiest ways to curate metadata is to pull all the information you need into a csv file so you work quickly in a spreadsheet, then make updates in bulk by importing information. You can now do this in Microsoft Purview.
Right now, you can only export one asset type at a time, and this only works for business assets. We plan to offer import and export for data assets in the future.
Exporting assets is easy. Just go to any collection, select business assets, and export:
Select the asset type you want to export. Reminder, we only support one asset type at time:
You’ll get csv of your assets, their guides, and the fields you chose to export. To import, just update the file with the information you want, and import into the right collection.
This article is contributed. See the original author and article here.
Introduction:
Have you ever found yourself in a situation where you needed to stream Microsoft Defender for Cloud data to another system? Microsoft Defender for Cloud provides the option of streaming data like recommendations and security alerts, to a Log Analytics workspace, event hub, or another SIEM solution. This capability is called continuous export.
Imagine if the system you want to stream Microsoft Defender for Cloud data is located behind the firewall. How would you go about doing that? This article teaches you how to accomplish this scenario by configuring export as a trusted service.
To configure Continuous export as a trusted service, you need to perform the following steps in sequence:
Identify the destination event hub.
Add the relevant role assignments on the destination event hub.
Configure continuous export as a trusted service to use the destination event hub.
Verify data is being exported to the destination event hub.
The first step is identifying the event hub used to stream data from Defender for Cloud, to the system located behind the firewall.
Identify the destination event hub
Event hub provides you with a way to ingest data and integrate with other Azure services, like Defender for Cloud. For the purposes of configuring continuous export to stream data located behind a firewall you can either use an existing event hub or create a new one.
After you identify the event hub to be used as the destination for your Defender for Cloud data, you need to grant the continuous export service access the necessary permissions.
Add the relevant role assignment on the destination event hub
To add the necessary permissions, perform the following actions:
Navigate to the Event Hubs dashboard.
Click the destination Event Hub.
Select Access Control > Add role assignment > Azure Event Hubs Data Sender.
Click + Select members > Windows Azure Security Resource Provider (like in figure 1).
Select > Review + assign.
Figure 1. Adding the relevant role assignment on the destination event hub
After you add the relevant permissions to the event hub, you can proceed to the next step of configuring continuous export.
Configure continuous export as a trusted service to use the destination event hub
To configure continuous export, you need to have write permissions on the event hub policy. Imagine you wanted to stream data related to recommendations and security alerts in near real-time, to a system located behind a firewall. To achieve this scenario, perform the following actions:
Navigate to the Cloud for Cloud dashboard.
Select Environment settings.
Click the desired subscription.
On the left, select Continuous export.
Select Event hub.
Select Security recommendations and Security alerts.
Under Export frequency select streaming updates.
Ensure Export as a trusted service is selected (like in figure 2).
Choose the destination event hub.
Figure 2. Ensure that Export as a trusted service is selected
If you need further guidance on how to configure continuous export as a trusted service you can start here.
After you perform these actions, you can optionally verifying that data is being sent to the destination event hub.
Conclusion:
Configuring continuous export as a trusted service to event hub, allows you to stream Defender for Cloud data to a system located behind a firewall. For the purposes on this article, I focus on teaching you how to configure continuous export with the portal. However, for large organizations it’s recommended to use something like Azure policy to configure this scenario at scale. To configure continuous export as a trusted service to event hub you can use the following Azure policy: Deploy export to Event Hub as a trusted service for Microsoft Defender for Cloud data. The respective policy definition ID is af9f6c70-eb74-4189-8d15-e4f11a7ebfd4.
Reviewers:
Arik Noyman, Principal Group Software Engineering Manager,
This article is contributed. See the original author and article here.
In this blog post, we will show you how to get started on your journey with generative AI for Microsoft Dynamics 365 Business Central. Microsoft’s unique partnership with OpenAI allows us to bring the innovative power of large-scale language models (LLM) to the Business Central ecosystem in a unique, complete, and responsible way. Microsoft Azure OpenAI Service provides access to OpenAI’s advanced models such as GPT-4, GPT-3, Codex, and DALL-E with the security and enterprise features of Azure.
Over the last few months, we’ve listened to many community ideas on how generative AI can enrich existing features, or deliver entirely new customer value, and the follow-up question is always the same: how do I get started with AI in AL and Visual Studio Code? We’ve collected a few tips, including sample code, to make it easy for you to start exploring Azure OpenAI and share your learnings with the community.
New to generative AI?
We appreciate that there is much to learn about this new and exciting technology. If you’re new to this branch of machine learning and Azure OpenAI, these links are a great introduction to understand the basics.
Explore key Responsible AI guidelines and principles at https://aka.ms/RAI
How to get started with your AL code
We’ve shared some sample code as an extension that enables you to explore the possibilities of LLM. This code is designed to simplify the process of setting up and running LLM experiments, starting with easily configuring and testing your connection to the Azure OpenAI service. After that, you can extend your AL logic to do more exciting things with this code.
The sample code uses Azure OpenAI to suggest an item category based on the item description field.
The extension’s source code is available at the Business Central BCTech repository on GitHub. You can get to it directly at https://aka.ms/BCStartCodingWithAI. You can either download or install the sample extension to your sandbox environment or clone the source code for your own projects.
How to get an Azure OpenAI key
To use Azure OpenAI Service, you need to have an Azure subscription and apply for access to the service. Azure OpenAI is generally available with a limited access policy to promote responsible use and limit the impact of high-risk use cases. Once you apply and are approved, you will receive an email with instructions on how to create an Azure OpenAI resource and get your API key.
How to use the Azure OpenAI playground
Azure AI Studio offers an Azure OpenAI playground: a web-based interface that allows you to explore the capabilities of generative AI models and try them out with your own prompts and data. You can access the playground from the Azure portal or from this link: Azure OpenAI Studio.
If you’re not a developer but have played around with ChatGPT or similar, you will find the playground to be a convenient place to experiment and assess whether ChatGPT is a suitable tool to solve the problem at hand. It’s perfect for product managers, designers and consultants looking to get their feet wet without having to write code or without having to build a deep understanding of the underlying technology.
And if you’re ready to dive into the more technical side of things, the playground lets you choose a model, a scenario, and a few shot learning examples to generate outputs. You can also modify the parameters such as temperature, top-p, frequency penalty, and presence penalty to control the randomness and diversity of the outputs.
How to get started with prompt engineering
Prompt engineering is the art of crafting effective inputs for generative models to produce desired outputs. Prompt engineering involves understanding the model’s capabilities and limitations, choosing the right format and tone for the input, providing relevant examples and instructions, and evaluating the output quality and reliability. Prompt engineering is a crucial skill for using Azure OpenAI Service effectively and responsibly. Here are some short links that can help you learn more about prompt engineering:
While Microsoft is hard at work bringing more generative AI to Business Central, we hope that these simple tips will get you started on your AI journey with Azure OpenAI. The product team is eager to hear your feedback on how we can support your use cases and help you design, build, and deliver AI solutions quickly and responsibly in the AI era: community partners are invited to join us at the Copilot and AI Innovation group on our Yammer partner community network. And if you’re looking for inspiration on how to enrich your features with generative AI, check out our 30 minute video where we present the details around our first generative AI feature in Business Central: Marketing Text Suggestions.
This article is contributed. See the original author and article here.
Microsoft is delighted to announce that the new Dynamics 365 Community is now live, marking a significant milestone in our journey of empowering users, fostering collaboration, and driving innovation.
With its fresh new look, streamlined experience, and a suite of powerful features, the new Dynamics 365 Community sets a new standard for user engagement and knowledge sharing. We have listened to your feedback, studied your needs, and made significant enhancements to ensure a seamless and immersive experience. Our aim is to create a platform that not only meets your present requirements but also inspires you to explore new possibilities and accelerate your success.
Here are the highlights of some of the new and future features of our new community:
Enhanced User Experience: Navigate through the community effortlessly and find the answers you need quickly. With intuitive search functionality, personalized recommendations, and a modern interface, your journey within the community has never been smoother.
Achievements Elements: Get ready for a fun and rewarding experience! Engage in community activities, earn badges, and unlock new levels as you contribute and grow. We believe that recognizing your valuable contributions is vital to building a thriving community.
AI-Powered Assistance: Our AI-assisted moderation ensures a safe and inclusive environment for all community members. By leveraging intelligent algorithms, we can maintain the quality and relevance of discussions while fostering a sense of belonging and respect.
This is just the beginning of an incredible journey! We are committed to continuous improvement and will be rolling out regular updates and new features to address your evolving needs. Your feedback and suggestions are invaluable to us, and we encourage you to share your thoughts to help shape the future of the Dynamics 365 Community.
As we embark on this exciting chapter together, we are excited to see the positive impact that this community will have on your professional growth, collaboration, and innovation. The Dynamics 365 Community is more than just a platform; it’s a catalyst for driving positive change in the world of business applications.
Microsoft would like to express our sincere gratitude to our incredible community members, MVPs, and User Group leaders who have played an instrumental role in shaping the Dynamics 365 Community. Your passion, expertise, and dedication continue to inspire us as we strive to create an inclusive and thriving ecosystem.
Thank you for being a part of this remarkable community. We invite you to explore the new Dynamics 365 Community at http://community.dynamics.com/ and embark on a journey of learning, collaboration, and success. Together, let’s unleash the full potential of Dynamics 365 and shape the future of business applications.
Recent Comments