Generative AI in ERP means more accurate planning across the organization  

Generative AI in ERP means more accurate planning across the organization  

This article is contributed. See the original author and article here.

Enterprise Resource Planning (ERP) is about knowing today the best way to approach tomorrow. It’s about collecting accurate snapshots of various business functions at any point in time, so leaders can make clear, careful decisions that poise the organization to thrive in the future.    

ERP sprang from systems designed to help manufacturers track inventory and raw material procurement. As businesses have become more complex and computing more ubiquitous, ERP platforms have grown into aggregated tech stacks or suites with vertical extensions that track data from supply chain, logistics, asset management, HR, finance, and virtually every aspect of the business. But adding all those facets—and their attendant data streams—to the picture can clutter the frame, hampering the agility of an ERP platform. 

Generative AI helps restore clarity. One of the animating features of AI lies in its ability to process data that lives outside the ERP—all the data an organization can access, in fact—to output efficient, error-free information and insights. AI-enabled ERP systems increase business intelligence by aggregating comprehensive data sets, even data stored in multiple clouds, in seconds, then delivering information from them securely, wherever and whenever they may be needed.  

Today we’ll examine a few of the many ways AI elevates ERP functionality. 

manufacturer working

Transform your business

Perform better with AI-powered ERP.

AI tailored to modern business needs

ERPs began as ways to plan material flow to ensure smooth manufacturing runs. Today’s supply chains remain as vulnerable as ever to price fluctuations, political turmoil, or natural disasters. In many firms, buyers and procurement teams must handle fluctuations and change response in large volumes of purchase orders involving quantities and delivery dates. They frequently have to examine these orders individually and assess risk to plans and downstream processes. 

Now, ERPs can use AI to quickly assess and rank high- and low-impact changes, allowing teams to rapidly take action to address risk. AI-enabled ERPs like Microsoft Dynamics 365 allow users to handle purchase order changes at scale and quickly communicate with internal and external stakeholders.  

Using natural language, an AI-assistant can bring relevant information into communications apps, keeping all parties apprised of, say, unexpected interruptions in supply due to extreme climate events or local market economics—and able to collaborate to find a rapid solution.  

Planners can proactively stress-test supply chains by simply prompting the AI Assistant with “what-if?” scenarios. Risk managers might ask, “If shipping traffic in the Persian Gulf is interrupted, what are our next fastest supply routes for [material] from [country]?” Empowered with AI’s ability to reason over large volumes of data, make connections, and then deliver recommendations in clear natural language, the right ERP system could help provide alternatives for planners to anticipate looming issues and downgrade risk. 

Learn more about Copilot for Microsoft Dynamics 365 Supply Chain Management.

AI enables better project management 

Supply chain may be where ERPs were born, but we’ve come a long way. ERPs now contribute to the run of business across the organization—and AI can make each of these more powerful.  

Whether you call them project managers or not, every organization has people whose job it is to manage projects. The chief obstacles for project managers typically involve completing projects on time and on budget. An AI-enabled ERP can cut the time project managers spend compiling status reports, planning tasks, and running risk assessments.  

Take Microsoft Copilot for Dynamics 365 Project Operations, for instance. With Copilot, creating new project plans—a task that used to take managers hours to research and write—now takes minutes. Managers can type in a simple description of the project, details of the timeline, budget, and staff availability. Then, Copilot generates a plan. Managers can fine-tune as necessary and launch the project. Along the way, Copilot automatically produces real-time status reports in minutes, identifies risks, and suggests mitigation plans—all of which is updated and adjusted on a continuous basis as the project progresses and new data becomes available. 

Learn more about Copilot for Dynamics 365 for Project Operations.

Follow the money: AI streamlines financial processes  

Timely payments, healthy cash flow, accurate credit information, successful collections—all of these functions are important for competitive vigor. All are part of a robust ERP, and all can be optimized by AI.  

At the top of the organization, real-time, comprehensive snapshots of the company’s financial positions enable leadership agility. But at an ongoing, operational level, AI can improve financial assessments for every department as well. By accessing data streams from across the organization—supply chain, HR, sales, accounts payable, service—AI provides financial planners with the ability to make decisions about budgets, operations planning, cash flow forecasting, or workforce provisioning based on more accurate forecasts and outcomes.  

AI can help planners collaboratively align budgets with business strategy and engage predictive analytics to sharpen forecasts. Anywhere within an ERP enhanced visibility is an advantage, AI provides it—and more visibility enables greater agility. AI can, for instance, mine processes to help optimize operations and find anomalies the human eye might fail to catch.  

An AI-enabled ERP system also elevates the business by closing talent gaps across the organization.  

Learn more about Microsoft Dynamics 365 ERP.

The future of smarter: Finding new workflows with generative AI  

These are just a few of the ways AI eases current workflows. The untapped strength of AI in an ERP lies in companies finding new workflows enabled by AI that add value—like predictive maintenance algorithms for machines on a factory floor, or recommendation engines to find new suppliers or partners, or modules that aid in new product design enhanced by customer feedback. 

The future rarely looks simpler than the past. When faced with increasing complexity, a common human adaptation is to compartmentalize, pack information into silos that we can shuffle around in our minds. In a business context, ERP platforms were conceived to integrate those silos with software, so people can manage the individual streams of information the way a conductor brings the pieces of a symphony together, each instrument at the right pitch and volume, in the right time.   

Generative AI helps us to do just that, collecting all the potential inputs and presenting them in a relationship to each other. This frees planners to focus on the big picture and how it all comes together, so we can decide which elements to adjust and where it all goes next.   

Learn more about Microsoft Copilot in Dynamics 365.


Using generative AI responsibly is critical to the continued acceptance of the technology and the maximizing of its potential. Microsoft has recently released its Responsible AI Transparency Report for 2024. This report details the criteria for how we build applications responsibly, decide when to release generative applications, support our customer in building responsibly, and learn, grow, and evolve our generative AI offerings.   

The post Generative AI in ERP means more accurate planning across the organization   appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

General Purpose to Business Critical Azure SQL database upgrade

This article is contributed. See the original author and article here.

Recently, we faced a requirement to upgrade large number of Azure SQL databases from general-purpose to business-critical.


 


As you’re aware, this scaling-up operation can be executed via PowerShell, CLI, or the Azure portal and follow the guidance mentioned here – Failover groups overview & best practices – Azure SQL Managed Instance | Microsoft Learn


 


Given the need to perform this task across a large number of databases, individually running commands for each server is not practical. Hence, I have created a PowerShell script to facilitate such extensive migrations.




# Scenarios tested:

# 1) Jobs will be executed in parallel.

# 2) The script will upgrade secondary databases first, then the primary.

# 3) Upgrade the database based on the primary listed in the database info list.

# 4) Script will perform the check prior to the migration in case of the role has changed from primary to secondary of the database mentioned in the database info list.

# 5) Upgrade individual databases in case of no primary/secondary found for a given database.

# 6) Upgrade the database if secondary is upgraded but primary has not been upgraded. Running the script again will skip the secondary and upgrade the primary database.

#    In other words, SLO mismatch will be handled based on the SKU defined in the database info list.

# 7) Track the database progress and display the progress in the console.


 


Important consideration:


 


# This script performs an upgrade of Azure SQL databases to a specified SKU.


# The script also handles geo-replicated databases by upgrading the secondary first, then the primary, and finally any other databases without replication links.


# The script logs the progress and outcome of each database upgrade to the console and a log file.


# Disclaimer: This script is provided as-is, without any warranty or support. Use it at your own risk.


# Before running this script, make sure to test it in a non-production environment and review the impact of the upgrade on your databases and applications.


# The script may take a long time to complete, depending on the number and size of the databases to be upgraded.


# The script may incur additional charges for the upgraded databases, depending on the target SKU and the duration of the upgrade process.


# The script requires the Az PowerShell module and the appropriate permissions to access and modify the Azure SQL databases.


 





 
# Define the list of database information
$DatabaseInfoList = @(
  #@{ DatabaseName = '{DatabaseName}'; PartnerResourceGroupName = '{PartnerResourcegroupName}'; ServerName = '{ServerName}'  ; ResourceGroupName = '{ResourceGroupName}'; RequestedServiceObjectiveName =  '{SLODetails}'; subscriptionID = '{SubscriptionID}' }
   )

# Define the script block that performs the update
$ScriptBlock = {
    param (
        $DatabaseInfo
    )

    Set-AzContext -subscriptionId $DatabaseInfo.subscriptionID

        ###store output in txt file
        $OutputFilePath = "C:temp$($DatabaseInfo.DatabaseName)_$($env:USERNAME)_$($Job.Id)_Output.txt"
        $OutputCapture = @()
        $OutputCapture += "Database: $($DatabaseInfo.DatabaseName)"

    
    $ReplicationLink = Get-AzSqlDatabaseReplicationLink -DatabaseName $DatabaseInfo.DatabaseName -PartnerResourceGroupName $DatabaseInfo.PartnerResourceGroupName -ServerName $DatabaseInfo.ServerName -ResourceGroupName $DatabaseInfo.ResourceGroupName
    $PrimaryServerRole = $ReplicationLink.Role
    $PrimaryResourceGroupName = $ReplicationLink.ResourceGroupName
    $PrimaryServerName = $ReplicationLink.ServerName
    $PrimaryDatabaseName = $ReplicationLink.DatabaseName

    $PartnerRole = $ReplicationLink.PartnerRole
    $PartnerServerName = $ReplicationLink.PartnerServerName
    $PartnerDatabaseName = $ReplicationLink.PartnerDatabaseName
    $PartnerResourceGroupName = $ReplicationLink.PartnerResourceGroupName


    $UpdateSecondary = $false
    $UpdatePrimary = $false

    if ($PartnerRole -eq "Secondary" -and $PrimaryServerRole -eq "Primary") {
        $UpdateSecondary = $true
        $UpdatePrimary = $true
    }
    #For Failover Scenarios only
    elseif ($PartnerRole -eq "Primary" -and $PrimaryServerRole -eq "Secondary") {
        $UpdateSecondary = $true
        $UpdatePrimary = $true

        $PartnerRole = $ReplicationLink.Role
        $PartnerServerName = $ReplicationLink.ServerName
        $PartnerDatabaseName = $ReplicationLink.DatabaseName
        $PartnerResourceGroupName = $ReplicationLink.ResourceGroupName
        
        $PrimaryServerRole = $ReplicationLink.PartnerRole
        $PrimaryResourceGroupName = $ReplicationLink.PartnerResourceGroupName
        $PrimaryServerName = $ReplicationLink.PartnerServerName
        $PrimaryDatabaseName = $ReplicationLink.PartnerDatabaseName
    }

    Try
    {
        if ($UpdateSecondary) {
            $DatabaseProperties = Get-AzSqlDatabase -ResourceGroupName $PartnerResourceGroupName -ServerName $PartnerServerName -DatabaseName $PartnerDatabaseName
            #$DatabaseEdition = $DatabaseProperties.Edition
            $DatabaseSKU = $DatabaseProperties.RequestedServiceObjectiveName
            if ($DatabaseSKU -ne $DatabaseInfo.RequestedServiceObjectiveName)  {
                Write-host "Secondary started at $(Get-Date) of DB $UpdateSecondary"
                $OutputCapture += "Secondary started at $(Get-Date) of DB $UpdateSecondary"
               
                Set-AzSqlDatabase -ResourceGroupName $PartnerResourceGroupName -DatabaseName $PartnerDatabaseName -ServerName $PartnerServerName -Edition "BusinessCritical"  -RequestedServiceObjectiveName $DatabaseInfo.RequestedServiceObjectiveName
                Write-host "Secondary end at $(Get-Date)"
                $OutputCapture += "Secondary end at $(Get-Date)"
        
                
                # Start Track Progress
                $activities = Get-AzSqlDatabaseActivity -ResourceGroupName $PartnerResourceGroupName -ServerName $PartnerServerName -DatabaseName $PartnerDatabaseName |
                Where-Object {$_.State -eq "InProgress" -or $_.State -eq "Succeeded" -or $_.State -eq "Failed"} |  Sort-Object -Property StartTime -Descending | Select-Object -First 1

                if ($activities.Count -gt 0) {
                    Write-Host "Operations in progress or completed for $($PartnerDatabaseName):"
                    $OutputCapture += "Operations in progress or completed for $($PartnerDatabaseName):"
                    foreach ($activity in $activities) {
                    Write-Host "Activity Start Time: $($activity.StartTime) , Activity Estimated Completed Time: $($activity.EstimatedCompletionTime) , Activity ID: $($activity.OperationId), Server Name: $($activity.ServerName), Database Name: $($activity.DatabaseName), Status: $($activity.State), Percent Complete: $($activity.PercentComplete)%, Description: $($activity.Description)"
                    $OutputCapture += "Activity Start Time: $($activity.StartTime) , Activity Estimated Completed Time: $($activity.EstimatedCompletionTime) , Activity ID: $($activity.OperationId), Server Name: $($activity.ServerName), Database Name: $($activity.DatabaseName), Status: $($activity.State), Percent Complete: $($activity.PercentComplete)%, Description: $($activity.Description)"
                    }
                    Write-Host  "$PartnerDatabaseName Upgrade Successfully Completed!"
                    $OutputCapture += "$PartnerDatabaseName Upgrade Successfully Completed!"
                } else {
                    Write-Host "No operations in progress or completed for $($PartnerDatabaseName)"
                    $OutputCapture += "No operations in progress or completed for $($PartnerDatabaseName)"
                }
                # End Track Progress
               # 
            }
            else {
                Write-host "Database $PartnerDatabaseName is already upgraded."
                $OutputCapture += "Database $PartnerDatabaseName is already upgraded."
            }
        }

        if ($UpdatePrimary) {
            $DatabaseProperties = Get-AzSqlDatabase -ResourceGroupName $PrimaryResourceGroupName -ServerName $PrimaryServerName -DatabaseName $PrimaryDatabaseName
           # $DatabaseEdition = $DatabaseProperties.Edition
            $DatabaseSKU = $DatabaseProperties.RequestedServiceObjectiveName
            if ($DatabaseSKU -ne $DatabaseInfo.RequestedServiceObjectiveName){
            Write-host "Primary started at $(Get-Date) of DB $UpdatePrimary"
            $OutputCapture += "Primary started at $(Get-Date) of DB $UpdatePrimary"
            Set-AzSqlDatabase -ResourceGroupName $PrimaryResourceGroupName -DatabaseName $PrimaryDatabaseName -ServerName $PrimaryServerName -Edition "BusinessCritical"  -RequestedServiceObjectiveName $DatabaseInfo.RequestedServiceObjectiveName
            Write-host "Primary end at $(Get-Date)" 
            $OutputCapture += "Primary end at $(Get-Date)"
            

            # Start Track Progress
            $activities = Get-AzSqlDatabaseActivity -ResourceGroupName $PrimaryResourceGroupName -ServerName $PrimaryServerName -DatabaseName $PrimaryDatabaseName |
            Where-Object {$_.State -eq "InProgress" -or $_.State -eq "Succeeded" -or $_.State -eq "Failed"} |  Sort-Object -Property StartTime -Descending | Select-Object -First 1

            if ($activities.Count -gt 0) {
                Write-Host "Operations in progress or completed for $($PrimaryDatabaseName):"
                $OutputCapture += "Operations in progress or completed for $($PrimaryDatabaseName):"
                foreach ($activity in $activities) {
                Write-Host "Activity Start Time: $($activity.StartTime) , Activity Estimated Completed Time: $($activity.EstimatedCompletionTime) , Activity ID: $($activity.OperationId), Server Name: $($activity.ServerName), Database Name: $($activity.DatabaseName), Status: $($activity.State), Percent Complete: $($activity.PercentComplete)%, Description: $($activity.Description)"
                $OutputCapture += "Activity Start Time: $($activity.StartTime) , Activity Estimated Completed Time: $($activity.EstimatedCompletionTime) , Activity ID: $($activity.OperationId), Server Name: $($activity.ServerName), Database Name: $($activity.DatabaseName), Status: $($activity.State), Percent Complete: $($activity.PercentComplete)%, Description: $($activity.Description)"
                }
                Write-Host  "$PrimaryDatabaseName Upgrade Successfully Completed!" 
                $OutputCapture += "$PrimaryDatabaseName Upgrade Successfully Completed!"
            } else {
                Write-Host "No operations in progress or completed for $($PrimaryDatabaseName)"
                $OutputCapture += "No operations in progress or completed for $($PrimaryDatabaseName)"
            }
            # End Track Progress
           #           
            }
            else {
                Write-host "Database $PrimaryDatabaseName is already upgraded."
                $OutputCapture += "Database $PrimaryDatabaseName is already upgraded."
            }
        }
        
        if (!$UpdateSecondary -and !$UpdatePrimary) {
            $DatabaseProperties = Get-AzSqlDatabase -ResourceGroupName $DatabaseInfo.ResourceGroupName -ServerName $DatabaseInfo.ServerName -DatabaseName $DatabaseInfo.DatabaseName
            # $DatabaseEdition = $DatabaseProperties.Edition
             $DatabaseSKU = $DatabaseProperties.RequestedServiceObjectiveName
        If ($DatabaseSKU -ne $DatabaseInfo.RequestedServiceObjectiveName)  {
            Write-Host "No Replica Found."
            $OutputCapture += "No Replica Found."
            Write-host "Upgrade started at $(Get-Date)"
            $OutputCapture += "Upgrade started at $(Get-Date)"
            Set-AzSqlDatabase -ResourceGroupName $DatabaseInfo.ResourceGroupName -DatabaseName $DatabaseInfo.DatabaseName -ServerName $DatabaseInfo.ServerName -Edition "BusinessCritical"  -RequestedServiceObjectiveName $DatabaseInfo.RequestedServiceObjectiveName
            Write-host "Upgrade completed at $(Get-Date)"
            $OutputCapture += "Upgrade completed at $(Get-Date)"

            # Start Track Progress
            $activities = Get-AzSqlDatabaseActivity -ResourceGroupName $DatabaseInfo.ResourceGroupName -ServerName $DatabaseInfo.ServerName -DatabaseName $DatabaseInfo.DatabaseName |
            Where-Object {$_.State -eq "InProgress" -or $_.State -eq "Succeeded" -or $_.State -eq "Failed"} |  Sort-Object -Property StartTime -Descending | Select-Object -First 1

            if ($activities.Count -gt 0) {
                Write-Host "Operations in progress or completed for $($DatabaseInfo.DatabaseName):"
                $OutputCapture += "Operations in progress or completed for $($DatabaseInfo.DatabaseName):"
                foreach ($activity in $activities) {
                Write-Host "Activity Start Time: $($activity.StartTime) , Activity Estimated Completed Time: $($activity.EstimatedCompletionTime) , Activity ID: $($activity.OperationId), Server Name: $($activity.ServerName), Database Name: $($activity.DatabaseName), Status: $($activity.State), Percent Complete: $($activity.PercentComplete)%, Description: $($activity.Description)"
                $OutputCapture += "Activity Start Time: $($activity.StartTime) , Activity Estimated Completed Time: $($activity.EstimatedCompletionTime) , Activity ID: $($activity.OperationId), Server Name: $($activity.ServerName), Database Name: $($activity.DatabaseName), Status: $($activity.State), Percent Complete: $($activity.PercentComplete)%, Description: $($activity.Description)"
                }
                Write-Host  " "$DatabaseInfo.DatabaseName" Upgrade Successfully Completed!"
                $OutputCapture += "$($DatabaseInfo.DatabaseName) Upgrade Successfully Completed!"
            } else {
                Write-Host "No operations in progress or completed for $($DatabaseInfo.DatabaseName)"
                $OutputCapture += "No operations in progress or completed for $($DatabaseInfo.DatabaseName)"
            }
            # End Track Progress
           # Write-Host  " "$DatabaseInfo.DatabaseName" Upgrade Successfully Completed!"
        }
        else {
            Write-host "Database "$DatabaseInfo.DatabaseName" is already upgraded."
            $OutputCapture += "Database $($DatabaseInfo.DatabaseName) is already upgraded."
        }
        $OutputCapture += "Secondary started at $(Get-Date) of DB $UpdateSecondary"
    }
    }
    Catch
    {
        # Catch any error
        Write-Output "Error occurred: $_"
        $OutputCapture += "Error occurred: $_"
    }
    Finally
    {
        Write-Host "Upgrade Successfully Completed!"
        $OutputCapture += "Upgrade Successfully Completed!"
            # Output the captured messages to the file
            $OutputCapture | Out-File -FilePath $OutputFilePath
    }
  
}

# Loop through each database and start a background job
foreach ($DatabaseInfo in $DatabaseInfoList) {
    Start-Job -ScriptBlock $ScriptBlock -ArgumentList $DatabaseInfo
}

# Wait for all background jobs to complete
Get-Job | Wait-Job

# Retrieve and display job results
#Get-Job | Receive-Job
Get-Job | ForEach-Object {
    $Job = $_
    $OutputFilePath = "C:temp$($Job.Id)_Output.txt"
    Receive-Job -Job $Job | Out-File -FilePath $OutputFilePath  # Append job output to the text file
}

# Clean up background jobs
Get-Job | Remove-Job -Force
write-host "Execution Completed successfully."
$OutputCapture += "Execution Completed successfully."

 

Este Mês no Azure Static Web Apps | 07/2024

Este Mês no Azure Static Web Apps | 07/2024

This article is contributed. See the original author and article here.

aswa-community.jpg


 


Sejam bem-vindos a primeira edição da Comunidade do Azure Static Web Apps! Todo mês, nós compartilharemos os conteúdos que a Comunidade Técnica criou, seja em formato de artigos, vídeos, podcasts que falam sobre o Azure Static Web Apps.


 


Quer ter o seu conteúdo compartilhado no TechCommunity no #ThisMonthInAzureStaticWebApps? Veja como!


 




  • 1 – Crie um artigo, vídeo, podcast ou até mesmo algum projeto Open Source que fale ou tenha relação com o Azure Static Web Apps.




  • 2 – Compartilhe o seu conteúdo no Twitter, LinkedIn ou Instagram com a hashtag #AzureStaticWebApps




  • 3 – Compartilhe também no nosso repositório oficial do Azure Static Web Apps no GitHub, na aba Discussions. Lá você encontrar um tópico chamado: This Month In Azure Static Web Apps. Compartilhe o link do seu conteúdo lá de acordo com o mês que você deseja que ele seja compartilhado.




  • 4 – Pronto! Nós iremos compartilhar o seu conteúdo no TechCommunity da Microsoft no mês seguinte!




 


repo-community-aswa.png


 


 


 


Independente do idioma que você escreva, seja em português, inglês, espanhol, francês, alemão, entre outros, nós queremos compartilhar o seu conteúdo!


 


Também se você estiver o Azure Static Web Apps com algum outro serviço ou Tecnologia, fique à vontade para compartilhar o seu conteúdo. Difere também da linguagem de programação que você está utilizando. Seja JavaScript, TypeScript, Python, Java, C#, Go, Rust, entre outras, nós queremos compartilhar o seu conteúdo!


 


Outro detalhe: você não precisa ser um especialista no Azure Static Web Apps para compartilhar o seu conteúdo. Se você está aprendendo sobre o serviço e quer compartilhar a sua jornada, fique à vontade para compartilhar o seu conteúdo!


 


Agora, vamos para os conteúdos do mês de Julho!


 


Agradecimentos!


 


Antes de começar a compartilhar os conteúdos, gostaríamos de agradecer a todas as pessoas que compartilharam os seus conteúdos no mês de Julho! Vocês são incríveis!


 


Conteúdos Compartilhados | Julho 2024


 


Agora vamos para os conteúdos compartilhados no mês de Julho de 2024!


 



Adding an API to an Azure hosted React Static Web App


 




  • Autor: Janek Fellien


 


 


01-article.png


 


O artigo explica como adicionar uma API a um aplicativo React Static Web App hospedado no Azure, configurando um ambiente de desenvolvimento com SWA CLI e VS Code, criando uma função HTTP em C#, e integrando a API ao app React para exibir dados no site.


 


Quer aprender a conectar seu aplicativo React a uma API no Azure? Leia o artigo completo e descubra como adicionar funcionalidades dinâmicas ao seu projeto.


 


Link: Adding an API to an Azure hosted React Static Web App


 


 



Hosting Next.JS Static Websites on Azure Static Web App


 




  • Autor: Parveen Singh


 


02-article.png


 


O artigo explica como hospedar sites estáticos criados com Next.js usando Azure Static Web Apps, abordando desde a configuração do repositório GitHub até a implantação contínua na Azure. É uma solução ideal para desenvolvedores que buscam simplicidade, segurança e escalabilidade em seus sites.


 


Quer aprender como hospedar seu site Next.js de forma simples e eficiente na Azure? Leia o artigo completo e descubra como aproveitar os recursos do Azure Static Web Apps!



 


 


Link: Hosting Next.JS Static Websites on Azure Static Web App


 



How to Deploy a React PWA to Azure Static Web Apps


 




  • Autor: Petkir


 


03-article.png


 


Este artigo ensina como implementar e automatizar o processo de deploy de uma aplicação React PWA para o Azure Static Web Apps usando GitHub Actions e Azure DevOps, além de gerar os recursos necessários com Bicep.


 


Quer aprender a simplificar o deploy de suas aplicações React PWA? Leia o artigo completo e descubra como automatizar tudo usando GitHub Actions e Azure DevOps!


 


Link: How to Deploy a React PWA to Azure Static Web Apps


 


 



Azure Static Web App: Seamless Microsoft Entra (Azure AD) Integration with Angular


 




  • Autor: Sparrow Note YouTube Channel | Althaf Moideen Konnola


 


 


Este vídeo ensina como integrar Microsoft Entra (Azure AD) com uma Azure Static Web App usando Angular, incluindo configuração de SSO, registro de aplicação e exibição de informações do usuário.


 


Quer aprender a integrar autenticação com Microsoft Entra em suas aplicações Angular? Assista agora e domine essa integração essencial para uma experiência de login unificada!


 


Link: Azure Static Web App: Seamless Microsoft Entra (Azure AD) Integration with Angular


 



Trimble Connect Workspace API 007 – Deployment


 




  • Autor: LetsConstructIT YouTube Channel


 


 


O vídeo demonstra como implantar uma aplicação local na nuvem usando Azure Static Web Apps, tornando-a acessível na web e integrada ao Trimble Connect, incluindo a configuração de extensões personalizadas.


 


Quer aprender a implantar suas aplicações na nuvem de forma simples e integrada com Trimble Connect? Assista ao vídeo completo e descubra como!


 


Link: Trimble Connect Workspace API 007 – Deployment


 



Blazor WASM Publishing to Azure Static Web Apps


 




  • Autor: Abdul Rahman | Regina Sharon


 


04-article.png


 


O artigo ensina como publicar aplicações Blazor WebAssembly (WASM) no Azure Static Web Apps, cobrindo desde a configuração inicial do projeto até a resolução de problemas comuns, como o erro 404 em atualizações de página. Ele também explica como personalizar o processo de build e configurar domínios personalizados.


 


Quer aprender a publicar suas aplicações Blazor WASM no Azure de forma simples e eficaz? Leia o artigo completo e descubra como configurar tudo passo a passo, garantindo que sua aplicação funcione perfeitamente!


 


Link: Blazor WASM Publishing to Azure Static Web Apps


 



Azure Static Web Apps Community Standup: Create a RAG App with App Spaces


 




  • Autor: Skyler Hartle | Dheeraj Bandaru


 


 


O vídeo apresenta o App Spaces, uma nova ferramenta do Azure que simplifica a criação e o gerenciamento de aplicativos inteligentes, especialmente ao integrar Azure Static Web Apps e Azure Container Apps. Durante a sessão, é demonstrado como criar e implantar um aplicativo de Recuperação Aumentada por Geração (RAG), utilizando uma interface simples que conecta repositórios do GitHub e automatiza o processo de CI/CD.


 


Descubra como simplificar a criação de aplicativos inteligentes com o Azure App Spaces! Assista ao vídeo completo e aprenda a implantar rapidamente um aplicativo RAG em minutos. Não perca essa oportunidade de elevar suas habilidades de desenvolvimento na nuvem!


 


Link: Azure Static Web Apps Community Standup: Create a RAG App with App Spaces


 



Serverless Single Page Application (Vue.js) mit Azure Static Web Apps


 




  • Autor: Florian Lenz

  • Idioma: Alemão


 


 


Este artigo mostra como criar e implantar uma aplicação de página única (SPA) usando Vue.js e Azure Static Web Apps. Ele guia o leitor desde a criação do projeto até a adição de um backend serverless com Azure Functions, destacando a facilidade de uso e as vantagens do modelo serverless para aplicações full-stack.


 


Quer aprender a implantar sua aplicação Vue.js na nuvem com Azure Static Web Apps e aproveitar os benefícios do serverless? Leia o artigo completo e descubra como criar e gerenciar uma aplicação full-stack de forma simples e eficiente!


 


Link: Serverless Single Page Application (Vue.js) mit Azure Static Web Apps


 


Conclusão


 


Se você deseja ser destaque no próximo artigo do #ThisMonthInAzureStaticWebApps, compartilhe o seu conteúdo nas redes sociais com a hashtag #AzureStaticWebApps e também no nosso repositório oficial no GitHub. Estamos ansiosos para compartilhar o seu conteúdo no próximo mês!


 


Lembrando que você não precisa ser um especialista no Azure Static Web Apps para compartilhar o seu conteúdo. Se você está aprendendo sobre o serviço e quer compartilhar a sua jornada, fique à vontade para compartilhar o seu conteúdo!


 


Até a próxima edição! :cool:

Simplify development with Dev Container templates for Azure SQL Database

Simplify development with Dev Container templates for Azure SQL Database

This article is contributed. See the original author and article here.

What are Dev Containers?


A development container essentially packages up your project’s development environment using the Development Container Specification (devcontainer.json). This specification enriches your container with metadata and content necessary to enable development from inside a container.


Workspace files are mounted from the local file system or copied or cloned into the container. Extensions are installed and run inside the container, where they have full access to the tools, platform, and file system. This means that you can seamlessly switch your entire development environment just by connecting to a different container.


carlosrobles_0-1721240099869.png


 


Dev Container Templates are source files packaged together that encode configuration for a complete development environment, while Dev Container Features allow us to add runtimes, tools, and libraries inside a container. As a result, all this put together ensures a consistent and reproducible development environment from any tool that supports the Development Container Specification.


When you open your project in the dev container, your code will just work without downloading anything on your local machine. Furthermore, the best part is that when connected to a dev container, your developer experience is exactly the same as if you opened the project locally in VS Code.


 


Introducing Dev Container Templates for Azure SQL Database


We are excited to introduce new Dev Container templates specifically designed for Azure SQL Database. These templates support multiple programming languages, including .NET 8, .NET Aspire, Python, and Node.js, making it easier for developers to get started quickly and focus on building their applications.


Dev Containers streamline the development process by providing an out-of-the-box environment configured for Azure SQL Database. This eliminates the need for developers to spend time searching for and setting up VS Code extensions to interact with their database and preferred programming language. With these templates, you can dive straight into coding, boosting productivity and reducing setup friction.


 


carlosrobles_1-1721240099871.png


 


Included with the templates is a pre-built demo database called Library, which serves as a practical example to help developers get started quickly. While these Dev Containers use the Azure SQL Edge container image, which offers a surface area close to Azure SQL Database, using SQL Database Projects ensures that your database code remains compatible with Azure SQL Database. With this demo project, you can easily use the dacpac artifact created by SQL Database Projects and deploy it to Azure SQL Database using the Azure SQL Action for GitHub Actions. This process streamlines your workflow and ensures seamless integration with your production environment.


Whether working locally or in the cloud, dev containers ensure consistency across development environments, making it easier to collaborate and maintain high standards across your team. With the inclusion of essential tools like SQLCMD, SqlPackage, Azure Command-Line Interface (CLI) and Azure Developer CLI (AZD), these templates offer a comprehensive solution for enhancing your development workflow with Azure SQL Database.


 


Benefits of Using Dev Containers


Dev Containers ensure a consistent and seamless experience, promoting smooth collaboration across teams and workflows, and facilitating an easy transition to Azure environments. Key benefits include:



  • Preconfigured environments: These come with all necessary tools and dependencies.

  • Consistency: Maintain uniformity across different development setups.

  • Simplified setup: Reduce time spent on configuration.

  • Enhanced collaboration: Improve teamwork within development teams.

  • Seamless transition to Azure: Leverage the scalability and reliability of Azure SQL Database for production deployments.

  • Accelerated time-to-market: Streamline development workflows and integrate seamlessly with existing toolsets, giving businesses a competitive edge.

  • Cost-efficient development: Reduce dependencies on cloud resources during the development and testing phases.


By using dev containers, developers can avoid the hassle of setting up and configuring their local development environment manually.


 


Prerequisites


Before you begin, make sure you have the following tools installed on your local machine:



To set up your environment, follow these steps:



  1. First, ensure you have Git installed for version control.

  2. Then, install Docker, which is necessary for running containers.

  3. After that, download and install Visual Studio Code, as it will be your primary IDE for using Dev Containers.

  4. Lastly, add the Dev Containers extension to Visual Studio Code to enable seamless containerized development.


 


Setting up the Dev Container template for Azure SQL Database


Creating a Dev Container



Begin by either opening a local folder containing your application project or cloning an existing repository into Visual Studio Code. This initial step sets the stage for integrating your project with a development container, whether you’re starting from scratch or working on an existing application. In Visual Studio Code, open the command palette (press F1 or Ctrl+Shift+P on Windows and Cmd+Shift+P on macOS). Select the



Dev Containers: Add Dev Container Configuration Files command.


carlosrobles_2-1721240099872.png

Select the Add configuration file to workspace option if you want to add the dev container configuration file to your current local repository. Alternatively, choose the Add configuration file to user data folder option. For this qiuckstart, select the Add configuration file to workspace option.


carlosrobles_3-1721240099873.png

Visual Studio Code prompts you to select a Dev Container template. The available templates are based on the tools and dependencies required for the specific development environment. Select Show All Definitions to view all available templates.


carlosrobles_4-1721240099873.png

Next, select the desired Dev Container template for Azure SQL Database by typing Azure SQL into the command palette. This action displays a list of available templates designed for Azure SQL Database development.


carlosrobles_5-1721240099874.png


Building the Container


Upon selection, Visual Studio Code automatically generates the necessary configuration files tailored to the chosen template. These files include settings for the development environment, extensions to install, and Docker configuration details. They’re stored in a .devcontainer folder within your project directory, ensuring a consistent and reproducible development environment.


carlosrobles_6-1721240099874.png


Following the configuration file generation, Visual Studio Code prompts you to transition your project into the newly created Dev Container environment. You can do it by selecting Reopen in Container. This step is crucial as it moves your development inside the container, applying the predefined environment settings for Azure SQL development.



If you haven’t already, you can also initiate this transition manually at any time using the Dev Containers extension. Use the Reopen in Container command from the command palette or select on the blue icon at the bottom left corner of Visual Studio Code and select Reopen in Container.


carlosrobles_7-1721240099875.png

This action initiates the setup process, where Visual Studio Code generates the necessary configuration files and builds the development container based on the selected template. The process ensures that your development environment is precisely configured for Azure SQL Database development.



Visual Studio Code builds the container based on the selected configuration. The build process might take a few minutes the first time.

carlosrobles_8-1721240099875.png


carlosrobles_9-1721240099876.png


Exploring and verifying the Dev Container


After you build the dev container, start exploring and verifying the setup. Open a terminal within Visual Studio Code to check that all necessary tools are installed and working correctly.


carlosrobles_10-1721240099877.png


As an optional step, you can also run predefined tasks directly from the command palette, streamlining your development workflow and allowing you to focus on writing code.


carlosrobles_11-1721240099879.png


carlosrobles_12-1721240099879.png


For more detailed information about specific templates, visit Azure SQL Database Dev Container templates.


Conclusion


Dev Containers for Azure SQL Database offer a powerful and efficient way to streamline your development process. By providing a consistent, portable environment, they help you focus on writing code and building features rather than configuring your setup. We encourage you to explore these templates and see how they can enhance your development workflow for Azure SQL Database.


Looking ahead, we will delve into more advanced topics like integrating Azure services with Dev Containers to further optimize your development process. Stay tuned for more insights and practical guides to help you get the most out of Azure SQL Database and Dev Containers.


 


More about Dev Container templates for Azure SQL Datatabase.

New cloud-ready application templates for accelerating your development

New cloud-ready application templates for accelerating your development

This article is contributed. See the original author and article here.

justinroyal_1-1723736273689.png


New extensible blueprint templates are available to help accelerate your app development. Each of these templates is fully working and ready to deploy reusable infrastructure and proof-of-concept code through Azure Developer CLI to GitHub Codespaces or VSCode. 


 



 


Explore the entire template library: browse for fully working, cloud-ready applications to deploy with Azure


 


If you are using app advisor, the self-guided experience that surfaces the latest resources and recommendations based on your current development stage, you will also be presented with templates that are relevant to your current stage of development.


 


*Note that this list of templates is subject to change as new templates become available.


 

Grow your Business with Copilot for Microsoft 365 – August 2024

Grow your Business with Copilot for Microsoft 365 – August 2024

This article is contributed. See the original author and article here.

GabeHo_0-1723486248427.png


 


Welcome back to Grow Your Business with Copilot for Microsoft 365, a monthly series designed to empower small and midsized businesses to harness the power of AI at work. 


 


My team works with a wide range of small and midsized businesses. And while each is unique in their own way, we’ve found that regardless of size, industry, or market, they basically want the same thing: to grow. To attract more customers. To boost revenue. To scale efficiently. 


 


Make sure to also check out our weekly Copilot productivity series that just launched, as well as the new Copilot Success Kit, your one-stop shop for getting ready and implementing Copilot.


 


PKSHA Technology – Embracing AI


Staying on the cutting edge – PKSHA Technology is doing just that by using the power Copilot for Microsoft 365 to grow their business and evangelize AI to their customers so they can do the same.  


 


PKSHA Technology is a midsized company based in Tokyo, Japan. PKSHA develops algorithmic solutions and AI technologies that help companies become more efficient and improve their processes – they believe algorithms can solve some of the world’s biggest challenges. With effective roll out techniques, PKSHA leveraged Copilot to create new hire shortcuts, improve their customer management, and shorten the process from product roadmap to feature enhancements.


An image of PKSHA's business signAn image of PKSHA’s business sign


Onboarding Shortcuts with AI 


As PKSHA experienced rapid growth and hired new employees, and like most businesses, they found pain points in the onboarding process. It was difficult to ensure new hires had access or could find the information they needed. Onboarding new employees and getting them up to speed can also be a very demanding process for your current employees.  
 
With the help of Copilot, PKSHA employees task Copilot to search for the information they need. This ultimately shortens the time for new hires between their first day on the job to making a true impact! It also frees up time for those tasked with onboarding them into role, taking advantage of the fact that much of the company internal intel is now at their fingertips with Copilot.  
 
There are many ways that Copilot can help accelerate onboarding. For example, while attending a team meeting, using Copilot to ask clarifying questions. The “personal chat” with Copilot allows you ask questions about the meeting while not interrupting the flow of the meeting. As a new hire, creating documents, proposals, or paper can be hard as you are still learning the tone, voice, and preferred format of your new company. Using Copilot in Word, you can reference other documents to get to your first draft faster. Managers are also able to use Copilot to create onboarding documents and processes much faster to help employees orientate themselves to their new organization.


A screenshot of Copilot in Teams Meeting personal chatA screenshot of Copilot in Teams Meeting personal chat


 


Customer Management 


Hightouch customer service can be a very time-consuming task that requires thorough preparation and detailed followup communications. Prior to Copilot, PKSHA Customer Success specialist, Ms. Takeuchi, would spend hours preparing information prior to calls and afterwards transcribing notes and documenting follow-up actions. Now, she uses Copilot to quickly assemble materials in advance, organizes to-dos and shares action tasks with customers immediately after the meetings. With her administrative workload considerably reduced with Copilot in Teams, Ms. Takeuchi is able to dedicate more time focusing on her customers and activities that matter the most, maximizing care, attention, and service quality.


A screenshot of Copilot in TeamsA screenshot of Copilot in Teams


 


Product Development 


A streamlined customer feedback loop that feeds into an issues list and ultimately product enhancements… sounds like an operational dream. With Copilot, PKSHA is making that dream closer to a reality. The PKSHA team leverages Copilot in Teams and Excel for gathering customer intel and feedback. By using Copilot in Teams to summarize and organize product feedback they receive to easily surface product needs and create a centralized log of possible product improvements. This process creates a shared knowledge base that team members across their product groups can reference, instead of disparate information silos, resulting in greater coordination and faster delivery of product enhancements. In parallel, the customer success team also uses Copilot in Excel to identify trends in the customer data. These trends help the team create meaningful recommendations for their customers. With Copilot, the team overall saves up to 4 hours of time spent on data analysis.


A screenshot of Copilot in ExcelA screenshot of Copilot in Excel


 


Creating AI Champions 


When introducing any new technology tools in the workplace, it’s crucial to have the right adoption plan in place. Often a pilot group is part of any successful roll out plan.  The pilot approach is baked into PKSHA’s vision for their company. PKSHA utilizes new AI solutions internally first to better evaluate how they can solve client needs with those AI solutions. In order to both test and drive the internal adoption of AI, PKSHA created their Future Work Black Belt Team. Creating an AI leadership team is a best practice that Microsoft has witnessed across its Copilot customer base. Read more details about how to stand up your own AI Council here 


A quote from Mr. Kensuke Yamamoto, Executive Office & Head of Development at PKSHA Technology: “Copilot is a tool that supports our main mission of defining and shaping the software of the future. Copilot is part of our own future way of working so we can lead our customers to the right future workplace.”


 


Accelerating AI innovation with Copilot 


The productivity and collaboration benefits of Copilot enable the team at PKSHA to focus more on their core mission of creating better AI solutions and technologies. Just like PKSHA is all about harnessing the power of algorithms to solve some of the world’s biggest challenges, Copilot gives them the power to fuel their innovation, creativity and efficiency amidst their AI development. 


 


We are so excited to see PKSHA and other small and medium companies harness the power of Copilot to grow! Tune in next month for another example of how Copilot helps unlock more value and opportunity. If your company has used Copilot for Microsoft 365 to grow and you’d like to share your story, we’d love to feature you! Comment below to let us know you’re interested and a member from our team will get in touch!


 


GabeHo_2-1723487362464.png


Want to try out some of the ways PKSHA used Copilot for Microsoft 365? Check out the following resources: 





  • Check out the new SMB Success Kit and accelerate your Copilot adoption today 



For adoption content visit Microsoft 365 Adoption – Get Started 


For the latest SMB AI insights follow Microsoft 365 blog 


 


Angela Byers


Microsoft


Senior Director, Copilot & Growth Marketing for SMB


An image of the SMB Copilot team at Microsoft, with Angela Byers, Elif Algedik, Kayla Patterson, Briana Taylor, and Gabe HoAn image of the SMB Copilot team at Microsoft, with Angela Byers, Elif Algedik, Kayla Patterson, Briana Taylor, and Gabe Ho


Meet the team 


The monthly series, Grow Your Business with Copilot for Microsoft 365, is brought to you by the SMB Copilot marketing team at Microsoft. From entrepreneurs to coffee connoisseurs, they work passionately behind the scenes, sharing the magic of Copilot products with small and medium businesses everywhere. Always ready with a smile, a helping hand, and a clever campaign, they’re passionate about helping YOUR business grow!  


 

Enhancing Supply Chain Integrity: introducing quality control for goods in-transit orders in Dynamics 365 SCM 

Enhancing Supply Chain Integrity: introducing quality control for goods in-transit orders in Dynamics 365 SCM 

This article is contributed. See the original author and article here.

Introduction 

In fast-paced, complex supply chain environments, ensuring product quality throughout the journey from supplier to customer is more critical than ever. We’re excited to address this with a powerful new feature in Microsoft Dynamics 365 Supply Chain Management Landed Cost module, enabling quality control for goods in-transit orders. 

Addressing a Critical Gap in Supply Chain Management 

Traditionally, quality control measures in supply chain management focus on initial stages of production and receipt of purchase orders at their final destination. We see a growing need for more comprehensive quality assurance processes covering all phases, including the in-transit phase. Now, businesses can conduct quality checks on goods while they are in transit. This new feature ensures product integrity throughout the entire supply chain journey. 

How it works 

The quality control for goods in-transit feature is seamlessly integrated into the Dynamics 365 SCM framework. Here’s how it enhances the supply chain process: 

Setup of Goods in transit order in Quality associations: Businesses can now define goods in-transit order as a new quality association type with pre-defined event blocking approach.  This proactive measure ensures any potential quality issues can be identified and addressed before the goods reach their final destination.
View of Quality Control menu in Dynamics 365

Automatic Quality order creation: During the Goods in-transit order registration/receive operation, depends on the previous step’s configuration, the corresponding quality order will automatically create to reflect the quality control.

Order control and release: Depending on the configuration, the quality order completeness will block the downstream business operation if it’s not passed. This control makes it easy for businesses to adopt and implement without significant changes to their current quality control processes for Goods in-transit order.

Benefits of Quality Control for Goods In-Transit

While the enhanced return receiving process in Dynamics 365 SCM represents a significant leap forward, transparency is key. We have multiple planned backlogs coming soon, such as:

Implementing quality control for goods in-transit offers several significant advantages:

  • Enhanced Supply Chain Reliability: By ensuring quality at every stage, businesses can significantly reduce the risk of receiving defective or non-compliant goods.
  • Cost Efficiency: Early detection of quality issues minimizes the need for costly rework or returns, leading to substantial cost savings.
  • Regulatory Compliance: The feature supports compliance with various regulatory standards, ensuring that products meet all necessary legal requirements.
  • Improved Customer Satisfaction: Delivering high-quality products consistently enhances customer trust and satisfaction, ultimately driving business growth.
Conclusion

The introduction of quality control for goods in-transit orders in Microsoft Dynamics 365 SCM represents a significant advancement in supply chain management. It empowers businesses to ensure product quality at every stage of the supply chain, from production to final delivery. By adopting this feature, companies can enhance their supply chain integrity, reduce costs, comply with regulatory standards, and deliver superior products to their customers.

Stay tuned for more updates as we continue to innovate and expand the capabilities of Microsoft Dynamics 365 SCM to meet the evolving needs of the global supply chain.

Call to action

Enable the feature: Turn on the Landed cost module and related features for your system – Supply Chain Management | Dynamics 365 | Microsoft Learn

Learning Article:   Quality orders – Supply Chain Management | Dynamics 365 | Microsoft Learn

The post Enhancing Supply Chain Integrity: introducing quality control for goods in-transit orders in Dynamics 365 SCM  appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Announcing SharePoint Embedded Fall Tour Events and Schedule

This article is contributed. See the original author and article here.

Kick off your journey with SharePoint Embedded. At the SharePoint Embedded for Enterprise Apps events, you’ll explore best practices for your projects, glimpse the future of SharePoint Embedded, and learn to integrate Copilot into document-centric apps. We’re eager for your feedback and experiences; your creations shape ours.


 


The SharePoint Embedded product team is coming to New York City and London in September! Come join us for an all-day event to learn how SharePoint Embedded can deliver Copilot, Collaboration, Compliance, and Core Enterprise Storage for your document centric apps. 

Specifically, you’ll have the opportunity to do the following:



  • Learn about SharePoint Embedded, a new way to build file and document centric apps.

  • Get hands-on coding experience with this new technology and learn how to build your own custom app.

  • Take a deep dive into critical features, like compliance, collaboration and copilot.

  • Hear from others who have implemented SharePoint Embedded solutions.

  • Get insight into the SharePoint Embedded roadmap



New York City, US
Date: Thursday, September 12th, 9AM-7PM (times are approximate, including social hour)
Where: Microsoft Offices NYC Times Square 

London, UK
Date: Thursday, September 26th, 9AM-7PM (times are approximate, including social hour)
Where: Central London, UK (Exact location TBD)

RSVP Details (Please note that this event is only open to certain countries and the following will not be accepted: Russia, Belarus)



  • 21+, free event, no registration fees

  • First come, first served (limited seats)


    • 1 RSVP = 1 person 


  • NDA required (if your company does not have an NDA on record, one will be sent)


    • NDA must be signed to attend event


  • Event will be IN PERSON ONLY and will not be recorded

  • Bring your own device for coding portions (tablets and smartphones will not work)


To register for one or more of these events visit Microsoft SharePoint Embedded for Enterprise Apps (office.com).

Your Board, Your Way – Optimize schedule board navigation patterns in Universal Resource Scheduling

Your Board, Your Way – Optimize schedule board navigation patterns in Universal Resource Scheduling

This article is contributed. See the original author and article here.

Leaving the schedule board today can be cumbersome because you have to re-enter your preferred settings every time you come back. You may also find it frustrating that your admin has the power to override your choices and reset the board to the default settings. Wouldn’t it be nice if you could save your personal preferences and have them ready when you need them? The schedule board now boasts improved navigation patterns to help YOU manage your schedules more efficiently!

Remember my board

The schedule board now works with your computer’s local cache to reload with the last accessed parameters as chosen by you, no configuration necessary! That means you can leave the schedule board to check on your resources, update requirements, or even grab a hot cuppa, all while your board stays the way you left it.

 The cache will save and reload the following parameters automatically: 

  1. Last accessed tab: Save time by not having to reload the tab its relevant resources and bookings 
  1. Map panel open/closed: Map remains in the state that you left it in 
  1. Viewtype: Gantt/list view – Schedule board  
  1. Viewmode: hourly/daily/weekly 
  1. BoardStartDate: Continue with the last accessed date range, resets to today’s date after 15min 
  1. Columnwidth: zoom level of the board stays the way you want it 

Many of our users have told us about their struggles trying to return to today’s date when switching between date ranges. We’ve thus added a new “Today” button next to the date range control, that helps you quickly return to today’s date range, wherever you may be. 

What if you want to share your settings with others or add a bookmark of your settings to your browser? We’ve added a new one-click button that helps you generate a URL link that captures all the following schedule board parameters: 

  1. Last accessed tab 
  1. Map panel open/closed 
  1. Viewtype: Gantt/list view 
  1. Viewmode: hourly/daily/weekly 
  1. Columnwidth: zoom level of the board

Saving and sharing your favorite board setup has never been easier! 

Step 1: Click on the “…” more button at the top right of the schedule board 

Step 2: Click on “Copy link” button 

Step 3: The generated link has been saved to your clipboard.

The use cases are numerous, for example:

  1. Add the copied link to a bookmark in your browser. Whenever you click on this bookmarked link, the browser will launch the board with your preferred parameters 
  1. Share the link with your colleagues/team to share a setup that works for you, and teach them optimize their workflow  

You can also configure Schedule Board URLs manually: Open the schedule board from a URL | Microsoft Learn 

More details on schedule board caching and URL addressability can be found here: Learn more about schedule board

The post Your Board, Your Way – Optimize schedule board navigation patterns in Universal Resource Scheduling appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

eDiscovery launches a modern, intuitive user experience

eDiscovery launches a modern, intuitive user experience

This article is contributed. See the original author and article here.

This month, we have launched a redesigned Microsoft Purview eDiscovery product experience in public preview. This improved user experience revolutionizes your data search, review and export tasks within eDiscovery. Our new user-friendly and feature-rich eDiscovery experience is not just about finding and preserving data, it’s about doing it with unprecedented efficiency and ease. The modern user experience of eDiscovery addresses some long-standing customer requests, such as enhanced search capabilities with MessageID,  Sensitive Information Types (SITs) and sensitivity labels. It also introduces innovative features like draft query with Copilot and search using audit log. These changes, driven by customer feedback and our commitment to innovation, offer tangible value by saving time and reducing costs in the eDiscovery process. 


 


The new eDiscovery experience is exclusively available in the Microsoft Purview portal. The new Microsoft Purview portal is a unified platform that streamlines data governance, data security, and data compliance across your entire data estate. It offers a more intuitive experience, allowing users to easily navigate and manage their compliance needs.  


 


Unified experience 


One of the benefits of the new improved eDiscovery offers a unified, consistent, and intuitive experience across different licensing tiers. Whether your license includes eDiscovery standard or premium, you can use the same workflow to create cases, conduct searches, apply holds, and export data. This simplifies the training and education process for organizations that upgrade their license and want to access premium eDiscovery features. Unlike the previous experience, where Content Search, eDiscovery (Standard), and eDiscovery (Premium) had different workflows and behaviors, the new experience lets you access eDiscovery capabilities seamlessly regardless of your license level. E5 license holders have the option to use premium features such as exporting cloud attachments and Teams conversation threading at the appropriate steps in the workflow. Moreover, users still have access to all existing Content Searches and both Standard and Premium eDiscovery cases on the unified eDiscovery case list page in the Microsoft Purview portal.  


 


The new experience also strengthens the security controls for Content Search by placing them in an eDiscovery case. This allows eDiscovery administrators to control who can access and use existing Content Searches and generated exports. Administrators can add or remove users from the Content Search case as needed. This way, they can prevent unauthorized access to sensitive search data and stop Content Search when it is no longer required. Moreover, this helps maintain the integrity and confidentiality of the investigation process. The new security controls ensure that only authorized personnel can access sensitive data, reducing the risk of data breaches and complying with legal and regulatory standards.


 


Enhanced data source management 


Efficient litigation and investigation workflows hinge on the ability to precisely select data sources and locations in the eDiscovery process. This enables legal teams to swiftly preserve relevant information and minimize the risk of missing critical evidence. The improved data source picking capability allows for a more targeted and effective search, which is essential in responding to legal matters or internal investigations. It enables users to apply holds and conduct searches with greater accuracy, ensuring that all pertinent information is captured without unnecessary data proliferation. This improvement not only enhances the quality of the review, but also reduces the overall costs associated with data storage and management. 


 


The new eDiscovery experience makes data source location mapping and management better as well. You can now perform a user or group search with different identifiers and see their data hierarchy tree, including their mailbox and OneDrive. For example, eDiscovery users can use any of following identifiers: Name, user principal name (UPN), SMTP address, or OneDrive URL. The data source picker streamlines the eDiscovery workflow by displaying all potential matches and their locations, along with related sources such as frequent collaborators, group memberships, and direct reports. This allows for the addition of these sources to search or hold scope without relying on external teams for information on collaboration patterns, Teams/Group memberships, or organizational hierarchies. 


 


Figure 1: New data source view with ability to associate person’s mailbox and OneDrive, exploring to a person’s frequent collaborator and ability to query data source updates.Figure 1: New data source view with ability to associate person’s mailbox and OneDrive, exploring to a person’s frequent collaborator and ability to query data source updates.


The “sync” capability in the new data source management flow is a significant addition that ensures eDiscovery users are always informed about the latest changes in data locations. With this feature, users can now query whether a specific data source has newly provisioned data locations or if any have been removed. For example, if a private channel is created for a Teams group, this feature alerts eDiscovery users to the new site’s existence, allowing them to quickly and easily include it in their search scope, ensuring no new data slips through the cracks. This real-time update capability empowers users to make informed decisions about including or excluding additional data locations in their investigations. This capability ensures that their eDiscovery process remains accurate and up-to-date with the latest data landscape changes. It is a proactive approach to data management that enhances the efficiency and effectiveness of eDiscovery operations, providing users with the agility to adapt to changes swiftly. 


 


Improved integration with Microsoft Information Protection 


The new eDiscovery experience now supports querying by Sensitive Information Types (SITs) and sensitivity labels. Labeling, classifying, and encrypting your organization’s data is a best practice that serves multiple essential purposes. It helps to ensure that sensitive information is handled appropriately, reducing the risk of unauthorized access and data breaches. By classifying data, organizations can apply the right level of protection to different types of information, which is crucial for compliance with various regulations and standards. Moreover, encryption adds a layer of security that keeps data safe even if it falls into the wrong hands. It ensures that only authorized users can access and read the information, protecting it from external threats and internal leaks.  


 


The new eDiscovery search functionality supports searches for emails and documents classified by SITs or specific sensitivity labels, facilitating the collection and review of data aligned with its classification for thorough investigations. This capability compresses the volume of evidence required for review, significantly reducing both the time and cost of the process. The support of efficient document location and management by targeting specific sensitivity labels unlocks the ability for organizations to validate and understand how sensitivity labels are utilized. This is exemplified by the ability to conduct collections across locations or the entire tenant for a particular label, using the review set to assess label application. Additionally, combining this with SIT searches helps verify correct data classification. For example, it ensures that all credit card data is appropriately labeled as highly confidential by reviewing items containing credit card data that are not marked as such, thereby streamlining compliance and adherence to security policies. 


 


Figure 2: Better integration with Microsoft Information Protection means the ability to search labeled and protected data by SIT and sensitivity label.Figure 2: Better integration with Microsoft Information Protection means the ability to search labeled and protected data by SIT and sensitivity label.Figure 3: Better integration with Microsoft Information Protection means the ability to search labeled and protected data by SIT and sensitivity label.Figure 3: Better integration with Microsoft Information Protection means the ability to search labeled and protected data by SIT and sensitivity label.


Enhanced investigation capabilities 


The new eDiscovery experience introduces a powerful capability to expedite security investigations, particularly in scenarios involving a potentially compromised account. By leveraging the ability to search by audit log, investigators can swiftly assess the account’s activities, pinpointing impacted files. As part of the investigative feature, eDiscovery search can also make use of evidence file as search input. It enables a rapid analysis of file content patterns or signatures. This feature is crucial for identifying similar or related content, providing a streamlined approach to discover if sensitive files have been copied or moved, thereby enhancing the efficiency and effectiveness of the security response. 


 


The enhanced search capability by identifier in the new eDiscovery UX is a game-changer for customers, offering a direct route to the exact message or file needed. With the ability to search using a messageID for mailbox items or a path for SharePoint items, users can quickly locate and retrieve the specific item they require. This precision not only streamlines evidence collection but also accelerates the process of purging leaked data for spillage cleanup. It’s a significant time-saver that simplifies the workflow, allowing customers to focus on what matters most – securing and managing their digital environment efficiently, while targeting relevant data. 


 


Building on the data spillage scenario, our search and purge tool for mailbox items, including Teams messages, also received a significant 10x enhancement. Where previously administrators could only purge 10 items per mailbox location, they can now purge up to 100 items per mailbox location. This enhancement is a benefit for administrators tasked with responding to data spills or needing to remediate data within Teams or Exchange, allowing for a more comprehensive and efficient purge process. With all these investigative capability updates, now the security operations team is ready to embrace the expanded functionality and take their eDiscovery operations to the next level. 


 


Microsoft Security Copilot capabilities 


The recently released Microsoft Security Copilot’s capabilities in eDiscovery are transformative, particularly in generating KeyQL from natural language and providing contextual summarization and answering abilities in review sets. These features significantly lower the learning curve for KeyQL, enabling users to construct complex queries with ease. Instead of mastering the intricacies of KeyQL, users can simply describe what they are looking for using natural language, and Copilot translates that into a precise KeyQL statement. This not only saves time but also makes the power of eDiscovery accessible to a broader range of users, regardless of their technical expertise. 



Figure 4: Draft query faster with Copilot’s N2KeyQL capability.Figure 4: Draft query faster with Copilot’s N2KeyQL capability.


Moreover, Copilot’s summarization skills streamline the review process by distilling key insights from extensive datasets. Users can quickly grasp the essence of large volumes of data, which accelerates the review process and aids in identifying the most pertinent information. This is particularly beneficial in legal and compliance contexts, where time is often of the essence, and the ability to rapidly process and understand information can have significant implications. 



Figure 5: Copilot summarization skill in Review Set helps reviewer review content by assessing summary of the item – even when the conversation is in not in English.Figure 5: Copilot summarization skill in Review Set helps reviewer review content by assessing summary of the item – even when the conversation is in not in English.


Additional export options 


The new eDiscovery experience introduces a highly anticipated suite of export setting enhancements. The contextual conversation setting is now distinct from the conversation transcript setting, offering greater flexibility in how Teams conversations are exported. The ability to export into a single PST allows for the consolidation of files/items from multiple locations, simplifying the post-export workflow. Export can now give friendly names to each item, eliminating the need for users to decipher item GUIDs, and making identification straightforward. Truncation in export addresses the challenges of zip file path character limits. Additionally, the expanded versioning options empower users to include all versions or select the latest 10 or 100, providing tailored control over the data. These improvements not only meet user expectations but also significantly benefit customers by streamlining the eDiscovery process and enhancing overall efficiency. 


 


Additional enhancements 


As part of the new experience, we are introducing the review set query report, which generates a hit-by-term report based on a KQL query. This query report allows users to quickly see the count and volume of items hit on a particular keyword or a list of compound queries, and can be optionally downloaded.  By providing a detailed breakdown of where and how often each term appears, it streamlines the review by focusing on the most relevant documents, reducing the volume of data that needs to be manually reviewed, and offers a better understanding of which terms may be too broad or too narrow. 


 


As part of the improved user experience, all long-running processes now show a transparent and informative progress bar. This progress bar provides users with real-time visibility into the status of their searches and exports, allowing eDiscovery practitioners to better plan their workflow and manage their time effectively. This feature is particularly beneficial in the context of legal investigations, where timing is often critical, and users need to anticipate when they can proceed to the next steps. This level of process transparency allows users to stay informed and make decisions accordingly. 


 


Figure 6: Transparent progress bar for all long-running processes detailing scope of the process and estimated time to complete.Figure 6: Transparent progress bar for all long-running processes detailing scope of the process and estimated time to complete.


In addition to progress transparency, all processes in the new eDiscovery experience will include a full report detailing the information related to completed processes. The defensibility of eDiscovery cases and investigations is paramount. The full reporting capabilities for processes such as exports, searches, and holds provide critical transparency. For example, it allows for a comprehensive audit of what was searched or exported, the specific timing, and the settings used. For customers, this means a significant increase in trust and defensibility of the eDiscovery process. This enhancement not only bolsters the integrity of the eDiscovery process but also reinforces the commitment to delivering customer-centric solutions that meet the rigorous demands of legal compliance and data management. 


 


Hold policy detail view also received an upgrade as part of this new eDiscovery release. Customers now can access the hold policy view with detailed information on all locations and their respective hold status. This detailed view is instrumental in providing a transparent audit of what location is on hold, ensuring that all relevant data is preserved, and that no inadvertent destruction of evidence occurs during the process. Customers can download and analyze the full detailed hold location report, ensuring that all necessary content is accounted for and that legal obligations are met.  


 


As we conclude this exploration of the modernized Microsoft Purview eDiscovery (preview) experience, it’s clear that the transformative enhancements are set to redefine the landscape of legal compliance and security investigations. The new experience, with its intuitive design and comprehensive set of new capabilities, streamlines the eDiscovery process, making it more efficient and accessible than ever before. The new eDiscovery experience is currently in public preview and is expected to be Generally Available by the end of 2024.  


 


Thank you for joining us on this journey through the latest advancements in eDiscovery. We are excited to see how these changes will empower legal and compliance teams to achieve new levels of efficiency and effectiveness in their important work. To learn more about the changes in eDiscovery, visit our product documentation. As always, we are eager to hear your feedback and continue innovating to improve your experience. We welcome your thoughts via the Microsoft Purview portal’s feedback button.  


 


We hope these enhancements improve your day-to-day experience and ultimately streamline the eDiscovery process, making it more efficient and accessible than ever before. The new eDiscovery experience is currently in public preview and is expected to be Generally Available by the end of 2024.  


 


Learn more


We are excited to see how these changes will empower legal and compliance teams to achieve new levels of efficiency and effectiveness in their important work. Check out our interactive guide at https://aka.ms/eDiscoverynewUX to better understand the changes in eDiscovery. As always, we are eager to hear your feedback and continue innovating to improve your experience. We welcome your thoughts via the Microsoft Purview portal’s feedback button.  


 


To learn more about eDiscovery, visit our Microsoft documentation at http://aka.ms/eDiscoveryPremium, or our “Become an eDiscovery Ninja” page at https://aka.ms/ediscoveryninja. If you have yet to try Microsoft Purview solutions, we are happy to share that there is an easy way for eligible customers to begin a free trial within the Microsoft Purview compliance portal. By enabling the trial in the compliance portal, you can quickly start using all capabilities of Microsoft Purview, including Insider Risk Management, Records Management, Audit, eDiscovery, Communication Compliance, Information Protection, Data Lifecycle Management, Data Loss Prevention, and Compliance Manager.