This article is contributed. See the original author and article here.
In this episode of MidDay Café hosts Tyrelle Barnes and Michael Gannotti discuss Human/AI partnership. Many organizations are trying to figure out AI strategy but seem to be taking a tech/product first approach. What Tyrelle and Michael discuss is how to anchor on people/employees first with AI/Tech in support.
This article is contributed. See the original author and article here.
Introduction
Azure OpenAI models provide a secure and robust solution for tasks like creating content, summarizing information, and various other applications that involve working with human language. Now you can operate these models in the context of your own data. Try Azure OpenAI Studio today to naturally interact with your data and publish it as an app from from within the studio.
Getting Started
Follow this quickstart tutorial for pre-requisites and setting up your Azure OpenAI environment.
In order to try the capabilities of the Azure OpenAI model on private data, I am uploading an ebook to the Azure OpenAI chat model. This e-book is about “Serverless Apps: Architecture, patterns and Azure Implementation” written by Jeremy Likness and Cecil Phillip. You can download the e-book here
Before uploading own data
Prior to uploading this particular e-book, the model’s response to the question on serverless design patterns is depicted below. While this response is relevant, let’s examine if the model is able to pick up the e-book related content during the next iteration
After uploading own data
This e-book has an exclusive section that talks in detail about different design patterns like Scheduling, CQRS, Event based processing etc.
After training the model on this PDF data, I asked a few questions and the following responses were nearly accurate. I also limited the model to only supply the information from the uploaded content. Here’s what I found.
Now when I asked about the contributors to this e-book, it listed everyone right.
Read more
With enterprise data ranging to large volumes in size, it is not practical to supply them in the context of a prompt to these models. Therefore, the setup leverages Azure services to create a repository of your knowledge base and utilize Azure OpenAI models to interact naturally with them.
The Azure OpenAI Service on your own data uses Azure Cognitive Search service in the background to rank and index your custom data and utilizes a storage account to host your content (.txt, .md, .html, .pdf, .docx, .pptx). Your data source is used to help ground the model with specific data. You can select an existing Azure Cognitive Search index, Azure Storage container, or upload local files as the source we will build the grounding data from. Your data is stored securely in your Azure subscription.
We also have another Enterprise GPT demo that allows you to piece all the azure building blocks yourself. An in-depth blog written by Pablo Castro chalks the detail steps here.
Getting started directly from Azure OpenAI studio allows you to iterate on your ideas quickly. At the time of writing this blog, the completions playground allow 23 different use cases that take advantage of different models under Azure OpenAI.
Summarize issue resolution from conversation
Summarize key points from financial report (extractive )
Summarize an article (abstractive)
Generate product name ideas
Generate an email
Generate a product description (bullet points)
Generate a listicle-style blog
Generate a job description
Generate a quiz
Classify Text
Classify and detect intent
Cluster into undefined categories
Analyze sentiment with aspects
Extract entities from text
Parse unstructured data
Translate text
Natural Language to SQL
Natural language to Python
Explain a SQL query
Question answering
Generate insights
Chain of thought reasoning
Chatbot
Resources
There are different resources to get you started on Azure OpenAI. Here’s a few:
This article is contributed. See the original author and article here.
Busy sales account managers prioritize their activities by mining information from their accounts. But manually making sense of all that unstructured data takes time and can lead to inaccurate assumptions. They can end up focusing on the wrong activities, which results in a lower impact on business outcomes. The most productive and successful account managers are the ones who focus on the right customers with the right priority. Dynamics 365 Sales account-based seller insights can help.
Account-based seller insights help drive priorities
Account-based seller insights help you set priorities and formulate the best engagement plan for your customers. These are automated, actionable insights that are derived from multiple sources of unstructured data and presented to you in the right context. For instance, you might be shown an upsell insight for an account based on past won opportunities for similar accounts, along with guidance on the next best action to take. Seller insights help you proactively manage the customer journey, from the first engagement to the final sale.
Behind the scenes with seller insights
Account-based seller insights can be generated in three ways:
Bring your own model. Use your own AI model, trained on your data, to generate insights, and work with them in the Dynamics 365 sales accelerator.
Use out-of-the-box models. The account-based seller insights solution comes with its own models, which mine the data in Dynamics 365 Sales to generate insights.
Build a back-end rule framework. You can build your own rule framework that uses Power Automate flows to generate insights when certain conditions are met.
The sales accelerator in Dynamics 365 Sales, with seller insights and next actions highlighted.
How seller insights boost productivity
How can seller insights help you be a more effective sales account manager? Let’s look.
Insight list and actions
First, you get curated insights for all your accounts:
You only see insights that are relevant to you, not your team members.
The insights have expiration dates so that you know the information is fresh and relevant.
You can see the reasons an insight appears in the list.
And after you acknowledge an insight, you’re guided through the next best steps to act on it, optimizing the sales workflow for better results. You can also collaborate with team members while you’re working on your insights.
The sales accelerator in Dynamics 365 Sales, with a seller insight and sequence highlighted.
Insight assignment and distribution
Second, although your insights are curated, that doesn’t mean they’re siloed. Insights are assigned to the account owner. If the owner of an entity is a team, an insight can be automatically assigned to the appropriate salesperson on the team, based on role, through the flexible rule framework. Ownership can be transferred from one seller to another, and multiple sellers can work on a single insight.
Insight action history
Finally, you can find all the insights that have been generated for an account on the account’s Insights tab. The list includes status, type, due date, and other helpful information. Filter and sort it to focus on what’s most important. You can easily identify all seller activities for the insights on the timeline view of the account.
By helping you identify your most important and profitable accounts, understand their needs and preferences, tailor your messages and offers, and nurture long-term relationships with them, account-based seller insights can lead to higher revenues, shorter sales cycles, and better customer satisfaction.
It is often required to copy data from on-premises to Azure SQL database, Azure SQL Managed Instance or to any other data store for data analytics purposes. You may simply want to migrate data from on-premises data sources to Azure Database Services. You will most likely want to be able to do this data movement at scale, with minimal coding and complexity and require an automated and simple approach to handle such scenarios.
In the following example, I am copying 2 tables from an On-premises SQL Server 2019 database to Azure SQL Database using Microsoft Fabric. The entire migration is driven through a metadata table approach, so the copy pipeline is simple and easy to deploy. We have used this approach to copy hundreds of tables from one database to another efficiently. The monitoring UI provides flexibility and convenience to track the progress and rerun the data migration in case of any failures. The entire migration is driven using a database table that holds the information about the tables to copy from the source.
Architecture diagram
This architectural diagram shows the components of the solution from SQL Server on-premises to Microsoft Fabric.
I intend to copy two tables – Customer and Sales – from the source to the target. Let us insert these entries into the metadata table. Insert one row per table.
Ensure that the table is populated. The data pipelines will use this table to drive the migration.
Create Data Pipelines:
Open Microsoft Fabric and click create button to see the items you can create with Microsoft Fabric.
Click on “Data pipeline” to start creating a new data pipeline.
Let us name the pipeline “Copy_Multiple_Tables”.
Click on “Add pipeline activity” to add a new activity.
Choose Azure SQL Database from the list. We will create the table to hold metadata in the target.
Ensure that the settings are as shown in the screenshot.
Click the preview data button and check if you can view the data from the table.
Let us now create a new connection to the source. From the list of available connections, choose SQL Server, as we intend to copy data from SQL Server 2019 on-premises. Ensure that the gateway cluster and connection are already configured and available.
Add a new activity and set the batch count to copy tables in parallel.
We now need to set the Items property, which is dynamically populated at runtime. To set this click on this button as shown in the screenshot and set the value as:
@activity('Get_Table_List').output.value
Add a copy activity to the activity container.
Set the source Table attributes in the copy activity as shown in the screenshot. Click on the edit button and click the “Add dynamic content” button. Ensure that you paste the text only after you click the “Add dynamic content” button, otherwise, the text will not render dynamically during runtime.
Set the Table schema name to:
@item().SourceSchemaName
Set the Table name to:
@item().SourceTableName
Click on the destination tab and set the destination attributes as in the screenshot.
Set the Table schema name to:
@item().TargetSchemaName
Set the Table name to:
@item().TargetTableName
We have configured the pipeline. Now click on save to publish the pipeline.
Run pipeline:
Click the Run button from the top menu to execute the pipeline. Ensure the pipeline runs successfully. This will copy both tables from source to target.
Summary:
In the above example, we have used Microsoft Fabric pipelines to copy data from an on-premises SQL Server 2019 database to Azure SQL Database. You can modify the sink/destination in this pipeline to copy to other sources such as Azure SQL Managed Instance or Azure Database for PostgreSQL. If you are interested in copying data from a mainframe z/OS database, then you will find this blog post from our team also very helpful.
This article is contributed. See the original author and article here.
Dynamics 365 Customer Service is a powerful tool for managing your contact center. Its built-in analytic dashboards, such as the recently launched Omnichannel real-time analytics dashboard, provide a wealth of industry standard KPIs and metrics to help you monitor and improve performance. These dashboards are built on Power BI with two components: a data model (or data set) that houses the KPIs, and reports that visualize the data for viewers. Dynamics 365 Customer Service reads the data from Dataverse, performs transformation logic for each of the KPIs, and makes these KPIs available for you within the data model. You can customize data models using data within Dynamics or external data to view metrics tailored to your needs.
Every Dynamics organization that has analytics enabled gets their copy of this solution deployed and available only to them. While the data model is not editable, the reports are fully customizable through visual customization. This way, you can see and use the data in ways that make sense for your organization. You can view metrics outside of what’s in the out-of-box reports. You can also create additional pivots and dimensions to slice the data as needed.
We have received a lot of feedback from you around the need for customizations. You want to modify the data or logic used to calculate metrics in the data set. You also want to create your metrics in addition to the out-of-box metrics available in the data model. Additionally, you want to create variants of existing metrics or calculate metrics differently based on your organization’s unique processes. Another frequent request has been guidance around building custom dashboards that combine KPIs from Dynamics Customer Service with other applications.
To address these scenarios, Dynamics 365 Customer Service launched model customization. This feature deploys a copy of the data set used by the out-of-box reports into your organization’s Power BI workspace. Therefore, you can build composite models that connect to the Dynamics data model.
By leveraging the out-of-box model and only creating the metrics that are unique to your organization, you can reduce the risk of metric definitions going stale as Dynamics updates its capabilities. This also saves you valuable time and development effort. Furthermore, by using model customization, you can build custom reports and dashboards that combine data from multiple applications. This gives you a more complete picture of your contact center’s performance.
Overall, Dynamics 365 Customer Service provides a powerful set of tools for managing your contact center. Its built-in analytic dashboards offer the specific insights you need to improve contact center performance. And with model customization, you can tailor these to your specific needs.
This article is contributed. See the original author and article here.
Since 2019, Microsoft has had a long-standing partnership with OpenAI. The goal: to accelerate breakthroughs in AI and ensure the benefits are broadly shared with the world. A significant milestone in this collaboration occurred in January 2023 with the introduction of Microsoft’s Copilot offerings. Referred to as an “AI collaborator,” Copilot functions as a valuable assistant that works alongside you, enhancing your productivity and efficiency. Our thought leaders across the organization have shared many perspectives on its use spanning approaches to responsible AI to its use in low-code app development. In the last few months, we have also shared several copilot announcements regarding model improvements. AI brings the potential of innovation to every line of business, and in this post we focus on our journey with Copilot in Dynamics 365 Marketing.
The Dynamics 365 Marketing journey began when we launched our first previews of content ideas in May 2022 and query assist in October 2022. The real-world feedback we have received has been instrumental in our learning process. It has allowed us to gain valuable insights, but we acknowledge that there is still much more to learn and accomplish to truly revolutionize the marketing domain. With the technology changing so rapidly, it has also become very clear that we will—the use cases for Copilot in customer experience (CX) are endless. Let’s delve into how these copilot features came to be and highlight our generative AI roadmap moving forward.
Dynamics 365 Marketing
Transform your customer experience with generative AI
Query assist, using Copilot to democratize segmentation
Marketing segmentation enables targeting—choosing an audience based on their characteristics. It could be demographics, psychographics, behaviors, preferences, or a combination thereof. Effective segmentation has traditionally required a thorough understanding of complex data models, database management, and SQL (Structured Query Language). Consequently, marketing organizations frequently face the challenge of relying on a single individual to create segments for the entire team, leading to a bottleneck in the process. The Copilot feature query assist is built upon OpenAI’s GPT foundational model to empower marketers to easily target the right audience by using everyday language to describe and quickly build segments without having to learn data models and construct queries. Directly within their workflow, marketers provide the segment description, and query assist starts to create a new segment and automatically generates the segment rules that marketers can add to the segment definition before publishing the segment.
“Our goal was to develop powerful and user-friendly features to enable everyone to create segments with confidence, while still providing flexibility and power for advanced users. OpenAI was the foundation, but customer needs were the central focus.”
Gaju Gatera, Dynamics 365 Marketing Product Manager for query assist
Ultimately, marketers who use query assist spend less time on segmentation and more time on creating campaigns that truly connect with their customers.
Query assist was initially released with our own AI model. We then experimented with OpenAI GPT and found that query assist could understand different syntaxes from formal language to more conversational language. For instance, users could type “Show me our Seattle based contacts who attended the Contoso coffee tasting event last year” rather than the prior version’s requirement for more structured or SQL-like inputs such as “contacts who live in Seattle city and attended the Contoso coffee tasting event 2022.” This was very encouraging.
We collaborated with customers to improve the UX and embedded query assist seamlessly into marketers’ workflows. Originally marketers would input a description powered by query assist, land on Copilot, and get stuck without a way to transition back to manual segmentation. So we made sure to highlight query assist in the entry card, and we introduced a button that lets you switch from Copilot and manual segmentation in the canvas, allowing marketers with different levels of expertise to choose the experience that they like most.
We also changed the interaction design and streamlined the UX so that complex segments (requiring relationships traversals) are now created automatically without human intervention; what required three to four manual steps before is now one click as Copilot can create the relationships automatically.
Immediately following the launch of the enhanced version of query assist in March, we’ve seen substantial results where we now see 66 percent of marketers creating segments with query assist. We also saw the accuracy (as measured by thumbs up or down) improve from 15 percent to 86 percent within a month of the release.
We are particularly excited to witness how Copilot is driving positive change and contributing to the success of our customers. Chris Barnhart, Marketing Director at NC Fusion, a professional soccer team and sports organization serving the Triad region of North Carolina is an early adopter of Copilot. As a SQL expert, Chris knows how to design relationships among tables to create segments that engage a wide range of audiences from fans, to parents, to donors. “When I saw how you just type in common phrases and query assist will connect everything for you, I was blown away!” He was thrilled to discover how query assist could simplify his work and make his team more efficient, especially since many of his interns haven’t learned about data structures in school.
Sidney Fernandes, Chief Information Officer and Vice President for Technology at the University of South Florida, shared with us his excitement for the potential of using copilot to empower stakeholders across the organization.
“Once we share this with our teams, it has the potential to be quite transformational. Students and staff won’t need to be educated on the nuances of student data or copywriting to engage the right audiences impactfully.”
Sidney Fernandes, Chief Information Officer and Vice President for Technology at the University of South Florida
Tyler McClain, Senior Solution Consultant with Connected Experience partner Coffee + Dunn tried the feature and shared their experience, saying “Using query assist was quick and easy. I just typed ‘anniversary within the past five years,’ and it gave me exactly what I was looking for. This feature can further empower our clients to engage their customers with ease, and I’m excited to share it with them.”
Encouraged by these positive results we continue to iterate on query assist. We are excited to announce that Copilot now supports segments that include marketing interactions or specific customer behaviors such as “customers who opened an email recently” enabling them to optimize segment creation to drive more effective campaign engagement. We can’t wait to see how marketers use it to drive better results and achieve their business goals.
Content ideas, simplifying email creation using Copilot to generate compelling content
Email marketing can be a highly effective way to connect with customers and drive business growth, but it also comes with its challenges. Over time, content can begin to feel stale or repetitive. Producing engaging content for email campaigns is time-consuming and requires significant creative energy.
To assist marketers in their email creation, we’ve used OpenAI GPT to develop content ideas. Given a few short key points, Copilot generates engaging email content ideas within seconds that marketers can select from and tweak to perfectly match their needs.
“Content ideas was developed with one goal in mind–saving hours of copywriting, by helping marketers kick-start their email creation and create compelling content within minutes.”
Aga Miskowiec, Principal Product Manager for content ideas
With this innovative tool, writer’s block is becoming a thing of the past, allowing marketers to accelerate the delivery of engaging email campaigns that resonate with their audiences.
After releasing the first iteration of content ideas, we rapidly realized its potential and have been working to refine the approach for over a year. This includes tuning that we haven’t seen in any other products. We tune based on a library of public emails, not for any branding but rather for the type of tone and content a marketer would use. We also enable companies to use content they’ve created in the past to tune the generation of new content.
In March, we took content ideas to the next level by upgrading the language model to OpenAI GPT-3.5, and this had a significant impact on the quality of ideas generated.
Our customers started to see a positive impact on their business Chris Barnhart Marketing director at NC Fusion shared with us that “Normal engagement runs about 20 to 30 percent and there’s a lot of factors that go into the email itself, but with the email written with content ideas the engagement was about 70 percent.”
But that’s not all, we’ve also added a range of different tone of voice options to help marketers tailor the content to their brand and audience. Whether they’re looking for an engaging, adventurous, casual, luxury, or formal tone, content ideas has got marketers covered.
“Setting the tone allows Campari Group, a global company, to tailor the tone of voice for each brand and region’s communication style saving us hours of copywriting” according to Liam Barnes, Global IT Director – Marketing Technologies from Campari Group, a worldwide leader in the spirits industry.
To level up usability, we put content ideas right in the email creation flow where marketers are adding or editing text as our data showed that switching to the Copilot tab was a significant discoverability usage drop off.
We’ve also gamified our UX to better guide marketers and help them craft higher quality key points used to generate content. “It’s pretty impressive. I was very happy using the feature. Our team has been very happy using the feature–it’s making delivering great content faster and easier.” shared Keith Perfect, Director of Technology and Intelligence from Northrop & Johnson, a luxury yacht brokerage.
As a result, the number of daily users who opened the feature doubled, the percentage of users who generated ideas (out of those who opened the feature) increased by 75 percent. Furthermore, the number of organizations using content ideas daily increased more than 5 times.
Receiving early feedback from our customers and partners who tried the latest version of content ideas was a rewarding experience. Their input helped us prioritize our focus and further enhance customer satisfaction.
“It’s awesome, content ideas helps us tremendously to efficiently produce emails. In fact, I can tell you it was normally at least an hour, and I did it in 15 minutes, it takes me ~25 percent of the time it took before to create email content!”
Chris Barnhart, Marketing Director at NC Fusion
“The results are very creative now, and tone of voice capability is great,” said Steve Smith, Principal Solutions Architect – Data & Artificial Intelligence at TTEC Digital.
And we’re just getting started! We are working on adding personalization, journey creation, brand style creation, and conversation help capabilities, just to name a few.
Our goal is to empower marketers in delivering exceptional customer experiences that drive business success with less effort and time than ever before. With the limitless potential of AI, we eagerly anticipate the future of customer experience.
Get started with Dynamics 365 Marketing
Start using content ideas and query assist today, and see how it can help you boost your productivity while creating deeply personalized experiences.
This article is contributed. See the original author and article here.
Welcome to our June Terraform on Azure bimonthly update! We hope the first update was helpful towards giving you insights on what the product team has been working on. This update is our first bimonthly update with collaboration between Microsoft and HashiCorp. We will be aiming for the next update in August!
AzureRM provider
The resources exposed by the AzureRM provider are what most customers think of and include in their configurations when managing Azure infrastructure with Terraform. Azure is always adding new features and services so we work hard to ensure that you can manage these when they are generally available (GA).
Latest Updates
A new version of the provider is released weekly that includes bug fixes, enhancements and net new resources and data sources. Here are some notable updates since our previous blogpost:
Azure Export for Terraform is a tool that seeks to ease the translation of Terraform and Azure concepts between each other. Whether it’s exporting your code into a new environment or creating repeatable code from an existing environment, we believe the tool provides functionality that simplifies tougher processes.
Latest Updates
The Team has published comprehensive documentation for a variety of Azure Export for Terraform scenarios. We’re excited to have you test this exciting tool and provide feedback – both on the product as well as our documentation for it. Read the overview of the tool here: https://aka.ms/tf/exportdocs
We’ve also recently merged a PR that supports import blocks for Terraform 1.5 onward: https://github.com/Azure/aztfexport/pull/398. To read up on import blocks, check out the HashiCorp documentation here, and if you’re curious about the difference between Azure Export for Terraform and import blocks, we also have a pinned issue detailing this: https://github.com/Azure/aztfexport/issues/406
Last, but certainly not least, we’ve released a video for Azure Export for Terraform! Make sure to give it a watch, as it includes benefits, scenarios, and demos:
Verified Modules
Have you ever encountered below problems related to modules:
Modules are out of date, not actively supported, and no longer functional
Cannot override some module logic without modifying the source code
Get confused when you see multiple modules with similar functions
When calling various modules, inconsistencies exist that cause instability to existing infrastructure
……
To help tackle the above problems and more, the Azure Terraform team has established a verified module testing pipeline, and only those modules that have passed this pipeline will be marked as “verified”. This pipeline ensures consistency and best practices across verified multiple modules, reduces breaking changes, and avoids duplication to empower the “DRY” principle.
Latest Updates
We have now released nine Azure verified modules. We prioritized these modules based on customer research and telemetry analysis. Meanwhile, we have continuously updated our verified modules for bug fixes and feature enhancements. For instance, for the AKS verified module, we have added support for the linux_os_config block in default_node_pool and default node pool’s node taints. For a full list of updates to each module, please refer to the changelog: Azure/terraform-azurerm-aks: Terraform Module for deploying an AKS cluster (github.com).
For our next modules, we are planning on releasing modules for hub networking, firewalls and key vaults, with close collaboration with the broader developer community. We hope you become one of the proactive contributors to the Azure Terraform verified modules community as well!
Community
The Terraform on Azure community is a key investment for our team in bringing the latest product updates, connecting you with other Terraform on Azure users, and enabling you to engage in ongoing feedback as we aim to improve your Terraform experience on Azure. This section will consistently speak on community related feedback or engagements. As always, register to join the community at https://aka.ms/AzureTerraform!
Community Calls
Our latest community call was on April 6th! The recording of the event is at https://youtu.be/Zrr-GXN6snQ and we hope you give it a watch. Ned Bellavance talks in depth about Azure Active Directory and OIDC authentication, and we spend some time exploring GitHub Copilot with Terraform.
We also announced our new slack channel, which you can join at https://aka.ms/joinaztfslack. Not only will you get access to fellow Azure Terraform community members, but also the product team.
Our next community call is June 22nd at 9 am PT. Make sure to register here. It’ll be a time of open discussion with the team on Terraform, Azure, and the future of AI. Come with your thoughts and opinions!
We are also taking applications to co-present with us at our community calls! Our only prerequisite is that you are a member of the community. If you are interested, fill out our form at https://aka.ms/aztfccspeakers and we will reach out if we like your topic! Don’t worry if you don’t get picked for the next one; we will keep your talk on file and may reach out later.
Docs
It’s been a busy couple of months in Azure Terraform documentation!
A key goal we’re making progress on is to bring the Terraform Azure-service documentation into parity with ARM Templates and Bicep. The object is to make it easier to find and compare Azure infrastructure-provisioning solutions across the various IaC options.
To that end, we’ve published 15 new Terraform articles covering many different Azure-service topics.
Second, Terraform Cloud has announced dynamic provider credentials, which enables OIDC with Azure in TFC. If you want a video explaining the benefits of dynamic credentials, check out a great presentation here.
Upcoming Events
Make sure to sign up for the Terraform on Azure June 22nd 9am PT community call here! We’ll be discussing in an open panel discussion with the team about the future of Terraform on Azure, especially regarding the latest developments in AI.
We’ll aim for our next blogpost in August. See you then!
This article is contributed. See the original author and article here.
Microsoft invites you to our Microsoft Operations: Community Q&A calls for CSP Partners. These sessions are dedicated to assist CSP Direct Bill and Indirect Providers with questions related to CSP launches and upcoming changes. Our goal is to help drive a smoother business relationship with Microsoft. We offer sessions in English, Chinese, Japanese and Korean.
Register Today to join a live webinar with Subject Matter Experts or listen back to past sessions.
This article is contributed. See the original author and article here.
In my conversations with customers about the future of business applications and low-code development, generative AI innovations come up frequently as the next transformative platform shift. This is especially true at Microsoft. Over the last several months, we have introduced new AI-powered copilot experiences in Microsoft Dynamics 365 and Microsoft Power Platform that revolutionize how businesses and people work.
Of all the questions that customers have, one of the most common is how Microsoft continually delivers new AI innovations at a rapid pace. This includes the major enhancements covered in blog posts and videos from Microsoft Build 2023. Our customers also want to understand how they can infuse AI into their own applications and products.
If you are still running systems on-premises, it starts with application modernization in the cloud. To help customers understand what such a journey might look like and how potentially to go about it, we documented our work to modernize Microsoft Dynamics 365 on Microsoft Azure. The key takeaway is that moving to the public cloud is more of a journey than a destination. But if you approach it right, you can realize substantial new value at every step, making incremental and concrete progress as you move to modern public cloud applications on Azure.
Dynamics 365
A portfolio of intelligent business applications that delivers superior operational efficiency and breakthrough customer experiences
Up through 2016, before the move to Azure, Dynamics 365 ran as a traditional, on-premises or hosted software solution. Customers could either install and run it on their own, or they could access it via a software as a service (SaaS) model, where Microsoft ran and supported it in its private datacenters.
Under the hood, both options were the same—the major difference being that the SaaS offering ran in Microsoft data centers at a much larger scale. It was powered at the lowest technical layer by thousands of bare-metal servers running Windows Server 2012 R2, with a frontend based on Internet Information Services (IIS) and backend based on Microsoft SQL Server 2012 R2, as illustrated in the left half of the following diagram.
Compared to the modern-day Dynamics 365 SaaS offering, both 2016-era deployment options suffered from limited availability, scalability, and flexibility. Product release cycles were measured in months, major customizations by customers required professional developers or partner assistance, and infrastructure upgrades required significant planning and downtime. Customers who chose the on-premises model also had to invest in expensive infrastructure up-front, which made it more difficult for smaller businesses with limited IT budgets (and expertise) to adopt.
Lift-and-shift isn’t always exciting, but it buys time and frees-up cash to modernize
The first step in our journey was to lift-and-shift the existing Dynamics 365 solutions to Azure. An as-is migration allowed us to focus on operational simplicity and delivery of a SaaS capability in a timely manner, begin transitioning from siloed areas of ownership to an integrated DevOps culture, develop experience operating a public cloud service at scale, and start migrating our existing customers.
Envisioning the future of Dynamics 365 solutions
In parallel to lifting existing Dynamics 365 solutions into the cloud, we started to envision the future—more specifically, how we wanted to go to market and have customers experience our offerings. Over the next several years, we proceeded to focus on several key areas.
Implement support for a modular set of offerings, instead of a monolith, from which customers could adopt only the functionality they wanted instead of having to go all-in when adopting Dynamics 365. From an application modernization perspective, this meant projecting the surface of Dynamics 365 as modules while evolving the existing codebase.
Empower customers and partners to customize and extend their Dynamics 365 applications through Microsoft Power Platform, a low-code/no-code platform for rapidly building customized end-to-end business solutions. A major enabler for all this was a Common Data Service (CDS)—now called Microsoft Dataverse—which today supports both transactions and analytics across all Dynamics 365 and Microsoft Power Platform solutions.
Build AI-driven insights directly into Dynamics 365 applications to help drive actions and further optimize experiences and interactions. We are also continuing to infuse generative AI through copilot experiences across Dynamics 365 and Microsoft Power Platform, enabling users to describe what they want to create using natural language—for example, “Build a site-inspection mobile application” and then have an app created automatically.
Looking back: Substantial new value at every step of the journey
Our multiyear work to modernize Dynamics 365 on Azure has empowered enterprises in many new ways, at every step of our journey. Immediate benefits of our initial lift-and-shift to Azure were primarily technical, resulting in better scalability, performance, and reliability. The larger benefit, however, was that the move to Azure set the stage for reimagining the customer experience and bringing that vision to life.
Once on Azure, we began harnessing the power of managed services in the cloud to accelerate value delivery. A modular approach gave customers greater choice, enabling incremental and rapid time-to-business outcomes when adopting Dynamics 365. Microsoft Power Platform expanded extensibility, enabling customers and partners to extend their Dynamics 365 solutions the way they wish, including easy version control, packaging, and distribution. Dynamics 365 Customer Insights let business users extract the insights needed to personalize customer experiences and orchestrate optimized, real-time customer journeys. The infusion of AI across Dynamics 365 and Microsoft Power Platform user experiences has further empowered our customers, enabling them to describe what they need using natural language and have it done for them.
A stronger business today
The modernization of Dynamics 365 on Azure has had an increasingly profound impact on the business, making Dynamics 365 and Microsoft Power Platform some of the fastest growing software categories for Microsoft. We’ve been able to launch new products, new features, and new user experiences at a faster pace, and with more precision, due to our cloud migration. We better understand our customers and can quickly solve unmet needs. The DNA of the team has fundamentally changed, and it shows throughout our products.
The journey continues
Looking forward, we are working to adopt Azure Kubernetes Service (AKS) as our main compute infrastructure, and to deliver further improvements in scalability and resiliency through increased use of Availability Zones. The real game changer, however, is generative AI, which we will continue infusing into Dataverse and Microsoft Power Platform, enabling virtually anyone to build and deploy (and automatically scale and govern) apps, workflows, and chatbots using natural language.
Although the future is bright, it all started with that initial cloud migration. Across Dynamics 365 and Microsoft Power Platform, the Business Applications and Platforms group is now innovating at the fastest pace in our history, and we were ready for the generative AI opportunity because of the investments we made back in 2017. Our biggest lesson learned is that we should have started the move to Azure earlier.
Take the next step
Read an in-depth case study covering our Dynamics 365 application modernization journey. Enterprise customers and independent software vendors (ISVs) can utilize the patterns and lessons it covers to help plan for their own modernization journeys, as they make the move from on-premises systems to modern applications in the cloud. In addition, customers and ISVs may be able to further accelerate their journeys by building atop Dataverse and Microsoft Power Platform, as a means of accelerating time-to-market while reducing their initial and ongoing engineering investments.
AI Builder is a powerful capability within the Microsoft Power Platform that enables you to automate processes and predict outcomes, improving your business performance. By harnessing AI Builder, you can seamlessly integrate AI into your applications and workflows, connecting them to your business data stored in Microsoft Dataverse or various cloud data sources like SharePoint, OneDrive, or Azure.
Building AI Models Made Easy:
One of the key advantages of AI Builder is its user-friendly approach, making AI creation accessible to individuals with varying levels of technical expertise. With AI Builder, you can create custom AI models tailored to your business requirements or choose from a range of prebuilt models. The best part is that you can leverage these models to enhance your apps and workflows without the need to write complex code. AI Builder even offers prebuilt AI models that you can immediately utilize without the hassle of data collection and model training.
Unlocking the Potential of AI Builder
Let’s explore some of the remarkable capabilities you can unlock with AI Builder:
1. Text Analysis: AI Builder empowers you to analyze text for classification, key phrases, language, and sentiment. By harnessing this capability, you can gain valuable insights from customer feedback, product reviews, survey responses, and support emails. Identify negative sentiment or key phrases to take informed actions and improve your business strategies.
2. Predictive Analytics: AI Builder enables you to predict future outcomes based on historical data. By leveraging this capability, you can make data-driven decisions and anticipate trends, empowering you to stay one step ahead in various aspects of your business.
3. Business Card Processing: Say goodbye to manual data entry. With AI Builder’s business card processing feature, you can automatically extract information from business cards and streamline your contact management process. Simply capture an image of a business card, and let AI Builder handle the rest.
4. Image Text Extraction: Extracting text from images is a breeze with AI Builder. Whether you need to process text from documents, images, or any visual content, AI Builder offers the tools to quickly and accurately extract information, saving you valuable time and effort.
5. Object Detection: AI Builder’s object detection capability allows you to recognize and count items in images. This can be particularly useful in scenarios such as inventory management, quality control, or any situation where you need to identify and quantify objects within images.
Real-World Applications of AI Builder
Let’s explore a few real-world applications of AI Builder to ignite your creativity:
1. Invoice Processing: Automate the tedious task of processing invoices by leveraging AI Builder. Extract text, key/value pairs, and tables from invoices and effortlessly integrate them into your database. Create workflows that automatically handle the information, streamlining your invoice processing workflow.
2. Text Analysis for Insights: Uncover hidden insights from large volumes of text data. Whether it’s customer feedback, support emails, or product reviews, AI Builder’s text analysis capabilities can help you identify key phrases, sentiment, and trends. Use these insights to drive improvements, make informed decisions, and enhance customer satisfaction.
Where to Access AI Builder
AI Builder is conveniently accessible within Power Apps and Power Automate. In Power Apps, you can find AI Builder on the Build tab, where you can refine existing models or utilize tools like the business card reader. The Models section provides a dedicated space for your created and shared models, ensuring easy management and collaboration.
Next Steps
Now that you have gained an overview of AI Builder’s potential, it’s time to dive deeper into its available AI capabilities. Explore the Microsoft Learning Pathway for AI Builder Microsoft Learning Pathway, where you will find comprehensive resources to expand your knowledge and master the art of leveraging AI in your business.
Conclusion
AI Builder opens up a world of possibilities for students and educators alike. By harnessing the power of AI Builder, you can streamline processes, gain valuable insights, and make data-driven decisions. Whether you’re an aspiring developer or a business professional, AI Builder empowers you to integrate AI into your applications and workflows without the need for extensive coding. Embrace this powerful tool and unlock the true potential of AI in your educational journey and professional endeavors.
Stay tuned for our upcoming blogs, where we will explore AI Builder’s features and use cases in more detail.
Recent Comments