This article is contributed. See the original author and article here.
Taking your machine learning (ML) models from local development into production can be challenging and time consuming. It requires creating a HTTP layer above your model to process incoming requests, integrate with logging services, and handle errors safely. What’s more, the code required for pre- and post-processing, model loading, and model inference vary across models and must integrate smoothly with the HTTP layer.
Today, we are excited to announce the General Availability (GA) and open sourcing of the Azure Machine Learning Inference Server. This easy-to-use python package provides an extensible HTTP layer that enables you to rapidly prepare your ML models for production scenarios at scale. The package takes care of request processing, logging, and error handling. It also provides a score script interfacethat allows for custom, user-defined, pre- and post-processing, model loading, and inference code for any model.
Summary of the AzureMLInference HTTP Server
The Azure Machine Learning Inference Server is a Python package thatexposes yourML model as a HTTP endpoint. The package contains a Flask-based server run via Gunicornand is designed to handle production scale requests. It is currently the default server used in the Azure Machine Learning prebuilt Docker images for inference. And, while it is built for production, it is also designed to support rapid local development.
Figure 1: How the Azure Machine Learning Inference Server Handles Incoming Requests
Score Script
The score script (sometimes referred to as the “scoring script” or “user code”) is how you can provide your model to the server. It consists of two parts, an init() function, executed on server startup, and a run() function, executed when the server receives a request to the “/score” route.
On server startup…
The init()function is designed to hold the code for loading the model from the filesystem. It is only run once.
On request to “/score” route…
Therun()function is designed to hold the code to handle inference requests. The code written here can be simple: passing raw JSON input to the model loaded in the init()function and returning the output. Or, it can be complex: running several pre-processing functions defined across multiple files, delegating inference to a GPU, and running content moderation on the model output before returning results to the user.
The score script is designed for maximum extensibility. Any code can be placed into init() or run() andit will be run when those functions are called as described above.
Developing a complex score script may require iterative debugging and often it’s not feasible to redeploy an online endpoint several times to debug potential issues. The AzureML Inference Server allows you to run a score script locally to test both model loading and inference request handling. It easily integrates with the VS Code debugger and allows you to step through potentially complex processing or inference steps.
The Azure Machine Learning Inference Server can also be used to create validation gates in a continuous integration and deployment (CICD) pipeline. For example, you can start the server with a candidate score script and run a test suite against this local instance directly in the pipeline, enabling a safe, efficient, and automatable deployment process.
Production Deployments
The Azure Machine Learning Inference Server is designed to support production-scale inference. Once local testing is complete, you can feel confident using the score script you developed alongside the Azure Machine Learning prebuilt inference images to deploy your model as an AzureML Managed Online Endpoint.
Safely bring your models into production using the Azure Machine Learning Inference Server and AzureML Managed Inference by referencing the resources below.
This article is contributed. See the original author and article here.
We are excited and honored that Gartner has recognized Microsoft as a Leader in their 2023 Magic Quadrant™ for Cloud ERP for Product-Centric Enterprises.* This evaluation of Microsoft was based on specific criteria that analyzed our overall Completeness of Vision and Ability to Execute. This is the third year in a row that we’ve been recognized as a Leader.
Agile enterprise resource planning (ERP) system for new ways of working
The way we do business has fundamentally changed. New business models are disrupting the way companies sell products and services, blurring industry lines and transforming customer experiences. ERP systems need to evolve from mere systems of transaction to systems of reasoning, offering their users prescriptive actions that they can take in their functional areas to accelerate growth.
Microsoft Dynamics 365 has already been helping thousands of organizations optimize finance and supply chains to create a connected enterprise by infusing automation and analytics powered by AI into the various ERP processes. Now, with Dynamics 365 Copilot in our ERP portfolio included in Microsoft Dynamics 365 Supply Chain Management, Microsoft Dynamics 365 Finance, and Microsoft Dynamics 365 Project Operations, we can enable every person in every organization to be more productive, collaborative, and deliver high-performance results.
For instance, with Copilot, organizations can supercharge productivity of procurement professionals and collections agents. Procurement professionals can efficiently handle purchase order changes at scale and assess the impact of changes downstream to production and distribution before making the right decision. Copilot enables quick collaboration with internal and external stakeholders that brings relevant information into Outlook and Microsoft Teams using natural language to meet customer and partner needs.
Collections managers with quick access to credit and payment history can prioritize and personalize customer communication and increase successful collection rates while proactively keeping customers in good standing. With Copilot, project managers can rapidly create new project plans for new engagements in minutes, automate status reports, identify risks, and suggest mitigation plans on a continuous basis, saving a significant amount of time, preventing project delays and budget overruns.
At Microsoft, we are fully committed to revolutionizing the future of ERP systems by harnessing the power of intelligent, composable technologies. The ERP portfolio from Dynamics 365, powered by generative AI technology, has the ability to speed time to insight, intelligently automate processes, and foster productivity ensuring that organizations can stay ahead of their competition in an increasingly complex business landscape.
Cloud-native ERP systems on a composable platform
One of the key strengths of Dynamics 365 Supply Chain Management and Dynamics 365 Finance is their extensibility. The ERP portfolio is built on a composable platform, making it easy to extend the solution with Microsoft Power Platform, providing low-code tools like Microsoft Power Apps and Microsoft Power Automate.
Where ERP customizations were once a heavy, time-consuming task, these tools empower businesses to customize their solutions and build apps with a modern user experience so that they can adapt to their bespoke industry specific needs and end users can work the way they want. Furthermore, companies and users can leverage prebuilt customizations and industry-specialized solutions from our ISV partner network to help speed development even further.
One of our customers, Nestlé, chose Dynamics 365 as the preferred platform for agile and speedy business system requests for mergers and acquisitions (M&A) activities. Nestlé needed business applications that would provide flexibility to adapt to different business models across geographies that could be reused multiple times. The company needed rich out-of-the-box features that could be extended with low-code/no-code capabilities. With Dynamics 365, Nestlé was able to create reusable strategies and blueprints for migrating business data and operations that would enable faster and more efficient acquisitions and divestitures easily with limited disruptions to customers and employees. This also helped them adhere to compliance, security, and data privacy regulations effectively. In just four short months after the project kicked off, Nestlé went live with Dynamics 365 Finance, Supply Chain Management, and Commerce.
AIM for the future with Microsoft today
In conclusion, running a business on Dynamics 365 offers numerous benefits for organizations. From seamless integration and enhanced productivity to real-time analysis and smart decision-making capabilities, Dynamics 365 empowers businesses to thrive in today’s dynamic market. Microsoft is committed to empowering customers to take advantage of AI capabilities in every line of business.
Organizations relying on on-premises applications will struggle to compete with peers embracing these AI-powered technologies in the cloud. It is paramount for companies to migrate their critical business processes to the cloud now. That is why we introduced AIM (Accelerate, Innovate, Move) earlier. AIM offers organizations a tailored path to move critical processes to the cloud with confidence. It provides qualified customers with access to a dedicated team of migration advisors, expert assessments, investment offers, tools, and migration support.
Magic Quadrant reports are a culmination of rigorous, fact-based research in specific markets, providing a wide-angle view of the relative positions of the providers in markets where growth is high and provider differentiation is distinct. Providers are positioned into four quadrants: Leaders, Challengers, Visionaries, and Niche Players. The research enables you to get the most from market analysis in alignment with your unique business and technology needs. View a complimentary copy of the Magic Quadrant report to learn more.
*Gartner is a registered trademark and service mark and Magic Quadrant is a registered trademark of Gartner, Inc. and/or its affiliates in the U.S. and internationally and are used herein with permission. All rights reserved. Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.
**This graphic was published by Gartner, Inc. as part of a larger research document and should be evaluated in the context of the entire document. The Gartner document is available upon request from Microsoft.
Source: Gartner, “Magic Quadrant for Cloud ERP for Product-Centric Enterprises,” Greg Leiter, Robert Anderson, Dixie John, Tomas Kienast, David Penny, September 26, 2023.
This article is contributed. See the original author and article here.
Issue
We recently encountered a support case where a customer using In-memory tables in an Azure SQL DB, receives an error message while trying to insert data into the table that also has a clustered columnstore index. The customer then deleted the entire data from the In-memory Tables (With the clustered columnstore index), however it appeared that the Index Unused memory was still not released. Here’s the memory allocation the customer could see:
Error
In addition to the error above- here is the error text:
Msg 41823, Level 16, State 109, Line 1
Could not perform the operation because the database has reached its quota for in-memory tables. This error may be transient. Please retry the operation. See ‘http://go.microsoft.com/fwlink/?LinkID=623028‘ for more information
Workaround
To reproduce the issue, we created two tables in our premium tier Azure SQL DB, one with a clustered columnstore Index while the other just had a regular clustered index. Also, the columnstore index was created with the option- MEMORY_OPTIMIZED=ON.
Then we went ahead and inserted data in both the tables and ran the script below to find the memory consumption of the indexes (Notice the 97 MB reported by the Index_Unused_memory column below in the table containing the columnstore Index):
IF( SELECT COUNT(1) FROM sys.data_spaces WHERE type = ‘FX’) > 0
BEGIN
SELECT OBJECT_NAME(object_id) AS tblName,
CAST(memory_used_by_table_kb / 1024.00 AS DECIMAL(10, 2)) [Total used Memory MB],
CAST(memory_allocated_for_table_kb / 1024.00 AS DECIMAL(10, 2)) – CAST(memory_used_by_table_kb / 1024.00 AS DECIMAL(10, 2)) [Total Unused Memory MB],
CAST(memory_used_by_indexes_kb / 1024.00 AS DECIMAL(10, 2)) [Index used Memory MB],
CAST(memory_allocated_for_indexes_kb / 1024.00 AS DECIMAL(10, 2)) – CAST(memory_used_by_indexes_kb / 1024.00 AS DECIMAL(10, 2)) [Index Unused Memory MB]
FROM sys.dm_db_xtp_table_memory_stats
ORDER by 2 desc;
END;
Now we went ahead and deleted all data from the table (with the columnstore Index) and ran the same query above:
The test above proves that it is not the data contained in an In-memory table that consumes the memory, but it is rather the Columnstore Index that consumes the memory and occupies it till the index stays on the table. Even if we delete the data from the table, the memory will still remain in the Index Unused memory. The only possible option to release the Index Unused memory is to drop the clustered Columnstore Index.
Moreover, it is also recommended to use a Columnstore Index only for tables with a lot of data (Millions or even billions) only if using it helps achieve the overall performance levels expected.
This article is contributed. See the original author and article here.
This is the next segment of our blog series highlighting Microsoft Learn Student Ambassadors who achieved the Gold milestone, the highest level attainable, and have recently graduated from university. Each blog in the series features a different student and highlights their accomplishments, their experience with the Student Ambassador community, and what they’re up to now.
Today we meet Vidushi Gupta, who recently graduated with a bachelor of computer science from SRM Institute of Science and Technology in India. Responses have been edited for clarity and length.
When did you join the Student Ambassadors community?
I joined the Student Ambassadors community in January 2021. This was the time when I started to learn about tech communities, and MLSA was my first.
What was being a Student Ambassador like?
Being a Microsoft Learn Student Ambassador was a transformative experience in my tech journey. It provided me with a supportive community and an exceptional program team, creating a safe space for me to learn and grow. Through this opportunity, I not only expanded my knowledge of new technologies but also made significant advancements in my existing tech skills. The program encouraged me to participate in hackathons, where I not only utilized my skills, but also emerged as a winner in some instances. Along the way, I had the privilege of meeting exceptional individuals who shared my passion for technology. Overall, being a Student Ambassador has been an incredible journey, filled with continuous learning, personal growth, and the development of unwavering confidence.
Was there a specific experience you had while you were in the program that had a profound impact on you and why?
During my time as a Microsoft Learn Student Ambassador, there were three experiences that had a profound impact on me. In 2021, I was awarded the Microsoft advocacy sponsorship to attend the Grace Hopper Celebration (GHC). This experience highlighted the importance of diversity and inclusion, and witnessing the safe space provided to women and gender minorities at the conference was inspiring. Since then, I have maintained my association with GHC, attending the conference in 2021 and serving as a mentor in 2022. I am currently aiming to attend the conference again this year.
Vidushi, Gold Student Ambassador, in Amsterdam during her student exchange program experience where she learned how to improve business by using data to drive decisions.
Tell us about a technology you had the chance to gain a skillset as a Student Ambassador. How has this skill you acquired helped you in your post-university journey?
As a Student Ambassador, I collaborated with Jasleen, a fellow Ambassador, on Microsoft’s data science and machine learning curriculums. This experience enhanced my skills in R, a language not commonly used in many projects. Acquiring proficiency in R has been invaluable in developing my data science portfolio and giving me a head start in my career. It has equipped me with the confidence and practical knowledge to tackle data-driven challenges and extract insights from complex datasets.
What is something you want all students, globally, to know about the Microsoft Learn Student Ambassador Program?
The MLSA program is an inclusive community with an amazing and supportive program team. It emphasizes the power of community and peer-to-peer learning, providing a safe space for diverse voices to be heard. Through MLSA, I learned the value of collaborating with fellow ambassadors, gaining support, guidance, and lifelong connections. I encourage all students worldwide to join this program and experience the transformative impact it can have on their tech journey.
What advice would you give to new Student Ambassadors, who are just starting in the program?
Five words – Trust the process and learn.
Look beyond the swag, look at the network you’re going to build, and grow!
Share a favorite quote with us! It can be from a movie, a song, a book, or someone you know personally. Tell us why you chose this. What does it mean for you?
“You v/s You.” This has always been my favorite quote. It always reiterates for me that I am my only competition. This helps me to work on being a little better than what I was yesterday. This quote also helps me to stay away from the comparison loop because yesterday’s Vidushi is my only baseline and no one else is!
Vidushi with fellow Student Ambassadors at the Microsoft office.
Tell us something interesting about you, about your journey.
When I joined, I had already experienced gender discrimination in tech. That experience led me to believe that women have to put in a lot of work to stay at the table. I was disheartened but wanted to get involved with a global community like MLSA to understand the importance of women and gender minorities in tech. I started off being doubtful about tech, even though I enjoyed it. Through my experience in the MLSA program, I became a confident public speaker, a mentor, a tech enthusiast, a data storyteller, a diversity and inclusion evangelist, and so much more!
This article is contributed. See the original author and article here.
We are excited to announce the availability of Azure AI Language Summarization Container! It comes with both Disconnected and Connected options, along with Commitment Tier pricing. Summarization in Azure AI Language provides ready-for-use solutions with task-oriented and -optimized LLM-powered models to summarize documents and conversation transcripts.
All resources are now LIVE and ready for use. Customers interested in the Disconnected container should go through the gating process to get approved.
This release is a significant step toward democratizing Generative AI and Large Language Models, offering key benefits to our customers:
With disconnected container, customers with high demands for data security and confidentiality are unblocked to bring the value of summarization to scenarios in a fully disconnected secure environment.
It is ideal for sensitive use cases where data isolation is critical, such as defense, legal, healthcare, financial industries, intelligence agencies.
Customers have full control over their environment, minimizing data exposure.
Organizations is empowered to harness the Cloud summarization capabilities in secure and confidential settings.
With container, both disconnected and connected option, customers will utilize summarization AI now in more regions and countries, beyond what is supported by the Cloud offering today.
With Commitment Tier pricing, customers will benefit from
Cost savings based on their commitment level, making it a cost-effective choice for long term usage.
Predictability in pricing, making budge planning more straightforward
Flexible commitment tiers, accommodating the specific needs and usage patterns.
These benefits cater to a wide range of customer needs, ensuring that they can choose the option that best aligns their requirements and preferences.
Please find below for more details and resources about the launch:
This article is contributed. See the original author and article here.
Hi, I’m Jaime Gonzales and I lead the Viva People Science R&D team. Our goal is to enrich the Viva journey with the science of employee happiness and success, to deliver exceptional and impactful experiences for humans at work.
As People Science experts, we are consultants, researchers, analysts, content authors, product consultants, and customer advocates with deep expertise in engagement and employee experience. I’ve worked in HR and OD roles for many years and now love sitting in a product team where we have the scale and reach to improve the lives of millions of people at work.
What is People Science?
I’ve always been fascinated with the idea that businesses win or lose based on the strength of their people. The better the connection between the two, the greater likelihood of success. As organizations seek any advantage to overcome strong economic headwinds, I think this notion is more important than ever. Those who prioritize what makes people feel happy, successful, and motivated to do their best work will find a competitive leg up. This is what People Science is all about – finding the intersection between people’s engagement and business performance.
People Science: our definition at Microsoft Viva
Let’s start with how we at Microsoft Viva think about People Science, and what it means for our customers. People Science is a research-backed and people-centric approach to the study and practice of happiness and success at work. It integrates fields of study like industrial-organizational psychology, organizational development, and occupational psychology with increasingly relevant fields like data science, product management, and design to reimagine an employee experience that drives better individual and organizational outcomes.
How Viva People Science creates value
Viva People Science transforms how people and organizations succeed by building new habits and mindsets. We help people bring their best selves to work so that they can do their best work. First, we invest in building a team of intellectually curious People Scientists who can transform themselves and their customers into People Science pros. We produce industry-leading research that challenges conventional HR practices, leveraging our unique data and expertise. We weave this research into our product and marketing outreach to design a human-centric Viva experience that improves employee engagement and ultimately business performance. As a team, we work to continuously evolve our skills, procedures, services, and toolkits to delight more and more customers and make People Science accessible to everyone.
For customers, we offer professional services to deliver connected insights and enable change
in three key ways:
1. We help customers shape and get buy-in on a tailored strategy that measures and improves aspects of their employee experience most related to engagement and business performance.
2. We help customers become People Science pros, elevating them in their roles and ensuring the partnership’s success.
3. We alleviate customer pain points by infusing our product with deep user empathy and our experience as People Scientists.
My colleagues and I will continue to explore the foundations of People Science with you over the coming months through this blog. But in the meantime, take a minute to watch this video that looks at People Science working with Viva Glint customers. I think it captures the essence of People Science at Microsoft.
Interested in how Viva People Science can support your organization’s success? Learn more here.
This article is contributed. See the original author and article here.
Introducing the first AI copilot experience for Dynamics 365 Commerce
For Merchandisers tasked with managing large product catalogs, the creation of high-quality “enriched” marketing content for their digital commerce channels can be a daunting and labor-intensive process. Especially if the Merchandiser is untrained in the creation of marketing copy. Yet it is incredibly important to have “enriched” content at the product level, where it can increase customer engagement, improve customer understanding, and drive natural search engine ranking, which all lead to higher conversion rates.
It is for this reason that Dynamics 365 Commerce is thrilled to announce the preview of ‘Commerce Copilot’, which provides a fast and efficient way of authoring product enrichment content for your B2B and B2C digital commerce websites.
Use Commerce Copilot to jump-start the creative process by first selecting a tone, configured by you, that aligns to your brand, like “adventurous”, “luxurious” or “bold”. Then select an audience choice from choices managed by you, like “sports enthusiast” or “college graduate”. Copilot will then use these prompts to craft compelling and engaging content that makes use of your existing product information like name, description, attributes, price and more!
You can further augment your newly enriched content with key product highlights or tune it to be optimized for search engines. But you always remain in control and have the ability to review and modify any content before it is published!
The Commerce Copilot for enriched product content is now available as a public preview for digital commerce customers based in United States. Additional market availability will be enabled soon.
This article is contributed. See the original author and article here.
In today’s digital age, collaboration platforms like Microsoft Teams have become the norm for businesses to communicate, stay connected, and share information. However, as organizations increasingly rely on these platforms, there has also been a surge in regulatory compliance and business conduct violations that occur on these platforms. Collaboration and messaging apps are one of the top three sources and that’s concerning. Many organizations, particularly financial institutions, healthcare providers, and other regulated industries face significant legal and reputational risks if non-compliant messages are shared during meetings.
Across Microsoft Teams, Outlook, and third-party apps like Instant Bloomberg, Microsoft Purview Communication Compliance provides the tools to help organizations detect regulatory (e.g. SEC or FINRA) and business conduct compliance violations, such as sensitive or confidential information, harassing or threatening language, and sharing of adult content. Built with privacy by design, usernames are pseudonymized by default, role-based access controls are built in, investigators are opted in by an admin, and audit logs are in place to help ensure user-level privacy.
We continue to invest in Communication Compliance and Microsoft Teams integrations to ensure compliant collaboration, including our existing capabilities like the ability to detect regulatory compliance or business conduct violations in a Teams chat, channel, and more.
Live transcription of Microsoft Teams meetings not only makes your meetings more productive and inclusive for participants but they are also official documentation, putting emphasis on detecting potential regulatory and business conduction violations all the more critical. With Teams supporting up to 34 language options for transcription, Communication Compliance is able to detect potential violations in all 34 languages.
This feature is only available for Teams meetings that attendees have opted in to record.
Communication Compliance Investigators with designated role-based access control permissions can then review the policy match in the meeting transcript alongside the video snippet feature that will be rolled out in October.
Additionally, Advanced eDiscovery can collect & review critical metadata associated to Teams meeting recordings and video files in OneDrive and SharePoint, including transcripts/captions, chapters, and custom thumbnails. This metadata can be used to identify the critical data within a Teams meeting more efficiently, especially with the ability to view the transcripts right within the review set.
Figure 1: View of meeting transcript shown alongside video snippet of regulatory compliance violation that occurred
Learn more about detecting communication in Microsoft Teams
Watch our latest video to learn how Microsoft Purview Communication Compliance can help safeguard Microsoft Teams data and ensure communications meet regulatory and business conduct requirements.
We also are happy to share that there is an easier way for you to try Microsoft Purview solutions directly in the Microsoft Purview compliance portal with a free trial (an active Microsoft 365 E3 subscription is required as a prerequisite). By enabling the trial in the compliance portal, you can quickly start using all capabilities of Microsoft Purview, including Insider Risk Management, Communication Compliance, Records Management, Audit, eDiscovery, Information Protection, Data Lifecycle Management, Data Loss Prevention, and Compliance Manager.
Visit your Microsoft Purview compliance portal for more details or check out the Microsoft Purview solutions trial (an active Microsoft 365 E3 subscription is required as a prerequisite).
If you are a current Communication Compliance customer and are interested in learning more about how Communication Compliance can help safeguard sensitive information and detect potential regulatory or business conduct violations, check out the resources available on our “Become a Communication Compliance Ninja” resource page.
This article is contributed. See the original author and article here.
AI is ready to support work—the question is: What will your business do with it?
Join us at the Microsoft Business Applications Launch Event on October 25 and explore how to create an AI-powered business that helps people and teams be more productive, solve problems quickly, and focus more energy on building revenue.
This will mark our tenth Business Applications Launch Event—a milestone in our commitment to business app innovation, our customers, and our partners. But we’re just getting started. To see what’s next—and what it means for your organization—you’ll hear directly from Microsoft leaders about their vision for AI, customer service, enterprise resource planning (ERP), and low-code solutions.
The latest AI in Microsoft Dynamics 365 Sales that provides relevant recommendations, summarizes data, retrieves information, and performs actions within the flow of work.
Advanced Copilot capabilities in Microsoft Dynamics 365 Customer Service that streamline agents’ workspaces, let them see transcripts of live chats and voice calls in their inboxes, and allow them to respond to customers quickly with the right information.
New Microsoft Sales Copilot features that help compose emails, update customer relationship management (CRM) records, recap meetings, and offer real-time tips to help sales teams close more deals.
Enhanced automation in Microsoft Dynamics 365 Finance to help handle accounts payable and bank statements, complex tax scenarios, and e-invoicing requirements in more markets.
New Copilot features in Dynamics 365 Customer Insights to help marketers quickly create memorable customer experiences using whole new levels of personalization for emails, images, and layouts.
Plus, learn about updates for Microsoft Dynamics 365 Supply Chain Management, Microsoft Dynamics 365 Commerce, and Microsoft Dynamics 365 Human Resources—all ready to help your employees be more productive, build customer loyalty, and drive meaningful growth. You’ll also hear from Charles Lamanna, Microsoft Corporate Vice President of Business Applications and Platforms, about what’s driving innovation today. Then catch some of the newest AI capabilities in action, with demos led by the people behind the 2023 release wave 2, offering expert guidance on how these updates will help you:
Improve insights, save time, and fuel creativity with the latest AI-powered solutions.
Empower your employees to focus on revenue-generating work and avoid repetitive tasks with automation.
Connect people, data, and processes across your organization using modern, AI-enhanced collaboration tools.
Insights on putting AI to work for you
The Business Applications Launch Event is more than our chance to showcase hundreds of new features and updates. It’s also a great opportunity for you to learn expert tips on how to apply these new technologies to some of your business’ biggest challenges. And if you have questions about new features, the role of AI at work, the evolution of copilots, or what’s ahead for business apps, get them answered by experts—we’ll be hosting a live Q&A chat at the end of the event, so be sure to stick around.
It’s been inspiring to see all the new features that the Dynamics 365 team has been working on, and we’re looking forward to celebrating 10 events’ worth of advancing business applications ahead.
This article is contributed. See the original author and article here.
The use of SaaS applications has become widespread in businesses of all sizes. With more SaaS apps in use, there are more potential targets for attackers. They frequently exploit centralized user authentication systems targeting unaware users with phishing attacks. Attackers can take advantage of this lack of awareness to trick users into authorizing malicious apps, steal credentials and gain access to multiple services. Attack techniques are getting more sophisticated and frequent exploits of poorly designed of SaaS applications are on the rise.
In this blog, we’ll demonstrate how SOC teams can benefit from App governance and its integration with Advanced Hunting to better secure SaaS apps.
Why use advanced hunting?
Advanced hunting uses a powerful query language called Kusto Query Language (KQL). KQL allows security analysts to create complex queries that can filter, aggregate, and analyze large volumes of data collected from endpoints, such as security events, process data, network activity, and more. However, this can be challenging for new security analysts who may not be familiar with writing queries in KQL. By using the pre-defined KQL queries and app signals collected in Microsoft 365 Defender, security analysts can immediately benefit from hunting capabilities to investigate app alert insights without having to use any KQL.
A real-life example of threat investigation
Let’s investigate a real-life incident triggered by a built-in threat detection policy in App governance. In our case, the “App impersonating a Microsoft logo” alert was triggered. Using our unified XDR platform, Microsoft 365 Defender, a SOC analyst can access all defender alerts in one place via the incidents view. The SOC analyst can filter on status, severity, incident assignment, service sources and other categories. In Figure 1, the Filter Service source = App Governance, Status = New, Severity= High, was applied to help with incident detection and prioritization.
Note: To learn more about App governance built in policies, check out our documentation.
Figure 1. Selecting incidents.
The incident (Figure 1) consists of four alerts that the SOC analyst can review to verify if they are true positives (TP) or false positives (FP) and act accordingly. The SOC analyst can click on the incident and access the attack story (Figure 2), where the alerts can be reviewed in chronological order. They can also view additional information in “What happened” and “Recommended actions” sections which gives the analyst a much better understanding as to why the alert was triggered in the first place with a path forward to remediate.
Figure 2. Reviewing the attack story.
Let’s learn more about the application, by selecting view app details (Figure 3).
Figure 3. Selecting View app details.
Usually, malicious apps will not have any certification or publisher verification because of the app nature community verification would be rare. The combination of all those attributes (highlighted at Figure 4) raise red flags.
Because the app is registered in Azure AD, the SOC team can easily access additional information available in the Azure portal which may help with providing additional context that may help with the incident resolution.
Figure 4. The malicious O365 Outlook Application card, Highlighted red flags and links to Azure AD and App activities in hunting.
In Figure 5, we can see why the machine learning algorithm highlighted the app as malicious, the logo impersonates the original Outlook logo, but the publisher domain does not match the Microsoft domain. The SOC analyst can now follow their company guidelines to disable the app (this can be completed directly in AAD or in App governance app details window – Figure 4)
Figure 5. View of app details in Azure Portal.
Use of Advanced Hunting as part of incident investigation.
After disabling the malicious app, the SOC analyst should investigate further the app activity by selecting, “View app activities” (option highlighted in Figure 4), which will generate the Query 1 also visible in Figure 6. The results visible in Figure 7&8 will include all graph API activities the app preformed on SharePoint Online, Exchange Online, One Drive for Business and Teams workloads.
Figure 6. Advanced hunting query.
Query 1:
// Find all the activities involving the cloud app in last 30 days
| where ((RawEventData.Workload == “SharePoint” or RawEventData.Workload == “OneDrive”) and (ActionType == “FileUploaded” or ActionType == “FileDownloaded”)) or (RawEventData.Workload == “Exchange” and (ActionType == “Send” or ActionType == “MailItemsAccessed”)) or (RawEventData.Workload == “MicrosoftTeams” and (ActionType == “MessagesListed” or ActionType == “MessageRead” or ActionType == “MessagesExported” or ActionType == “MessageSent”))
| extend AppId = appid(RawEventData)
| where AppId == “Paste your app Id“
| where Timestamp between (datetime(“2023-08-08 00:00:00Z”)..30d)
| extend tostring(RawEventData.Id)
| summarize arg_max(Timestamp, *) by RawEventData_Id
In the query results, the analyst can see the IP address which could be an indicator of malicious activity, attackers frequently use IP of bad reputation, blacklisted, Tor exit nodes. Analyzing historical data can reveal patterns of malicious behavior associated with specific IP addresses. This can be useful for threat intelligence and proactive threat hunting. The analyst can also see impacted workloads and action types which are crucial for them to understand hacker actions.
By analyzing these actions, security analysts can trace the steps of the attacker to determine the scope of the breach, how the attacker gained access, and what data or systems may have been compromised. MailItemsAccessed action suggests that an unauthorized user or hacker has accessed the contents of one or more email messages within an email account and UpdateInboxRules can be a sign of an attacker attempting to manipulate email traffic by diverting, filtering, or forwarding messages to their advantage.
Figure 7. Advanced hunting query results.
The analyst may want to create a detection rule (option visible on Figure 6) to proactively identify and alert on similar suspicious activities in the future, which is essential for enhancing an organization’s ability to detect and respond to security threats effectively, automate alerts, reduce false positives, and stay ahead of evolving cyber threats. Learn more about custom detections rules and how to create them here.
By selecting one of the records (Figure 8), the SOC analyst can get more information about the impacted user to act accordingly and “stop the bleeding.” They can take immediate action to halt or mitigate the security breach, prevent further access (changing passwords, revoking access privileges or even disabling the compromised account), all result in minimizing the damage. After the bleeding has stopped, the data helps security teams conduct a thorough investigation to determine the root cause of the incident. Understanding how the breach occurred is essential for preventing similar incidents in the future.
Figure 8. Advanced hunting inspected record details.
The app impersonation security incident shows the benefits of app governance machine learning in detecting malicious applications which offers additional layer of protection for your users and organization. The integration of app governance with advanced hunting capabilities provides SOC teams with the tools and insights needed to proactively detect, respond to, and mitigate security threats in SaaS OAuth applications. It allows for a more comprehensive and data-driven approach to SaaS app security, helping organizations protect their critical data and assets.
Recent Comments