What is your role and title? What are your responsibilities associated with your position?
I am an Integration Developer, and my key responsibilities consist of working with my team and alongside clients, making the transition and integration of their products and services smoother.
Can you provide some insights into your day-to-day activities and what a typical day in your role looks like?
Sure, merging a portion of my activities, what I could express as day-to-day would be: I start by checking for any issues in our clients’ production environments to ensure everything’s running smoothly, and then my main activities will be implementing cloud integration solutions with Azure Integration Services. Occasionally, I also help the team on on-premises projects using BizTalk Server.
Also, one of my big activities is going deep into Enterprise Integration features and crafting new ways to archive specific tasks. Do proof-of-concept in new features, explore existing or new services, test those solutions, and find alternatives, for example, creating Azure functions as an alternative to the Integration Account and testing inside Logic App flows to use those Azure functions.
I’m always on the hunt for new solutions to any problems we face, and in doing so, there’s a lot of documenting everything we do. This documentation is more than just busy work; it really helps by streamlining our processes and guides our team and community through troubleshooting. To ensure the importance of knowledge sharing, I actively produce informative content for our blog and YouTube Channel. This includes writing posts and creating videos that share our experiences, solutions, and insights with a broader audience.
I also contribute to enhancing our team’s productivity by creating tools tailored to address specific issues or streamline processes that are later shared with the community.
What motivates and inspires you to be an active member of the Aviators/Microsoft community?
What really drives me to engage with the Aviators/Microsoft community is my passion for tackling challenges and finding solutions. There’s something incredibly rewarding about cracking a tough problem and then being able to pass on that knowledge to others. I believe we’ve all had that moment of gratitude towards someone who’s taken the time to document and share a solution to the exact issue we were facing. That cycle of giving and receiving is what motivates me the most. It’s about contributing to a community that has been so important in my own learning and problem-solving journey, and I’m inspired to give back and assist others in the same way.
Looking back, what advice do you wish you would have been told earlier on that you would give to individuals looking to become involved in STEM/technology?
I could say something about always having a passion for new technologies and staying up to date with what you are pursuing. There would be nothing wrong with it, but those sound like already-at-hand phrases to be exchanged without considering each individual’s current state.
On a moment, and in a world where mental health is so important, let me share a simple tale that resonates with anyone at the crossroads of their career, whether they are new and confused about what to do, whether they’re just starting, or contemplating a shift in direction. It’s a gentle reminder that venturing into new territories can be daunting but immensely rewarding and that, at times, we may not even realize that our current paths could be detrimental to our well-being, professional growth, and personal relationships.
“There was once a man that went into the wilds of Africa, believing himself to be a hunter for many years. Despite his efforts, he found himself unable to catch any game. Overwhelmed by frustration and feeling lost, he sought the guidance of a Shaman from a nearby tribe.
Confessing to the Shaman, he said, “Hunting is what I live for, but I’m hitting a wall. There’s simply nothing out there for me to hunt, and I don’t know what to do.”
The Shaman, who had seen many seasons and had a kind of wisdom you don’t come across every day, simply put his arm on the hunter’s shoulder, looked him in the eyes and said, “Really? Nothing to hunt for? This land has fed us for generations. There is plenty of hunt out there and yet you cannot see it? Maybe the problem isn’t the land…allow me to ask you something very important, do you genuinely desire to be a hunter?”
This narrative goes much deeper than the act of hunting. It’s a reflection on our passions, how we confront our challenges, and the realization that our perspective might need a shift.
If our passions no longer ignite us, or if our efforts to chase them lead nowhere, it might be a sign to let go, not in defeat, but in liberation, because, in the end, I want everyone to be happy with the career path they have chosen, so that would be my advice, to read this simple tale, apply it to your current situation and ask yourself, “Do I really want to do keep doing what I am doing right now?” And if you find that your current path is not worth pursuing, if your mental health is not in shape, or if you are hitting a wall, then yes, it is time to take the step!
Imagine you had a magic wand that could create a feature in Logic Apps. What would this feature be and why?
In a world where AI is at such a fast pace, one feature that I would personally like to have on Logic Apps is prompted AI-generated Logic App flows. What that would mean is you give a prompt to the designer of what you pretend, and you would have a generated, most efficient flow for what you have described. Of course, you will still need to configure some things, but I think AI-generated flows could outline and cover many scenarios, making our processes faster and more efficient.
AI is here to stay, whether we like it or not; it just doesn’t go away, so we could take advantage of it to create better, faster, and more efficient products or stay behind while we see others do it.
What are some of the most important lessons you’ve learned throughout your career that surprised you?
One of the most surprising yet vital lessons from my career is the central role of relationships in keeping the ship sailing smoothly. Having positive communication and nurturing a positive work environment are crucial elements that empower a team to deliver top-notch results, remain driven, and maximize their daily potential. A car has four tires, and you need them all to get home safely.
Check out this customer success story on how Microsoft is helping to keep Slovenia’s lights on by improving and modernizing ELES’ operations. In 2012, ELES turned to Microsoft when they needed a new enterprise resource planning (ERP) solution. Today, ELES uses Azure Logic Apps to connect their ERP with other systems, improving collaboration between departments, streamline operations, and better manage their energy resources and infrastructure.
For those using the IBM MQ Built-in (In-App) connector available in Logic Apps Standard, check out this article explain more on Handles and how to calculate the max value to set in your IBM MQ server.
Learn about various issue scenarios related to the Azure Automation connector in both Logic App Consumption and Standard, along with its causes and resolutions.
V1 Actions/Triggers of the SQL Connector for Logic Apps will be deprecated by the end of March 2024. In this article, learn how to use a PowerShell Script to identify the Logic Apps still using the deprecated SQL Connectors so that you can change them to the V2 equivalent.
ISE’s retirement date is August 31st, 2024, so make sure you migrate any Logic Apps running on ISE to Logic Apps Standard. Check out this guide video from our FastTrack team that walks you through the whole process!
Check out this recording from the February 2024 meetup for Houston Azure User Group where Azure customers dive into their journey from on-premises Biztalk to Azure Logic Apps hosted in an Integration Service Environment (ISE).
Watch this recording from a webinar hosted by Derek and Tim as they talk about the benefits of Azure’s ecosystem and a step-by-step strategy for a smooth transition from MuleSoft to AIS.
Performance evaluation has been revolutionized by technology, extending its reach to the individual level. Consider health apps on your smartphone. They gather data breadcrumbs from your daily activities, providing analysis of your movement patterns. This isn’t a generic data compilation, but a near-accurate reflection of your physical activity during a specific period.
In the future, it’s conceivable that these apps might be equipped with an AI companion, or Copilot, to guide your next steps based on your past activities. It could suggest rest days or additional exercise to help you achieve your personal health goals.
This concept of performance evaluation based on collected data is the bedrock of process mining and process comparison. Our Copilot functionality adds a layer of assistance, enabling you to make informed decisions about your warehouse operations.
In this context, Copilot can help you optimize warehouse processes. It can identify bottlenecks in certain processes or compare different methods to achieve the same goal, empowering you to choose the most optimal method for your specific case.
In this blog, we’ll explore the essence of this feature, its intended audience, and how and why you should leverage it for your warehousing operations.
Process Mining Insights:
At first glance, using Process Advisor for material movement analysis is easy. The setup process is straightforward:
Go to Warehouse Management > Setup > Process Mining > Warehouse material movement configuration. In the taskbar, select Deploy New Process.
The configuration Wizard will open. Press Next, then enter the name of the process in the field Process Name, choose company, choose number of months to load (12 months = data from latest 12 months) and choose the appropriate Activity. Press Next.
Process is deployed.
The configuration wizard looks like this:
Image: Configuration wizard screenshot.
The easy part is now complete. We have set up a process, named it, and loaded 12 months of data to prepare for our analysis. The difficult part is making sense of our data and using it to make decisions to improve our warehouse output.
Therefore, we will provide you with some real-life examples on how to use the data analysis functionality to understand your processes, and a scenario where we evaluate two different methods and use the Process Advisor to figure out which method would be preferred for our business operations.
Analysis of data
There are multiple ways to analyze your process data to understand and compare your processes.
Start with opening Power Automate and go to the tab Process Mining. The report is accessible on the main page.
Report: When the report is loaded, it can look like this:
Image: Process Mining Case Summary.
3. Select Map
Select the Map tab to display the process map:
Image: Process Mining Map.
This is a screenshot of the process map from our example. On the Map, there are separate locations on which actions(tasks) have taken place, as well as the time spent on this location and between locations. You can change the time unit to, let’s say mean duration, to see how long each activity in a particular location takes per average.
4. Use the Co-Pilot to get started.
We provide you with suggestions for frequent prompts, but you can of course choose to enter whatever you want. In this case, we will use the suggested “provide the top insights” prompt.
Image: Process Mining map with Copilot.
5. Copilot Generates
The Copilot generates a response based on the data in your process map. In the example, we can see that the Copilot has found the “BULK” as the longest running activity, and provided us with a list of the activities with the greatest number of repetitions:
Image: Process Mining map and Copilot generated answer.
6. Copilot Follow Up
We can also ask the Co-pilot follow-up questions. In this case, we will follow-up with the suggested “How to identify my bottleneck?” and “Find my Bottleneck” prompts. The Co-pilot generates a message explaining what the bottleneck is and its mean duration. In this instance, since we have selected the metric Mean duration, we will generate an answer reflecting this metric.
Image: Process Mining map with Copilot generated answer.
The message we receive tells us that the Variant with the highest duration is “Variant 2” with a mean duration of 2 minutes and 47 seconds. It also tells us that the activity with the highest mean duration is “BULK” with a mean duration of 15 minutes.
From this, we can draw the conclusion that “Variant 2” is the variant that takes the longest time to complete, and that the most amount of time is spent in the “BULK” location.
By using the process advisor for warehouse movement material analysis, we can streamline warehouse operations and ensure we don’t spend more time than we need on a particular task or operation. Another example where the Process Advisor can be utilized to enhance operational fluidity in your warehouse is by comparing different methods of achieving a similar goal, to understand which method is more effective to reach your desired goal. We will try to explain how to conduct such a comparison to with a test-case.
In our test-case, we will compare two different methods of picking goods in the Warehouse to figure out which picking method takes less time, so we can increase the Warehouse output.
Test Case : “Single order picking” vs “Cluster picking”
In this test-case, the user wants to know which method of picking is faster, “Single order picking” vs “Cluster picking”. To compare the two, the user goes through the following steps. First, the user creates a Hypothesis for the purpose of this test-case. In this case, the user wants to determine which picking method is faster.
Secondly, the user decides the scope of the test. For both methods, the user will have 5 sales orders with one to five different items per order, in different quantities. Both methods will use identical sales orders for test purposes. In the Work Details screen, we can see the work details for the work that has been created. The Variants are the different Variants of work, so in this instance, for work ID USMF-002327 with Order number 002375 (displayed in the picture) the worker will “Pick” 1 piece of item LB0001 in 5 different variations (in this case colors), then “Put” these 5 items away in packing area (location “PACK”).
Image: Work details screenshot.Image: Work details label(s).
With the “Single order picking” method, the worker picks one order at a time and puts it in the packing location. To clarify, the warehouse worker will go to each location where the item is located, pick and scan that item, repeat the process for each item in that order, take the order to pack location and then repeat with next order.
Worker goes to 5 different locations to pick items, then proceeds to “PACK” location to put items away for packing. Then, the worker repeats the process for the other orders.
Image: Picking locations
After we have constructed our hypothesis and determined the scope, we can go ahead and prepare for the analysis.
First, we will have to deploy our process comparison. We head into Warehouse Management > Setup > Process Mining > Warehouse material process configuration, and in the taskbar, we select Deploy New Process. We select a fitting description as the Process Name, select company and number of months to load. In this test case, we will only be loading one month of data since we don’t need more for this test’s purposes.
Usually, you would want as much correct data(not corrupted/faulty data since this will affect the analysis) and necessary (scope needs to determine how much and what is necessary) data as possible to get a high-quality analysis. When our process has been deployed, we can move on to the analysis and evaluate this process.
We load our process map into Power Automate, and in the beginning, it will look something like this:
Image: Process Map Starting view.
We can press the Play Animation button to get a representation of the process.
Image: Process Map Starting view.
In the Statistics tab, we can see basic information of the process.
Image: Process mining statistics tab overview.
In the Variants tab, we can view the different work-Variants. By selecting one, we can get an in-depth view of, in this case, “Variant 3”. We can see that in this variant, 6 cases occurred, the total duration was 8 minutes and 15 seconds, and the total active time was 8 minutes and 14 seconds. In this case, the attribute selected is Zone. If we look closely at the Variants, we can see that “Variant 2” has 2 cases and the others have 1.
This means that two pieces of “work” that was scheduled were so similar that they could be grouped. This is because, from a warehouse management perspective, the operation is identical. This is because the worker goes to one location, picks item(s) 1, goes to another location and picks item(s) 2, then put them away in “PACK”. Thus, it is two “Pick” operations and one “Put”, and therefore they will be grouped in this view.
Image: Process mining variants tab zone overview.
We can also change the Variants’ view by changing the Attribute selected. In this case, we will change the attribute from Zone to Order number. This will change our view, so that we see different Variants based on work type. It will in this case show us 5 variants, which at first can seem confusing. A new variant is displayed with these settings, since this now displays Variants by order number instead of zone, which means that we get one variant for each Sales order we created, since all of them were different from each other.
Image: Process mining variants tab order number overview.
In this instance, we can see the order numbers in the legend on the right side. This view tells us that we have 5 different order numbers, and the boxes below Variants Overview represents the number of work operations performed per Order Number. The Case Count per order number, in the case of “Variant 2” there has been a total of 6 operations performed (pick, pick, pick, pick, pick, put, as mentioned previously) and in the case of Variant 4 and 5, there has been a total of 3 case count (Pick, Pick, Put).
For this scenario, it can be helpful to see how much work we are performing per event. If we want a view where we can see how much work we do per event, we can switch Attribute to Work Quantity. This will in this instance allow us to see the quantity of work that needs to be performed for each event. In the example of “Variant 2” the interface tells us that 6 events have taken place, in 5 of the events quantity has been 1, and in one of the events quantities was 5. To put this into a warehouse perspective, this means that we have performed 5 of the events 1 time each, which for Variant 2 is “Pick item 1, Pick item 2, Pick item 3, Pick item 4, Pick item 5” and one event where we “Put” away these items 5 times. That single operation is performed 5 times and counts as one event because it is the same event occurring multiple times, whilst the other event, even though they are all “Pick” events, will count as individual events due to picking different products, which are all in different locations. When we “Put” away in “PACK” location, we don’t put the items in different locations, thus it counts as one event.
Image: Process mining variants tab work quantity overview.
If we select Attribute by Work type, this becomes clear:
Image: Process mining variants tab work type overview.
We might want to see the location where the events took place. To do that, we can set Attribute to Location, and the view will show us the locations of the events below the header Variants overview.
Image: Process mining variants tab work location overview.
In this image, we can see the variants based on location. To put this into context, “Variant 6” tells us 6 events have taken place, all in different parts of the warehouse. For “Variant 10”, we can see that one event took place in “LEGOLOC301” and one in “PACK”.
Now, after we have made ourselves comfortable within the report, we can start analyzing our process. To do that, press the Process Compare button below Variants.
A view similar to this one will appear:
Image: Process compare variants tab location map overview.
In the process map displayed on the screen, we have set the Mining attribute to Location, and the Metric to Total duration. This will allow us to see the total amount of time spent in each location.
By changing the Metric to Total count, we can see the number of times an event took place in each location, as the picture below displays:
Image: Process compare variants tab location map overview.
The total amount of time spent in one location and number of cases per location might be valuable, but a more telling metric could be how much time we spent on average per location.
By switching metric to mean duration, we can see the average time spent per location. This gives us yet another hint on which part of the process takes the most amount of time to manage. But, if we want to see how it looks from a proportional perspective, by toggling the percentage sign next to the Metric drop-down menu, we will achieve exactly that.
Image: Process compare variants tab location and mean duration map overview.
As we can see from the image above, LEGOLOC 201 is the location in which we spend the largest percentage of our time. If we want to further examine what is going on in that location, we can do so by pressing the bar. This will change the view slightly, and a card with detailed information will appear on the right of the screen.
Image: Process compare variants tab location map detailed view.
In the highlighted red box, we can see detailed performance data to further assess the performance in this location.
Now, we have enough information to draw some conclusions on our own. We have identified zone LEGOLOC 201 as our “time-thief”, and we know that more than 1/3 of the time was spent on picking items in this zone. To make the analysis process easier, Microsoft’s Copilot has been built into this feature. By pressing the Copilot sign in the top-right corner, you will open the dialogue box where you can create a prompt and ask the Copilot about your process. The Copilot will suggest some common prompts, but you can of course create your own. In this case, we will ask the Copilot to summarize our process.
Image: Process compare map and Copilot dialogue.Image: Process compare map and Copilot generated answer.
As displayed in the picture, the Copilot will give us a summary of the process. Because we have selected to compare our first part of the test vs our default value (the red locations), it also summarizes the default value’s process.
We do get some information on how many events took place etc., but we did not get the total case time, which was the value we wanted to find to confirm or deny our hypothesis. By asking the Copilot what the average case duration and the total case duration was, we received the answer that mean case duration was 4 minutes and 18 seconds, and total duration was 21 minutes and 31 seconds.
So, our answer in this case is that the Single order picking took 21 minutes and 31 seconds to complete.
Image: Process compare map and Copilot generated answer.
Now, we will compare the result to the cluster picking method, to see how they compare.
For context, cluster picking differs from single order picking in the sense that in cluster picking, workers pick multiple orders simultaneously and not one at a time. In this case, it means the worker will pick all 5 sales orders, then put them all away in the packing station at the same time, rather than picking an order, putting them away in the packing station, and repeating for next orders.
Image: Work clusters screenshot.
In this image, we can see the main difference between these picking methods. For cluster picking, we can see that the warehouse worker is tasked with picking 8 pieces of red Lego blocks (left image), and in the second screenshot (right) we can see how many and from which specific positions items should be picked.
Image: Work clusters screenshot with illustrations.
When all items have been picked, the Work status will be updated so all Cluster positions are “In process”.
Image: Work Cluster in progress.
Next task is to put all items in the packing station. When we have done that, all Cluster position Work statuses will be changed to Closed.
Image: Cluster Put screenshot.
As we can see in the image below, work status has been changed to Closed across the board.
Image: Work Clusters status closed.
Now, let’s jump back to the analysis. Start by creating a new process in the same way we did for single order picking and open the process map in Power Automate. In our test case, this is what we are shown on our screen.
Image: Process Compare map.
As we have already covered how choosing different metrics affects the process map and the information on display, we will not do that for this part of the test, since we know we need to compare location as the Mining attribute, and total duration as the Metric.
We will again use the help of the Copilot to evaluate the process map. Once again, we ask for a summary of the process.
Image: Process Compare map and Copilot generated insight.
Test Case Results
The summary from the Copilot tells us that this process started November 6th and ended after 8 minutes and 45 seconds.
This means we have successfully confirmed our hypothesis by using process mining and the process advisor. Now we know for a fact that for one picker with 5 sales orders constructed in this manner, cluster picking is a much more efficient picking method compared to single order picking, since identical amount of work took significantly less time to complete. Therefore, we can draw the conclusion that for all work with similar characteristics, we should prefer using cluster picking over single order picking, at least if we want to increase warehouse output.
Keep in mind, harnessing the power of Process Advisor requires an analytical mindset and a structured approach. The sheer volume of headers, variants, locations, and numbers can be overwhelming. To navigate this complexity, emulate the structured methodology illustrated in this example. By having a clear understanding of your comparison and measurement objectives, and a strategy to achieve them, you’ll significantly enhance the outcomes derived from Process Advisor.
Essential skills for effective process mining:
Use a fact-based approach with warehouse data as the base.
Use a strategic and tactical approach throughout the analysis.
Unlike this example, a great way of using process mining is by using continuous analysis, where you monitor something over time, rather than one-time analysis, which it can also be used for, as in this example.
Use quick data for immediate insights, and big data for continuous and conclusive analysis.
Master filtering to gain valuable insights and sort out what you believe is important.
Wealth of achievements made possible through process mining:
Identify areas in which processes can be improved.
Validate conformance of processes.
Do process simulation and predictive analysis.
Discover the most optimal paths for automatization.
Conclusion:
The power of Process Advisor extends far beyond what we’ve explored in this blog. It’s a versatile tool that can be adapted to a myriad of scenarios, and this guide merely scratches the surface of its potential. We’ve used it here to streamline warehouse operations, but the possibilities are truly limitless.
We encourage you to dive in and experiment with Process Advisor. Use the scenario we’ve outlined as a starting point, but don’t stop there. Input your own warehouse data and see firsthand how Process Advisor can illuminate opportunities for efficiency and growth. The journey towards optimizing your warehouse output begins with the Process Advisor.
This article is contributed. See the original author and article here.
1. SharePoint datasets and OneDrive
When I describe the SharePoint datasets in Microsoft Graph Data Connect to someone, I frequently get this question: do Sites and Sharing Permissions cover only SharePoint or do they include OneDrive? The short answer is that OneDrive is included, but there is much more to say here…
2. OneDrive is a type of SharePoint site
For most technical intents and purposes, a OneDrive in your Microsoft 365 tenant is a SharePoint site with a specific template and permissions. It is basically a SharePoint site collection for personal use that comes preconfigured with permissions for the owner and nobody else. After that, you can upload/create files and decide to keep them private or share with others from there.
This special type of site was initially called a “Personal Site”, later was referred to as a “My Site” or “MySite”, then a “OneDrive for Business” (commonly abbreviated to “ODfB” or simply “ODB”). These days, we usually just call it a OneDrive and you can figure out if we’re talking about the consumer or business variety based on context.
Along the way, the purpose has always been the same. To allow someone in a tenant to store information needed for your personal work, with the ability to share with others as necessary. As the name suggests, it’s your single drive in the cloud to store all your business-related personal files.
The personal sites for each user are typically created only when the user tries to access their OneDrive for the first time. SharePoint does offer administrators a mechanism to pre-provision accounts. You can read more about it athttps://learn.microsoft.com/en-us/sharepoint/pre-provision-accounts.
But keep in mind that, when you use the Microsoft Graph Data Connect to pull the Sites dataset, you get all types of sites in the tenant and that does include OneDrives.
3. How can you tell them apart?
In the Sites dataset, you can tell a site is a OneDrive by looking at the RootWeb.WebTemplate (which is “SPSPERS” for OneDrive) or the RootWeb.WebTemplateId (which is 21 for OneDrive). Note that these are properties of the Root Web for the site (more on this later).
For the other Microsoft Graph Data Connect for SharePoint datasets, you can use the SiteId property to join with the Sites dataset and find the Template or Template Id. This is a reliable method and the recommended one.
Some of the datasets might also have a URL property which can be used to identify a OneDrive. For the Sharing Permissions dataset, for instance, an ItemURL that starts with “personal/” indicates a permission for a OneDrive. You can read more about OneDrive URLs athttps://learn.microsoft.com/en-us/sharepoint/list-onedrive-urls.
Using the URL is probably OK for most tenants using OneDrive but might not work for other site types.
4. Root Web
It is good to clarify why the Template and TemplateId properties come from the RootWeb property and it’s not a property of the site itself.
For starters, it’s important to understand the main SharePoint entities:
There are many tenants.
Tenants have Sites, also known as Site Collections.
Sites (Site Collections) have Webs, also known as Subsites.
Webs (Subsites) have Lists, some of which are called libraries or document libraries.
Lists have List Items (document libraries have folders and documents)
As you can see, there is a hierarchy.
Hierarchy
The relationship between Sites and Webs is particularly interesting. When you create a Site, you must tell SharePoint the type of Site you want. That is used to create the Site and the main Web inside, called the RootWeb.
Every Site Collection has at least one Web and most have only one (the Root Web). The Site’s name and type (template) ends up being stored in the Root Web. Most templates don’t even have an option to add more webs (subsites). I would recommend keeping things simple and having only one web per site.
Note: You will sometimes hear people refer to Webs as Sites, which is a term normally used for Site Collections. Since most Site Collections have only one Web, that is typically not a big issue. That can get a little confusing at times, so you might want to stick to using the unambiguous terms “Site Collections” and “Webs” to be extra clear.
5. Web Templates
When you create a Site Collection and its corresponding Root Web, you must choose a Web Template. Each Web Template comes with a few default lists and libraries.
Some of these Web Templates (like Team Sites and Communication Sites) help you get started with a new Site. Others are not meant to be created by end users but are used for specific scenarios (like the Compliance Policy Center, the Search Center or the Tenant Admin Site). As we mentioned before, one of these templates is the Personal Site or OneDrive.
Here’s a list of some common Web Templates used by SharePoint Online:
Web Template Id
Web Template
Description
1
STS
Classic Team Site
16
TENANTADMIN
Tenant Admin Site
18
APPCATALOG
App Catalog Site
21
SPSPERS
OneDrive (Personal Site)
54
SPSMSITEHOST
My Site Host
56
ENTERWIKI
Enterprise Wiki
64
GROUP
Office 365 group-connected Team Site
68
SITEPAGEPUBLISHING
Communication site
69
TEAMCHANNEL
Team Channel
90
SRCHCENTERLITE
Basic Search Center
301
REDIRECTSITE
Redirect Site
3500
POLICYCTR
Compliance Policy Center
Note: There are many more of these templates, not only the ones listed above. You can get a list of the templates available to you using the Get-SPOWebTemplate PowerShell cmdlet:
Name : BICenterSite#0
Title : Business Intelligence Center
Name : BLANKINTERNETCONTAINER#0
Title : Publishing Portal
Name : COMMUNITY#0
Title : Community Site
Name : COMMUNITYPORTAL#0
Title : Community Portal
Name : DEV#0
Title : Developer Site
Name : EHS#1
Title : Team Site – SharePoint Online configuration
Name : ENTERWIKI#0
Title : Enterprise Wiki
Name : OFFILE#1
Title : Records Center
Name : PRODUCTCATALOG#0
Title : Product Catalog
Name : PROJECTSITE#0
Title : Project Site
Name : SITEPAGEPUBLISHING#0
Title : Communication site
Name : SRCHCEN#0
Title : Enterprise Search Center
Name : SRCHCENTERLITE#0
Title : Basic Search Center
Name : STS#0
Title : Team site (classic experience)
Name : STS#3
Title : Team site (no Microsoft 365 group)
Name : visprus#0
Title : Visio Process Repository
6. They are all in there…
So, I hope it’s clear that the Microsoft Graph Data Connect for SharePoint datasets (like Sites, Sharing Permissions and Groups) include information for all types of sites in the tenant, regardless of the Template they use. You can use the Sites dataset to understand Team Sites, OneDrives, and Communication Sites. The Sharing Permissions dataset includes permissions for all these different types of sites.
This article is contributed. See the original author and article here.
Artificial intelligence (AI) is transforming the world of work, creating new opportunities and challenges for businesses and workers alike. According to a recent report by Microsoft and PwC, AI could boost the UK economy by £232 billion by 2030, but it also requires a significant upskilling of the workforce to ensure that everyone can benefit from it.
If you are a technology student or a young professional who wants to develop AI skills and prepare for the future of work, here are some tips and resources that can help you: The Microsoft UK AI & Copilot Skills Challenge starts February 20, 2024 at 8:00 AM (8:00) GMT and ends on March 31, 2024 at 23:00 PM (11pm) GMT.
Learn the basics of AI and its applications. AI is a broad field that encompasses many subdomains, such as machine learning, computer vision, natural language processing, and more. To get started, you can take online courses, such as Microsoft Learn, edX, or Coursera, that cover the fundamentals of AI and how it can be used to solve real-world problems. You can also explore Learn AI Microsoft Resources learning paths and hands-on labs for various AI scenarios and communities.
Get hands-on experience with AI tools and platforms. To apply your AI knowledge and skills, you need to familiarize yourself with the tools and platforms that enable you to build, deploy, and manage AI solutions. For example, you can use Azure AI Studio, a cloud-based service that provides a comprehensive set of AI capabilities, such as cognitive services, machine learning, and conversational AI. You can also use Power Platform, a low-code/no-code platform that allows you to create AI-powered apps, workflows, and chatbots without writing code.
Join AI communities and events. One of the best ways to learn and grow your AI skills is to connect with other AI enthusiasts and experts, who can offer you guidance, feedback, and inspiration. You can join online or local AI communities, such as the Gobal AI Community, where you can network, share ideas, and collaborate on projects. You can also attend AI events, where you can hear from industry leaders, discover the latest trends, and showcase your work.
Keep up with the ethical and social implications of AI. As AI becomes more pervasive and powerful, it also raises important ethical and social questions, such as how to ensure fairness, accountability, transparency, and human dignity in AI systems. To be a responsible AI practitioner, you need to be aware of these issues and how to address them in your work. You can read books, articles, and reports, such as The Future Computed, AI Ethics, or Responsible AI, that explore the ethical and social dimensions of AI. You can also take courses, that teach you how to design and implement AI solutions that align with ethical principles and social values.
AI is a fast-growing and exciting field that offers many opportunities for technology students and professionals. By following these tips and resources, you can develop AI skills that will help you succeed in the future of work. Remember, AI is not only about technology, but also about people, society, and the world. So, be curious, be creative, and be ethical, and you will be ready to make a positive impact with AI.
Learn and develop essential AI and Copilot skills with the UK AI Skills Challenge
Get ahead with immersive and curated AI, Generative AI and Copilot training content across Microsoft products and services with four engaging themed challenges. Once you complete a challenge, you will receive a Microsoft UK AI & Copilot Skills Challenge badge of completion. For more info refer to the official rules.
As you progress through the challenges, you’ll have the chance to explore additional experiences tailored to your learning preferences and goals. Join the vibrant technical community in your local region, attend live sessions, build a powerful network, and build in-demand AI skills for today’s job market.
Generative AI
This challenge focused on understanding Generative AI and Large Language Models. Discover the fundamentals of generative AI and get started with Azure OpenAI Service. You’ll learn more about prompt engineering, generating code with Azure OpenAI Services, large language models, and prompt flow to develop large language model apps.
This challenge is tailored for IT Pro Administrators seeking to leverage Copilot for Microsoft 365 effectively in their work environments. The series of modules covers a range of topics from basic introductions to advanced management techniques, ensuring a comprehensive learning experience.
This challenge is tailored for developers who want to learn how to build apps for Microsoft Teams and get to know Microsoft Copilot Studio. It includes a series of modules that will give you practical experience and valuable knowledge about creating, launching, and improving apps on these platforms.
Machine learning is at the core of artificial intelligence, and many modern services depend on predictive machine learning models. Learn how to use Azure Machine Learning to create and publish models without writing code. You’ll also explore the various developer tools you can use to interact with the workspace.
This article is contributed. See the original author and article here.
We are pleased to announce the security review for Microsoft Edge, version 122!
We have reviewed the new settings in Microsoft Edge version 122 and determined that there are no additional security settings that require enforcement. The Microsoft Edge version 117 security baseline continues to be our recommended configuration which can be downloaded from theMicrosoft Security Compliance Toolkit.
Microsoft Edge version 122 introduced 4 new computer settings and 4 new user settings. We have included a spreadsheet listing the new settings in the release to make it easier for you to find them.
As a friendly reminder, all available settings for Microsoft Edge are documentedhere, and all available settings for Microsoft Edge Update are documentedhere.
This article is contributed. See the original author and article here.
Welcome to the first edition of What’s new in Copilot for Microsoft 365. We are continuing to enhance Copilot to provide deeper experiences for users and tighter integration with your organization’s data to unlock even more capabilities. Whether you’re a Microsoft 365 admin for a large enterprise or smaller company or someone who uses Copilot for Microsoft 365 for their daily work, every month we’ll highlight updates to let you know about new and upcoming features and where you can find more information to help make your Copilot experience a great one. In addition to these monthly posts, we’ll continue to provide updates through our usual message center posts and on our public roadmap.
Today, we are highlighting Copilot support in 17 additional languages, expanded resources and coming features in Copilot Lab, the updated Copilot experience in Teams, Copilot in the Microsoft 365 mobile app, and a new feature that provides a single entry point to help you create content from scratch. We’ll also take a look at updates to Copilot in OneDrive, Stream, and Forms plus a new feature that generates content summaries when you share files with coworkers. Finally, we’ll share a bit on what’s new in the Copilot for Microsoft 365 Usage report for admins. Let’s take a closer look at what’s new this month:
Experience Copilot support for more languages
Begin your Copilot journey and build new skills with Copilot Lab
Copilot now available in the Microsoft 365 mobile app
Introducing Copilot in Forms
Extract information quickly from your files with Copilot in OneDrive
Include quick summaries when sharing documents
Get instant video summaries and insights with Copilot in Stream
Try new ways of working with Help me create
Draft emails quicker and get coaching tips for your messages with Copilot in classic Outlook for Windows
Experience the new Copilot experience in Microsoft Teams
Check out the improved usage reports for Microsoft Copilot in the admin center
Catch up on the Copilot for Microsoft 365 Tech Accelerator
Experience Copilot support for more languages
We are adding support for an additional 17 languages, further expanding access to Copilot worldwide. We will start rolling out Arabic, Chinese Traditional, Czech, Danish, Dutch, Finnish, Hebrew, Hungarian, Korean, Norwegian, Polish, Portuguese (Portugal), Russian, Swedish, Thai, Turkish and Ukrainian over March and April. Copilot is already supported in the following languages: English (US, GB, AU, CA, IN), Spanish (ES, MX), Japanese, French (FR, CA), German, Portuguese (BR), Italian, and Chinese Simplified. Check the public roadmap and message center to track roll out status.
Copilot in Excel (preview) is currently supported in English (US, GB, AU, CA, IN) and will be supported in Spanish (ES, MX), Japanese, French (FR, CA), German, Portuguese (BR), Italian, and Chinese Simplified starting in March.
Begin your Copilot journey and build new skills with Copilot Lab
Copilot Lab helps users get started with the art of prompting and helps organizations with onboarding and adoption by providing a single experience that meets Copilot users where they are in their journey. Today, we’re expanding Copilot Lab by transforming the current prompts library into a comprehensive learning resource that helps everyone begin their Copilot journey with confidence and to take greater advantage of Copilot in their daily work.
Start your Copilot journey with ease. We’ve learned from our earliest Copilot adopters that working with generative AI requires new skills and habits. Copilot Lab already shows up in Copilot for Microsoft 365, Word, PowerPoint, Excel, and OneNote via the small notebook icon that suggests relevant prompts to inspire you. Now, we have consolidated our best resources, training videos, ready-made prompts, and inspiration to make Copilot Lab the single resource to help you get started. To do this, we’ve brought together our own internal best practices, insights from our earliest customers, findings from the Microsoft Research team, and thought leadership published on WorkLab.
Achieve more together by sharing your favorite prompts. With Copilot Lab, we are making it even easier to create, save, and share your favorite prompts with colleagues inside your organization. Now you can share prompts with colleagues to prepare for a customer meeting or to generate ideas for a new product launch. And leaders across your organization can showcase how they’re using Copilot by sharing their favorite prompts to save time or tackle any task at hand, to help improve personal and team productivity and encourage community-centric learning and adoption. This feature is integrated into the Copilot Lab website and in-app experiences will begin rolling out by this summer.
You can access Copilot Lab today at copilot.cloud.microsoft/prompts or directly in app by selecting the notebook icon next to the Copilot prompt window.
Copilot now available in the Microsoft 365 mobile app
We’re extending Copilot to the Microsoft 365 mobile app and to the Word and PowerPoint mobile apps. With the new Microsoft 365 app look and feel, you can easily find Copilot alongside your content, apps, and shortcuts. You can use it to:
Bring your content into Copilot to complete tasks on the go. Summarize documents, translate, explain, or ask questions, and have your answer grounded in the content you select.
Start generating content wherever you work based on your ideas and existing information, and hand over to Microsoft 365 mobile apps to continue working.
Interact with Copilot in Word mobile and PowerPoint mobile to comprehend content better and skim through only the most important slides on the go (requires a Copilot license).
The Microsoft 365 mobile app complements the Copilot mobile app rolled out earlier this month, and licensed users can continue to use the Copilot mobile app to have responses grounded in both web or work data. IT admins can easily deploy both the Microsoft 365 mobile app and the Copilot mobile app to corporate devices using Microsoft Intune or a third-party tool, or users can simply download the Microsoft 365 mobile app on any supported device and sign in.
Copilot integration in the Microsoft 365 mobile app and the Word and PowerPoint mobile apps is rolling out now. You can learn more here.
The iOS layout of the Microsoft 365 mobile app, showing Copilot available on the taskbar.
Create compelling surveys, polls, and forms with Copilot in Forms
Use Copilot to simplify the process of creating surveys, polls, and forms, saving you time and effort. Go to forms.microsoft.com, select New, and tell Copilot your topic, length, and any additional context. Copilot will provide relevant questions and suggestions, and then you can refine the draft by adding extra details, editing text, or removing content. Once you’ve created a solid draft with Copilot, you can then customize the background with one of the many Forms style options. With Copilot in Forms, you’ll effortlessly create well-crafted forms that capture your audience’s attention, leading to better response rates.
An image of a form draft with Copilot prompts displayed
Extract information quickly from your files with Copilot in OneDrive
Copilot in OneDrive gives you instant access to information contained deep within your files. Initially available from the OneDrive web experience, Copilot will provide you with smart and intuitive ways to interact with your documents, presentations, spreadsheets, and files. You can use Copilot in OneDrive to:
Get information from your files: Ask questions about your content using natural language, and Copilot will fetch the information from your files, saving you the work and time of manually searching for what you need.
Generate file summaries: Need a quick overview of a file? Copilot can summarize the contents of one or multiple files, offering you quick insights without having to even open the file.
Find files using natural language: Find files in new ways by using Copilot prompts such as “Show me all the files shared with me in the past week” or “Show files that Kat Larson has commented in.”
Alt text: Video showing Copilot in OneDrive with a prompt to extract information from a collection of resumes.
Include quick summaries when sharing documents
Add Copilot-generated summaries when you share documents with your colleagues. These summaries, included in the document sharing notification, give your recipients immediate context around a document and a quick overview of its content without needing to open the file. Sharing summaries helps users prioritize work, increases engagement, and reduces cognitive burden.
Sharing summaries will be available in March 2024, starting when sharing a Word document from the web, with support in the desktop client and the mobile app later this year. Learn more here.
GIF showing AI-generated sharing summary when sharing a Microsoft Word doc.
Get instant video summaries and insights with Copilot in Stream
By using Copilot in Microsoft Stream, you can quickly get the information you need about videos in your organization, whether you’re viewing the latest Teams meeting recording, town hall, product demo, how-to, or onsite videos from frontline workers. Copilot helps you get what you need from your videos in seconds. You can use it to:
Summarize any video and identify relevant points you need to watch
Ask questions to get insights from long or detailed videos
Locate when people, teams, or topics are discussed so you can jump to that point in the video
Identify calls to action and where you can get involved to help
Copilot in Stream can quickly summarize a video or answer your questions about the content in the video. Alt text: Screen shot showing Copilot in Microsoft Stream.
Try new ways of working with Help me create
In March, we’re rolling out a new Copilot capability in the Microsoft 365 web app that helps you focus on the substance of your content while Copilot suggests the best format: a white paper, a presentation, a list, an icebreaker quiz, and so on. In the Microsoft 365 app at microsoft365.com, simply tell Help me create what you want to work on and it will suggest the best app for you and give you a boost with generative AI suggestions. Learn more here.
Help me create dialog box in the foreground, with the Microsoft 365 web app create screen in the background.
Draft emails quicker and get coaching tips for your messages with Copilot in classic Outlook for Windows
Customers of the new Outlook for Windows have been enjoying Copilot features like draft, coaching, and summary which we announced last year. Since November last year, summary by Copilot has also been available in classic Outlook for Windows. Soon, draft and coaching will be coming to classic Outlook too.
Draft with Copilot helps you reduce time spent on email by drafting new emails or responses for you with just a short prompt that explains what you want to communicate. Because you are always in control with Copilot, you can choose to adjust the proposed draft in length and tone or ask Copilot to generate a new message – and you can always go back to the previous options if you prefer.
Coaching by Copilot can help you get your point across in the best possible way, coaching you on tone (for example, too aggressive, too formal, and so on), reader sentiment (how a reader might perceive your message), and clarity. Copilot can provide coaching for drafts it created or drafts you wrote yourself.
An image of a message composed in the classic Outlook for Windows with the Copilot icon being clicked to reveal options for draft and coaching.
Experience the new Copilot in Microsoft Teams
We have recently enabled a new Copilot experience in Microsoft Teams that offers better prompts, easier access, and more functionality than the previous version. Copilot in Teams will be automatically pinned above your chats, and you can use it to catch up, create, and ask anything related to Microsoft 365. Learn more about the new Copilot experience in Teams here.
An image of the Copilot experience in Microsoft Teams, responding to a question based on the user’s Graph data
Check out the improved usage reports for Microsoft Copilot in the admin center
The Microsoft 365 admin center Usage reports offer a growing set of usage insights across your Microsoft 365 cloud services. Among these reports, the Copilot for Microsoft 365 Usage report (Preview) is built to help Microsoft 365 admins plan for rollout, inform adoption strategy, and make license allocation decisions.
The report now includes usage metrics for Microsoft Copilot with Graph-grounded chat. This allows you to see how Chat compares with usage of Copilot in other apps like Teams, Outlook, Word, PowerPoint, Excel, OneNote and Loop. You can review the enabled and active user time series chart to assess how usage is trending over time. The new metric has been added retroactively dating back to late November of 2023. To access the report, navigate to Reports > Usage and select the Copilot for Microsoft 365 product report. Learn more here.
An image of the Copilot for Microsoft 365 Usage report highlighting the addition of a new metric for Microsoft Copilot with Graph-grounded chat
Learn more about the use of Copilot for Microsoft 365 in the Financial Services Industry
Today we are releasing the new white paper for the financial services industry (FSI) with information about use cases and benefits for the FSI, information about risks and regulations, guidance for managing and governing a generative AI solution, and more information about how to prepare for Copilot. Read the paper here.
Catch up on the Copilot for Microsoft 365 Tech Accelerator
In case you missed it, you can catch up on all the sessions from the Copilot for Microsoft 365 Tech Accelerator via recordings on the event page. The event covered a range of topics including how Copilot works, how to prepare your organization for Copilot, strategies for deploying, driving adoption, and measuring impact, and deep dives on how to extend Copilot with Copilot Studio and Graph connectors. Chat Q&A is open through Friday, March 1, 12:00 P.M. PT, so watch the recordings and get any questions you might have answered.
Did you know? The Microsoft 365 Roadmap is where you can get the latest updates on productivity apps and intelligent cloud services. Check out what features are in development or coming soon on the Microsoft 365 Roadmap. All future rollout dates assume the feature availability on the Current Channel. Customers should expect these features to be available on the Monthly Enterprise Channel the second Tuesday of the upcoming month.
This article is contributed. See the original author and article here.
The finance department is the heart of the organization, juggling a myriad of critical, yet complex tasks—from quote-to-cash processes like credit and collections to risk management and compliance. Financial teams are not only responsible for these mandatory, labor-intensive operations, but are increasingly tasked with real-time insights into business performance and recommendations for future growth initiatives. In fact, 80% of finance leaders and teams face challenges to take on more strategic work beyond the operational portions of their roles.¹On the one hand, teams are poised and ready to play a larger role in driving business growth strategy. On the other hand, however, they can’t walk away from maintaining a critical and mandatory set of responsibilities.
Microsoft is introducing a solution to help finance teams reclaim time and stay on top of the critical decisions that can impact business performance.Microsoft Copilot for Finance is a new Copilot experience for Microsoft 365 that unlocks AI-assisted competencies for financial professionals, right from within productivity applications they use every day. Now available in public preview, Copilot for Finance connects to the organization’s financial systems, including Dynamics 365 and SAP, to provide role-specific workflow automation, guided actions, and recommendations in Microsoft Outlook, Excel, Microsoft Teams and other Microsoft 365 applications—helping to save time and focus on what truly matters: navigating the company to success.
Copilot for Finance
By harnessing AI, it automates time-consuming tasks, allowing you to focus on what truly matters.
Leveraging innovation to accelerate fiscal stewardship
Finance teams play a critical role in innovating processes to improve efficiency across the organization. As teams look to evolve and improve how time is spent to support more strategic work, it’s evident there are elements of operational tasks that are more mundane, repetitive, and manually intensive. Instead of spending the majority of their day on analysis or cross-team collaboration, 62% of finance professionals are stuck in the drudgery of data entry and review cycles.² While some of these tasks are critical and can’t be automated—like compliance and tax reporting—we also hear from majority of finance leaders that they lack the automation tools and technology they need to transform these processes and free up time.¹
With the pace of business accelerating every day, becoming a disruptor requires investing in technology that will drive innovation and support the bottom line. In the next three to five years, 68% of CFOs anticipate revenue growth from generative AI (GenAI).³ By implementing next-generation AI to deliver insight and automate costly and time-intensive operational tasks, teams can reinvest that time to accelerate their impact as financial stewards and strategists.
Microsoft Copilot for Finance: Accomplish more with less
Copilot for Finance provides AI-powered assistance while working in Microsoft 365 applications, making financial processes more streamlined and automated. Copilot for Finance can streamline audits by pulling and reconciling data with a simple prompt, simplify collections by automating communication and payment plans, and accelerate financial reporting by detecting variances with ease. The potential time and cost savings are substantial, transforming not just how financial professionals work, but how they drive impact within the organization.
Users can interact with Copilot for Finance in multiple ways. It both suggests actions in the flow of work, and enables users to ask questions by typing a prompt in natural language. For example, a user can prompt Copilot to “help me understand forecast to actuals variance data.” In moments, Copilot for Finance will generate insights and pull data directly from across the ERP and financial systems, suggesting actions to take and providing a head start by generating contextualized text and attaching relevant files. Like other copilot experiences, users can easily check source data to ensure transparency before using Copilot to take any actions.
Copilot for Finance connects to existing financial systems, including Dynamics 365 and SAP, as well as thousands more with Microsoft Copilot Studio. With the ability to both pull insight from and update actions back to existing sources, Copilot for Finance empowers users to stay in the flow of work and complete tasks more efficiently.
Built for finance professionals
Copilot for Finance is well versed in the critical and often time-consuming tasks and processes across a finance professional’s workday, providing a simple way to ask questions about data, surface insights, and automate processes—helping to reduce the time spent on repetitive actions. While today’s modern finance team is responsible for a litany of tasks, let’s explore three scenarios that Copilot for Finance supports at public preview.
Copilot for Finance can also help financial analysts to reduce the risk of reporting errors and missing unidentified variances. Rather than manually reviewing large financial data sets for unusual patterns, users can prompt Copilot to detect outliers and highlight variances for investigation. Copilot for Finance streamlines variance identification with reusable natural language instructions in the enterprise context. A financial analyst can direct Copilot to identify answers for variances, and Copilot will gather supporting data autonomously.
Audits of a company’s financial statements are critical to ensuring accuracy and mitigating risk. Traditionally, accounts receivable managers were required to pull account data manually from ERP records, reconcile it in Excel, and look for inaccuracies manually. With Copilot for Finance, these critical steps are done with a single prompt, allowing AR managers to act on inconsistencies and any delinquencies found with Copilot suggested copy and relevant invoices.
“Finance organizations need to be utilizing generative AI to help blend structured and unstructured datasets. Copilot for Finance is a solution that aggressively targets this challenge. Microsoft continues to push the boundary of business applications by providing AI-driven solutions for common business problems. Copilot for Finance is another powerful example of this effort. Copilot for Finance has potential to help finance professionals at organizations of all sizes accelerate impact and possibly even reduce financial operation costs.”
—Kevin Permenter, IDC research director, financial applications
The collections process is another critical responsibility as it affects company cash flow, profitability, and customer relationships. Collection coordinators spend their time reviewing outstanding accounts and attempting to reconcile them in a timely manner. This often means phone calls, emails, and negotiating payment plans. With Copilot for Finance, collection coordinators can focus their time on more meaningful client-facing interactions by leaving the busy work to Copilot. Copilot for Finance supports the collections process end-to-end by suggesting priority accounts, summarizing conversations to record back to ERP, and providing customized payment plans for customers.
Copilot for Finance can also help financial analysts to reduce the risk of reporting errors and missing unidentified variances. Rather than manually reviewing large financial data sets for unusual patterns, users can prompt Copilot to detect outliers and highlight variances for investigation. Copilot for Finance streamlines variance identification with reusable natural language instructions in the enterprise context. A financial analyst can direct Copilot to identify answers for variances, and Copilot will gather supporting data autonomously.
Copilot will suggest financial context contacts and will provide auto summaries for streamlined tracking of action items and follow ups. Copilot for Finance can generate fine-tuned financial commentary, PowerPoint presentations, and emails to report to key stakeholders.
Our journey with Microsoft Finance
Microsoft employs thousands across its finance team to manage and drive countless processes and systems as well as identify opportunities for company growth and strategy. Who better to pilot the latest innovation in finance? For the first phase, we worked closely with a Treasury team focused on accounts receivable as well as a team in financial planning and analysis—who need to reconcile data as a part of their workflow before conducting further analysis. After trialing the data reconciliation capabilities in Copilot for Finance, the initial value and potential for scale for these teams was clear.
“Financial analysts today spend, on average, one to two hours reconciling data per week. With Copilot for Finance, that is down to 10 minutes. Functionality like data reconciliation will be a huge time saver for an organization as complex as Microsoft.”
—Sarper Baysal, Microsoft Commercial Revenue Planning Lead
“The accounts receivable reconciliation capabilities help us to eliminate the time it takes to compare data across sources, saving an average 20 minutes per account. Based on pilot usage, this translates to an average of 22% cost savings in average handling time.”
—Gladys Jin, Senior Director Microsoft Finance Global Treasury and Financial Services
Microsoft Copilot for Finance availability
Ready to take the next step? Microsoft Copilot for Finance is available for public preview today. Explore the public preview demo and stay tuned for additional announcements by following us on social.
This article is contributed. See the original author and article here.
Configuration analyzer in Microsoft Defender for Office 365 helps you find and fix security policies that are less secure than the recommended settings. It allows you to compare your current policies with the standard or strict preset policies, lets you apply recommendations to improve your security posture, and view historical changes to your policies.
We are excited to announce several updates to Configuration analyzer. This update includes:
New recommendations covering more scenarios.
New flyout which adds more context around the recommendations.
New export button which lets you easily export recommendations to share with your partners.
Clicking on a recommendation will now open a flyout that has brief detail about why we are making the recommendation as well as targeted links to documentation to learn more about.
Exporting the Recommendations:
A new Export button should appear when you select one or multiple recommendations. Clicking on the Export button will download the selected recommendations as a CSV file which can be shared with your external partners who might not have access to your environment.
If you have other questions or feedback about Microsoft Defender for Office 365, engage with the community and Microsoft experts in the Defender for Office 365 forum.
This article is contributed. See the original author and article here.
Azure HDInsight Spark 5.0 to HDI 5.1 Migration
A new version of HDInsight 5.1 is released with Spark 3.3.1. This release improves join query performance via Bloom filters, increases the Pandas API coverage with the support of popular Pandas features such as datetime.timedelta and merge_asof, simplifies the migration from traditional data warehouses by improving ANSI compliance and supporting dozens of new built-in functions.
In this article we will discuss about the migration of user applications from HDInsight 5.0(Spark 3.1) to HDInsight 5.1 (Spark 3.3). The sections include,
1. Changes which are compatible with minor changes
2. Changes in Spark that require application changes
Application Changes with backport.
The below changes are part of HDI 5.1 release. If these functions are used in applications, the given steps can be taken to avoid the changes in application code.
Since Spark 3.3, the histogram_numeric function in Spark SQL returns an output type of an array of structs (x, y), where the type of the ‘x’ field in the return value is propagated from the input values consumed in the aggregate function. In Spark 3.2 or earlier, x’ always had double type. Optionally, use the configuration spark.sql.legacy.histogramNumericPropagateInputType since Spark 3.3 to revert to the previous behavior.
Spark 3.1 (pyspark)
Spark 3.3:
In Spark 3.3, the timestamps subtraction expression such as timestamp ‘2021-03-31 23:48:00’ – timestamp ‘2021-01-01 00:00:00’ returns values of DayTimeIntervalType. In Spark 3.1 and earlier, the type of the same expression is CalendarIntervalType. To restore the behavior before Spark 3.3, you can set spark.sql.legacy.interval.enabled to true.
Since Spark 3.3, the functions lpad and rpad have been overloaded to support byte sequences. When the first argument is a byte sequence, the optional padding pattern must also be a byte sequence and the result is a BINARY value. The default padding pattern in this case is the zero byte. To restore the legacy behavior of always returning string types, set spark.sql.legacy.lpadRpadAlwaysReturnString to true.
> SELECT hex(lpad(x’1020′, 5, x’05’))
0505051020
SELECT hex(rpad(x’1020′, 5, x’05’)) 1020050505
Since Spark 3.3, Spark turns a non-nullable schema into nullable for API DataFrameReader.schema(schema: StructType).json(jsonDataset: Dataset[String]) and DataFrameReader.schema(schema: StructType).csv(csvDataset: Dataset[String]) when the schema is specified by the user and contains non-nullable fields. To restore the legacy behavior of respecting the nullability, set spark.sql.legacy.respectNullabilityInTextDatasetConversion to true.
Since Spark 3.3, nulls are written as empty strings in CSV data source by default. In Spark 3.2 or earlier, nulls were written as empty strings as quoted empty strings, “”. To restore the previous behavior, set nullValue to “”, or set the configuration spark.sql.legacy.nullValueWrittenAsQuotedEmptyStringCsv to true.
Sample Data:
Spark 3.1:
Spark 3.3:
Since Spark 3.3, Spark will try to use built-in data source writer instead of Hive serde in INSERT OVERWRITE DIRECTORY. This behavior is effective only if spark.sql.hive.convertMetastoreParquet or spark.sql.hive.convertMetastoreOrc is enabled respectively for Parquet and ORC formats. To restore the behavior before Spark 3.3, you can set spark.sql.hive.convertMetastoreInsertDir to false.
Spark logs:
INFO ParquetOutputFormat [Executor task launch worker for task 0.0 in stage 0.0 (TID 0)]: ParquetRecordWriter [block size: 134217728b, row group padding size: 8388608b, validating: false]INFO ParquetWriteSupport [Executor task launch worker for task 0.0 in stage 0.0 (TID 0)]: Initialized Parquet WriteSupport with Catalyst schema:{ “type” : “struct”, “fields” : [ { “name” : “fname”, “type” : “string”, “nullable” : true, “metadata” : { } }, {
Since Spark 3.3.1 and 3.2.3, for SELECT … GROUP BY a GROUPING SETS (b)-style SQL statements, grouping__id returns different values from Apache Spark 3.2.0, 3.2.1, 3.2.2, and 3.3.0. It computes based on user-given group-by expressions plus grouping set columns. To restore the behavior before 3.3.1 and 3.2.3, you can set spark.sql.legacy.groupingIdWithAppendedUserGroupBy
In Spark 3.3, spark.sql.adaptive.enabled is enabled by default. To restore the behavior before Spark 3.3, you can set spark.sql.adaptive.enabled to false.
In Spark3.1, AQE is set to false by default.
In Spark3.3, AQE is enabled by default.
Adaptive Query Execution (AQE) is an optimization technique in Spark SQL that makes use of the runtime statistics to choose the most efficient query execution plan, which is enabled by default since Apache Spark 3.3.0. Spark SQL can turn on and off AQE by spark.sql.adaptive.enabled as an umbrella configuration. As of Spark 3.0, there are three major features in AQE: including coalescing post-shuffle partitions, converting sort-merge join to broadcast join, and skew join optimization.
In Spark 3.3, the output schema of SHOW TABLES becomes namespace: string, tableName: string, isTemporary: boolean. In Spark 3.1 or earlier, the namespace field was named database for the builtin catalog, and there is no isTemporary field for v2 catalogs. To restore the old schema with the builtin catalog, you can set spark.sql.legacy.keepCommandOutputSchema to true.
In Spark3.1, Field is termed as database:-
In Spark3.3, Field is termed as Namespace: –
We can restore the behavior by setting the below property.
In Spark 3.3, the output schema of SHOW TABLE EXTENDED becomes namespace: string, tableName: string, isTemporary: boolean, information: string. In Spark 3.1 or earlier, the namespace field was named database for the builtin catalog, and no change for the v2 catalogs. To restore the old schema with the builtin catalog, you can set spark.sql.legacy.keepCommandOutputSchema to true.
Show similar screenshot details in both spark-sql shell for spark3.1 and spark3.3 versions.
In Spark3.1, Field is termed as database:
In Spark3.3, Field is termed as Namespace: –
We can restore the behavior by setting the below property.
In Spark 3.3, CREATE TABLE AS SELECT with non-empty LOCATION will throw AnalysisException. To restore the behavior before Spark 3.2, you can set spark.sql.legacy.allowNonEmptyLocationInCTAS to true.
In spark 3.3, we are able to CTAS with non-empty location, as shown below
In spark 3.3 also we are able to create tables without the above property change
In Spark 3.3, special datetime values such as epoch, today, yesterday, tomorrow, and now are supported in typed literals or in cast of foldable strings only, for instance, select timestamp’now’ or select cast(‘today’ as date). In Spark 3.1 and 3.0, such special values are supported in any casts of strings to dates/timestamps. To keep these special values as dates/timestamps in Spark 3.1 and 3.0, you should replace them manually, e.g. if (c in (‘now’, ‘today’), current_date(), cast(c as date)).
In spark 3.3 and 3.1 below code works exactly same.
Application Changes Expected
There are some changes in the spark functions between HDI 5.0 and 5.1. The changes depend on whether the applications use below functionalities and APIs.
Since Spark 3.3, DESCRIBE FUNCTION fails if the function does not exist. In Spark 3.2 or earlier, DESCRIBE FUNCTION can still run and print “Function: func_name not found”.
Spark 3.1:
Spark 3.3:
Since Spark 3.3, DROP FUNCTION fails if the function name matches one of the built-in functions’ name and is not qualified. In Spark 3.2 or earlier, DROP FUNCTION can still drop a persistent function even if the name is not qualified and is the same as a built-in function’s name.
Since Spark 3.3, when reading values from a JSON attribute defined as FloatType or DoubleType, the strings “+Infinity”, “+INF”, and “-INF” are now parsed to the appropriate values, in addition to the already supported “Infinity” and “-Infinity” variations. This change was made to improve consistency with Jackson’s parsing of the unquoted versions of these values. Also, the allowNonNumericNumbers option is now respected so these strings will now be considered invalid if this option is disabled.
Since Spark 3.3, when reading values from a JSON attribute defined as FloatType or DoubleType, the strings “+Infinity”, “+INF”, and “-INF” are now parsed to the appropriate values, in addition to the already supported “Infinity” and “-Infinity” variations. This change was made to improve consistency with Jackson’s parsing of the unquoted versions of these values. Also, the allowNonNumericNumbers option is now respected so these strings will now be considered invalid if this option is disabled.
Spark 3.3:
Spark 3.1:
Spark 3.3 introduced error handling functions like below:
TRY_SUBTRACT – behaves as an “-” operator but returns null in case of an error.
TRY_MULTIPLY – is a safe representation of the “*” operator.
TRY_SUM – is an error-handling implementation of the sum operation.
TRY_AVG – is an error handling-implementation of the average operation.
TRY_TO_BINARY – eventually converts an input value to a binary value.
Example of ‘try_to_binary’ function:
When correct value given for base64 decoding:
When wrong value given for base64 decoding it doesn’t throw any error.
Since Spark 3.3, ADD FILE/JAR/ARCHIVE commands require each path to be enclosed by ” or ‘ if the path contains whitespaces.
In spark3.3:
In spark3.1: Multiple jars adding not working, only at a time can be added.
16.In Spark 3.3, the following meta-characters are escaped in the show() action. In Spark 3.1 or earlier, the following metacharacters are output as it is.
n (new line)
r (carrige ret)
t (horizontal tab)
f (form feed)
b (backspace)
u000B (vertical tab)
u0007 (bell)
In Spark3.3, meta-characters are escaped in the show() action.
In Spark3.1, the meta-characters are actually interpreted as their define functions.
In Spark 3.3, the output schema of DESCRIBE NAMESPACE becomes info_name: string, info_value: string. In Spark 3.1 or earlier, the info_name field was named database_description_item and the info_value field was named database_description_value for the builtin catalog. To restore the old schema with the builtin catalog, you can set spark.sql.legacy.keepCommandOutputSchema to true.
In Spark 3.1, we see the below headers before we set the property to false and check.
In Spark 3.3, we see the Info name and Info value before we set the property to true.
In Spark 3.3, DataFrameNaFunctions.replace() no longer uses exact string match for the input column names, to match the SQL syntax and support qualified column names. Input column name having a dot in the name (not nested) needs to be escaped with backtick `. Now, it throws AnalysisException if the column is not found in the data frame schema. It also throws IllegalArgumentException if the input column name is a nested column. In Spark 3.1 and earlier, it used to ignore invalid input column name and nested column name.
In Spark 3.3, CREATE TABLE .. LIKE .. command can not use reserved properties. You need their specific clauses to specify them, for example, CREATE TABLE test1 LIKE test LOCATION ‘some path’. You can set spark.sql.legacy.notReserveProperties to true to ignore the ParseException, in this case, these properties will be silently removed, for example: TBLPROPERTIES(‘owner’=’yao’) will have no effect. In Spark version 3.1 and below, the reserved properties can be used in CREATE TABLE .. LIKE .. command but have no side effects, for example, TBLPROPERTIES(‘location’=’/tmp’) does not change the location of the table but only creates a headless property just like ‘a’=’b’.
In spark 3.3 we got the same parse exceptions, post setting the property we were able to create the table
In spark 3.1 , we didn’t get any exceptions or errors:
In Spark 3.3, the unit-to-unit interval literals like INTERVAL ‘1-1’ YEAR TO MONTH and the unit list interval literals like INTERVAL ‘3’ DAYS ‘1’ HOUR are converted to ANSI interval types: YearMonthIntervalType or DayTimeIntervalType. In Spark 3.1 and earlier, such interval literals are converted to CalendarIntervalType. To restore the behavior before Spark 3.3, you can set spark.sql.legacy.interval.enabled to true.
In spark 3.3, post setting up this spark.sql.legacy.interval.enabled to true these literals are converted to ANSI interval types: YearMonthIntervalType or DayTimeIntervalType.
In Spark 3.1, there are no changes due to the change in property.
In Spark 3.3, the TRANSFORM operator can’t support alias in inputs. In Spark 3.1 and earlier, we can write script transform like SELECT TRANSFORM(a AS c1, b AS c2) USING ‘cat’ FROM TBL.
In Spark 3.1 we are able use direct transforms but , In spark 3.3, direct transform is prohibited , but can be use with below workaround.
Dynamics 365 Commerce is a comprehensive omnichannel solution that empowers retailers to deliver personalized, seamless, and differentiated shopping experiences across physical and digital channels. In the 2024 release Wave 1, Dynamics 365 Commerce continues to innovate and enhance its capabilities to improve store associate productivity and meet the evolving needs of customers and businesses. Here are some of the highlights of the new features coming soon:
Copilot in Site builder is going global and multi-lingual:
Copilot in Site builder is a generative AI assistant that helps users create engaging and relevant content for their e-commerce sites. Copilot uses the product information and the user’s input to generate product enrichment content that is crafted using brand tone and tailored for targeted customer segments.
Image: Copilot Site Builder
In the 2024 release wave 1, Copilot in Site builder is expanding its language support to include support for 23 additional locales including German, French, Spanish, and more. This feature demonstrates Microsoft’s commitment to making Copilot accessible globally and empowering users to create multilingual content with ease and efficiency.
Strengthening our dedication to creating a comprehensive B2B solution for Digital Commerce by supporting B2B indirect commerce
Dynamics 365 Commerce supports both B2C and B2B commerce scenarios, enabling retailers to sell directly to consumers and businesses. In the 2024 release wave 1, Dynamics 365 Commerce fortifies its B2B investments by introducing support for B2B indirect commerce, which enables manufacturers selling through a network of distributors to get complete visibility into their sales and inventory.
Image: New distributor capabilities
New distributor capabilities enable manufacturers to provide a self-service platform that simplifies distributor operations and builds meaningful, long-lasting business relationships through efficient and transparent transactions. Distributors can access product catalogs and pricing specific to their partner agreements, manufacturers can place orders on behalf of their customers with specific distributor, and outlets can track order status and history.
Dynamics 365 Commerce also streamlines multi-outlet ordering, enabling business buyers that are associated with more than one outlet organization to buy for all of them. Commerce provides the ability to seamlessly buy for multiple organizations using the same email account, enabling buyers to be more efficient.
Image: Order for Organizations
Additionally, Dynamics 365 Commerce supports advance ordering, which is a common practice in some businesses to order products in advance to ensure they have adequate stock when needed. This feature enables customers to specify the desired delivery date and include additional order notes.
Also, introducing support for a promotions page on an e-commerce site that serves as a hub to showcase various deals and promotions that shoppers can take advantage of. The promotions page can display active and upcoming promotions.
Image : Promotions Page
Adyen Tap to Pay is coming to Store Commerce app on iOS
The Store Commerce app is a mobile point of sale (POS) solution that enables store associates to complete transactions through a mobile device on the sales floor, pop-up store, or remote location. The Store Commerce app supports various payment methods, such as cash, card, gift card, and loyalty points.
Image: Adyen Tap to Pay
In the 2024 release wave 1, Dynamics 365 Commerce is introducing Adyen Tap to Pay capabilities into the Store Commerce app for iOS, so that retailers everywhere can accept payments directly on Apple iPhones. Adyen Tap to Pay enhances the utility and versatility of the Store Commerce app, as it eliminates the need for additional hardware or peripherals to process payments. It also enables retailers to offer a more customer-centric and engaging in-store retail experience, as store associates can interact with customers anywhere in the store and complete transactions on the spot.
Speed up your checkout process with simplified and consistent payment workflows for different payment methods on Store Commerce app
Efficiency and predictability are key to the smooth operation of a point of sale (POS) system, especially when it comes to payment processing. When store associates can process customer payments across a variety of payment types with minimal friction, customers spend less time waiting and more time shopping.
In the 2024 release wave 1, Dynamics 365 Commerce is improving the POS payment processing user experience to create more consistent workflows across payment types. The new user experience simplifies the payment selection and confirmation process, reduces the number of clicks and screens, and provides clear feedback and guidance to the store associate. The new user experience also supports split tendering, which allows customers to pay with multiple payment methods in a single transaction.
Image: Check out process
The improved POS payment processing user experience will contribute to efficiencies in the checkout process and more satisfied customers. It will also reduce the training time and effort for store associates, as they can easily learn and master the payment workflows.
Enabling retailers to effectively monitor and track inventory of Batch-controlled products via Store Commerce app
Batch-controlled products are products that are manufactured in batches and associated with unique identifiers for quality control and traceability. Batch-controlled products are commonly used in food, chemical, and electronics industries, where the quality and safety of the products are critical.
Image: Batch Control Products
In the 2024 release wave 1, Dynamics 365 Commerce enhances the Store Commerce app to support batch-controlled products. This feature enables store associates to scan or enter the batch number of the products during the sales or return transactions and validate the batch information against the inventory records. This feature also enables store associates to view the batch details of the products, such as the expiration date, manufacture date, and lot number.
With these new features, Dynamics 365 Commerce aims to provide you with the best tools and solutions to grow your business and delight your customers. Whether you want to create engaging and relevant content for your e-commerce site, automate and integrate your order management workflows, expand your B2B commerce opportunities, or improve your payment processing and inventory management, Dynamics 365 Commerce has something new for you.
To learn more about Dynamics 365 Commerce:
Learn more about additional investments and timeline for these investments here in release plans.
Recent Comments