Microsoft named a Leader in the 2024 Gartner® Magic Quadrant™ for Unified Communications as a Service for the sixth year in a row

Microsoft named a Leader in the 2024 Gartner® Magic Quadrant™ for Unified Communications as a Service for the sixth year in a row

This article is contributed. See the original author and article here.

We’re honored to announce that Microsoft has, once again, been recognized as a Leader in the 2024 Gartner® Magic Quadrant™ for Unified Communications as a Service (UCaaS), Worldwide. This is the sixth year we’ve received this recognition and we’re thrilled to be positioned highest in both the ability to execute and furthest on completeness of vision axes.

The post Microsoft named a Leader in the 2024 Gartner® Magic Quadrant™ for Unified Communications as a Service for the sixth year in a row appeared first on Microsoft 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Explore what’s new for Copilot and Dynamics 365 at the Business Applications Launch Event

Explore what’s new for Copilot and Dynamics 365 at the Business Applications Launch Event

This article is contributed. See the original author and article here.

As Microsoft Copilot features continue to roll out across Microsoft Dynamics 365 and Microsoft Power Platform, it can be easy to get overwhelmed and lose track of critical new capabilities. Thankfully, the Microsoft Business Applications Launch Event is just around the corner.  

Register today for the virtual launch event on Tuesday, October 29, 2024—a showcase of new and enhanced capabilities releasing between October 2024 and March 2025. Packed with demos and a live Q&A chat with Microsoft experts, you’ll get a sneak peek at innovation that can empower your workforce, optimize business processes, and enhance customer engagement.   

Microsoft Business Applications Launch Event

Explore the future of your business.

Explore the future of business with Copilot

Microsoft product leaders and engineers will be live at the event to give you an in-depth look at the latest Copilot capabilities for Dynamics 365 and Microsoft Power Platform, including new ways to automate business process across your organization and scale your team. Our team will also showcase organizations across industries using new Copilot and Dynamics 365 features to drive transformation.

Top 4 reasons to attend the launch event

Twice a year, the Business Applications Launch Event gives you a sneak peek at product news, demos and insights into upcoming features and capabilities across Dynamics 365, Microsoft Power Platform, and Copilot. Here are four top reasons to attend the October 2024 event:  

  1. Get a sneak peek at highlights from the 2024 release wave 2. Discover what’s new and improved in Dynamics 365 and Microsoft Power Platform. Hear from Charles Lamanna, Microsoft Corporate Vice President Business and Industry Copilot, and other leaders as they guide you through dozens of new Copilot and core platform capabilities releasing over the next six months.  
  2. Personalize sales and service experiences. Learn how to elevate customer experiences with demonstrations of new capabilities across Microsoft Dynamics 365 Customer Service, Microsoft Dynamics 365 Contact Center, and Microsoft Dynamics 365 Sales. You’ll also discover how Sweden-based automotive company, Lynk & Co, is using Dynamics 365 to drive highly personalized experiences.
  3. Transform business operations with AI-enabled enterprise resource planning (ERP) processes. Get a sneak peek at the enhancements that improve both core functionality and autonomous capabilities across ERP applications like Microsoft Dynamics 365 Finance, Microsoft Dynamics 365 Supply Chain Management, and Microsoft Dynamics 365 Business Central through the lens of our customer Lifetime Products, as well as the latest features for Business Central.  
  4. Exploring the future of Microsoft Power Platform. Learn how Copilot is transforming how you build, what you build, how you automate, and get a first-hand look at how Applied Information Sciences is innovating business solutions using the newest capabilities for Microsoft Power Apps, Microsoft Power Automate, and Microsoft Copilot Studio.

That’s not all. You’ll also hear from other Microsoft leaders about their roadmap for the future of AI, customer service, and operations and how to use these new technologies to take on your organization’s most time-consuming tasks.  

The Business Applications Launch Event streams live on Tuesday, October 29, 2024 starting at 9:00 AM Pacific Time and then available on-demand. Be sure to register for updates and reminders as the event day approaches. We’ll see you there!    

Microsoft Business Applications Launch Event 

Tuesday, October 29, 2024 
9:00 AM-10:00 AM Pacific Time (UTC-7)  

The post Explore what’s new for Copilot and Dynamics 365 at the Business Applications Launch Event appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Improve operational efficiency with Cross-Location Open Shifts

Improve operational efficiency with Cross-Location Open Shifts

This article is contributed. See the original author and article here.

We’re excited to introduce the cross-location shifts feature in Microsoft Shifts app, designed to enhance flexibility and efficiency for both frontline managers and workers. Currently in public preview, this feature empowers businesses to better manage staffing across multiple locations while giving employees more control over their schedules.


 


For Frontline Managers


With cross-location shifts, managers can offer open shifts across various teams and locations, helping to balance workforce needs, fill last-minute gaps, and improve customer satisfaction. By turning on the cross-location feature in Shifts settings, managers can post open shifts that employees from other locations can request, ensuring their store or site is always fully staffed.


 


Managers will be notified when employees from other locations request shifts and can easily approve or decline requests. Once approved, workers from other locations will appear in your schedule as external employees, making it seamless to track staff across locations.


JohnSteckroth_1-1728923088834.png


 


 


For Frontline Workers


The cross-location feature provides more flexibility for employees, allowing them to pick up open shifts at different locations that suit their schedules. Workers can view open shifts at other sites and submit a request to pick up the shift. Once approved by the manager at that location, the shift will appear in their schedule.


JohnSteckroth_1-1728923234146.png


 


Getting Started


For IT Admins: To enable cross-location open shifts in your organization, follow the steps outlined here: Set up open shifts across locations in Shifts for your frontline – Microsoft 365 for frontline workers | Microsoft Learn


For Managers and Workers: Learn more about using this feature here: Use open shifts across locations in Shifts – Microsoft Support


What’s coming next:



  1. Manager of employees can opt in for approval when they are working at other locations in addition to approval of the target store manager

  2. Manager of employee can view the location name in the team schedule when they are working at other locations


This powerful new feature helps businesses optimize staffing, enhance worker flexibility, and improve overall operational efficiency. Stay tuned as we refine this feature during its public preview phase, and we encourage you to share your feedback!


Please take a moment to share your feedback/questions on this feature via this brief survey (https://aka.ms/CrossLocationShifts) and include your email for any further queries. We’re eager to connect with you!


 


 


 


 


 

Performing ETL in Real-Time Intelligence with Microsoft Fabric

Performing ETL in Real-Time Intelligence with Microsoft Fabric

This article is contributed. See the original author and article here.

Introduction


In today’s data-driven world, the ability to act upon data as soon as its generated is crucial for businesses to make informed decisions quickly. Organizations seek to harness the power of up-to-the-minute data to drive their operations, marketing strategies, and customer interactions.


 


This becomes challenging in the world of real-time data where it is not always possible to do all the transformations while the data is being streamed. Therefore, you must come up with a flow that does not impact the data stream and is also quick.


 


This is where Microsoft Fabric comes into play. Fabric offers a comprehensive suite of services including Data Engineering, Data Factory, Data Science, Real-Time Intelligence, Data Warehouse, and Databases. But today, we are going to focus on Real-Time Intelligence.


 


Use-Cases


This set up can be used in scenarios where data transformation is needed to be used in downstream processing/analytical workload. As example of this would be to enable OneLake availability in KQL table and use that data to be accessed by other Fabric engines like Notebooks, Lakehouse etc. for training ML models/analytics.


 


Another example let’s say you have a timestamp column in your streaming data and you would like to change its format based on your standard. You can use the update policy to transform the timestamp data format and store it.


 


Overview


Fabric Real-Time Intelligence supports KQL database as its datastore which is designed to handle real-time data streams efficiently. After ingestion, you can use Kusto Query Language (KQL) to query the data in the database.


 


KQL Table is a Fabric item which is part of the KQL Database. Both these entities are housed within an Eventhouse. An Eventhouse is a workspace of databases, which might be shared across a certain project. It allows you to manage multiple databases at once, sharing capacity and resources to optimize performance and cost. Eventhouses provide unified monitoring and management across all databases and per database.


gurkamal_0-1728575168009.png


 


Figure 1: Hierarchy of Fabric items in an Eventhouse


 


Update policies are automated processes activated when new data is added to a table. They automatically transform the incoming data with a query and save the result in a destination table, removing the need for manual orchestration. A single table can have multiple update policies for various transformations, saving data to different tables simultaneously. These target tables can have distinct schemas, retention policies, and other configurations from the source table.


 


Scope


In this blog, we have a scenario where we will be doing data enrichment on the data that lands in the KQL table. In this case, we will be dropping the columns we don’t need but you can also do other transformations supported in KQL on the data.


Here we have a real-time stream pushing data to a KQL table. Once loaded into the source table, we will use an update policy which will drop columns not needed and push the data of interest to the destination table from the source table.


  
Screenshot 2024-10-10 130733.png


 


Prerequisites



 


Creating sample data stream



  1. In the Real-Time Intelligence experience, create a new event stream.

  2. Under source, add new source and select sample data.

    gurkamal_1-1728575168017.png


  3. Continue configuring the stream. I am using the Bicycles sample data stream in this blog.

  4. Select Direct ingestion as the Data Ingestion Mode for destination.

  5. Select your workspace and KQL database you have created as a prerequisite to this exercise for the destination.

  6. You should be seeing a pop-up to configure the database details and continue to configure the table where you need to land the data from the stream.


 


Configuring KQL Table with Update Policy



  1. Open the Eventhouse page in Fabric. There you should now be able to preview the data that is being ingested from the sample data stream.


  2. Create a new destination table. I used the following KQL to create the new table (destination):

    .create table RTITableNew (
        BikepointID: string,Street: string, Neighbourhood: string, No_Bikes: int, No_Empty_Docks: int )
    


  3. Under the Database tab, click on new and select Table Update Policy.                
     


    gurkamal_2-1728575168020.png




  4. You can edit the existing policy format or paste the one below that I used:


    NOTE: RTITable is source and RTITableNew is the destination table.

    .alter table RTITable policy update ```[
      {
        "IsEnabled": true,
        "Source": "RTITable",
        "Query": "RTITable | project BikepointID=BikepointID, Street=Street, Neighbourhood=Neighbourhood, No_Bikes=No_Bikes, No_Empty_Docks=No_Empty_Docks ",
        "IsTransactional": true,
        "PropagateIngestionProperties": false,
        "ManagedIdentity": null
      }
    ]```​


    The above policy drops the Longitude and Latitude columns and stores the rest of the columns in the destination table. You can do more transformations as per your requirements, but the workflow remains the same.


  5. After running the above command, your destination table will start populating with the new data as soon as the source table gets data. To review the policy on the destination table, you can run the following command:

     .show table  policy update​



 


Conclusion


To summarize, we took a real-time data stream, stored the data in a KQL database and then performed data enrichment on the data and stored in a destination table. This flow caters the scenarios where you want to perform processing on the data once its ingested from the stream.


 


Further Reading and Resources


Common scenarios for using table update policies – Kusto | Microsoft Learn


Create a table update policy in Real-Time Intelligence – Microsoft Fabric | Microsoft Learn

The Future of AI: Oh! One! Applying o1 Models to Business Data

This article is contributed. See the original author and article here.


In mid-September 2024, OpenAI introduced a groundbreaking family of models known as o1, often referred to as “Ph.D. models” due to their advanced capabilities. Now accessible through Azure OpenAI Service, o1 represents a significant leap in artificial intelligence, particularly in reasoning and problem-solving tasks. We’ve seen the o1 models solve problems like counting the number of R’s in the word “strawberry” and logic problems – but what does this mean for businesses?

 

One of the most remarkable features of o1 is its ability to do math and perform complex data analysis. Unlike previous models, o1 can calculate aggregate statistics, detect correlations across multiple datasets, and provide deep insights that were previously unattainable. To test its mettle, I decided to run the largest of the o1 models, o1-preview, through its paces, using datasets similar to those it might see in business scenarios. Note that the data used here is entirely synthetic, but it is patterned after real datasets that business might

use.

 


 



First, I tried a retail scenario. I took some example sales and staffing data, the kind a real store might have over a month and fed it into o1. I wanted to see if it could help figure out what’s driving sales and how staffing levels impact performance. Well, o1 didn’t disappoint. It crunched the numbers and pointed out that our sales were lower on weekends compared to other days. The funny thing is, the data didn’t label which days were weekends, but o1 figured it out anyway. It found the correlations between the sales and staffing datasets, even though the only obvious commonality between them was the time period they covered. It suggested that maybe having fewer staff scheduled over the weekend would lead to lower sales, and even recommended upping the weekend staff or adding self-checkout kiosks. It felt like having a seasoned retail analyst giving me personalized advice.

 

Next up, I wanted to see how o1 handled financial data. I used a fictional company’s financial statementsstructured just like real income statements, balance sheets, and cash flow statementsand asked o1 to create a sell-side research report. The results were impressive. It put together a detailed report, complete with an investment thesis and justification. It calculated growth rates, analyzed profit margins, and looked into financial ratios like price-to-earnings ratios. It justified its price target with solid analysis. Clearly, financial analysts can use this tool to make their jobs easier.

 

Then I decided to try something in the entertainment sector. I gave o1 some sample ticketing data from a fictional eventjust like the kind of data a real concert might generate. I wanted to find out who spent the most on tickets, and who bought the highest quantity. o1 not only identified the top spenders but also analyzed their buying habits and provided suggestions on how to encourage more high-volume ticket sales in the future. It was pretty cool to see how it turned raw numbers into real marketing insights. Even though I was using fictional datasets, o1 showed me its potential to make a real impact on businesses. It can help make better decisions by uncovering deeper insights, save time by handling complex tasks, spark innovation with its creative thinking, and help understand customers better to improve engagement.

 

Lastly, just for fun, I tested o1’s coding abilities. I asked o1 to create a simple HTML page with a playable Space Invaders game written entirely in JavaScript. To my surprise, it generated all the code I needed. When I ran it in my browser, there was the game, fully functional and ready to play. It worked on the first try! It was like magic, and I didn’t have to write a single line of code myself.

 

o1 has proven to be remarkably good at these sorts of technical tasks, but it turns out that its reasoning ability also extends to the creative realm. In fact, I fed it the transcript of the YouTube video I had created and prompted it to write this blog post (at least all the paragraphs above this one) – and it did! It took me four more little prompts to adjust the output to the tone I was looking for, but in just a few minutes, I had what I needed. So, as writing this blog post is one of my own business activities, the o1 model has now made a genuine impact on my business.

 



Why not get started?