Future-Proofing AI: Strategies for Effective Model Upgrades in Azure OpenAI

This article is contributed. See the original author and article here.

TL;DR: This post navigates the intricate world of AI model upgrades, with a spotlight on Azure OpenAI’s embedding models like text-embedding-ada-002. We emphasize the critical importance of consistent model versioning ensuring accuracy and validity in AI applications. The post also addresses the challenges and strategies essential for effectively managing model upgrades, focusing on compatibility and performance testing. 


 


Introduction


What are Embeddings?


 


Embeddings in machine learning are more than just data transformations. They are the cornerstone of how AI interprets the nuances of language, context, and semantics. By converting text into numerical vectors, embeddings allow AI models to measure similarities and differences in meaning, paving the way for advanced applications in various fields.


 


Importance of Embeddings


 


In the complex world of data science and machine learning, embeddings are crucial for handling intricate data types like natural language and images. They transform these data into structured, vectorized forms, making them more manageable for computational analysis. This transformation isn’t just about simplifying data; it’s about retaining and emphasizing the essential features and relationships in the original data, which are vital for precise analysis and decision-making.


Embeddings significantly enhance data processing efficiency. They allow algorithms to swiftly navigate through large datasets, identifying patterns and nuances that are difficult to detect in raw data. This is particularly transformative in natural language processing, where comprehending context, sentiment, and semantic meaning is complex. By streamlining these tasks, embeddings enable deeper, more sophisticated analysis, thus boosting the effectiveness of machine learning models.


 


Implications of Model Version Mismatches in Embeddings


 


Lets discuss the potential impacts and challenges that arise when different versions of embedding models are used within the same domain, specifically focusing on Azure OpenAI embeddings. When embeddings generated by one version of a model are applied or compared with data processed by a different version, various issues can arise. These issues are not only technical but also have practical implications on the efficiency, accuracy, and overall performance of AI-driven applications.


 


Compatibility and Consistency Issues



  • Vector Space Misalignment: Different versions of embedding models might organize their vector spaces differently. This misalignment can lead to inaccurate comparisons or analyses when embeddings from different model versions are used together.

  • Semantic Drift: Over time, models might be trained on new data or with updated techniques, causing shifts in how they interpret and represent language (semantic drift). This drift can cause inconsistencies when integrating new embeddings with those generated by older versions.


 


Impact on Performance



  • Reduced Accuracy: Inaccuracies in semantic understanding or context interpretation can occur when different model versions process the same text, leading to reduced accuracy in tasks like search, recommendation, or sentiment analysis.

  • Inefficiency in Data Processing: Mismatches in model versions can require additional computational resources to reconcile or adjust the differing embeddings, leading to inefficiencies in data processing and increased operational costs.


 


Best Practices for Upgrading Embedding Models


 


Upgrading Embedding – Overview


 


Now lets move to the process of upgrading an embedding model, focusing on the steps you should take before making a change, important questions to consider, and key areas for testing.


Pre-Upgrade Considerations




  • Assessing the Need for Upgrade:



    • Why is the upgrade necessary?

    • What specific improvements or new features does the new model version offer?

    • How will these changes impact the current system or process?




  • Understanding Model Changes:



    • What are the major differences between the current and new model versions?

    • How might these differences affect data processing and results?




  • Data Backup and Version Control:



    • Ensure that current data and model versions are backed up.

    • Implement version control to maintain a record of changes.




Questions to Ask Before Upgrading




  • Compatibility with Existing Systems:



    • Is the new model version compatible with existing data formats and infrastructure?

    • What adjustments, if any, will be needed to integrate the new model?




  • Cost-Benefit Analysis:



    • What are the anticipated costs (monetary, time, resources) of the upgrade?

    • How do these costs compare to the expected benefits?




  • Long-Term Support and Updates:



    • Does the new model version have a roadmap for future updates and support?

    • How will these future changes impact the system?




Key Areas for Testing




  • Performance Testing:



    • Test the new model version for performance improvements or regressions.

    • Compare accuracy, speed, and resource usage against the current version.




  • Compatibility Testing:



    • Ensure that the new model works seamlessly with existing data and systems.

    • Test for any integration issues or data format mismatches.




  • Fallback Strategies:



    • Develop and test fallback strategies in case the new model does not perform as expected.

    • Ensure the ability to revert to the previous model version if necessary.




Post-Upgrade Best Practices




  • Monitoring and Evaluation:



    • Continuously monitor the system’s performance post-upgrade.

    • Evaluate whether the upgrade meets the anticipated goals and objectives.




  • Feedback Loop:



    • Establish a feedback loop to collect user and system performance data.

    • Use this data to make informed decisions about future upgrades or changes.




Upgrading Embedding – Conclusion


Upgrading an embedding model involves careful consideration, planning, and testing. By following these guidelines, customers can ensure a smooth transition to the new model version, minimizing potential risks and maximizing the benefits of the upgrade.


Use Cases in Azure OpenAI and Beyond


Embedding can significantly enhance the performance of various AI applications by enabling more efficient data handling and processing. Here’s a list of use cases where embeddings can be effectively utilized:




  1. Enhanced Document Retrieval and Analysis: By first performing embeddings on paragraphs or sections of documents, you can store these vector representations in a vector database. This allows for rapid retrieval of semantically similar sections, streamlining the process of analyzing large volumes of text. When integrated with models like GPT, this method can reduce the computational load and improve the efficiency of generating relevant responses or insights.




  2. Semantic Search in Large Datasets: Embeddings can transform vast datasets into searchable vector spaces. In applications like eCommerce or content platforms, this can significantly improve search functionality, allowing users to find products or content based not just on keywords, but on the underlying semantic meaning of their queries.




  3. Recommendation Systems: In recommendation engines, embeddings can be used to understand user preferences and content characteristics. By embedding user profiles and product or content descriptions, systems can more accurately match users with recommendations that are relevant to their interests and past behavior.




  4. Sentiment Analysis and Customer Feedback Interpretation: Embeddings can process customer reviews or feedback by capturing the sentiment and nuanced meanings within the text. This provides businesses with deeper insights into customer sentiment, enabling them to tailor their services or products more effectively.




  5. Language Translation and Localization: Embeddings can enhance machine translation services by understanding the context and nuances of different languages. This is particularly useful in translating idiomatic expressions or culturally specific references, thereby improving the accuracy and relevancy of translations.




  6. Automated Content Moderation: By using embeddings to understand the context and nuance of user-generated content, AI models can more effectively identify and filter out inappropriate or harmful content, maintaining a safe and positive environment on digital platforms.




  7. Personalized Chatbots and Virtual Assistants: Embeddings can be used to improve the understanding of user queries by virtual assistants or chatbots, leading to more accurate and contextually appropriate responses, thus enhancing user experience. With similar logic they could help route natural language to specific APIs. See CompactVectorSearch repository, as an example.




  8. Predictive Analytics in Healthcare: In healthcare data analysis, embeddings can help in interpreting patient data, medical notes, and research papers to predict trends, treatment outcomes, and patient needs more accurately.




In all these use cases, the key advantage of using embeddings is their ability to process and interpret large and complex datasets more efficiently. This not only improves the performance of AI applications but also reduces the computational resources required, especially for high-cost models like GPT. This approach can lead to significant improvements in both the effectiveness and efficiency of AI-driven systems.


Specific Considerations for Azure OpenAI



  • Model Update Frequency: Understanding how frequently Azure OpenAI updates its models and the nature of these updates (e.g., major vs. minor changes) is crucial.

  • Backward Compatibility: Assessing whether newer versions of Azure OpenAI’s embedding models maintain backward compatibility with previous versions is key to managing version mismatches.

  • Version-Specific Features: Identifying features or improvements specific to certain versions of the model helps in understanding the potential impact of using mixed-version embeddings.


Strategies for Mitigation



  • Version Control in Data Storage: Implementing strict version control for stored embeddings ensures that data remains consistent and compatible with the model version used for its generation.

  • Compatibility Layers: Developing compatibility layers or conversion tools to adapt older embeddings to newer model formats can help mitigate the effects of version differences.

  • Baseline Tests: Create few simple baseline tests, that would identify any drift of the embeddings. 


Azure OpenAI Model Versioning: Understanding the Process


Azure OpenAI provides a systematic approach to model versioning, applicable to models like text-embedding-ada-002:




  1. Regular Model Releases:





  2. Version Update Policies:



    • Options for auto-updating to new versions or deploying specific versions.

    • Customizable update policies for flexibility.

    • Details on update options.




  3. Notifications and Version Maintenance:





  4. Upgrade Preparation:



    • Recommendations to read the latest documentation and test applications with new versions.

    • Importance of updating code and configurations for new features.

    • Preparing for version upgrades.




Conclusion


Model version mismatches in embeddings, particularly in the context of Azure OpenAI, pose significant challenges that can impact the effectiveness of AI applications. Understanding these challenges and implementing strategies to mitigate their effects is crucial for maintaining the integrity and efficiency of AI-driven systems.


 


References



  1. “Learn about Azure OpenAI Model Version Upgrades.” Microsoft Tech Community. Link

  2. “OpenAI Unveils New Embedding Model.” InfoQ. Link

  3. “Word2Vec Explained.” Guru99. Link

  4. “GloVe: Global Vectors for Word Representation.” Stanford NLP. Link

Validate your skills with our new certification for Microsoft Fabric Analytics Engineers

This article is contributed. See the original author and article here.

We’re looking for Microsoft Fabric Analytics Engineers to take our new beta exam. Do you have subject matter expertise in designing, creating, and deploying enterprise-scale data analytics solutions? If so, and if you know how to transform data into reusable analytics assets by using Microsoft Fabric components, such as lakehouses, data warehouses, notebooks, dataflows, data pipelines, semantic models, and reports, be sure to check out this exam. Other helpful qualifications include the ability to implement analytics best practices in Fabric, including version control and deployment.


 


If this is your skill set, we have a new certification for you. The Microsoft Certified: Fabric Analytics Engineer Associate certification validates your expertise in this area and offers you the opportunity to prove your skills. To earn this certification, pass Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric, currently in beta.


 


Is this the right certification for you?


This certification could be a great fit if you have in-depth familiarity with the Fabric solution and you have experience with data modeling, data transformation, Git-based source control, exploratory analytics, and languages, including Structured Query Language (SQL), Data Analysis Expressions (DAX), and PySpark.


Review the Exam DP-600 (beta) page for details, and check out the self-paced learning paths and instructor-led training there. The Exam DP-600 study guide alerts you for key topics covered on the exam.


 


Ready to prove your skills?


Take advantage of the discounted beta exam offer. The first 300 people who take Exam DP-600 (beta) on or before January 25, 2024, can get 80 percent off market price.


 


To receive the discount, when you register for the exam and are prompted for payment, use code DP600Winfield. This is not a private access code. The seats are offered on a first-come, first-served basis. As noted, you must take the exam on or before January 25, 2024. Please note that this beta exam is not available in Turkey, Pakistan, India, or China.


 


The rescore process starts on the day an exam goes live—8 to 12 weeks after the beta period, and final scores for beta exams are released approximately 10 days after that. For details on the timing of beta exam rescoring and results, read my post Creating high-quality exams: The path from beta to live.


 


Get ready to take Exam DP-600 (beta)



 


Did you know that you can take any role-based exam online? Online delivered exams—taken from your home or office—can be less hassle, less stress, and even less worry than traveling to a test center, especially if you’re adequately prepared for what to expect. To find out more, check out my blog post Online proctored exams: What to expect and how to prepare.


 


Ready to get started?


Remember, the number of spots for the discounted beta exam offer is limited to the first 300 candidates taking Exam DP-600 (beta) on or before January 25, 2024.


 


Related announcements


Explore the Latest Innovations for your Retail Workers with Microsoft Teams

Explore the Latest Innovations for your Retail Workers with Microsoft Teams

This article is contributed. See the original author and article here.

As we ring in the start of 2024, we’re gearing up to showcase a host of new innovations across Microsoft Teams at the annual National Retail Federation (NRF) conference, taking place January 14th – January 16th in New York City.


 


We’re announcing new solutions designed to enable store teams to efficiently meet customers’ expectations and improve the retail experience in this new era of AI.


 


Keep reading below for the latest product and feature capabilities coming to Teams to help simplify operations and enable first-class retail experiences for all retail workers – including the frontline.


 


Enhanced Store Team Communication and Collaboration


Route announcements to frontline teams by location, department, and role
Target important announcements to the right frontline employees based on location, department, and job role information. Targeted announcements will surface on the Teams home experience so your frontline employees will never miss an important communication. This feature will be generally available in March 2024. Learn more


Route announcements to frontline teams by location, department, and role.png


 


Boost frontline teamwork with auto-generated role and department tagging
Reach the right person at the right time with automatic tags for your frontline teams. Tags for department and job roles can be configured and created automatically for your frontline workers in the Teams Admin Center. Frontline employees can leverage these automatic tags in their frontline teams to connect with the right person every time. This feature will be in public preview in February 2024. Learn more.


 


Bring answers to communities for easier information sharing
In Viva Engage in Teams, answers from Q&A conversations will now be available in communities, better enabling frontline workers to easily source needed information. This feature will be generally available January 2024.


Bring answers to communities for easier information sharing.png


 


Monitor how employee engagement drives business performance
Also coming to Viva Engage in Teams, network analytics will bring AI-powered theme extraction and employee retention metrics to users to help enhance insights into workforce dynamics and help drive informed decision making. This feature will be generally available in February 2024. Learn more.


Monitor how employee engagement drives business performance.png


 


Automatically hear push-to-talk transmissions from multiple channels
Frontline workers using Walkie Talkie in Teams now have the option to automatically hear incoming transmissions from any of their pinned favorite Teams channels. With this new feature, users can stay better connected to multiple channels without needing to switch channels manually. This feature will be generally available by end of month. Learn more on how to get started.


Listen to multiple channels.gif


 


Use any generic wired (USB-C and 3.5mm) headset for instant team communication on Android
Frontline workers often need to instantly communicate with each other even when their phones are locked. We integrated Walkie Talkie in Teams with audio accessories partners to make this experience possible with the dedicated push-to-talk (PTT) button on headsets, which instantly brings up walkie talkie for clear and secure voice communication. In addition to select specialized headsets, we are excited to announce that Walkie Talkie in Teams will now work with any generic wired (USB-C and 3.5mm) headsets on Android.


 


As long as the generic headsets have a control to play/pause button or to accept/decline calls, frontline workers can tap the play/pause button to start and stop transmissions on walkie talkie. Frontline organizations will be able to easily start using walkie talkie with these lower-cost generic headsets. This feature will be generally available starting February 2024. Learn more.


 


Streamline Retail Store Operations


Allow frontline teams to set their shift availability for specific dates
Frontline workers will now have the flexibility to set their availability preferences on specific dates, enhancing their ability to manage unique scheduling needs. This added feature complements existing options for recurring weekly availability. This feature is available in January 2024. To learn more about recent enhancements to Shifts in Teams, read the latest blog – Discover the latest enhancements in Microsoft Shifts.


Allow frontline teams to set their shift availability for specific dates.png


 


Easily deploy shifts at scale for your frontline
Teams admins can now standardize Shifts settings across all frontline teams and manage them centrally by deploying Shifts to frontline teams at scale in the Teams admin center. You can select which capabilities to turn on or off like (showing open shifts, swap shift requests, offer shift requests, time off requests, and time clock.)


 


Admins can also identify schedule owners and create scheduling groups uniformly for all frontline teams at the tenant level and create schedule groups and time-off reasons that will be set uniformly across all frontline teams. Your frontline managers are able to start using Shifts straight out-of-the-box with minimal setup required. This feature is currently in public preview and will be generally available in March 2024. Learn more.


Shifts.png


 


Streamline Teams deployment for your frontline and manage at scale
Whether due to seasonality or the natural turnover seen on the frontline in retail, simplifying user membership is key to easing management needs. Now generally available, Microsoft has added new capabilities in the Teams Admin Center to deploy frontline dynamic teams at scale for your entire frontline workforce. Through the power of dynamic teams, team membership is automatically managed and always up to date with the right users as people enter, move within, or leave the organization using dynamic groups from Entra ID.


 


This deployment tool streamlines the admin experience to create a Teams structure that maps the frontline workforces’ real-world into digital world and makes it easy to set up a consistent channel structure to optimize for strong frontline collaboration on day one. Available in February, customers can use custom user attributes in Entra ID to define frontline and location attributes, with additional enhancements that make it easier to assign team owners by adding a people picker to the setup wizard.


Streamline Teams deployment for your frontline and manage at scale.png


 


Map your operational hierarchy to frontline teams
Admins will be able to set up their frontline operational hierarchy to map their organization’s structure of frontline locations and teams to a hierarchy in the Teams Admin Center. Admins can also define attributes for their teams that range from department information to brand information. The operational hierarchy coupled with this added metadata will enable frontline apps and experiences in the future like task publishing. This feature will be in public preview in January 2024. Learn more.


Map your operational hierarchy to frontline teams.png


 


Leverage generative AI to streamline in-store shift management
Store managers can also identify items such as open shifts, time off, and existing shifts with a new Shifts plug-in for Microsoft 365 Copilot. Microsoft 365 Copilot can now ground prompts and retrieve insights for frontline managers leveraging data from the Shifts app in addition to user and company data it has access to such as Teams chat history, SharePoint, emails, and more.


Leverage generative AI to streamline in-store shift management 1.png


 


Automate and simplify corporate to store task publishing
With task publishing, you can now create a list of tasks and schedule them to be automatically published to your frontline teams on a regular cadence, such as every month on the 15th. Once you publish a list, the task publishing feature will handle the scheduling and ensure that the list is published at the desired cadence. This feature is useful for tasks that need to be done regularly, such as store opening and closing processes or conducting periodic inspections and compliance checks. This feature will be generally available in March 2024.


Automate and simplify corporate to store task publishing.png


 


Publish a task that everyone in the team must complete
This new capability provides the option to create a task that every member of the recipient team must complete. Organizations can assign tasks like complete training or review a new policy to all or a specific set of frontline workers. The task will be created for each worker at the designated location. This feature will become generally available in March 2024.


 


Require additional completion requirements for submitting tasks
When you create a task within the task publishing feature, you have the option to request a form and/or photo completion. When you publish that task, each recipient team will be unable to mark the task complete until the form is submitted by a member of the team. This ensures that the task is completed properly by each team member.


Require additional completion requirements for submitting tasks.png


 


Additionally, with approval completion requirements, organizations can hold frontline managers and their teams accountable for verifying the work was done to standard before reflecting that work as completed. This allows an organization to increase attention to detail and accountability for important tasks. These features will become generally available in March 2024.


Require additional completion requirements for submitting tasks1.png


 


Secure and Manage your Business


Simplify authentication with domain-less sign-in
Since a single device is often shared among multiple frontline workers, they need to sign-in and out multiple times a day throughout a shift or across shifts. Typing out long user names with a domain is prone to mistakes and can be time consuming. With domain-less sign-in, frontline workers can now sign-in to Teams quicker using only the first part of their username (i.e., without the domain), then enter the password to access Teams on shared and corporate-managed devices. For example, if the username is 123456@microsoft.com or alland@microsoft.com, users can now sign in with only “123456” or “alland”, respectively.


Domainless sign in on Teams_Ignite.gif


 


We’re excited to share more updates and new features throughout the calendar year. To learn more about how Microsoft Teams empowers frontline workers, please visit our webpage to learn how.


 

Level up your retail workforce with smart, simple solutions from Microsoft Teams

Level up your retail workforce with smart, simple solutions from Microsoft Teams

This article is contributed. See the original author and article here.

In the race to deliver engaging in-store experiences, Microsoft is uniquely positioned to equip retailers with the tech they need to transform their store team’s workdays. At the National Retail Federation (NRF) 2024, we are announcing new solutions designed to enable store teams to efficiently meet customers’ expectations and improve the retail experience in this new era of AI.

The post Level up your retail workforce with smart, simple solutions from Microsoft Teams appeared first on Microsoft 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Enabling security and management across all your SMB customers with Microsoft 365 Lighthouse

Enabling security and management across all your SMB customers with Microsoft 365 Lighthouse

This article is contributed. See the original author and article here.

One of the common adoption blockers we have heard of from our partners is that they cannot standardize their security and management practices on Microsoft 365 Lighthouse because they cannot manage all their customers using it. This has made it challenging to standardize procedures such as resetting passwords, identifying risky users, or simply navigating a customer admin portal with delegated access. While we made it simple to search and discover users across the SMB customers you were managing in Microsoft 365 Lighthouse, you still needed a second process for the customers you were not managing in Microsoft 365 Lighthouse. This was primarily due to the requirement for Microsoft 365 Business Premium. While we have expanded support for a limited set of subscriptions to manage a customer in Lighthouse over the past couple of years, it was still limited to subscriptions that offered premium security value, preventing you from having a single solution.


 


Today, we expand support for all your commercial and educational SMB customers. This enables you as a partner to create standardized processes for managing all your SMB customers in Lighthouse. Here are a few of the scenarios you can do now with all your Microsoft 365 SMB customers using Lighthouse:



  • Anticipate your customers’ needs with proactive account management made easy with Sales Advisor opportunities.  Anticipate your customers’ needs. Discover the best ways to add value and support business growth with AI-powered insights and recommendations. 

    Learn more: Introducing Sales Advisor – unlock your customer’s potential in Microsoft 365 Lighthouse – Microsoft Community Hub
    Screenshot of Microsoft 365 Lighthouse Opportunities page with AI-powered insights and recommendations to grow a customer.Screenshot of Microsoft 365 Lighthouse Opportunities page with AI-powered insights and recommendations to grow a customer.

  • Simplified delegated access across all your customer tenants. Configure granular delegated access to your customers’ tenants to manage users, devices, and data quickly and easily. Reduce risk by rightsizing delegated permissions across your organization while improving your productivity with a guided wizard that helps you scale best practices from across the MSP industry to set up Granular Delegated Access Privileges (GDAP).

    Learn more: Set up GDAP (microsoft.com)

    Screenshot of Microsoft 365 Lighthouse Granular Delegated Access Privileges setup wizard.Screenshot of Microsoft 365 Lighthouse Granular Delegated Access Privileges setup wizard.



  • Assist with everyday user management. Lighthouse enables end-to-end user management, which allows you to create new users and quickly search and modify existing user details, including managing security groups, licensing, etc., and offboarding users. In addition to basic user management, Lighthouse adds value by providing management views across your Microsoft SMB customers that allow you to quickly identify and act on inactive accounts, Global Admin accounts, risky user behavior, and multi-factor authentication.


Screenshot of Microsoft 365 Lighthouse showing how to search for a user and view the user’s details.Screenshot of Microsoft 365 Lighthouse showing how to search for a user and view the user’s details.



  • Gain visibility into any Microsoft 365 incidents or advisories affecting your customers with a multi-tenant Service health dashboard.

    Screenshot of Microsoft 365 Lighthouse Service Health page.Screenshot of Microsoft 365 Lighthouse Service Health page.




  • One of the challenges of managing multiple customers is that you often need to use different admin portals, such as the Microsoft 365 admin center, the Azure portal, Microsoft Intune, or Exchange, to name a few.  Lighthouse lets you quickly and securely access other Microsoft admin portals for each of your SMB customers in the context of your partner tenant credentials using GDAP. Lighthouse users can leverage our security and management scenarios and seamlessly jump to another Microsoft admin portal when necessary. 



    Learn more: Manage your customers with Microsoft 365 Lighthouse


Screenshot of Microsoft 365 Lighthouse showing how to navigate into a customer’s Microsoft Entra admin portal.Screenshot of Microsoft 365 Lighthouse showing how to navigate into a customer’s Microsoft Entra admin portal.


We are just getting started and will continue to expand on the capabilities we offer to manage the breadth of customers you have in the coming months. So, check back often to learn what is new with Lighthouse.  


 


Not able to manage a customer in Lighthouse?


Here are cases where you will still find that a customer has limited management capabilities in Lighthouse and how you can change it.



  • By far, the most common cause a customer is “Limited” in that the customer tenant no longer has any active subscriptions and is no longer in use. If this is the case, the recommendation is to remove the reseller relationship (and GDAP relationships (Partner-led termination of a granular admin relationship – Partner Center | Microsoft Learn). It is a best practice to remove relationships that are no longer needed to reduce unnecessary exposure to your organization.

  • The second most common cause a customer is “Limited” is that delegated permissions (GDAP) have not been setup. You can use the GDAP setup wizard within Lighthouse to resolve this (Set up GDAP for your customers in Microsoft 365 Lighthouse – Microsoft 365 Lighthouse | Microsoft Learn).

  • customer tenant is in the Government Cloud. Unfortunately, we cannot support the management of this customer in Microsoft 365 Lighthouse.

  • The customer is not an SMB and has more than 2,500 licensed users.

  • You are not in the same geographic area as the customer. If you have customers in a different geographic area, you can set up Lighthouse in that region to manage them.

  • Lastly, some cases exist where tenants are used for Azure and not Microsoft 365. In that case, we recommend you check out Azure Lighthouse: What is Azure Lighthouse? – Azure Lighthouse | Microsoft Learn


To know why a specific customer is limited, click on Tenants link from the left navigation within Lighthouse and click the “Limited” link to bring up details on why they are not fully managed in Lighthouse:


Tenant list showing Contoso as “Limited” because delegated access has not been configured.Tenant list showing Contoso as “Limited” because delegated access has not been configured.


If you have a customer tenant using the Microsoft 365 services and you only have Limited management capabilities within Lighthouse, we want to know. You can leave comments below or use the feedback mechanism in Lighthouse. We want to enable you to manage all your active Microsoft 365 SMB customer tenants in Lighthouse.


If you already have Lighthouse, sign in and check out the links to other Microsoft admin centers at lighthouse.microsoft.com. If you don’t have Lighthouse, Sign up for Microsoft 365 Lighthouse to get started today.