This article is contributed. See the original author and article here.
We’re thrilled to announce the public preview of the Microsoft Entra PowerShell module, a new high-quality and scenario-focused PowerShell module designed to streamline management and automation for the Microsoft Entra product family. In 2021, we announced that all our future PowerShell investments would be in the Microsoft Graph PowerShell SDK. Today, we’re launching the next major step on this journey. The Microsoft Entra PowerShell module (Microsoft.Graph.Entra) is a part of our ongoing commitment and increased investment in Microsoft Graph PowerShell SDK to improve your experience and empower automation with Microsoft Entra.
We’re grateful for the substantial feedback we’ve heard from Microsoft Entra customers about our PowerShell experiences, and we’re excited to hear your thoughts after evaluating this preview module. We plan to build on our investment in the Microsoft Entra PowerShell module going forward and expand its coverage of resources and scenarios.
What is Microsoft Entra PowerShell?
The Microsoft Entra PowerShell module is a command-line tool that allows administrators to manage and automate Microsoft Entra resources programmatically. This includes efficiently managing users, groups, applications, service principals, policies, and more. The module builds upon and is part of the Microsoft Graph PowerShell SDK. It’s fully interoperable with all cmdlets in the Microsoft Graph PowerShell SDK, enabling you to perform complex operations with simple, well-documented commands. The module also offers a backward compatibility option with the deprecated AzureAD module to accelerate migration. Microsoft Entra PowerShell supports PowerShell version 5.1 and version 7+. We recommend using PowerShell version 7 or higher with the Microsoft Entra PowerShell module on all platforms, including Windows, Linux, and macOS.
Benefits of Microsoft Entra PowerShell
Focus on usability and quality: Microsoft Entra PowerShell offers human-readable parameters, deliberate parameter set specification, inline documentation, and core PowerShell fundamentals like pipelining.
Backward compatibility with AzureAD module: Microsoft Entra PowerShell accelerates migration from the recently announced AzureAD module deprecation.
Flexible and granular authorization: Consistent with Microsoft Graph PowerShell SDK, Microsoft Entra PowerShell enables administrative consent for the permissions you want to grant to the application and supports specifying your own application identity for maximum granularity in app permission assignment. You can also use certificate, Service Principal, or Managed Identity authentication patterns.
Open source: The Microsoft Entra PowerShell module is open source, allowing contributions from the community to create great PowerShell experiences and share them with everyone. Open source promotes collaboration and facilitates the development of innovative business solutions. You can view Microsoft’s customizations and adapt them to meet your needs.
Next steps
Installation: Install Microsoft Entra PowerShell, which uses the “/v1.0” API version to manage Microsoft Graph resources, from the PowerShell Gallery by running this command:
Authentication: Use the Connect-Entra command to sign in to Microsoft Entra ID with delegated access (interactive) or application-only access (noninteractive).
To see more examples for using your own registered application, Service Principal, Managed Identity, and other authentication methods, see the Connect-Entra command documentation.
Find all available commands: You can list all available commands in the Microsoft Entra PowerShell module by using the command:
Get-Command –Module Microsoft.Graph.Entra
Get Help: The Get-Help command shows detailed information about specific commands, such as syntax, parameters, cmdlet description, and usage examples. For example, to learn more about the Get-EntraUser command, run:
Get-Help Get-EntraUser –Full
Migrating from AzureAD PowerShell module: You can run your existing AzureAD PowerShell scripts with minimal modifications using Microsoft Entra PowerShell by using the Enable-EntraAzureADAlias command. For example:
Import-Module -Name Microsoft.Graph.Entra
Connect-Entra #Replaces Connect-AzureAD for auth
Enable-EntraAzureADAlias #enable aliasing
Get-AzureADUser -Top 1
Frequently Asked Questions (FAQs)
What is the difference between the Microsoft Graph PowerShell SDK and Microsoft Entra PowerShell modules?
Microsoft Entra PowerShell is a part of our increased investment in Microsoft Graph PowerShell SDK. It brings high-quality and scenario-optimized Entra resource management to the Microsoft Graph PowerShell SDK. Still, it keeps all the benefits of Microsoft Graph PowerShell SDK for authorization, connection management, error handling, and (low-level) API coverage. As Microsoft Entra PowerShell builds on the Microsoft Graph PowerShell SDK, it is completely interoperable.
Is the Microsoft Entra PowerShell module compatible with Microsoft Graph PowerShell?
Yes. You don’t need to switch if you’ve already used the Microsoft Graph PowerShell module. Both modules work well together, and whether you use Entra module cmdlets or Microsoft Graph PowerShell SDK cmdlets for Entra resources is a matter of preference.
I need to migrate from the deprecated AzureAD or MSOnline modules. Should I wait for Microsoft Entra PowerShell?
No. One of our goals with Microsoft Entra PowerShell is to help you migrate from Azure AD PowerShell more quickly by setting Enable-EntraAzureADAlias. Microsoft Entra PowerShell supports simplified migration for scripts that were using AzureAD PowerShell, with over 98% compatibility. However, the legacy AzureAD and MSOnline PowerShell modules are deprecated and will be retired (stop working) after March 30, 2025. We recommend that you act now to begin migrating your MSOnline and AzureAD PowerShell scripts.
Both modules use the latest Microsoft Graph APIs. For test environments and non-production systems, you can migrate to Microsoft Entra PowerShell. We recommend migrating to this module for production systems only after it reaches general availability. If you migrate scripts to Microsoft Graph PowerShell SDK now, there is no need to update them again with Microsoft Entra PowerShell, as it enhances and will not replace Microsoft Graph PowerShell SDK.
Should I update Microsoft Graph PowerShell scripts to Microsoft Entra PowerShell?
This is not necessary but a matter of preference. Microsoft Entra PowerShell is part of the Microsoft Graph PowerShell solution, and the two modules are interoperable. You can install both modules side-by-side.
Will Microsoft Entra PowerShell add support for more resources in the future?
Yes, it is a long-term investment. We will continue to expand support for more resources and scenarios over time. Expect new cmdlets for Privileged Identity Management (PIM), Entitlement Management, Tenant Configuration settings, Per-User multifactor authentication (MFA), and more. We’ll also enhance existing cmdlets with additional parameters, detailed help, and intuitive names. Check out GitHub repo for ongoing updates.
Will Microsoft Entra PowerShell use a pre-consented app like AzureAD or MSOnline modules?
No. Microsoft Entra PowerShell permissions aren’t preauthorized, and users must request the specific app permissions needed. This granularity ensures that the application has only the necessary permissions, providing granular control over resource management. For maximum flexibility and granularity in application permissions, we recommend using your own application identity with Entra PowerShell. By creating different applications for different uses of PowerShell in your tenant, you can have exacting control over application permissions granted for specific scenarios. To use your own application identity with Microsoft Entra PowerShell, you can use the Connect-Entra cmdlet:
I am new to Microsoft Entra PowerShell; where do I start?
Explore our public documentation to learn how to install the Microsoft Entra PowerShell module, authenticate, discover which cmdlet to use for a particular scenario, read how-to guides, and more. Our best practice guide will help you start on a secure foundation.
How can I provide feedback?
You can provide feedback by visiting our GitHub repository issues section. Create a new issue with your feedback, suggestions, or any problems you’ve encountered. Our team actively monitors and responds to feedback to improve the module.
How can I contribute?
We welcome contributions from the community, whether it’s through submitting bug reports, suggesting new features, or contributing scenario and example improvements. To get started, visit the GitHub repository, check out our contribution guidelines, and create a pull request with your changes.
Learn more about Microsoft Entra PowerShell module
Explore our public documentation, to learn how to install the Microsoft Entra PowerShell module, the authentication methods available, which cmdlet to use for a particular scenario, how-to guides, and more.
Try It Today
Try out the new version and let us know what you think on GitHub! Your insights are invaluable as we continue to improve and enhance the module to better meet your needs.
Thank you!
We want to thank all the community members who helped us improve this release by reporting issues on GitHub during the private preview! Please keep them coming!
Prevent identity attacks, ensure least privilege access, unify access controls, and improve the experience for users with comprehensive identity and network access solutions across on-premises and clouds.
This article is contributed. See the original author and article here.
We’re excited to announce that the migration tool for Active Directory Federation Service (AD FS) customers to move their apps to Microsoft Entra ID is now generally available! Customers can begin updating their identity management with more extensive monitoring and security infrastructure by quickly identifying which applications are capable of being migrated and assessing all their AD FS applications for compatibility.
In November we announced AD FS Application Migration would be moving to public preview, and the response from our partners and customers has been overwhelmingly positive. For some, transitioning to cloud-based security is a daunting task, but the tool has proven to dramatically streamline the process of moving to Microsoft Entra ID.
A simplified workflow, reduced need for manual intervention, and minimized downtime (for applications and end users) have reduced stress for hassle-free migrations. The tool not only checks the compatibility of your applications with Entra ID, but it can also suggest how to resolve any issues. It then monitors the migration progress and reflects the latest changes in your applications. Watch the demo to see the tool in action.
Moving from AD FS to a more agile and responsive, cloud-native solution helps overcome some of the inherent limitations of the old way of managing identities.
In addition to more robust security, organizations count greater visibility and control with a centralized, intuitive admin center and reduced server costs as transformative benefits of moving toa modern identity management.Moreover, Entra ID features can help organizations achieve better security and compliance with multifactor authentication (MFA) and conditional access policies—both of which provide a critical foundation for Zero Trust strategy.
Prevent identity attacks, ensure least privilege access, unify access controls, and improve the experience for users with comprehensive identity and network access solutions across on-premises and clouds.
This article is contributed. See the original author and article here.
Recently, I received a question about unattended uninstall for SQL Server Express edition. This article describes how to perform this task.
We need to take in consideration before to proceed
User that performs the process must be a local administrator with permissions to log on as a service. You can review more information about required permissions here.
If the machine has theminimumrequired amount of physical memory, increase the size of the page file to two times the amount of physical memory. Insufficient virtual memory can result in an incomplete removal of SQL Server.
On a system with multiple instances of SQL Server, the SQL Server browser service is uninstalled only once the last instance of SQL Server is removed. The SQL Server Browser service can be removed manually fromPrograms and Featuresin theControl Panel.
Uninstalling SQL Server deletestempdbdata files that were added during the install process. Files with tempdb_mssql_*.ndf name pattern are deleted if they exist in the system database directory.
This article is contributed. See the original author and article here.
As enterprises are asked to manage increasingly complex business processes and data environments, context-aware AI summarization by Copilot in Microsoft Dynamics 365 streamlines operations by synthesizing data from multiple sources across Supply Chain Management, Finance, Commerce, and Human Resources. By delivering clear, actionable insights from ERP data, this generative AI feature eliminates context-switching and allows users to make better decisions faster.
Transformative AI summarization in Dynamics 365
Copilot generative AI features are revolutionizing the user experience in Supply Chain Management, Finance, Commerce, and Human Resources. Insights that used to require literally dozens of clicks, searches, and views in multiple windows—and a lot of deep thinking about complex data—are now presented to the right user, at the right time, automatically. Let’s take a closer look at how Copilot aggregates data from multiple sources and displays it in easily digestible and context-aware summaries.
Vendor summary streamlines understanding of vendor performance and financials
What do we mean by “context-aware”? One meaning is that Copilot summarizes data based on the user’s role to deliver real-time, role-specific insights. Take the vendor summary, for example. Traditionally, procurement managers had to navigate multiple forms to understand vendor performance. Copilot summaries streamline these insights by providing quick access to crucial information, such as active contracts, purchase orders, late deliveries, and overdue payments. For accounts payable teams, however, the vendor summary presents essential financial details about a vendor. For both roles, the vendor summary enables faster, data-driven decisions for better vendor interactions.
Real-time vendor summary helps optimize supplier interactions and negotiations.
Sales order and purchase order summaries pinpoint critical items in open orders
Another perspective on “context-aware” is AI summarization based on task. Consider purchase and sales orders. Procurement and sales teams often spend significant time following up on open orders. Getting a comprehensive overview or pinpointing lines that need attention can be challenging, because the necessary data is typically spread across multiple forms. Copilot summaries consolidate the information, enabling users to easily identify critical items.
Copilot sales order summary highlights potential delivery issues, aiding in efficient order management and customer service.
It’s not just about summarizing data, though. AI summarization also facilitates quicker action on next steps. Copilot’s summary includes convenient one-click filtering options, allowing users to swiftly access the information they need to act.
One-click filtering options right in the Copilot summary help users quickly find the information they need.
Customer summary streamlines insights by role for more effective customer relationships
When it comes to customer information, “context-aware” refers to everything that creates a relationship between an organization and its customers—information that’s often found in multiple, disparate tables, reports, and modules. Copilot addresses the challenges faced by roles such as accounts receivable agents, sales order agents, and customer account managers, who need comprehensive and role-specific information about customers that’s often scattered across multiple systems. For example, while accounts receivable teams need quick access to open invoices, sales order teams require details on open orders and shipments. Copilot consolidates all relevant data into a single, context-aware summary that’s specific to each role, allowing agents and account managers to tailor their interactions with customers, strengthen relationships, and enhance operational efficiency.
Customer events, statuses, and insights are summarized in Dynamics 365 Finance.
Warehouse worker home screen brings warehouse teams up to speed quickly
“Context-aware” can also refer to a user’s surroundings and situation. Warehouse start-of-shift stand-up meetings can miss important updates, and they don’t cover changes that happen throughout the day. Copilot’s dynamic operational summary on the Warehouse Management home screen brings warehouse workers up to speed at the start of their shifts and keeps them on top of the situation as they go about their day, helping them quickly adapt to changes and ensure daily goals are met.
Warehouse workers get up to speed fast at the start of their shifts with a dynamic overview in Warehouse Management.
Workflow history summary streamlines review and approval of invoices and expense reports
AI summarization streamlines examination of workflows by providing a concise overview of recent actions and comments, allowing approvers to quickly act without navigating through separate detail screens. Copilot summaries apply to workflows in Dynamics 365 Supply Chain Management, Finance, Commerce, and Human Resources, aiding review and approval processes and supporting informed decisions for things like vendor invoices, time-off requests, and expense reports.
Workflow history summaries help stakeholders make informed decisions about future activities.
Product preview summary consolidates product details for quick consumption
Procurement managers typically must navigate multiple forms to gather product details such as name, description, dimensions, hierarchy, life cycle state, and release policy. Copilot consolidates this information and other key product attributes in a single, concise summary, making these details quick and easy to consume.
Copilot aggregates and summarizes product information based on the user’s role.
When a warehouse manager views the product detail page, Copilot’s summary focuses on relevant information that would take multiple clicks to find, such as on-hand inventory levels, purchase information like main vendor, and batch numbers that are expiring soon.
The product detail summary includes information about stock on hand and recent sales.
Employee workspace summary makes leave management easier for both HR and employees
An organization’s success relies on both employees and customers. Effective time-off management is crucial for employees to make informed decisions and for the organization to optimize time-off utilization and manage financial liabilities from unused leave. Time-off information is scattered across multiple screens in the employee self-service portal. Copilot consolidates key details like vacation and sick leave balances and potential forfeitures due to policy, and includes a link to submit leave requests, all in one summary view.
Employees can view available leave and request time off right in the Copilot summary.
Retail statement summary provides insights about risky transactions across multiple stores
Physical stores send cash-and-carry transactions to Dynamics 365 Commerce for inventory and financial updates. The store operations team must ensure proper posting, but identifying pending transactions can be difficult across multiple stores. Summaries of posted and unposted retail statements highlight stores needing attention and flag risky transactions like returns without receipts or price overrides. Brief error summaries for failed statements aid in quick resolution, enhancing store management efficiency.
Copilot summarizes retail transaction errors in Dynamics 365 Commerce.
For retail merchandisers, the challenge lies in managing complex product configurations without errors. Copilot addresses this challenge by streamlining merchandising workflows, offering a clear summary of settings, automating data validation, and providing a risk preview to anticipate issues. Here, context-aware AI summarization enhances efficiency, reduces the risk of lost sales, and drives growth.
Merchandise workflow summary aids management of retail merchandise.
More benefits of context-aware AI summarization of ERP data
Beyond the specific benefits we described earlier, Copilot summaries in Dynamics 365 Supply Chain Management, Finance, Commerce, and Human Resources enhance user experience and operational efficiency in multiple ways.
Enhanced productivity: With key data points automatically summarized, users spend less time analyzing vast datasets and can focus on strategic decision-making and core activities.
Proactive problem-solving: With real-time summaries, users can anticipate challenges and address them proactively, improving business agility and resilience.
Improved accuracy and insight: Copilot highlights critical information and trends, reducing the risk of human error in interpreting complex data. Analysis is more accurate and insightful, crucial for effective decision-making.
Customized user experiences: Each summarization feature is tailored to the specific needs of different roles within an organization, ensuring that every user receives the most relevant and actionable insights.
Seamless integration: AI features integrate seamlessly into your existing Dynamics 365 framework, providing a smooth user experience without the need for extensive setup or training.
Scalable decision support: Whether for small tasks or large-scale strategic decisions, Copilot summaries meet the needs of businesses of all sizes, scenarios, and requirements.
These benefits collectively contribute to a more streamlined, efficient, and informed ERP environment, setting the stage for more advanced AI features to come.
Introducing generative AI responsibly
Integrating generative AI into ERP products presents challenges. It requires ensuring that the AI features are reliable and robust enough for mission-critical business settings. It also requires building customer trust in the AI capabilities. Our vision is an autonomous ERP system that automates and optimizes business processes with minimal human intervention. However, this is a journey we’re embarking on together to instill confidence in the results and encourage greater adoption over time.
Our approach is to gradually introduce low-risk AI features that provide immediate benefits and time savings, gather user feedback, and build excitement. This way, we can improve the AI features based on user needs and business operations, laying the foundation for more advanced AI features in the future. We prioritize the safe deployment and continuous improvement of AI features in our ERP suite and are leading the way for responsible and impactful integration of AI in the ERP landscape.
Ensuring the ethical use of AI technology
Microsoft is committed to the ethical deployment of AI technologies. Through our Responsible AI practices, we ensure that all AI-powered features in Dynamics 365 adhere to stringent data privacy laws and ethical AI usage standards, promoting transparency, fairness, and accountability.
Learn more about AI summarization in Dynamics 365
Interested in learning more about the power of AI summarization to transform your business processes with unparalleled efficiency and insight? Here’s how you can dive deeper:
This article is contributed. See the original author and article here.
In this post, we’ll cover some details on how to track the lifecycle of a SharePoint Site in the Microsoft Graph Data Connect (MGDC), using the date columns in the SharePoint Site dataset. If you’re not familiar with MGDC for SharePoint, start with https://aka.ms/SharePointData.
All Dates in the Sites Dataset
One of the most common scenarios in MGDC for SharePoint is tracking the lifecycle of a site, which includes understanding when the site was created, how it grows over time, when it stops growing and when it becomes inactive or abandoned.
The SharePoint Sites dataset includes several columns that can be used to understand the site lifecycle in general. For instance, here are the datetime columns available:
Site Created
Creation date is straightforward. There is a column (CreatedTime) with the date when the site was created. As with all other dates, it uses the UTC time zone.
Last Modified
In the Sites dataset, you also have the date and time when any items under the root web were last modified (RootWeb.LastItemModifiedDate). This includes the last time when files were created, updated or deleted. This is a great indication that the Site is still in active use.
You also have the date the site security was last modified (LastSecurityModifiedDate). This shows when permissions were granted, updated or revoked. That includes permissions granted through the manage access interface and permissions granted through sharing links.
Last Accessed
Last access is available at the site level (LastUserAccessDate). This shows when an item in the site was last accessed (this includes simply reading the file). This is an important indicator to help understand when the site is becoming inactive or abandoned.
Note that, while there is an effort to identify here only access performed directly by users, this date might also include automated actions by applications, including internal SharePoint applications.
Snapshot Date
Please note that there is one more date (SnapshotDate), but that one is not relevant to the site lifecycle. The snapshot date simple tracks when the data was retrieved by MGDC.
File Actions
Besides what’s captured in these datetime columns in the Sites dataset, you also have the option to capture detailed file activity in the site using the SharePoint File Actions and accumulate those over time.
Keep in mind that MGDC for SharePoint only keeps actions for the last 21 days due to compliance issues. More specifically, you can get file actions between today minus 2 days and today minus 23 days. For instance, if today is June 30th, you can get file actions between June 8th and June 28th.
If you query this information daily, you could build a longer history of file actions over time. For instance, you could keep the last 90 days of data from the File Actions dataset. With that you could find recent access or otherwise say “no access in the last 90 days”.
This would also let you know more details about recent file activities, like who last accessed the site, which file or extension was last accessed, what was the last action, etc. You need to decide if you can rely solely on the date columns provided in the Sites dataset or if it is useful to keep these additional details.
Please do check with the compliance team in your company to make sure there are no restrictions on keeping this information for longer periods of time in your country. There might be regulatory restrictions on how long you can keep this type of personally identifiable information.
Calculated Columns
Keep in mind that these dates use a datetime data type, so grouping by one of them can sometimes be a challenge. If you’re using Power BI, you can show them as a date hierarchy and get a summary by year, quarter, month or day.
It might also be useful to create calculated columns to help with grouping and visualization. For instance, you can create a new date column (without the time portion) for daily summaries. Here’s how to calculate that in Power BI:
I hope this clarifies what is available in MGDC for SharePoint to track the lifecycle of a SharePoint site.
Let us know in the comments if you think we should consider additional lifecycle information.
For further details about the schema of all SharePoint datasets in MGDC, including SharePoint Sites and SharePoint File Actions, see https://aka.ms/SharePointDatasets.
This article is contributed. See the original author and article here.
Azure SQL Managed Instance with Zone Redundancy to begin billing in Italy-North, Israel-Central, and West-Europe
Microsoft Azure continues to enhance its services, ensuring that customers have access to the latest innovations and features. The latest update is particularly exciting for businesses operating in Italy, Israel, and West Europe: Azure SQL Managed Instance with Zone Redundancy is available in these regions and starts billing for all configurations.
What is Azure SQL Managed Instance?
Azure SQL Managed Instance is a fully managed database service that offers the best of SQL Server with the operational and financial benefits of an intelligent, fully managed service. It provides a near-100% compatibility with the latest SQL Server (Enterprise Edition) database engine, which makes it easy to migrate your SQL Server databases to Azure without changing your apps.
Understanding Zone Redundancy
Zone Redundancy is a feature designed to improve the availability and resilience of your database instances. In the context of Azure SQL Managed Instance, Zone Redundancy means that your instances are replicated across multiple availability zones within a region. Each availability zone is a physically separate location with independent power, cooling, and networking. This separation ensures that even in the event of a data center outage, your database remains available and operational.
Benefits of Zone Redundancy
1. Increased Resilience
By replicating your data across multiple zones, you safeguard your applications from data center failures. This redundancy minimizes the risk of downtime and ensures that your critical business applications remain online, providing a more reliable service to your users.
2. Improved Business Continuity
With Zone Redundancy, you can achieve higher availability SLAs. For many businesses, this means meeting stringent uptime requirements and maintaining customer trust by ensuring their services are always available.
3. Cost Efficiency
While Zone Redundancy does come with additional costs, the benefits of reduced downtime and the potential financial impact of data loss often outweigh these expenses. In essence, investing in Zone Redundancy can save your business money in the long run by avoiding costly downtime and data recovery efforts.
How to Enable Zone Redundancy
Enabling Zone Redundancy for your Azure SQL Managed Instance is straightforward:
During Instance Creation:
When creating a new managed instance, you can specify Zone Redundancy in the configuration options.
For Existing Instances:
If you have an existing instance, you can modify its settings to enable Zone Redundancy in configure blade
This article is contributed. See the original author and article here.
Introduction
Welcome to Part 2 of our exploration into generative AI, where we delve deeper into the practical applications and creative potential of this innovative technology.
This article highlights concrete examples from students projects of the course ‘Prompt Engineering’ at Fondazione Bruno Kessler (FBK) in Trento (Italy). The aim is to showcase how students leveraged generative AI in unique ways. In particular, we’ll focus on two fascinating projects: “Generative Music” and “Personal Chef,” which exemplify the versatility and impact of generative AI in diverse fields.
Core element of these projects is the use of a structured framework known as the Card Model to define and organize generative AI tasks. In the context of this course, a card refers to a structured format or template used to define a specific task or objective for generating content or output using generative AI techniques. The Flow of these cards, meaning the logical sequence and interaction between them, is crucial for the coherent generation of complex outputs. For a detailed explanation of Card and Flow concepts read the 1st part of this blog series.
Our students have been actively experimenting with generative AI, producing remarkable results in their projects. Here, we present detailed insights and experiences from their hands-on work, demonstrating the practical applications of prompt engineering with non-tech students.
Generative Music
The “Generative Music” project leverages generative AI technology to innovate the music creation process. Central to this project is the use of Generative AI Cards that define various musical parameters and guide the AI in generating unique compositions. Generative AI Cards specify key musical elements such as genre, number of chords, melody length, key, and instrumentation, including bass and guitar (Fig. 1). Each card represents a distinct aspect of the music, allowing for precise control over the generated content. By configuring these cards, the team can tailor the AI’s output to meet specific creative goals.
Card Configuration
The process begins with the selection and configuration of these cards. Initial configurations often require multiple iterations to achieve satisfactory results. Each card’s parameters are adjusted to optimize the music generation, focusing on refining the elements to create a harmonious and appealing output.
Fig 1: Example of Cards from the Music Project.
Flow Generation
Flow generation involves the structured combination of these AI Cards to produce a coherent piece of music. This stage is crucial as it dictates the sequence and interaction of different musical components defined by the cards. The project utilizes tools like Canva to aid in visualizing and organizing the flow of these components, ensuring a smooth and logical progression in the music. During the flow generation process, the order of the AI Cards is experimented to explore different musical outcomes. However, the team found that altering the sequence did not significantly affect the final output, indicating that the cards’ individual configurations are more critical than their order.
Iterative Refinement and Human Interaction
A significant aspect of the project is the iterative refinement process, where generated music undergoes multiple evaluations and adjustments. Human intervention is essential at this stage to validate the quality of the output. Listening to the music is the primary method for assessing its adequacy, as human judgment is necessary to determine whether the AI’s creation meets the desired standards. The team continuously modifies the prompts and configurations of the AI Cards based on feedback, refining the generative process to improve the music quality. This iterative cycle of generation, evaluation, and adjustment ensures that the final product aligns with the creative vision (Fig. 2).
The “Generative Music” project demonstrates the potential of generative AI in the field of music creation. By using Generative AI Cards and structured flow generation, the project showcases a methodical approach to producing unique musical compositions. Despite the need for substantial human involvement in the refinement process, this innovative use of AI represents a significant step forward in integrating technology with artistic creativity.
Personal Chef
The “Personal Chef” project utilizes generative AI to assist individuals in planning balanced meals efficiently. The primary goal of this project is to save time and resources, enhance creativity, and provide valuable insights for meal planning. Generative AI Cards are central to this project, serving as modular components that define specific meal planning parameters. Each card encapsulates different aspects of meal creation, such as the type of dish (e.g., balanced dish, vegetarian alternative), the ingredients required, and the nutritional composition (Fig. 3). These cards help in structuring the meal planning process by providing detailed instructions and alternatives based on user preferences and dietary needs.
Card Configuration
For instance, one AI Card might focus on generating a list of high-protein foods, while another might ensure the meal components are seasonal. These cards are iteratively refined based on user feedback to ensure they deliver precise and relevant outputs. The language used in these cards is carefully chosen, as even small changes can significantly impact the results. The feedback loop is crucial here, as it allows continuous improvement and ensures that the AI provides more accurate and context-specific suggestions over time.
Fig 3: Example of Cards from the Personal Chef Project.
Flow Generation
Flow generation in this project involves the logical sequencing and combination of Generative AI Cards to create coherent and balanced meal plans. This process ensures that the output not only meets nutritional guidelines but also aligns with the user’s preferences and constraints. The flow of these cards is designed to cover various stages of meal planning, from selecting ingredients to proposing complete dishes (Fig. 4). For example, a flow might start with an AI Card that provides a balanced dish recipe, followed by another card that suggests a vegetarian alternative, and then a card that customizes the dish based on seasonal ingredients. This structured approach ensures a logical progression and maintains the relevance and coherence of the meal plans.
Fig 4: Example of Flows from the Personal Chef Project.
Iterative Refinement and Human Interaction
The project emphasizes the importance of human feedback in refining the AI-generated outputs. Users can interact with the system to customize the generated meal plans, adding or removing ingredients as needed. This iterative process ensures that the AI’s suggestions remain practical and tailored to individual preferences and dietary requirements. By continuously incorporating user feedback, the project aims to enhance the precision and utility of the Generative AI Cards, ultimately making the meal planning process more efficient and enjoyable.
Lessons Learnt
The “Personal Chef” project showcases how generative AI can be leveraged to support everyday tasks like meal planning. The use of Generative AI Cards allows for a modular and flexible approach, enabling users to create personalized and balanced meal plans. While the AI can provide valuable insights and save time, human interaction remains essential to validate and refine the outputs, ensuring they meet the users’ specific needs and preferences. This integration of AI and human expertise represents a significant advancement in making daily routines more manageable and creative.
Students Survey Results
The class consisted of 11 students (average age 23, 5 female) from various university faculties, i.e. Psychology, Cognitive Science, Human-Computer Interaction.
As said, it was a class of non-tech students. Indeed, most of them (7 out of 11) stated that they rarely (1-4 times in the last month) used tools such as ChatGPT. Only one student stated that he/she regularly (every day or almost every day) used these tools, either for study-related and unrelated purposes. One student admitted that he/she was not familiar with ChaptGPT, had only heard about it but had never used it.
In order to investigate the students’ knowledge of GenAI and its potential and to assess the effectiveness or otherwise of the course in increasing their knowledge and ability to use GenAI, we administered a questionnaire at the beginning and end of the course and then made comparisons.
The questionnaire consists of 50 items taken from existing surveys [1,2] investigating various dimensions concerning AI in general. These included: (a) AI Literacy, based on the level of knowledge, understanding and ability to use AI, (b) Anxiety, related to the fear of not being able to learn how to use AI correctly, as well as of losing one’s reasoning and control abilities, and (c) Self-Efficacy, related to confidence both in one’s technical capabilities and in AI as good aid to learning.
As evident from the graph below (Fig. 5), by comparing the answers given by the students before and after the course, it is evident that on average the students increased their literacy and self-efficacy, and decreased their anxiety.
Fig. 5:. Average scores of students’ literacy, anxiety and self-efficacy gathered at the beginning and the end of the Prompt Engineering course for non-tech students.
Furthermore, at the end of the course we asked the students to answer 10 additional questions aimed at gathering their feedback specifically about Generative AI. In particular, we asked them to score the following statements using a 7-point Likert scale, where 1 means “strongly disagree” and 7 means “strongly agree”:
I increased my knowledge and understanding of GenAI
I can effectively use GenAI
When interacting with technology, I am now aware of the possible use of GenAI
I am aware of the ethical implications when using AI-based applications
Taking a class on Prompt Engineering for Generative AI made me anxious
I am afraid that by using AI systems I will become lazy and lose some of my reasoning skills
AI malfunctioning can cause many problems
If used appropriately, GenAI is a valuable learning support
When using GenAI, I feel comfortable
I significantly increased my technological skills
Fig. 6 presents the average score and standard deviation of each item. As evident from the graph, after the course the students recognised the value of GenAI as a valuable tool to support learning (item 8). They also showed to be more aware of the possible uses of GenAI (item 3) and ethical implications of such uses (item 4), whereas they showed a low level of anxiety in attending the course (item 5) and a low level of fear of losing reasoning skills (item 6).
Fig. 6: Average scores on a 7-point Likert scale given by non-tech students at the end of the Prompt Engineering course.
Conclusions
Generative AI is widely used in higher education and skills training. Articles like [3] demonstrate that Generative AI is widely used in higher education and skills training, highlighting its benefits for productivity and efficiency, alongside concerns about overdependence and superficial learning. In Part 2 of our blog series, we delved into the practical applications and creative potential of this innovative technology. Through projects like “Generative Music” and “Personal Chef,” our students demonstrated the versatility and impact of generative AI across diverse fields. Central to these projects was the structured framework known as the Card Model and a Flow of the identified cards, which helped define and organize generative AI tasks.
The course significantly enhanced students’ understanding of prompt engineering, reducing their anxiety and increasing their self-efficacy. Survey results indicated improved AI literacy and decreased anxiety, with students feeling more confident in their technical abilities and recognizing the value of generative AI as a learning tool. Utilizing atomic cards to define and organize generative AI tasks facilitated the learning process. This structured approach allowed students to better grasp and control various aspects of content generation. In the “Generative Music” and “Personal Chef” projects, the cards provided a flexible and modular framework, enabling iterative refinement and improved output quality.
Looking ahead, future developments could further enhance the effectiveness of teaching generative AI. Developing specific tools and editors for configuring prompts could simplify the process, making it more intuitive for students. Establishing standard guidelines and metrics for evaluating generative outputs could provide more structured feedback, improving the learning process. Additionally, expanding the course content to include a broader range of diverse and complex case studies could help students explore more generative AI applications, deepening their understanding and innovative capabilities.
These advancements would not only improve the teaching of generative AI but also promote greater integration of technology and creativity, better preparing students for their future professional career.
Antonio Bucchiarone
Motivational Digital System (MoDiS)
Fondazione Bruno Kessler (FBK), Trento – Italy
Nadia Mana
Intelligent Interfaces and Interaction (i3)
Fondazione Bruno Kessler (FBK), Trento – Italy
References
[1] Schiavo, Gianluca and Businaro, Stefano and Zancanaro, Massimo. Comprehension, Apprehension, and Acceptance: Understanding the Influence of Literacy and Anxiety on Acceptance of Artificial Intelligence. Available at SSRN: https://ssrn.com/abstract=4668256.
[2] Wang, Y. M., Wei, C. L., Lin, H. H., Wang, S. C., & Wang, Y. S. (2022). What drives students’ AI learning behavior: a perspective of AI anxiety. Interactive Learning Environments, 1–17. https://doi.org/10.1080/10494820.2022.2153147
[3] Hadi Mogavi, Reza and Deng, Chao and Juho Kim, Justin and Zhou, Pengyuan and D. Kwon, Young and Hosny Saleh Metwally, Ahmed and Tlili, Ahmed and Bassanelli, Simone and Bucchiarone, Antonio and Gujar, Sujit and Nacke, Lennart E. and Hui, Pan. ChatGPT in education: A blessing or a curse? A qualitative study exploring early adopters’ utilization and perceptions. Computers in Human Behavior: Artificial Humans, Vol. 2, N. 1, 2024. https://doi.org/10.1016/j.chbah.2023.100027
This article is contributed. See the original author and article here.
As spring turns to summer, and sprouts appear in my garden, I think about all the preparation I’ve made for their environment—turning the soil, setting up the watering system, adding peat moss—and know that the yield will be greater and the harvest better. Such is the case with Microsoft Intune. As we continue to enhance the capabilities, each one an investment, the cumulative result is a richer and more robust management experience. Below, we highlight some of the newest features.
New troubleshooting tool for mobile devices
Part of diagnosing an issue is not only defining what is wrong but also what is not wrong. Our customers asked for a simple way to temporarily remove apps and configurations from a device managed in Microsoft Intune as part of the troubleshooting process. The result is a feature we call Remove apps and configuration (RAC). Before RAC, removing settings involved excluding devices from policy assignments or removing users from groups, and then waiting for devices to check in. After diagnosing the device, those assignments and group memberships would need to be restored one by one. Now, RAC affords a set of useful troubleshooting steps:
Real-time monitoring of which policies and apps are removed/restored
Selective restore of individual apps and policies
Temporary removal of apps and policies with an automated restore in 8 to 24 hours
Policy assignments and group membership remain unchanged
This initial release will be distributed in early July. It will support iOS/iPadOS and Android corporate-owned devices, and it will be available to GCC, GCC High, and DoD environments on release. For more information on this tool, follow the update on the Microsoft 365 roadmap.
Windows enrollment attestation preview is here
Last month we talked about device attestation capabilities coming to Intune. This month, the public preview of Windows enrollment attestation is starting to roll out with the new reporting and Device Attest action.
This feature builds on attestation by applying it to enrollment. Applicable Windows devices have their enrollment credentials stored in the device hardware, in this case Trusted Platform Module (TPM) 2.0, and Intune can then attest to this storage—meaning that the device enrolled securely. Successful devices show as Completed in the report. Devices that are Not Started or Failed can be retried using the new Device Attest at the top of the report. This will be available in public preview by the end of June.
Screenshot of the preview of the device attestation status report in the Intune admin center listing the name, ID, and primary UPN of a device that failed device attestation.
Stay up to date on the release of this capability to the public Microsoft 365 roadmap.
More granular endpoint security access controls
Role-based access control (RBAC) enhances organizations’ ability to configure access to specific workloads while maintaining a robust security posture. Our customers asked for even more granular controls to help scope security work across geographic areas, business units, or different teams to only relevant information and features. In this latest release, we are adding specific permission sets to enable more flexibility in creating custom roles for:
Endpoint detection and response
Application control
Attack surface reduction
We plan to have new permission sets for all endpoint security workloads in the future.
We know that many of our customers use a custom role with Security baselines permission to manage security workloads, so we are automatically adding the new permissions to this role. This way, no permissions will be lost for existing users. For new custom roles that are granted Security baselines permission, these will not include the new permissions by default but rather only those without specific permission sets.
This update also applies to customers using the Microsoft Defender console to manage security policies, and it is available in GCC, GCC High, and DoD environments. Read more about granular RBAC permissions.
So much of what we do is a direct result of customer feedback. Please join our community, visit the Microsoft Feedback portal, or leave a comment on this post. We value all your input, so please share it, especially after working with these exciting new capabilities.
This article is contributed. See the original author and article here.
The latest edition of Sync Up is live now in your favorite podcast app! This month, Arvind Mishra and I are explored all the announcements and energy of the Microsoft 365 Community Conference! Even better, we had on-site interviews with a host of OneDrive experts!
Mark Kashman (you read that right, the Mark Kashman, of Intrazone!) talked about our experiences at the conference, including the :robot_face: Age of Copilots, the magic of in-person conferences, and some of the amazing SharePoint-related features that were shown off!
LeshaBhansali shared how OneDrive is available everywhere, from Windows to Mac to Teams to Outlook, making it easy for your users to be productive inside the apps that they already use!
Carter Green talked about the latest improvements we’re bringing to the OneDrive desktop app, including :rainbow: colored folders, and Sync health report exports!
Vishal Lodha made a second consecutive appearance on Sync Up to talk about the amazing customer interactions we had, including an amazing panel where customers shared their experiences with OneDrive and how they’ve unlocked that power for their users!
This article is contributed. See the original author and article here.
We wanted to provide you with an important update to the deprecation schedule for the two Admin Audit Log cmdlets, as part of our ongoing commitment to improve security and compliance capabilities within our services. The two Admin Audit Log cmdlets are:
Search-AdminAuditLog
New-AdminAuditLog
As communicated in a previous blog post, the deprecation of Admin Audit Log (AAL) and Mailbox Audit Log (MAL) cmdlets was initially planned to occur simultaneously on April 30th, 2024. However, to ensure a smooth transition and to accommodate the feedback from our community, we have revised the deprecation timeline.
We would like to inform you that the Admin Audit Log cmdlets will now be deprecated separately from the Mailbox Audit Log cmdlets, with the final date set for September 15, 2024.
This change allows for a more phased approach, giving you additional time to adapt your processes to the new Unified Audit Log (UAL) cmdlets, which offer enhanced functionality and a more unified experience.
What This Means for You
The Admin Audit Log cmdlets will be deprecated on September 15, 2024.
The Mailbox Audit Log cmdlets will have a separate deprecation date, which will be announced early next year.
We encourage customers to begin transitioning to the Unified Audit Log (UAL) cmdlet i.e. Search-UnifiedAuditLog as soon as possible. Alternatively, you can explore using the Audit Search Graph API, which is currently in Public Preview and is expected to become Generally Available by early July 2024.
Next Steps
If you are currently using any one or both of the above-mentioned Admin Audit Log cmdlets, you will need to take the following actions before September 15, 2024:
For Search-AdminAuditLog, you will need to replace it with Search-UnifiedAuditLog in your scripts or commands. To get the same results as Search-AdminAuditLog, you will need to set the RecordType parameter to ExchangeAdmin. For example, if you want to search for all Exchange admin actions in the last 30 days, you can use the following command:
For New-AdminAuditLogSearch, you will need to use the Microsoft Purview Compliance Portal to download your audit log report. The portal allows you to specify the criteria for your audit log search, such as date range, record type, user, and action. You can also choose to receive the report by email or download it directly from the portal. You can access the portal here: Home Microsoft Purview. More details on using the Compliance portal for audit log searching can be found here.
Differences between UAL and AAL cmdlets
As you move from AAL to UAL cmdlets, you may notice some minor changes between them. In this section, we will show you some important differences in the Input and Output of the UAL cmdlet from the AAL cmdlets.
Input Parameter Differences
Admin Audit Log (AAL) cmdlets include certain parameters that are not directly available in the Unified Audit Log (UAL) cmdlets. However, we have identified suitable alternatives for most of them within the UAL that will allow you to achieve similar functionality.
Below are the 4 parameters that are supported in the AAL and their alternatives in UAL (if present).
The “Cmdlets” parameter in AAL can be substituted with the “Operations” parameter in UAL. This will allow you to filter audit records based on the operations performed.
While UAL does not have a direct “ExternalAccess” parameter, you can use the “FreeText” parameter to filter for external access by including relevant keywords and terms associated with external user activities
This property was always True in AAL because only the logs that succeeded were returned. Hence using or not using this parameter made no difference in the returned result set. Therefore, this property is not supported anymore in the Search-UnifiedAuditLog cmdlet.
In AAL, you can use the “StartIndex” parameter to pick the starting index for the results. UAL doesn’t support this parameter. Instead, you can use the pagination feature of Search-UnifiedAuditLog cmdlet to get a specific number of objects with the SessionId, SessionCommand and ResultSize parameter.
Please Note: The SessionId that is returned in the output of Search-AdminAuditLog is a system set value and the SessionId that is passed as an input along with the Search-UnifiedAuditLog cmdlet is User set value. This parameter may have the same name but perform different functions for each cmdlet.
Output Differences
There are differences how the Audit Log output is displayed in AAL vs UAL cmdlets. UAL has an enhanced set of results with enhanced properties in JSON format. In this section we point out a few major differences that should ease your migration journey.
Property in AAL
Equivalent Property in UAL
CmdletName
Operations
ObjectModified
Object Id
Caller
UserId
Parameters
AuditData > Parameters
NOTE: All the parameters and the values passed will be present as a JSON
ModifiedProperties
AuditData > ModifiedProperties
NOTE: Modified values will be only present in case the verbose mode is enabled using Set-AdminAuditLogConfig cmdlet.
ExternalAccess
AuditData > ExternalAccess
RunDate
CreationDate
We are here to help We are committed to providing you with the best tools and services to manage your Exchange Online environment and welcome your questions or feedback about this change. Please feel free to contact us through a comment on this blog post or reaching out by email at AdminAuditLogDeprecation[at]service.microsoft.com. We are always happy to hear from you and assist in any way we can.
Recent Comments