Gartner recognizes Microsoft as Leader in Unified Communications as a Service and Meetings Solutions

Gartner recognizes Microsoft as Leader in Unified Communications as a Service and Meetings Solutions

This article is contributed. See the original author and article here.

Hybrid work has become a part of our reality. We’ve had to rethink everything from where, when, and how we work. Navigating this new normal is both challenging and uncomfortable at times.

The post Gartner recognizes Microsoft as Leader in Unified Communications as a Service and Meetings Solutions appeared first on Microsoft 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

How to copy an Azure SQL Database to a different server or subscription using Azure Automation

How to copy an Azure SQL Database to a different server or subscription using Azure Automation

This article is contributed. See the original author and article here.

In this article we will demonstrate how to copy an Azure SQL DB to a different server or subscription using Azure Automation.


This article will explain the following:


1- The requirements on Azure SQL DB and how to fulfil it.


2- How to create an Azure Automation account and add a runbook to execute the copy operation.


 


 


1. Requirements for Azure SQL DB



  • Use a login that has the same name and password as the database owner of the source database on the source server. The login on the target server must also be a member of the dbmanager role, or be the server administrator login.

  • The below steps can be used to create the login on the source and the target server.


 


 

--Step# 1

--Create login and user in the master database of the source server.



CREATE LOGIN loginname WITH PASSWORD = 'xxxxxxxxx'

GO

CREATE USER [loginname] FOR LOGIN [loginname] WITH DEFAULT_SCHEMA=[dbo];

GO

ALTER ROLE dbmanager ADD MEMBER loginname;

GO



--Step# 2

--Create the user in the source database and grant dbowner permission to the database.



CREATE USER [loginname] FOR LOGIN [loginname] WITH DEFAULT_SCHEMA=[dbo];

GO

ALTER ROLE db_owner ADD MEMBER loginname;

GO



--Step# 3

--Capture the SID of the user "loginname" from master database



SELECT [sid] FROM sysusers WHERE [name] = 'loginname';



--Step# 4

--Connect to Destination server.

--Create login and user in the master database, same as of the source server.



CREATE LOGIN loginname WITH PASSWORD = 'xxxxxxxxx', SID = [SID of loginname login on source server];

GO

CREATE USER [loginname] FOR LOGIN [loginname] WITH DEFAULT_SCHEMA=[dbo];

GO

ALTER ROLE dbmanager ADD MEMBER loginname;

GO

 


 


 


From the above script we have created a same login on the source and target server that we will use on the Automation account to execute the copy.


 


2. Create Azure automation account and add runbook



  1. Create an Automation account: From the Azure Portal go to Automation account and create a new account. If you already have an Automation account you can skip this step and use your existing Automation account instead.
     

     


    mohammad_belbaisi_21-1635077353771.png

     



  2. Import SQLServer module: Go to the modules on your Automation account, click on modules gallery. From the search option type sqlserver and select this module. The next screen allows you to import the module.                                                                                    mohammad_belbaisi_22-1635077409451.pngmohammad_belbaisi_23-1635077422784.png

     


                                                                                                                                                                    

  3. Create Credential to access Azure SQL DB: Now we need to create a credential that we use to allow the automation account able to connect to the target SQL server and execute the copy operation. This credential must be the login that we created on using the above steps that have access to the source and destination servers .From the Automation account click on credential and provide a name for the credential, password and confirm the password.                                                                         

     


      mohammad_belbaisi_0-1635078959934.png

     


                                                                                                                                                               

  4. Create Variables for the Azure SQL DB server name: Create a variable that we will use to connect to the target SQL server, as the copy operation should be done from the target server. The value for this variable should be the FQDN of the target server (servername.database.windows.net):                                                                                         mohammad_belbaisi_1-1635078987468.png

     


                                                                                                                                                             

  5. Create the run book: Click on the runbook from the Automation account. On the create runbook screen you have to type the name for this runbook and choose PowerShell as the runbook type.                                                                                                                             mohammad_belbaisi_26-1635077593057.png

     


                                                                                                                                                               

  6. When the creation is completed, click on the created runbook, click on the edit and enter the below sample code (you can use the test pane to test the runbook and click save to save it).                                                                                                                                                                                                                                                                                       

     $Cred = Get-AutomationPSCredential -Name "name_of_created_credintial"
    
     $Server_Name = Get-AutomationVariable -Name "name_of_created_variable "
     
     $Query = "CREATE DATABASE copy13 AS COPY OF [source_server_name].[source_DB_name];"
    
     invoke-sqlcmd -ServerInstance "$Server_Name"  -Credential $Cred -  Query "$Query" -Encrypt
    ​

                                                                                                                                                              


  7. If you want to run this runbook on a schedule, you need to create a schedule and link this runbook to the created schedule.


mohammad_belbaisi_27-1635077715731.pngmohammad_belbaisi_28-1635077738587.png


 


 


 


 

How 3 makers, 2 devs and a princess came together to save kittens for a hackathon

How 3 makers, 2 devs and a princess came together to save kittens for a hackathon

This article is contributed. See the original author and article here.

 

 


solution-overview.png



 


The Story (Michael)


 


Just before Southcoast Summit 2021 got started, the organizers hosted the Automate Everything – SS2021 Hackathon where every solution revolves around Flic buttons. Wait, you don’t know what a Flic button is? It’s basically a wireless smart button that lets you control devices, apps and services. Push once, twice or hold the button and let each variant trigger a different action. There are multiple use cases in business but also in personal life in which Flic buttons make your life easier. Check out the Flic homepage to learn more.

 


PetrolPushTitle.png



 


Meet Petrol Push. A modern day organization that has a clear mission: Save kittens. There are hundreds of kittens all over Britain that get stuck in trees, get lost within the urban jungle or need help in any other kind of way. Luckily Petrol Push underholds a huge fleet of volunteers to rescue kittens every day.

 


the challenge

As you may know there is a petrol shortage happening right now and of course you wonder, how can Petrol Push keep up their noble mission? Flic Buttons and the Microsoft Power Platform gave them the ability to come up with a solutions to help all their volunteers in their day to day work.

 


the solution

Every Petrol Push car got a Flic button installed and whenever Petrol Push volunteers pass a gas station, they can indicate with a push of a button, whether the gas station has fuel available or not. This information gets stored on a map so every Petrol Push employee knows where fuel is available and where it’s not. This way the volunteers can keep their focus on their mission. They don’t need to drive around searching for fuel or worry where to gas up. The community of volunteers takes care of that.

 


Petrol Push cares deeply about their volunteers so they don’t want to put them in danger in any way. That’s why this solution comes with a little extra. Petrol Push workers don’t have to check the map over and over again to see whether anything has changed. If one of the volunteers found a gas station where fuel is available, the button gets pushed and the fleet will get notified with a song. That way the drivers know when to check the map for updates.

 


Within these times it might happen that our drivers get in trouble themselves, run out of gas, have a flat tire or something else. Once again, Petrol Push cares about their volunteers deeply so the Flic button provides the opportunity to call other volunteers on the road for help. Once again with a song, so no other driver needs to check their phone. The position gets indicated on the map though, so that help can be arranged quickly. It’s only the supervisor that gets an additional text message in order to provide further information.

 


Note: you will probably know by know, but this use case exemplifies the ability to combine geographic location with notifications that are not based on text. In this way, we want to draw attention to how versatile Power Platform solutions are and we also want to think about the people who can only use devices in a limited way. Please use this use case to customize it to your needs. And always remember, only as a community we are strong, so let’s be inclusive

 


Now, let’s dive into details and see how this solution actually works

 


The Flic and the flow (Tomasz)


 


In a big picture, the flow was built to get information about location of a driver who triggered it, next to lookup details of the closest petrol station (by using Azure Maps API). Finally to save the station’s data together with status into database, so later it can be displayed with a proper color of a pin, inside the app. But in details, it’s much more interesting.

 


PetrolFlow - part1.png

 


The flow can be triggered by any driver (1). Also, for any Flic event, but that will be described later. Next, bot looks up details of the button itself (2), to get its owner (3). This information will be later used to record data along with information about the driver.

 


PetrolFlow - part2.png



 


Next the flow calls Azure Maps custom connector via a dedicated child flow (1), by passing latitude and longitude of a driver’s location. Coordinates are obtained using GPS from driver’s phone that is paired with Flic button. Obviously this should be done using the action directly within the parent flow, however for some unknown reasons we were facing an issue while saving process with the action inside, so we decided to move it into a child flow. Don’t judge :)

Data returned by the child flow, that represents details about the nearest petrol station is then parsed (2).

 


Finally bot using postal code is filtering existing stations’ data to get a match (3). This is done using ODATA expression:

 

woi_postalcode eq ‘@{first(body(‘Parse_JSON’)?[‘results’])?[‘address’]?[‘extendedPostalCode’]}’. Then it saves its row ID into variable (4). Naturally, if there’s no station for the given postal code, variable will be empty. **We also made an assumption**, that there can be one station for a given postal code :)

 


PetrolFlow - part3.png

 


Process now checks, if station’s row ID is empty (1) – if yes, it means it has to be created. Creation (2) of the record takes all the details returned from Azure Maps API, like full address, station name, lat and lon, information about driver who reported it and finally – the postal code. After that row ID of the created station is being saved into variable.

 


PetrolFlow - part4.png






 


Now process moves to check what kind of action occurred on the Flic. There are 3 possible activities:

 


Single click – means that there’s petrol on the station,

Double click – means that there’s no petrol on the station,

Long press – means there’s an issue and driver requires assistance.

 


To check what action occurred, we are using switch action (1). For each branch process executes the same actions, just with different statuses. First, bot creates an entry in Activities table (2), to record latest status (to one from Petrol, No petrol, Issue) for the station together with driver details who reported it.

 


After that is done, it updates status (again to one from Petrol, No petrol, Issue) of the station record itself (3). Then it saves created activity record OData id into a variable. And finally it relates records (4) – petrol station together with the created activity record.

 


PetrolFlow - part5.png

 


What is also worth to mention is that the whole process is built using the try-catch pattern. All actions that are executed in terms of the business logic are stored in the “Try” scope (1). If anything fails within the scope, it is caught by the “Catch” scope (2), that has it’s “Run after” settings configured to only be executed if previous actions fails, times out or is skipped.

 


Process in the “Catch” scope first filters (3) results of the “Try” scope, using the expression result(‘Try’) to leave only those entries which contain information about errors: @equals(createArray(‘Failed’, ‘TimedOut’), ”). Next for each such record (4) it is adding information about the details to a string variable. Finally, variable’s contents is sent to admin as a notification (5) and the whole process ends up with “Failed” outcome.

 


Show me something beautiful – The canvas app (Carmen)



With the data stored in Dataverse, the canvas app can be created to display the available information and inform the people where they can find fuel. The canvas app consists of a header (with the company logo, name and refresh icon) and a map control.

 


We are using the built-in map control, which allows us to display the gas stations with their appropriate color, automatically center on the user’s current location and display additional information about each gas station when selecting the location pin.

 


To get the location pins on the map, we added the Dataverse table as a source in the Items property of the map control. We are currently not doing any filtering, but this could be added if needed. The latitude, longitude, labels and colors is each contained in a specific column within the data source. These are provided as values for the following properties (where the text between quotes is the name of the column in the Dataverse table):

 


ItemsLabels = “woi_name”

ItemsLatitudes = “woi_latitude”

ItemsLongitudes= “woi_longitude”

ItemsColors = “woi_color”

 


The woi_colors columns is defined as a calculated column that is influenced by the value of the Petrol Status column in the same table. Petrol Status contains the last known status of fuel at the respective station. The colors are defined as hex values with the following mapping:

























Last Known Status Color Color name
Petrol “#66FF00” Light Green
No petrol “#FF0000” Red
Issue “#FFBF00” Amber


 

 


The color of the grouped pins is defined by the PinColor property of the map control. It is set to Green, which is a darker color than the green used for the stations with fuel.

 


When a pin is selected, the info card is shown. This is defined by setting the InfoCards property of the map to Microsoft.Map.InfoCards’.OnClick. The fields that are shown on the info card are defined by editing the **Fields** in the properties pane of the map. Four fields are shown on the info card:

 


Name

Address

Postal code

Modified on (to know when the station’s status was last updated)

 


This can be seen on the below screenshot.

 


 


App-MapFields.png


The resulting app shows a map with all identified gas stations and their last known status, indicated with the color of the pin. Selecting a specific gas station provides the user with more information on that station.

 

App-Details.png

 


We need a real map – The custom connector to Azure Maps (Lee)


 


A key part of the solution is populating a list of petrol stations and their status based on presses of the Flic button. We initially looked to use the built-in Bing maps Power Automate connector and actions to find the current address when a Flic button was used. However, this would return the nearest address, which is not necessarily a petrol station (e.g. it could be a house on the opposite side of the street which is deemed nearer).

 


To work around this, we created an Azure Maps resource in Azure. Azure Maps can return a list of addresses within a certain radius that fit a particular “POI (point of interest) category” – in this case a petrol station. Using the subscription-key (API key) from the Azure Maps resource, we were able to create a custom connector in Power Automate and query for the nearest petrol stations to the longitude and latitude when the Flic button was pressed.

 


Bring me the vibes – The Spotify connector(Yannick)


 


We like to celebrate victories and help each other in times of need, and what better way than use music for this? We have a sound system in the office connected to Spotify so let’s use that to keep everyone updated on things that happen on the road!

A new Power Automate flow will trigger every time a new petrol station status is logged, excluding when no petrol was available. In the case someone found Petrol at a gas station, we get super excited for our colleague and play Fuel by Metallica in the office to have a small party. When someone gets in trouble, for whatever reason, we play Trouble by Coldplay  (so we know we need to rush to rescue) and a text message is sent to the manager.

 


Integrating with Spotify isn’t too difficult (the API is well-documented) but requires the creation of a custom connector with following API actions:

 


Get a User’s Available Devices: fetch a list of all devices currently connected to the Spotify service

Start/Resume a User’s Playback: play a song on a specific device

 


When combining both, we can first fetch all connected devices and then filter them on the device id of our office sound system. If the device is connected, we can play the appropriate song for the occasion with the second API call.  

And lastly, for the text message we’ll use Twilio. Luckily they have an existing connector within Power Automate so it’s only a matter of registering for a Twilio account, getting a number to send messages from and configuring the action in our flow.

 


The princess and the push (Luise)


 


Straight from the beginning of the hackathon, we took care of documenting our architecture decisions and how we would implement them. We set up a GitHub repository, invited everyone in the team so they could commit their files. We continued to document all major steps so that everyone could use this as a reference to explain our solution, although each member was only in charge of their workload. Getting all information and documenting while building ensured accuracy but also gave an opportunity to think through the app and reflect on decisions.

 


Documentation includes screenshots of the flows, explains the data model and environment variables. We also published the solution itself in this repository to give community the chance to play with our app.

 


What can we learn from this epic quest? (everybody)


 


As a group, we discussed the hackathon for quite a while even after it had ended, and we came up with, for us, four important lessons this experience has taught us.

 


1. Do one thing the right way, instead of a million things in a messy way


What helped us build this solution in a short timespan, was that each person of the team was responsible for a specific part of the solution. There was no context switching between the app studio and building the cloud flow for example. Instead, we made some agreements in the beginning of the day and let each other know verbally and in the documentation if anything needed to change. This allowed all of us to focus on their own part, resulting in finished pieces to the puzzle.

 


2. Take care of documentation


Since development was decentralized, it was important we could keep each other up to date on what we were doing. Therefore, we documented from the start. Since we were working against the clock, we had one person who constantly went around the table to see what each of us was working on and to make sure it was captured in the documentation. After the individual pieces were finished, this allowed us to piece them together more easily.

 


3. 1 + 1 = 3


Or in our case 6 x 1 = 10 (or something). Each of us has a different background, no two are the same. Because of this, we were able to share different perspectives and we were able to find the most efficient way to create the different pieces of the puzzle: e.g. Azure maps for station identification, canvas apps for a quick user interface and cloud flows for logic. Since we were not limited to one area of expertise, our solution combines the best of different worlds. In the process, we all learned from each other, either technical skills or an approach how to tackle something. And since we were all eager to learn from and share with each other, we had a lot of fun doing it. ????

 


4. We are all developers


Each of us is building or creating something on a daily basis, be that using no-code, low-code or code-first platforms and tools. During the hackathon, we realized that our commonalities are more important than our differences. We share a common problem-solving and solution-oriented approach. We can define logic and we do it in very similar terms (if – then – else, anyone?). We can conceptualize solutions and explain them to each other. And then each of us can find some way using their own tools to build that solution. This is what makes us developers, not the language or tool set we build things in, but the approach and mindset we share. All of us are developers, and you can be one too.

 

NZ6_5087-focus (2).jpg

 

IoT at Microsoft Ignite

This article is contributed. See the original author and article here.

It is fall season in the northern hemisphere, spring season in the southern and time for a new edition of Microsoft Ignite!


If you have not yet registered for the free virtual event, you can do so at https://myignite.microsoft.com.


 


As you navigate through the many sessions and opportunities to connect with Microsoft’s product teams, you will want to build yourself some learning path, and if you are interested in learning more about IoT, here are some pointers to help you out.


Below is the list of all the IoT related content featured at the event. As you will notice this edition is focused on roundtables to give you an opportunity to not just learn about the latest in IoT technologies at Microsoft but also to provide feedback, share what your needs and pain points are to the teams developing the next generation of IoT tools and services.


 

























































































Title Type Speakers Description
Developing AI Edge modules on Windows for IoT devices Product Rountables Martin TuipTerry Warwick Linux is often used for AI modules but the hurdle is the management of the devices in the infrastructure. What if you could use the best of both of them? Join the Azure EFLOW team to provide feedback and input on developing IoT Edge Modules, what types of devices you want to create, features you think we should add
Deploy IoT Solutions with Azure SQL Database Connection Zone Anna HoffmanDavide Mauri Many organizations are investing in IoT to increase operational efficiency, deliver better customer experiences, increase levels of security, enhance workplace safety, and reduce costs. This session will introduce how Azure SQL Database provides a price-performant backend, including templates that simplify deploying and configuring IoT solutions for any scenario.
IoT in manufacturing Product Roundtables Fabian FrankJehona Morina, Sean Parham, Ranga Vadlamudi, David Walker Tell the Azure IoT engineering team about your experiences with connected devices in manufacturing settings. We’ll share some materials on which we’d like your feedback. Your insights will help inform investments for upcoming releases
IoT devices are typically an organizations weakest security link. Help us prioritize features to help strengthen your IoT security posture Product Roundtables Yossi Basha, Nir KrumerPhil Neray Attacks on Enterprise IoT devices are increasing and it’s no wonder. IoT devices are not designed with security in mind like you’ll find with nearly ever every other traditional type of endpoint (workstations, servers mobile devices). In addition, the ability to monitor such devices is limited due to the lack of a deployable agent that can perform configuration management (e.gf.: settings and patching) and internal security monitoring using high fidelity endpoint signals. In this session, we are seeking your feedback so that we can learn and properly prioritize the roadmap for delivering IoT security capabilities to the Microsoft Defender suite.
How to Develop a Security Vision and Strategy for Cyber-Physical and IoT/OT Systems Breakout Phil NerayKatell Thielemann Recent ransomware attacks that halted production for a gas pipeline operator and food processor have raised board-level awareness about IoT and Operational Technology (OT) risk. Security leaders are now responsible for new threats from cyber physical systems (CPS) and parts of the organization they never traditionally worried about. Join Katell Thielemann from Gartner® to discuss how to develop a CPS risk strategy using the “language of the business” to show security as a strategic business enabler. GARTNER is a registered trademark and service mark of Gartner, Inc. and/or its affiliates in the U.S. and internationally and is used herein with permission. All rights reserved.
Onboarding to Azure IoT On-Demand Ricardo Minguez Pablos, Cory Newton-Smith Learn how to onboard to Azure IoT through our recommended approach to solution building. We’ll share how to get started with an aPaaS offering that helps you crystalize your IoT solution needs, set a strategic direction with confidence, and deliver value to your organization. You will learn how Azure IoT is simplifying IoT by providing an out-of-the-box and ready-to-use UX and API surface that works with customization options.
US Local Connection: Data is the new air – how customers use data & apps to drive innovation Connection Zone   How do we break down data silos and derive valuable insights about your customers to deliver transformational experiences? Join us for this session to learn how to leverage data, AI, and IoT to build innovative applications that benefit both your employees and customers.
Integrating Azure Sphere into products Product Roundtables Barry BondSusmitha Kothari, James Scott Calling all developers and device builders who have used Azure Sphere or want to know more about how MT3620 can accelerate your IoT journey. Join the Azure Sphere product team to hear what’s new and upcoming for the MT3620, and to share your feedback on what could be done to make the product better suit your needs.
Manufacturing a Resilient Future with Microsoft Cloud for Manufacturing Breakout Çağlayan ArkanArun Kumar Bhaskara-babaZikar Dawood, Satish Thomas
Microsoft Cloud for Manufacturing drives new levels of asset and workforce productivity while streamlining and improving security for IT, OT, and industrial IoT across the manufacturing value chain. By aligning cloud services to industry-specific requirements, we give customers a starting point in the cloud that easily integrates into their existing operations. Microsoft Cloud for manufacturing provides access to the broader portfolio of Microsoft cloud services enabling manufacturers to begin where the need for technology or business transformation is most urgent.

Ask the Experts: Manufacturing a Resilient Future with Microsoft Cloud for Manufacturing Connection Zone


 

 



Valerio Frediani, Colin Masson, Pepijn Richter, Severin Wandji


Microsoft Cloud for Manufacturing drives new levels of asset and workforce productivity while streamlining and improving security for IT, OT, and industrial IoT across the manufacturing value chain. By aligning cloud services to industry-specific requirements, we give customers a starting point in the cloud that easily integrates into their existing operations. Microsoft Cloud for manufacturing provides access to the broader portfolio of Microsoft cloud services enabling manufacturers to begin where the need for technology or business transformation is most urgent.
Scaling Unreal Engine in Azure with Pixel Streaming and Integrating Azure Digital Twins On-Demand

 


Steve BusbyErik JansenMaurizio Sciglio, Aaron Sternberg, David Weir-McCall



Deep dive into how Unreal Engine games or apps can be easily deployed and auto-scaled in Azure using Pixel Streaming, leveraging a new solution built by Azure Engineering that can save hundreds of hours of developer’s time to build the infrastructure and management to deploy at massive scale. Additionally, learn how Unreal Engine can integrate with Azure Digital Twins to deliver immersive and live visualizations of your IoT and digital twin environments.
Green Transformation: driving measurable sustainability every step of the way On-Demand

Microsoft Cloud services are up to 98% more carbon efficient than traditional datacenters, but the sustainability benefits don’t stop there. In this session, our experts discuss how to infuse your digital transformation with sustainable practices every step of the way — from migration and application development to creating more value for clients and measuring results. We’ll also explore a real-world scenario utilizing IoT and edge technology to power green initiatives for a global manufacturer.
Powering Cloud at the Edge: A Data Story Told Across Three Horizons On-Demand

 

Jai Mishra, Girish Phadke



As your organization builds its digital core, how can you develop crucial cloud-native capabilities that will help you quickly scale? Join us to learn what to consider when building edge-to-cloud solutions, and the unique roles that AI, IoT, and cognitive services play in being able to get the most out of the edge. We’ll also share use cases that highlight end-to-end scenarios, the respective Azure edge solutions deployed, and the roles that data, IoT and AI can play.

 

Customizing Lists best practices [guest blog and video]

Customizing Lists best practices [guest blog and video]

This article is contributed. See the original author and article here.

Hello, my name is Norm Young, and I am Microsoft MVP from St. Catharines, Ontario, Canada. I am a 20 plus year data professional who has recently made the shift to the collaboration space using Microsoft Lists, Microsoft Teams and SharePoint. Microsoft asked if I would share both the way I approach leveraging technology and highlight a few recent custom solutions I put together. I’m thrilled to be here, so – here, we go.


 


Background


 


Many of the solutions I have built using Microsoft Lists reflect the current state of a business process. When I worked in higher education, we used Lists to track student exchange program information (school name, start/end dates, current student counts etc.). In my current role we use Microsoft Lists to track our new customer onboarding program (client name, contact information, products to onboarding, major milestones etc.). Business processes tend to have multiple stages with start and end dates, persons responsible, and other forms of supporting metadata that make the execution of the business process possible. Additional value is brought to the Lists experience through the integration with other apps and services like the Power Platform with list data serving as the foundation for applications developed using Power Apps, workflows using Power Automate and reporting using Power BI.


 


Approach


 


When I develop solutions using Microsoft Lists, I take a configure-first approach where the existing out-of-the-box functionality is evaluated to see if the minimum viable product can be achieved without customization. If this is not possible then extending functionality is undertaken.


 


In the not so recent past, if I needed to hide or rearrange columns in the list new/edit form I had use Power Apps and by doing so I created technical debt that someone would have to own and maintain. Don’t get me wrong, I think Power Apps is a great development platform but not all solutions require customization. In the current state of the new/edit form hiding columns or rearranging columns is as simple as deselecting a checkbox or dragging columns into the desired order with no Power Apps customization required.


 


In contrast, if the solution requires customization and the value of that customization outweighs the cost the technical debt then customization is the right choice. While working in higher-ed, I developed a Microsoft Lists based solution that was used to track multi-factor authentication token assignments to users. The token identifier was a very long string of numbers with an associated barcode. If we implemented a no-customization solution the user would be forced to manually enter the identifier. This implementation would be prone to data entry error and would be unnecessarily mundane. Integrating Power Apps meant we could use the barcode scanner control to eliminate data entry errors and improve the user experience. In this case the cost of technical debt is outweighed by better data quality and improved user experience.


 



 


Because of their availability and low barrier to use, Microsoft Lists are leveraged in many ways. Below are the use cases shown in the above video – with additional context and a few tips and tricks along the way for why and how I customized lists to create business solutions.


 


Use case | Application submission solution using Forms, Microsoft Lists and Power Automate


Lists can be used to store response data from Microsoft Forms using Power Automate allowing for additional information tracking or analysis using Power BI. Integrating these tools helps to create a self-service platform for enhancing and modernizing business processes. Check out this blog article to learn more: Get responses from Forms to Microsoft Lists using Power Automate.


 


CustomizeLists_NY_001_Form-flow-list (002).jpg


Connect Microsoft Forms to Power Automate flows to capture and populate Microsoft Lists list items.


 


Tip | When integrating apps and services into Microsoft Lists, it is important to remember that not all list column types will translate to the destination data types and some concessions in functionality may be required. Complex data types like the person column include additional information like email address, mobile phone number, department name, job title, first name, last name and many more that may need to be split into separate columns to store the additional person information (email, first and last name) or handled in some programmatic way like an expression in a Power Automate.


 


Use case | Organization specific project management tracker using Microsoft Lists.


 


If you have created a list-based solution, why not share it with others in your organization using


Custom List Templates? Custom List Templates are a great way to extend the value of one solution to others with similar needs by giving users a well-thought-out starting point that brings value to them in a very quick manner. Read more at: A first look at Templates in Microsoft Lists.


 


CustomizeLists_NY_002_Custom-list-templates (002).jpg


Microsoft Lists now supports custom lists design by your organization.


 


Use case | Issue tracking management using Microsoft Lists, Microsoft Teams and Power Automate.


 


Microsoft Lists solutions that leverage a target date or end date will find value added through reminders prior to the date. Value comes in the form of users not having to manually look for items that are coming due, it is pushed to them. Get started with List reminders in teams with this blog post: Send reminders to Teams from Microsoft Lists using Power Automate.


 


CustomizeLists_NY_003_Reminder-in-Teams.jpg


Use adaptive cards populated with Microsoft Lists list item data to push actionable requests into Microsoft Teams chats.


 


Tip | Another point of consideration are how column names are referenced internally with Microsoft Lists. Depending on how the column was created will influence the internal column name. As an example, if we create two similarly name columns, we will get two different internal names as shown in the table below:


 





















Method



External name



Internal name



From default experience



Column 1



Column1



From List settings



Column 2



Column_x0020_2



 


Trick | A simple way to view the internal column name is to Edit Column (click on the column name) while in List Settings and check the Field value at the end of the URL.


 


CustomizeLists_NY_004_ChangeColumn (002).jpg


Trick | Go to List setting > Edit Column to view an internal column’s name.


 


Caution


 


Lists are great for centralizing and collaborating on data with teams but there are situations where lists are not the right fit. Solutions requiring complex data relationships and/or high-volume transactions should look to other offerings like Dataverse or Azure SQL Database as a more suitable platform.


 


Another caution when using lists as a data source is exceeding the List View Threshold – when apps and services attempt to fetch more than 5,000 items from a list. This doesn’t mean a list can only store 5,000 items – in fact a list can have up to 30 million items – but it does mean that any single call to a list must be 5,000 or fewer items. Be sure to check out the Living Large with Large Lists and Large Libraries article for more information on the List View Threshold issue, and how to design within it.


 


Final thoughts


 


Microsoft Lists are a great platform for collaborating on data using a single, low-barrier platform where solutions, can start modestly and grow as requirements, experience, and features increase. When deciding to customize lists be sure to take a configure-first approach and when it makes sense then customize using app and services like the Power Platform. To learn more be sure to check out the Microsoft Lists Resource Center for the latest blogs, videos, demos, and access to learning resources.


 


If you want to see more examples of how to extend functionality in Microsoft Lists be sure to check out my blog at https://normyoung.ca/. If you want to keep the conversation going lets connect on LinkedIn or following me on Twitter.


 


I hope you found some value in this post and thanks for reading!


 


Cheers, Norm