This article is contributed. See the original author and article here.
This article was written by Nick Hughes, an avid contributor in the extended reality spaceas part of our Humans of Mixed Reality Guest Blogger Series. Nick Hughes shares about his approach and core best practices to the Mixed Reality stack and how YOU can bring value in XR projects.
The whole extended reality space is vast, fluid, and sometimes foolish to try to define. There are so many ways that we can leverage this innovative modality that it also makes it somewhat difficult to pitch. The idea of bridging the gap between extended reality and the tangible world is still relatively fresh to most people. In this article, I’m going to give my experience taking a pragmatic approach to the Microsoft mixed reality stack and how you can bring value to your XR projects.
The first ever HoloLens I used is still in my office!
You’re probably thinking to yourself, “This guy is probably a Microsoft partner or some industry expert in mixed reality.” I assure you that two years ago I was running network cables down soot-covered walls in a foundry.
If there is someone who has asked all the silly questions, someone who has had to search the acronym on the Internet, it is me. However, that is why what I am going to share might be helpful to you too. I am a jack-of-all-trades-master-of-none type of person who just happens to be exceptional at translating all things technology into a humanly digestible form. This is going to land right in your backyard.
Two years ago, I was first exposed to XR. I put on a friend’s PlayStation VR headset and found myself captivated with the experience. The next couple of months had me buried in articles, YouTube videos, blogs, you name it. When I found the HoloLens, I couldn’t stop imagining all the business applications and use case scenarios. I knew this was going to be transformative, but I didn’t know it was going to radically change my entire career direction as well.
Today, I help lead an extended reality team that covers the entire globe. We have deployed HoloLens and Remote Assist in nearly twenty countries and on almost every continent. This technology has dramatically transformed the way we can conduct business and I’m thrilled to have the opportunity to share with everyone how we did it.
SHARE THIS AS SOON AS YOU CAN I cannot stress to you the importance of excitement. Gordon B. Hinckley said, “Every good citizen adds to the strength of a nation” and that should be your mindset when you first get your hands on a HoloLens. Your first goal should be to create as many “good citizens” of XR as possible. These people will help you to develop the organic business excitement that will be instrumental in the adoption, peer-aided adoption, and funding – yes funding! – of your efforts. A word of caution though, if you post all these showy videos in public places, once that excitement gets to a certain point, you’ll have to deliver that capability. Try to be intentional and align your program sponsors before it gets to that level. They may want you to temper that excitement and keep it regulated.
BECOME A PART OF THE MIXED REALITY COMMUNITY If you’ve found us here on the Mixed Reality Tech Community, you’re off to a good start! I and most ordinary humans can only partially consume all the information that is whirring around about XR. Developers are releasing original ideas and capabilities each day. A couple of suggestions that I recommend are the Microsoft Mixed Reality Community, Twitter, and the Global XR Community. These have been very effective in building my XR wisdom and helping flatten my learning curve.
Don’t be intimidated – it is going to seem foreign to you at first and that’s OK. As my friend, Steve Kaminski over at Microsoft would say, “just hang on by your fingernails to what you don’t know at first. If you stay engaged, it will all begin to make sense soon”. If some of the content doesn’t hit home, go jump into AltSpaceVR for a support group or post something in the mixed reality forums. The XR community is wonderfully accommodating and we’re all here to learn together.
REMOTE ASSIST, TRAINING, INTEGRATION – SAY IT WITH ME The most straightforward value proposition is the cost savings from eliminating travel altogether. You will crush geographical barriers by traversing over border closings and eliminating the impact of COVID travel bans with ease. This will be huge for anyone doing business globally. All three of those elements can compound into a trifecta business case that is impossible to ignore. Case in point: We are getting complete returns on our Microsoft stack investments in less than six months (~ 3 months) of having a device in operation. It practically pays for itself over time! Now THAT is powerful.
Training is another significant piece of business that is challenging with the current health concerns. Remote Assist and Guides both facilitate the pseudo-face-to-face training as well as virtually guided training. This eliminates the need for another person to be present and helps mitigate risk as well as reduces labor cost. The icing on the cake is that I can normally train a slow adopter of technology how to use Guides in less than an hour because it is so simple. Two weeks into our rollout, a senior manager had come into an MTC meeting and asked “What is the next big thing? What does the future look like?” I can confidently answer that with a single word, integration. It isn’t a new technology that is next, but how we use extended reality. Connect your MES systems, connect your ERP and show your “good citizens” how to advance customized applications. That is like opening Pandora’s box.
CAPTURE THE STORY AND MAKE IT REAL Microsoft, specifically, has done a great job at making sure they have enough videos and knowledge available so that some of the stories can be told without too much trouble. However, to get that essential excitement, you’re going to want to show you and your people using XR at your own locations and in your own ways. So, let me accelerate that effort for you. As soon as you get your HoloLens 2, grab a laptop on the same network, go to the device portal, and then to the capture section. Now, open your HoloLens and record from the HoloLens. You can capture a brilliant first-person perspective of what you’re viewing. This will be meaningful when you’re making your marketing materials, especially for your executive stakeholders.
You will also want to go buy a Microsoft Display Adapter. I know, I’m sounding like a bit of a salesman for Microsoft (trust me, I wish I got commissions too!) but, this adapter is small, inexpensive, and works without any configuration. Open the box, stick it in the HDMI and USB, and magically stream the first-person video directly to the television. This comes in very convenient when you’re showing someone the HoloLens for the first time!
KEEP ORBITING If you take any advice from me it is this, “stay present and keep orbiting”. This technology will sell itself but you must keep in touch and you must be ready to seize an opportunity when it presents itself. You’ll be working quietly for a week and then someone will tell you about this new thing they want to explore! Yes! That is exactly what you want to happen. Carpe diem, friends!
Do not let those kinds of opportunities go to waste because those “good citizens” are realizing the additional capability to the business. Be a facilitator and help them develop their ideas. That will resonate back to the business in a grand way. Especially when the users can build and then share their own story. You’ll see dividends from the peer-aided adoption that will come from those interactions.
Remote Assist will consistently be a high value-add and transformative product to your business. You will want to continue to nurture and develop people to expertly use all the features and be able to bestow that capability with others. The more people using Remote Assist, the more people are exposed to XR in an easy-to-digest way. That is vital in your long-term success. You’re not going to be able to implement this new communication method with an isolated group of skilled operators.
Extended reality is here to stay! There is no doubt about it. Believe me – if you don’t get on board now, you’ll be ten years behind in just three years.
I welcome you to connect with me on Twitter and share your experiences! I love hearing what others are doing on the implementation side of things. @TheNerdNick #doSomethingGreatToday #XR
This article is contributed. See the original author and article here.
This article was written by Nick Hughes, an avid contributor in the extended reality spaceas part of our Humans of Mixed Reality Guest Blogger Series. Nick Hughes shares about his approach and core best practices to the Mixed Reality stack and how YOU can bring value in XR projects.
The whole extended reality space is vast, fluid, and sometimes foolish to try to define. There are so many ways that we can leverage this innovative modality that it also makes it somewhat difficult to pitch. The idea of bridging the gap between extended reality and the tangible world is still relatively fresh to most people. In this article, I’m going to give my experience taking a pragmatic approach to the Microsoft mixed reality stack and how you can bring value to your XR projects.
The first ever HoloLens I used is still in my office!
You’re probably thinking to yourself, “This guy is probably a Microsoft partner or some industry expert in mixed reality.” I assure you that two years ago I was running network cables down soot-covered walls in a foundry.
If there is someone who has asked all the silly questions, someone who has had to search the acronym on the Internet, it is me. However, that is why what I am going to share might be helpful to you too. I am a jack-of-all-trades-master-of-none type of person who just happens to be exceptional at translating all things technology into a humanly digestible form. This is going to land right in your backyard.
Two years ago, I was first exposed to XR. I put on a friend’s PlayStation VR headset and found myself captivated with the experience. The next couple of months had me buried in articles, YouTube videos, blogs, you name it. When I found the HoloLens, I couldn’t stop imagining all the business applications and use case scenarios. I knew this was going to be transformative, but I didn’t know it was going to radically change my entire career direction as well.
Today, I help lead an extended reality team that covers the entire globe. We have deployed HoloLens and Remote Assist in nearly twenty countries and on almost every continent. This technology has dramatically transformed the way we can conduct business and I’m thrilled to have the opportunity to share with everyone how we did it.
SHARE THIS AS SOON AS YOU CAN I cannot stress to you the importance of excitement. Gordon B. Hinckley said, “Every good citizen adds to the strength of a nation” and that should be your mindset when you first get your hands on a HoloLens. Your first goal should be to create as many “good citizens” of XR as possible. These people will help you to develop the organic business excitement that will be instrumental in the adoption, peer-aided adoption, and funding – yes funding! – of your efforts. A word of caution though, if you post all these showy videos in public places, once that excitement gets to a certain point, you’ll have to deliver that capability. Try to be intentional and align your program sponsors before it gets to that level. They may want you to temper that excitement and keep it regulated.
BECOME A PART OF THE MIXED REALITY COMMUNITY If you’ve found us here on the Mixed Reality Tech Community, you’re off to a good start! I and most ordinary humans can only partially consume all the information that is whirring around about XR. Developers are releasing original ideas and capabilities each day. A couple of suggestions that I recommend are the Microsoft Mixed Reality Community, Twitter, and the Global XR Community. These have been very effective in building my XR wisdom and helping flatten my learning curve.
Don’t be intimidated – it is going to seem foreign to you at first and that’s OK. As my friend, Steve Kaminski over at Microsoft would say, “just hang on by your fingernails to what you don’t know at first. If you stay engaged, it will all begin to make sense soon”. If some of the content doesn’t hit home, go jump into AltSpaceVR for a support group or post something in the mixed reality forums. The XR community is wonderfully accommodating and we’re all here to learn together.
REMOTE ASSIST, TRAINING, INTEGRATION – SAY IT WITH ME The most straightforward value proposition is the cost savings from eliminating travel altogether. You will crush geographical barriers by traversing over border closings and eliminating the impact of COVID travel bans with ease. This will be huge for anyone doing business globally. All three of those elements can compound into a trifecta business case that is impossible to ignore. Case in point: We are getting complete returns on our Microsoft stack investments in less than six months (~ 3 months) of having a device in operation. It practically pays for itself over time! Now THAT is powerful.
Training is another significant piece of business that is challenging with the current health concerns. Remote Assist and Guides both facilitate the pseudo-face-to-face training as well as virtually guided training. This eliminates the need for another person to be present and helps mitigate risk as well as reduces labor cost. The icing on the cake is that I can normally train a slow adopter of technology how to use Guides in less than an hour because it is so simple. Two weeks into our rollout, a senior manager had come into an MTC meeting and asked “What is the next big thing? What does the future look like?” I can confidently answer that with a single word, integration. It isn’t a new technology that is next, but how we use extended reality. Connect your MES systems, connect your ERP and show your “good citizens” how to advance customized applications. That is like opening Pandora’s box.
CAPTURE THE STORY AND MAKE IT REAL Microsoft, specifically, has done a great job at making sure they have enough videos and knowledge available so that some of the stories can be told without too much trouble. However, to get that essential excitement, you’re going to want to show you and your people using XR at your own locations and in your own ways. So, let me accelerate that effort for you. As soon as you get your HoloLens 2, grab a laptop on the same network, go to the device portal, and then to the capture section. Now, open your HoloLens and record from the HoloLens. You can capture a brilliant first-person perspective of what you’re viewing. This will be meaningful when you’re making your marketing materials, especially for your executive stakeholders.
You will also want to go buy a Microsoft Display Adapter. I know, I’m sounding like a bit of a salesman for Microsoft (trust me, I wish I got commissions too!) but, this adapter is small, inexpensive, and works without any configuration. Open the box, stick it in the HDMI and USB, and magically stream the first-person video directly to the television. This comes in very convenient when you’re showing someone the HoloLens for the first time!
KEEP ORBITING If you take any advice from me it is this, “stay present and keep orbiting”. This technology will sell itself but you must keep in touch and you must be ready to seize an opportunity when it presents itself. You’ll be working quietly for a week and then someone will tell you about this new thing they want to explore! Yes! That is exactly what you want to happen. Carpe diem, friends!
Do not let those kinds of opportunities go to waste because those “good citizens” are realizing the additional capability to the business. Be a facilitator and help them develop their ideas. That will resonate back to the business in a grand way. Especially when the users can build and then share their own story. You’ll see dividends from the peer-aided adoption that will come from those interactions.
Remote Assist will consistently be a high value-add and transformative product to your business. You will want to continue to nurture and develop people to expertly use all the features and be able to bestow that capability with others. The more people using Remote Assist, the more people are exposed to XR in an easy-to-digest way. That is vital in your long-term success. You’re not going to be able to implement this new communication method with an isolated group of skilled operators.
Extended reality is here to stay! There is no doubt about it. Believe me – if you don’t get on board now, you’ll be ten years behind in just three years.
I welcome you to connect with me on Twitter and share your experiences! I love hearing what others are doing on the implementation side of things. @TheNerdNick #doSomethingGreatToday #XR
This article is contributed. See the original author and article here.
Sync Upis your monthly podcast hosted by the OneDrive team taking you behind the scenes of OneDrive, shedding light on how OneDrive connects you to all your files in Microsoft 365 so you can share and work together from anywhere. You will hear from experts behind the design and development of OneDrive, as well as customers and Microsoft MVPs. Each episode will also give you news and announcements, special topics of discussion, and best practices for your OneDrive experience.
In episode 18 , cohosts Jason Moore and Ankita Kirti talk with Cory Kincaid, a Customer Success Manager for Modern Work, who advises customers on how to use technologies like Teams and OneDrive to improve their business.
You’ll also get guidance on how to combat current workplace issues like “employee burnout” and “screen fatigue” as well as find out the team’s favorite article of “pandemic clothing.”
Jason Mooreis the Principal Group Program Manager for OneDrive and the Microsoft 365 files experience. He loves files, folders, and metadata. Twitter:@jasmo
Ankita Kirtiis a Product Manager on the Microsoft 365 product marketing team responsible for OneDrive for Business. Twitter:@Ankita_Kirti21
Cory Kincaid is a Customer Success Manager for Modern Work, who advises customers on how to use technologies like Teams and OneDrive to improve their business productivity.
Additional guests:
Ryan Voelki and Tatyanah Castillo, also customer success managers for Modern Work, who take a #HumansFirst approach to helping customers navigate their digital transformation.
Be sure to visit our show page to hear all the episodes, access the show notes, and get bonus content. And stay connected to the OneDrive community blog where we’ll share more information per episode, guest insights, and take any questions from our listeners and OneDrive users. We, too, welcome your ideas for future episodes topics and segments. Keep the discussion going in comments below.
As you can see, we continue to evolve OneDrive as a place to access, share, and collaborate on all your files in Office 365, keeping them protected and readily accessible on all your devices, anywhere. We, at OneDrive, will shine a recurring light on the importance of you, the user. We will continue working to make OneDrive and related apps more approachable. The OneDrive team wants you to unleash your creativity. And we will do this, together, one episode at a time.
Thanks for your time reading and listening to all things OneDrive,
This article is contributed. See the original author and article here.
In this blog I want to show you, how you can build, test and publish an FAQ bot for Microsoft Teams within minutes. We will use the Power Virtual Agents for Teams, which means, that you will not need any additional license to your Microsoft 365 license, for reference see alsoPower Virtual Agents for Microsoft Teams plan.
What is Power Virtual Agents?
Power Virtual Agents belongs like Power Apps,Power Automateand Power Bi to the Power Platform (wow, that was a powerFULL sentence:smiling_face_with_halo:). You can create chatbots, which can interact with users in apps and websites, trigger workflows and more, without the need of writing code. You can choose if you want to use it in thePower Virtual Agents standalone web appor asapp within Microsoft Teams.
Let’s build a bot
I will guide you how to create an FAQ bot. To feed our bot we will need some FAQ so that our can bot can learn them. I will useFAQ regarding licensing:nerd_face:, but you can choose any FAQ from a website or PDF or even Word file that you like.
Open Teams
Click on theAppsicon
Search forPower Virtual Agents
ClickAdd
Select the Team you want your bot to join
Give your bot a name ans select a language that your bot shall understand (should be the same language as your FAQ)
ClickChatbots– here you get an overview of ALL your chatbots
Add topics from any website
ClickTopics
You see, that some basic topics are already created for you. You can take a look later.
ClickSuggested
Now we want to work on feeding our bot with the FAQ from the website that we selected.
Copy the URL auf the FAQ website
Paste the URL into theLink to online contentfield
ClickAdd
ClickStart
This may take now a couple of minutes. Grab a coffee in the meanwhile::hot_beverage:. Soon you will see the message that your new suggested topics are now in:
Review & edit topics
You can now review and edit each topic:
ClickSave Topic
After you are done with reviewing and editing your topics, you will need to turn on the topics
Switch the toggle toon
Train your bot by entering more trigger phrases. This way, it is more likely that the Chatbot understands users asking questions even if they don’t exactly match the trigger phrases.
Time to test the bot!
You can now review and edit your topics until you are happy with the results.
Publish your Bot to Microsoft Teams
Click thePublishicon
ClickAdd
ClickAdd to Teams
Use your bot
Conclusion & what’s next
It took us only a few minutes to create, test and publish a chatbot, that now works inside of Microsoft Teams. Want to do some more? We could extend the capabilities of our Power Virtual Agents bot: Let’s say our bot can’t answer a question and needs to transfer the chat to a human agent, who will answer that question. What if we trained the bot with that answer so that our bot gets smarter over time? I will cover that in one of my next blog posts. What do you use chatbots for? Did you already try to make a 5 minute bot? Please share below :)
This article is contributed. See the original author and article here.
Have you ever tried to cast hand shadows on a wall? It is the easiest thing in the world, and yet to do it well requires practice and just the right setup. To cultivate your #cottagecore aesthetic, try going into a completely dark room with just one lit candle, and casting hand shadows on a plain wall. The effect is startlingly dramatic. What fun!
Even a tea light suffices to create a great effect
In 2020, and now into 2021, many folks are reverting back to basics as they look around their houses, reopening dusty corners of attics and basements and remembering the simple crafts that they used to love. Papermaking, anyone? All you need is a few tools and torn up, recycled paper. Pressing flowers? All you need is newspaper, some heavy books, and patience. And hand shadows? Just a candle.
This TikTok creator has thousands of views for their handshadow tutorials
But what’s a developer to do when trying to capture that #cottagecore vibe in a web app?
High Tech for the Cottage
While exploring the art of hand shadows, I wondered whether some of therecent workI had done for body poses might be applicable to hand poses. What if you could tell a story on the web using your hands, and somehow save a video of the show and the narrative behind it, and send it to someone special? In lockdown, what could be more amusing than sharing shadow stories between friends or relatives, all virtually?
Hand shadow casting is a folk art probably originating in China; if you go to tea houses with stage shows, you might be lucky enough to view one like this!
A Show Of Hands
When you start researching hand poses, it’s striking how much content there is on the web on the topic. There has been work since at least 2014 on creating fully articulated hands within the research, simulation, and gaming sphere:
MSR throwing hands
There are dozens of handpose libraries already on GitHub:
There are many applications where tracking hands is a useful activity:
• Gaming • Simulations / Training • “Hands free” uses for remote interactions with things by moving the body • Assistive technologies • TikTok effects :trophy: • Useful things likeAccordion Hands apps
One of the more interesting new libraries,handsfree.js, offers an excellent array of demos in its effort to move to a hands free web experience:
Handsfree.js, a very promising project
As it turns out, hands are pretty complicated things. Theyeachinclude 21 keypoints (vs PoseNet’s 17 keypoints for an entire body). Building a model to support inference for such a complicated grouping of keypoints has provenn challenging.
There are two main libraries available to the web developer when incorporating hand poses into an app: TensorFlow.js’s handposes, and MediaPipe’s. HandsFree.js uses both, to the extent that they expose APIs. As it turns out, neither TensorFlow.js nor MediaPipe’s handposes are perfect for our project. We will have to compromise.
TensorFlow.js’s handposesallow access to each hand keypoint and the ability to draw the hand to canvas as desired. HOWEVER, it only currently supports single hand poses, which is not optimal for good hand shadow shows.
MediaPipe’s handpose models(which are used by TensorFlow.js) do allow for dual hands BUT its API does not allow for much styling of the keypoints so that drawing shadows using it is not obvious.
One other library,fingerposes, is optimized for finger spelling in a sign language context and is worth a look.
Since it’s more important to use the Canvas API to draw custom shadows, we are obliged to use TensorFlow.js, hoping that either it will soon support multiple hands OR handsfree.js helps push the envelope to expose a more styleable hand.
Let’s get to work to build this app.
Scaffold a Static Web App
As a Vue.js developer, I always use the Vue CLI to scaffold an app usingvue create my-appand creating a standard app. I set up a basic app with two routes: Home and Show. Since this is going to be deployed as an Azure Static Web App, I follow my standard practice of including my app files in a folder namedappand creating anapifolder to include an Azure function to store a key (more on this in a minute).
In my package.json file, I import the important packages for using TensorFlow.js and the Cognitive Services Speech SDK in this app. Note that TensorFlow.js has divided its imports into individual packages:
We will draw an image of a hand, as detected by TensorFlow.js, onto a canvas, superimposed onto a video suppled by a webcam. In addition, we will redraw the hand to a second canvas (shadowCanvas), styled like shadows:
Working asynchronously, load the Handpose model. Once the backend is setup and the model is loaded, load the video via the webcam, and start watching the video’s keyframes for hand poses. It’s important at these steps to ensure error handling in case the model fails to load or there’s no webcam available.
asyncmounted(){ awaittf.setBackend(this.backend); //async load model, then load video, then pass it to start landmarking this.model=awaithandpose.load(); this.message=“Model is loaded! Now loading video“; letwebcam; try{ webcam=awaitthis.loadVideo(); }catch(e){ this.message=e.message; throwe; }
this.landmarksRealTime(webcam); },
Setup the Webcam
Still working asynchronously, set up the camera to provide a stream of images
asyncsetupCamera(){ if(!navigator.mediaDevices||!navigator.mediaDevices.getUserMedia){ thrownewError( “Browser API navigator.mediaDevices.getUserMedia not available“ ); } this.video=this.$refs.video; conststream=awaitnavigator.mediaDevices.getUserMedia({ video:{ facingMode:“user“, width:VIDEO_WIDTH, height:VIDEO_HEIGHT, }, });
Now the fun begins, as you can get creative in drawing the hand on top of the video. This landmarking function runs on every keyframe, watching for a hand to be detected and drawing lines onto the canvas – red on top of the video, and black on top of the shadowCanvas. Since the shadowCanvas background is white, the hand is drawn as white as well and the viewer only sees the offset shadow, in fuzzy black with rounded corners. The effect is rather spooky!
Since TensorFlow.js allows you direct access to the keypoints of the hand and the hand’s coordinates, you can manipulate them to draw a more lifelike hand. Thus we can redraw the palm to be a polygon, rather than resembling a garden rake with points culminating in the wrist.
With the models and video loaded, keyframes tracked, and hands and shadows drawn to canvas, we can implement a speech-to-text SDK so that you can narrate and save your shadow story.
To do this, get a key from the Azure portal forSpeech Servicesby creating a Service:
You can connect to this service by importing the sdk:
import * as sdk from “microsoft-cognitiveservices-speech-sdk”;
…and start audio transcription after obtaining an API key which is stored in an Azure function in the/apifolder. This function gets the key stored in the Azure portal in the Azure Static Web App where the app is hosted.
asyncstartAudioTranscription(){ try{ //get the key constresponse=awaitaxios.get(“/api/getKey“); this.subKey=response.data; //sdk
In this function, the SpeechRecognizer gathers text in chunks that it recognizes and organizes into sentences. That text is printed into a message string and displayed on the front end.
Display the Story
In this last part, the output cast onto the shadowCanvas is saved as a stream and recorded using the MediaRecorder API:
This app can be deployed as an Azure Static Web App using the excellentAzure plugin for Visual Studio Code. And once it’s live, you can tell durable shadow stories!
Recent Comments