This article is contributed. See the original author and article here.
Happy Friday, MTC’ers! I hope you’ve had a good week and that March is treating you well so far. Let’s dive in and see what’s going on in the community this week!
MTC Moments of the Week
First up, we are shining our MTC Member of the Week spotlight on a community member from Down Under, @Doug_Robbins_Word_MVP! True to his username, Doug has been a valued MVP for over 20 years and an active contributor to the MTC across several forums, including M365, Word, Outlook, and Excel. Thanks for being awesome, Doug!
And last but not least, over on the Blogs, @Hung_Dang has shared a skilling snack for you to devour during your next break, where you can learn how to make the most of your time with the help of Windows Autopilot!
Unanswered Questions – Can you help them out?
Every week, users come to the MTC seeking guidance or technical support for their Microsoft solutions, and we want to help highlight a few of these each week in the hopes of getting these questions answered by our amazing community!
Did you know that the little wave-shaped blob of toothpaste you squeeze onto your toothbrush has a name? It’s called a “nurdle” (the exact origin of this term isn’t clear), but there was even a lawsuit in 2010 that Colgate filed against Glaxo (the maker of Aquafresh) to prohibit them from depicting the nurdle in their toothpaste packaging. Glaxo countersued, and the case was settled after a few months. The more you know!
This article is contributed. See the original author and article here.
PgBadger is one of the most comprehensive Postgres troubleshooting tools available. It allows users to have insight into a wide variety of events happening in the database including:
(Vacuums tab) Autovacuum actions – how many ANALYZE and VACUUM actions were triggered by autovacuum daemon including number of tuples and pages removed per table.
(Temp Files tab) Distribution of temporary files and their sizes, queries that generated temporary files.
(Locks tab) type of locks in the system, most frequent waiting queries, queries that waited the most; unfortunately, there is no information provided which query is holding the lock, only the queries that are locked are shown.
You can generate a pgBadger report from Azure Database for PostgreSQL Flexible Server in multiple ways:
Using Diagnostic Settings and redirecting logs to a storage account; mount storage account onto VM with BlobFuse.
Using Diagnostic Settings and redirecting logs to a storage account; download the logs from storage account to the VM.
Using Diagnostic Settings and redirecting logs to Log Analytics workspace.
Using plain Server Logs*
*Coming soon!
In this article we will describe the first solution – Using Diagnostic Settings and redirecting logs to a storage account. At the end of exercise we will have storage account filled with the logs from Postgres Flexible Server and a operational VM with direct access to the logs stored in the storage account like shown below in the picture:
generate pgBadger report from Azure Database for PostgreSQL Flexible Server
To be able to generate the report you need to configure the following items:
Adjust Postgres Server configuration
Create storage account (or use existing one)
Create Linux VM (or use existing one)
Configure Diagnostic Settings in Postgres Flexible Server and redirect logs to the storage account.
Mount storage account onto VM using BlobFuse
Install pgBadger on the VM
Ready to generate reports!
Step 1 Adjust Postgres Server configuration
Navigate to the Server Parameters blade in the portal and modify the following parameters:
log_line_prefix = ‘%t %p %l-1 db-%d,user-%u,app-%a,client-%h ‘ #Please mind the space at the end! log_lock_waits = on log_temp_files = 0 log_autovacuum_min_duration = 0 log_min_duration_statement=0 # 0 is recommended only for test purposes, for production usage please consider much higher value, like 60000 (1 minute) to avoid excessive usage of resources
Adjust Postgres Server configuration
After the change hit the “Save”:
Save changed Postgres parameters
Step 2 Create storage account (or use existing one)
Please keep in mind that the storage account needs to be created in the same region as Azure Database for PostgreSQL Flexible Server. Please find the instruction here.
Step 3 Create Linux VM (or use existing one)
In this blog post we will use Ubuntu 20.04 as an example, but nothing stops you from using rpm-based system, the only difference will be in a way that BlobFuse and pgBadger is installed.
Step 4 Configure Diagnostic Settings in Postgres Flexible Server and redirect logs to the storage account.
Navigate to Diagnostic settings page in the Azure Portal, Azure Database for PostgreSQL Flexible Server instance and add a new diagnostic setting with storage account as a destination:
Hit save button.
Step 5 Mount storage account onto VM using BlobFuse
In this section you will mount storage account to your VM using BlobFuse. This way you will see the logs on the storage account as standard files in your VM. First let’s download and install necessary packages. Commands for Ubuntu 20.04 are as follows (feel free to simply copy and paste the following commands):
Use a ramdisk for the temporary path (Optional Step)
The following example creates a ramdisk of 16 GB and a directory for BlobFuse. Choose the size based on your needs. This ramdisk allows BlobFuse to open files up to 16 GB in size.
You can authorize access to your storage account by using the account access key, a shared access signature, a managed identity, or a service principal. Authorization information can be provided on the command line, in a config file, or in environment variables. For details, seeValid authentication setupsin the BlobFuse readme.
For example, suppose you are authorizing with the account access keys and storing them in a config file. The config file should have the following format:
Please prepare the following file in editor of your choice. Values for the accountName and accountKey you will find in the Azure Portal and the container name is the same as in the example above. The accountName is the name of your storage account, and not the full URL.
Please navigate to your storage account in the portal and then choose Access keys page:
Copy accountName and accountKey and paste it to the file. Copy the content of your file and paste it to thefuse_connection.cfgfile in your home directory, then mount your storage account container onto the directory in your VM:
sudo-i cd /home//mycontainer/ ls# check if you see container mounted # Please use tab key for directory autocompletion; do not copy and paste! cd resourceId=/SUBSCRIPTIONS/<your subscription id>/RESOURCEGROUPS/PG-WORKSHOP/PROVIDERS/MICROSOFT.DBFORPOSTGRESQL/FLEXIBLESERVERS/PSQLFLEXIKHLYQLERJGTM/y=2022/m=06/d=16/h=09/m=00/
head PT1H.json # to check if file is not empty
At this point you should be able to see some logs being generated.
Step 6 Install pgBadger on the VM
Now we need to install pgBadger tool on the VM. For Ubuntu please simply use the command below:
Choose the file you want to generate pgBadger from and go to the directory where the chosenPT1H.jsonfile is stored, for instance, to generate a report from 2022-05-23, 9 o’clock you need to go to the following directory:
cd /home/pgadmin/mycontainer/resourceId=/SUBSCRIPTIONS/***/RESOURCEGROUPS/PG-WORKSHOP/PROVIDERS/MICROSOFT.DBFORPOSTGRESQL/FLEXIBLESERVERS/PSQLFLEXIKHLYQLERJGTM/y=2022/m=05/d=23/h=09/m=00
Since PT1H.json file is a json file and the Postgres log lines are stored in the message and statement values of the json we need to extract the logs first. The most convenient tool for the job is jq which you can install using the following command on Ubuntu:
sudo apt-get install jq -y
Once jq is installed we need to extract Postgres log from json file and save it in another file (PTH1.log in this example):
This article is contributed. See the original author and article here.
In our daily lives we engage across multiple communication channels depending on the time and place. It’s no different when we engage in a sales conversation. Customers and prospects are busy people who don’t always have the time to join a booked meeting, take a call or even engage via e-mail. Yet when the need arises, we want them to be able to get in touch with ease.
Now in Dynamics 365 Sales we have made connecting with customers even easier by adding an SMS channel communication for our users. This channel provides another option to the sellers ‘kit bag’ to aid connection.
It’s a convenient and non-intrusive option to send reminders, provide quick updates, or respond to customer queries.
Today most of the customers are already collaborating with their key contacts and business decision makers via various chat channels including SMS. However, often these happen over seller personal devices, or via custom SMS platforms which need connection and setup into a CRM application. With this new update sellers can just get started quickly knowing the conversation is tracked for the team to keep informed. We can stop data silos and reduce IT spend on activation.
Let’s look at how this latest update can help engage customers:
Connect with customers on their terms. Allow sellers to interact in a way that suits their customer while maintaining a full history tracked back into the Dynamics 365 Sales application.
Improve team visibility. Help sellers and supporting teams engage with the full context on conversations enabling better relationship building and removing information repetition.
Maintain continuity. New account teams and sellers can quickly pick up the history with stakeholders without missing a beat.
Inform next steps. With connected conversation history a sales team can accurately plan what actions to take to close deals faster.
Organizations will be able to:
Ensure all the touchpoints sellers have with their customers are captured as part of the sales application for better tracking and planning.
Enable sellers to spend less effort managing multiple conversations with all their key contacts.
Let’s dive in to see our key capabilities and learn how implementing this across sales scenarios will benefit your organization.
What are the capabilities available to the users?
We are adding the SMS communication channel as a new activity record in Dynamics 365 Sales. It will power several workflows, including:
Send and receive SMS messages using the chat editor from various touchpoints like the up next widget in the sales accelerator workspace, contact or lead forms. You can use the chat editor to send manually typed messages or invoke pre-created templates to send to the customer.
SMS chat editor
Add automated/manual SMS step in a sequence – While creating sequences, add SMS as a manual or automated step. This enables sellers to send reminders or quick updates to key contacts as part of the sales process.
Adding SMS step to a sequence
Create SMS templates for personal use or to share with the organization based on permissions. As a user, you can create templates which are best suited to your relevant SMS-related scenarios. These templates can be:
Invoked manually from the SMS chat editor in context of specific entities, such as a contact or an opportunity.
Additionally, you can leverage the personalization capability while creating the templates, to add a personal touch to the messages.
Creating an SMS template
View the conversation details on the entity timeline. Often, multiple members from the sales team talk to the same customer as part of a sales process. This creates communication gaps within the sales team as everyone is not aware of all the other, parallel, conversations. With this capability, all the various conversations happening with different team members will be captured as “conversation blocks” on the entity timeline. All the team members can refer to the timeline for information. They can also expand each block to view the details of each customer conversation in context of the entity.
Viewing SMS conversation on the entity timeline
How to get started:
Administrators will need to set up the SMS provider within the Dynamics 365 Sales application and assign numbers to enable additional communication channels. Once done, sellers or sales team can leverage the assigned number to send and receive SMS.
Let’s look at the key steps involved in the process:
Set up using our current providers. The account will then be used to configure the SMS channel within the Dynamics 365 Sales application.
Connect to Dynamics 365 Sales. Administrators can then easily associate the SMS channel provider within the Dynamics 365 Sales application by following the setup wizard.
Assign sellers. Once the provider configuration is completed, the admin or sales manager can assign the available SMS numbers to their sales team or specific sellers. Once the sellers log in with their credentials, they will be able to leverage the assigned numbers to use the SMS capabilities.
Number assignment
Start today with additional communication channels:
Reduce the time and effort your sellers are putting into managing, tracking and entering relevant engagement details into CRM.
Help existing sellers and sales teams be more productive and have more efficient conversations,
Help new starters quickly get up to speed on customer conversation history.
We’re always looking for feedback and would like to hear from you. Please head to the Dynamics 365 Sales – Forums, Blogs, Support to start a discussion, ask questions, and tell us what you think!
If you are not yet a Dynamics 365 Sales customer, check out theDynamics 365 Sales webpagewhere you can take a guided tour or get a free 30-day trial.
This article is contributed. See the original author and article here.
Expect your apps to work on Windows 11. If you still need extra reassurance, test only your most critical apps with Test Base for Microsoft 365. This efficient and cost-effective testing service is available today to all IT pros, along with developers and independent software vendors.
We hope you’re already experiencing the safer, smoother, and more collaborative work environment with the new Windows 11 features. We’re also here to help you ensure business continuity by maintaining high application compatibility rates and supporting you in testing the mission-critical applications that you manage. Many of you have traditional (read legacy) testing systems that are provisioned in on-premises lab infrastructure. This setup has a myriad of challenges: limited capacity, manual setup and configuration, cost associated with third-party software, etc. That’s why we created Test Base for Microsoft 365. It’s a cloud-based testing solution that allows IT pros and developers to test your applications against pre-release and in-market versions of Windows and Office in a Microsoft-managed environment. Scale your testing, use data-driven insights, and ensure fewer deployment issues and helpdesk escalations with targeted testing of critical apps.
Let’s take a closer look at how you can leverage Test Base to optimize your testing with:
Proactive issue discovery with insights
Automated testing against a large matrix
Worry-free interoperability assurance
Support throughout the entire validation lifecycle
Proactive issue discovery with insights
If you’re on a mission to be more proactive with issue detection, give Test Base a try. Do you wish to have more time to test your apps before the monthly Windows updates go out? We know reactive issue detection is stressful after your end users have already been impacted. So why not leverage prebuilt test scripts in our smooth onboarding experience! With that, let Test Base perform the install, launch, close, and uninstall actions on your apps 30 times for extra assurance. Get helpful reports and know that your apps won’t have any foundational issues on the new version of Windows.
Use prebuilt or personalized scripts to test updates
Here’s how you can use the prebuilt test scripts and get actionable insights:
Click on the Edit Package tab and select a pre-generated test script from the left-hand side pane.
Follow the step-by-step onboarding experience from there!
The following image shows the Package Preview step of the onboarding experience, illustrating a custom script to test the installation of an application. You can also choose to write your own functional tests to validate the specific functionality of your applications.
Pre-generated template scripts available on the New Package page inside Test Base
After you configure your test matrix on the following screen, we automatically test your applications against Windows updates. You’ll also get an execution report and summary of any failures. Use this report to see data on your applications’ reliability, memory, and CPU utilization across all your configured tests. At the end, see Event Trace Logs (ETLs) and a video recording of the test execution to easily debug any issues that you find.
Script execution report in the test results summary in Test Base
Compare CPU usage for feature updates
As part of the test results, view CPU utilization insights for your applications during Windows feature updates. Within the same onboarding experience in Test Base > New Package > Create New Package Online > Test Matrix tab, you can view a side-by-side analysis between your configured run and a baseline run. Select any of the familiar OSs as the baseline, and it will appear below your test graph. In the following example, the tested December monthly security build (2022.12 B) shows up on the top, while the baseline November monthly security build (2022.11 B) is listed on the bottom. The results are summarized above both charts with a regression status and details.
Side-by-side CPU utilization analysis of a test run against the baseline inside Test Base
How do we determine that? We check for a statistically significant difference in CPU usage for processes between the baseline and target run. A run is regressed if one or more relevant processes experiences a statistically significant increase at one of our observed percentiles. Use these insights for faster regression detection and to ensure enough lead time to fix any issues you might find. Learn more about the CPU regression analysis in our official documentation.
Automated testing against a large matrix
Can you identify with managing scores of devices that run multiple different versions of Windows? Then you might worry about eventually facing an unexpected compatibility issue between Windows and another Microsoft first-party app update. Test Base is integrated with the Windows build system, so we can help you test your apps against builds as soon as they are available and before they are released to Windows devices. Use our comprehensive OS coverage and automated testing to detect potential issues as early as possible, improve your SLA, and reduce labor costs.
Select New Package from the left-hand side menu.
Within the onboarding guide, click on the Test Matrix page.
Select the OS update type from among security update, feature update, or both.
Use the drop-down menu to select all applicable OS versions (either in-market or pre-release) that you want to test.
Optional: Select Inside Channel if you want to include tests on the latest available features.
Choose your OS baseline for insight.
Note: To choose a baseline OS, evaluate which OS version your organization is currently using. This will offer you more regression insights. You can also leave it empty if you prefer to focus on new feature testing.
The Test Base testing matrix of in-market and pre-release security and feature updates
That’s how you can easily automate testing against in-market and pre-release Windows security and feature updates. Start today with these considerations:
To always get the latest features available, select a Windows Insider channel to run your tests on.
In case you still don’t find the OS you want to use as a baseline, let us know!
Worry-free interoperability assurance
We know your users have multiple applications running on their devices at the same time: mail app, browser, collaboration tools, etc. Sometimes, updates affect the interaction between applications and the OS, which could then affect the application’s performance.
Do you find yourself wishing for better predictability of regressions from Microsoft product updates or more lead time to fix occasional resulting issues? Use our interoperability tests with detailed reliability signals just for that. Find them within the same Test Base onboarding experience:
Select the New Package flow.
Navigate to the Configure test tab.
Toggle the Pre-install Microsoft apps to turn it on under Test type.
The pre-install Microsoft apps is toggled on in the New package configuration in Test Base
For interoperability assurance, you’d want to look for any signals of regressions after running the test. From the test summary, select the run you want to examine closer, and navigate to the Reliability tab. Here’s what a successful test would look like, comparing the number of crashes between the target test run and the baseline.
Detailed reliability signals for feature update test results in Test Base
With Test Base, you can conduct pre-release testing against monthly Microsoft 365 and Office 365 updates for extra confidence that you are covered across a broad range of Microsoft product updates. Leverage the automation in Test Base to schedule monthly tests whenever the monthly updates become available! For that, learn how to pick security testing options in the Set test matrix step in our official documentation.
Support throughout the entire validation lifecycle
The best part about Test Base is that you can use it throughout the entire Windows and Microsoft 365 apps validation lifecycle, supported by people and tools you trust. Use our support not only for the annual Windows updates, but also for the monthly security updates.
If you’re a developer, take advantage of integrations with familiar development tools like Azure DevOps and GitHub to test as you develop. That way, you can catch regressions before they are introduced to end users. Check out our documentation on how you can integrate Test Base into your Continuous Integration/Continuous Delivery pipeline.
Whether you’re an app developer or an IT pro, work with an App Assure engineer to get support with remediating any issues you find.
The table below outlines some of the key benefits that Test Base provides across the end-to-end validation lifecycle: from testing environment, to testing tools and software, and, finally, to testing services.
Testing environment
Testing tools & software
Testing services
Elastic cloud capacity
Access to pre-release Windows and Microsoft 365 Apps builds
Sign up for a free trial of Test Base to try out these features and optimize your testing process! To get started, simply sign up for an account via Azure or visit our website for more information.
Check out more information and best practices on how to integrate Test Base into your Continuous Integration/Continuous Delivery pipeline in our GitHub documentation. We’ve also put together some sample packages that you can use to test out the onboarding process:
This article is contributed. See the original author and article here.
In this guest blog post, Omri Eytan, CTO of odix, discusses how businesses relying on Microsoft 365 can protect themselves from file-based attacks with Filewall, available via Microsoft Azure Marketplace.
Many theses were written about the latest pandemic impact on our working habits, preferences, and how companies had to adopt new processes to keep business alive.
We see numerous reports describing the work-from-home trend becoming the new reality of hybrid working environment. This had a huge impact on IT departments, which had to enable a secured yet transparent working environment for all employees, wherever they work. A McAfee Report showed a 50 percent increase in cloud use across enterprises in all industries during COVID while the number of threat actors targeting cloud collaboration services increased 630 percent for the same period!
Furthermore, the reports highlighted the increase in number of threats mainly targeted collaboration services such as Microsoft 365.
Microsoft 365 security approaches: attack channel vs. attack vector protection
To begin with, businesses must take responsibility for their cloud SaaS deployments when it comes to security, data backup, and privacy. Microsoft 365 business applications (Exchange Online, OneDrive, SharePoint, and Teams) are no different.
Security solutions offering protection for Microsoft 365 users can be divided into two main methodologies:
Protecting attack channel: dedicated security solutions designed to protect users from various attack vectors within a specific channel such as email. The email channel has many third-party solutions working to protect users against various attack vectors such as phishing, spam, and malicious attachments.
Protecting attack vector: advanced security solutions aiming to protect users from a specific attack vector across multiple channels such as email, OneDrive, and SharePoint.
These approaches remind us of an old debate when purchasing IT products: Should the company compromise a bit on innovation and quality and purchase a one-stop-shop type of solution, or is it better to choose multiple best-of-breed solutions and be equipped with the best technology available?
Security solutions for Microsoft 365 are no different.
Protecting Microsoft 365 users against file-based attacks
This article focuses on one of the top attack vectors hackers use: the file-based attack vector. Research shows the top file types used to embed malware in channels like email are commonly used, like Word, Excel, and PDF. Hackers use these file types because people tend to click and open them naturally. When malicious code is embedded well (e.g., in nested files), the file bypasses common anti-malware solutions such as anti-virus and sandbox methods that scan files for threats and block the files if malware was detected.
Deep file analysis (DFA) technology, introduced by odix, was designed to handle all commonly used files and offers a detectionless approach. With DFA, all malware, including zero-day exploits, is prevented and the user gets a safe copy of the original file.
What are DFA and CDR?
DFA or CDR (content disarm and reconstruction) describes the process of creating a safe copy of an original file by including only the safe elements from the original file. CDR focuses on verifying the validity of the file structure on the binary level and disarms both known and unknown threats. The detectionless approach ensures all files that complete the sanitization process successfully are malware-free copies and can be used safely.
odix, an Israel-based cybersecurity company driving innovation in content disarm and reconstruction technology, developed the FileWall solution to complement and strengthen existing Microsoft security systems. FileWall, available in Microsoft AppSource and Azure Marketplace, helps business users easily strengthen Microsoft 365 security within a few clicks.
How FileWall works with Microsoft security technology
FileWall integrates with the Microsoft Graph security API and Microsoft Azure Sentinel, bringing malware protection capabilities with an essential added layer of deep file analysis technology containing CDR proprietary algorithms.
FileWall blocks malicious elements embedded in files across Microsoft 365 applications including Exchange Online, SharePoint, OneDrive, and Teams. The unique DFA process is also extremely effective in complex file scenarios such as nested files and password-protected attachments where traditional sandbox methods could miss or result in lengthy delays and disruption of business processes.
Empowering Microsoft 365 security: granular filtering per channel
FileWall includes a modern admin console so the Microsoft 365 administrator can set security policies and gain overall control of all files and attachments across Exchange Online, SharePoint, OneDrive, and Teams. The FileWall file type filter lets the admin define which file types are permitted in the organization and which should be blocked. This minimizes the attack surface the organization is exposing via email and collaboration services by eliminating the threat vectors available in certain file types.
The type filter has three main controls:
On/off: enabling or disabling the filter functionality on all file types
Work mode: the ability to create preset lists of permitted and non-permitted file types for specific users within the organization
Default settings: suggested default policy by FileWall which includes204 file types categorized as dangerous, including executable files (*.exe), windows batch files (*.bat), windows links (*.lnk), and others
How FileWall complements Defender’s safe attachments
As a native security solution within the Microsoft 365 deployment, FileWall doesn’t harm productivity. Consequently, all FileWall’s settings have been configured to complement Microsoft 365 Defender. Combining the two products provides high levels of security with multi antivirus, sandbox, and CDR capabilities. While the sandbox can manage executables and active content, FileWall handles all commonly used files such as Microsoft Office, PDF, and images. As most organizational traffic consists of non-executable documents, this method can reduce sandbox load by 90 percent to 95 percent, lowering total costs and improving the average latency.
FileWall enhances the existing Microsoft type filter and allows additional granular controls over the types of files that are allowed to enter the organization while enforcing these restrictions on nested and embedded files as well.
Call for Certified Microsoft CSPs who wish to increase revenues
Microsoft CSPs can bundle FileWall via Microsoft Partner Center for their customers. odix offers generous margins to Microsoft CSPs who joined the FileWall partner program.
FileWall-certified partners are eligible for a free NFR license according to odix terms of use.
This article is contributed. See the original author and article here.
Today marks a significant shift in endpoint management and security. We are launching the Microsoft Intune Suite, which unifies mission-critical advanced endpoint management and security solutions into one simple bundle.
This article is contributed. See the original author and article here.
Join Microsoft at GTC, a global technology conference running March 20 – 23, 2023, to learn how organizations of any size can power AI innovation with purpose-built cloud infrastructure from Microsoft.
Microsoft’s Azure AI supercomputing infrastructure is uniquely designed for AI workloads and helps build and train some of the industry’s most advanced AI solutions. From data preparation to model and infrastructure performance management, Azure’s comprehensive portfolio of powerful and massively scalable GPU-accelerated virtual machines (VMs) and seamless integration with services like Azure Batch and open-source solutions helps streamline management and automation of large AI models and infrastructure.
Attend GTC to discover how Azure AI infrastructure optimized for AI performance can deliver speed and scale in the cloud and help you reduce the complexity of building, training, and bringing AI models into production. Register today! GTC Developer Conference is a free online event.
Microsoft sessions at NVIDIA GTC
Add the below Microsoft sessions at GTC to your conference schedule to learn about the latest Azure AI infrastructure and dive deep into a variety of use cases and technologies.
Nidhi Chappell, General Manager, Azure HPC, AI, SAP and Confidential Computing
Kathleen Mitford, Corporate Vice President, Azure Marketing, Microsoft
Manuvir Das, Head of Enterprise Computing, NVIDIA
Azure’s purpose-built AI infrastructure is enabling leading organizations in AI to build a new era of innovative applications and services. The convergence of cloud flexibility and economics, with advances in cloud performance, is paving the way to accelerate AI initiatives across simulations, science, and industry. Whether you need to scale to 80,000 cores for MPI workloads, or you’re looking for AI supercomputing capabilities, Azure can support your needs. Learn more about Azures AI platform, our latest updates, and hear about customer experiences.
Microsoft offers some of the most powerful and massively scalable Virtual Machines, optimized for AI workloads. Join us for an in-depth look at the latest updates for Azure’s ND series based on NVIDIA GPUs, engineered to deliver a combination of high-performance, interconnected GPUs, working in parallel, that can help you reduce complexity, minimize operational bottlenecks operations, and can deliver reliability at scale.
Next-Generation AI for Improving Building Security and Safety
Adina Trufinescu, Senior Program Manager, Azure Specialized Compute, Microsoft
Computer Vision – AI Video Analytics
Deep Learning Institute Workshops and Labs
We are proud to host NVIDIA’s Deep Learning Institute (DLI) training at NVIDIA GTC. Attend full-day, hands-on, instructor-led workshops or two-hour free training labs to get up to speed on the latest technology and breakthroughs. Hosted on Microsoft Azure, these sessions enable and empower you to leverage NVIDIA GPUs on the Azure platform to solve the world’s most interesting and relevant problems.
Whether your project is big or small, local or global, Microsoft Azure is empowering companies worldwide to push the boundaries of AI innovation. Learn how you can make AI your reality.
This article is contributed. See the original author and article here.
This month, we’re bringing new AI-powered capabilities to Microsoft Teams Premium, helping keep everyone aligned with Microsoft Viva Engage, and sharing new Loop components in Whiteboard to help your team collaborate in sync.
Recent Comments