This NCPW, let’s talk about impersonation scams
This article was originally posted by the FTC. See the original article here.
Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.
This article was originally posted by the FTC. See the original article here.
Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.
This article is contributed. See the original author and article here.
Hi, I am Samson Amaugo, a Microsoft MVP and a Microsoft Learn Student Ambassador. I love writing and talking about all things DotNet. I am currently a student at the Federal University of Technology, Owerri. We could connect on Linkedin at My Linkedin Profile.
Have you ever thought about going through all your GitHub Repositories, taking note of the languages used, aggregating them, and visualizing it on Excel?
Well, that is what this post is all about except you don’t have to do it manually in a mundane way.
With the Aid of the Octokit GraphQL Library and Microsoft Graph .NET SDK, you could code up a cool that automates this process.
To build out this project in a sandboxed environment with the appropriate permissions I had to sign up on Microsoft 365 Dev Center to Create an Account that I could use to interact with Microsoft 365 products.
The outcome of the project could be seen below
{
"AzureClientID": "eff50f7f-6900-49fb-a245-168fa53d2730",
"AzureClientSecret": "vUx8Q~plb15_Q~2ZscyfxKnR6VrWm634lIYVRb.V",
"AzureTenantID": "33f6d3c4-7d26-473b-a7f0-13b53b72b52b",
"GitHubClientSecret": "ghp_rtPprvqRPlykkYofA4V36EQPNV4SK210LNt7",
"NameOfNewFile": "chartFile.xlsx"
}
You would need to replace the credential above with yours.
using Microsoft.Extensions.Configuration;
namespace MicrosoftGraphDotNet
{
internal class Config
{
// Define properties to hold configuration values
public string? AzureClientId { get; set; }
public string? AzureClientSecret { get; set; }
public string? AzureTenantId { get; set; }
public string? GitHubClientSecret { get; set; }
public string? NameOfNewFile { get; set; }
// Constructor to read configuration values from appsettings.json file
public Config()
{
// Create a new configuration builder and add appsettings.json as a configuration source
IConfiguration config = new ConfigurationBuilder()
.SetBasePath(Directory.GetCurrentDirectory())
.AddJsonFile("appsettings.json")
.Build();
// Bind configuration values to the properties of this class
config.Bind(this);
}
}
}
In the Program.cs file I imported the necessary namespaces that I would be needing by adding the code below:
// Import necessary packages
using System.Text.Json;
using Octokit.GraphQL;
using Octokit.GraphQL.Core;
using Octokit.GraphQL.Model;
using Azure.Identity;
using Microsoft.Graph;
using MicrosoftGraphDotNet;
// retrieve the config
var config = new Config();
// Define user agent and connection string for GitHub GraphQL API
var userAgent = new ProductHeaderValue("YOUR_PRODUCT_NAME", "1.0.0");
var connection = new Connection(userAgent, config.GitHubClientSecret!);
// Define GraphQL query to fetch repository names and their associated programming languages
var query = new Query()
.Viewer.Repositories(
isFork: false,
affiliations: new Arg<IEnumerable>(
new RepositoryAffiliation?[] { RepositoryAffiliation.Owner })
).AllPages().Select(repo => new
{
repo.Name,
Languages = repo.Languages(null, null, null, null, null).AllPages().Select(language => language.Name).ToList()
}).Compile();
// Execute the GraphQL query and deserialize the result into a list of repositories
var result = await connection.Run(query);
var languages = result.SelectMany(repo => repo.Languages).Distinct().ToList();
var repoNameAndLanguages = JsonSerializer.Deserialize(JsonSerializer.Serialize(result));
// Define a class to hold repository data
class Repository
{
public string? Name { get; set; }
public List? Languages { get; set; }
}
// Define credentials and access scopes for Microsoft Graph API
var tokenCred = new ClientSecretCredential(
config.AzureTenantId!,
config.AzureClientId!,
config.AzureClientSecret!);
var graphClient = new GraphServiceClient(tokenCred);
// Define the file name and create a new Excel file in OneDrive
var driveItem = new DriveItem
{
Name = config.NameOfNewFile!,
File = new Microsoft.Graph.File
{
}
};
var newFile = await graphClient.Drive.Root.Children
.Request()
.AddAsync(driveItem);
// Define the address of the Excel table and create a new table in the file
var address = "Sheet1!A1:" + (char)('A' + languages.Count) + repoNameAndLanguages?.Count();
var hasHeaders = true;
var table = await graphClient.Drive.Items[newFile.Id].Workbook.Tables
.Add(hasHeaders, address)
.Request()
.PostAsync();
The code that represents the data above can be seen below:
// Define the first row of the Excel table with the column headers
var firstRow = new List { "Repository Name" }.Concat(languages).ToList();
// Convert the repository data into a two-dimensional list
List<List> totalRows = new List<List> { firstRow };
foreach (var value in repoNameAndLanguages!)
{
var row = new List { value.Name! };
foreach (var language in languages)
{
row.Add(value.Languages!.Contains(language) ? "1" : "0");
}
totalRows.Add(row);
}
// Add a new row to the table with the total number of repositories for each language
var languageTotalRow = new List();
// Add "Total" as the first item in the list
languageTotalRow.Add("Total");
// Loop through each programming language in the header row
for (var languageIndex = 1; languageIndex < totalRows[0].Count; languageIndex++)
{
// Set the total count for this language to 0
var languageTotal = 0;
// Loop through each repository in the table
for (var repoIndex = 1; repoIndex < totalRows.Count; repoIndex++)
{
// If the repository uses this language, increment the count
if (totalRows[repoIndex][languageIndex] == "1")
{
languageTotal++;
}
}
// Add the total count for this language to the languageTotalRow list
languageTotalRow.Add(languageTotal.ToString());
}
// Add the languageTotalRow list to the bottom of the table
totalRows.Add(languageTotalRow);
// Create a new WorkbookTableRow object with the totalRows list serialized as a JSON document
var workbookTableRow = new WorkbookTableRow
{
Values = JsonSerializer.SerializeToDocument(totalRows),
Index = 0,
};
// Add the new row to the workbook table
await graphClient.Drive.Items[newFile.Id].Workbook.
Tables[table.Id].Rows
.Request()
.AddAsync(workbookTableRow);
// Add a new chart to the worksheet with the language totals as data
await graphClient.Drive.Items[newFile.Id].Workbook.Worksheets["Sheet1"].Charts
.Add("ColumnClustered", "Auto", JsonSerializer.SerializeToDocument($"Sheet1!B2:{(char)('A' + languages.Count)}2, Sheet1!B{repoNameAndLanguages.Count() + 3}:{(char)('A' + languages.Count)}{repoNameAndLanguages.Count() + 3}"))
.Request()
.PostAsync();
// Print the URL of the new file to the console
Console.WriteLine(newFile.WebUrl);
And that’s the end of this article. I hope you enjoyed it and got to see how I used Microsoft Graph .NET SDK to automate this process.
To learn more about Microsoft Graph API and SDKs:
Microsoft Graph https://developer.microsoft.com/graph
Develop apps with the Microsoft Graph Toolkit – Training
Hack Together: Microsoft Graph and .NET
Is a hackathon for beginners to get started building scenario-based apps using .NET and Microsoft Graph. In this hackathon, you will kick-start learning how to build apps with Microsoft Graph and develop apps based on the given Top Microsoft Graph Scenarios, for a chance to win exciting prizes while meeting Microsoft Graph Product Group Leaders, Cloud Advocates, MVPs and Student Ambassadors. The hackathon starts on March 1st and ends on March 15th. It is recommended for participants to follow the Hack Together Roadmap for a successful hackathon.
Demo/Sample Code
You can access the code for this project at https://github.com/sammychinedu2ky/MicrosoftGraphDotNet
This article is contributed. See the original author and article here.
Happy Friday, MTC’ers! I hope you’ve had a good week and that March is treating you well so far. Let’s dive in and see what’s going on in the community this week!
MTC Moments of the Week
First up, we are shining our MTC Member of the Week spotlight on a community member from Down Under, @Doug_Robbins_Word_MVP! True to his username, Doug has been a valued MVP for over 20 years and an active contributor to the MTC across several forums, including M365, Word, Outlook, and Excel. Thanks for being awesome, Doug!
Jumping to events, on Tuesday, we closed out February with a brand new episode of Unpacking Endpoint Management all about Group Policy migration and transformation. Our hosts @Danny Guillory and @Steve Thomas (GLADIATOR) sat down for a fun policy pizza party with guests @mikedano, @Aasawari Navathe, @LauraArrizza, and @Joe Lurie from the Intune engineering team. We will be back next month to have a conversation about Endpoint Data Loss Prevention and Zero Trust – click here to RSVP and add it to your calendar!
Then on Wednesday, we had our first AMA of the new month with the Microsoft Syntex team. This event focused on answering the community’s questions and listening to user feedback about Syntex’s new document processing pay-as-you-go metered services with a stacked panel of experts including @Ian Story, @James Eccles, @Ankit_Rastogi, @jolenetam, @Kristen_Kamath, @Shreya_Ganguly, and @Bill Baer. Thank you to everyone who contributed to the discussion, and stay tuned for more Syntex news!
And last but not least, over on the Blogs, @Hung_Dang has shared a skilling snack for you to devour during your next break, where you can learn how to make the most of your time with the help of Windows Autopilot!
Unanswered Questions – Can you help them out?
Every week, users come to the MTC seeking guidance or technical support for their Microsoft solutions, and we want to help highlight a few of these each week in the hopes of getting these questions answered by our amazing community!
In the Exchange forum, @Ian Gibason recently setup a Workspace mailbox that is not populating the Location field when using Room Finder, leading to user requests being denied. Have you seen this before?
Perhaps you can help @infoatlantic over in the Intune forum, who’s looking for a way to sync a list of important support contacts to company owned devices.
Upcoming Events – Mark Your Calendars!
——————————–
For this week’s fun fact…
Did you know that the little wave-shaped blob of toothpaste you squeeze onto your toothbrush has a name? It’s called a “nurdle” (the exact origin of this term isn’t clear), but there was even a lawsuit in 2010 that Colgate filed against Glaxo (the maker of Aquafresh) to prohibit them from depicting the nurdle in their toothpaste packaging. Glaxo countersued, and the case was settled after a few months. The more you know!
Have a great weekend, everyone!
This article is contributed. See the original author and article here.
PgBadger is one of the most comprehensive Postgres troubleshooting tools available. It allows users to have insight into a wide variety of events happening in the database including:
You can see a sample pgBadger report here.
You can generate a pgBadger report from Azure Database for PostgreSQL Flexible Server in multiple ways:
*Coming soon!
In this article we will describe the first solution – Using Diagnostic Settings and redirecting logs to a storage account. At the end of exercise we will have storage account filled with the logs from Postgres Flexible Server and a operational VM with direct access to the logs stored in the storage account like shown below in the picture:
generate pgBadger report from Azure Database for PostgreSQL Flexible Server
To be able to generate the report you need to configure the following items:
Navigate to the Server Parameters blade in the portal and modify the following parameters:
log_line_prefix = ‘%t %p %l-1 db-%d,user-%u,app-%a,client-%h ‘ #Please mind the space at the end!
log_lock_waits = on
log_temp_files = 0
log_autovacuum_min_duration = 0
log_min_duration_statement=0 # 0 is recommended only for test purposes, for production usage please consider much higher value, like 60000 (1 minute) to avoid excessive usage of resources
Adjust Postgres Server configuration
After the change hit the “Save”:
Save changed Postgres parameters
Please keep in mind that the storage account needs to be created in the same region as Azure Database for PostgreSQL Flexible Server. Please find the instruction here.
In this blog post we will use Ubuntu 20.04 as an example, but nothing stops you from using rpm-based system, the only difference will be in a way that BlobFuse and pgBadger is installed.
Navigate to Diagnostic settings page in the Azure Portal, Azure Database for PostgreSQL Flexible Server instance and add a new diagnostic setting with storage account as a destination:
Hit save button.
In this section you will mount storage account to your VM using BlobFuse. This way you will see the logs on the storage account as standard files in your VM. First let’s download and install necessary packages. Commands for Ubuntu 20.04 are as follows (feel free to simply copy and paste the following commands):
wget https://packages.microsoft.com/config/ubuntu/20.04/packages-microsoft-prod.deb
sudo dpkg -i packages-microsoft-prod.deb
sudo apt-get update -y
For other distributions please follow the official documentation.
The following example creates a ramdisk of 16 GB and a directory for BlobFuse. Choose the size based on your needs. This ramdisk allows BlobFuse to open files up to 16 GB in size.
sudo mkdir /mnt/ramdisk
sudo mount -t tmpfs -o size=16g tmpfs /mnt/ramdisk
sudo mkdir /mnt/ramdisk/blobfusetmp
sudo chown /mnt/ramdisk/blobfusetmpYou can authorize access to your storage account by using the account access key, a shared access signature, a managed identity, or a service principal. Authorization information can be provided on the command line, in a config file, or in environment variables. For details, see Valid authentication setups in the BlobFuse readme.
For example, suppose you are authorizing with the account access keys and storing them in a config file. The config file should have the following format:
accountName myaccount
accountKey storageaccesskey
containerName insights-logs-postgresqllogs
Please prepare the following file in editor of your choice. Values for the accountName and accountKey you will find in the Azure Portal and the container name is the same as in the example above. The accountName is the name of your storage account, and not the full URL.
Please navigate to your storage account in the portal and then choose Access keys page:
Copy accountName and accountKey and paste it to the file. Copy the content of your file and paste it to the fuse_connection.cfg file in your home directory, then mount your storage account container onto the directory in your VM:
vi fuse_connection.cfg
chmod 600 fuse_connection.cfg
mkdir ~/mycontainer
sudo blobfuse ~/mycontainer –tmp-path=/mnt/resource/blobfusetmp –config-file=/home//fuse_connection.cfg -o attr_timeout=240 -o entry_timeout=240 -o negative_timeout=120
sudo -i
cd /home//mycontainer/
ls # check if you see container mounted
# Please use tab key for directory autocompletion; do not copy and paste!
cd resourceId=/SUBSCRIPTIONS/<your subscription id>/RESOURCEGROUPS/PG-WORKSHOP/PROVIDERS/MICROSOFT.DBFORPOSTGRESQL/FLEXIBLESERVERS/PSQLFLEXIKHLYQLERJGTM/y=2022/m=06/d=16/h=09/m=00/
head PT1H.json # to check if file is not empty
At this point you should be able to see some logs being generated.
Now we need to install pgBadger tool on the VM. For Ubuntu please simply use the command below:
sudo apt-get install -y pgbadgerFor other distributions please follow the official documentation.
Choose the file you want to generate pgBadger from and go to the directory where the chosen PT1H.json file is stored, for instance, to generate a report from 2022-05-23, 9 o’clock you need to go to the following directory:
cd /home/pgadmin/mycontainer/resourceId=/SUBSCRIPTIONS/***/RESOURCEGROUPS/PG-WORKSHOP/PROVIDERS/MICROSOFT.DBFORPOSTGRESQL/FLEXIBLESERVERS/PSQLFLEXIKHLYQLERJGTM/y=2022/m=05/d=23/h=09/m=00
Since PT1H.json file is a json file and the Postgres log lines are stored in the message and statement values of the json we need to extract the logs first. The most convenient tool for the job is jq which you can install using the following command on Ubuntu:
sudo apt-get install jq -y
Once jq is installed we need to extract Postgres log from json file and save it in another file (PTH1.log in this example):
jq -r ‘.properties | .message + .statement’ PT1H.json > PT1H.log
Finally we are ready to generate pgBadger report:
pgbadger –prefix=’%t %p %l-1 db-%d,user-%u,app-%a,client-%h ‘ PT1H.log -o pgbadgerReport.html
Now you can download your report either from Azure Portal – your storage account or by using scp command:
Happy Troubleshooting!
This article was originally posted by the FTC. See the original article here.
Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.
This article is contributed. See the original author and article here.
In our daily lives we engage across multiple communication channels depending on the time and place. It’s no different when we engage in a sales conversation. Customers and prospects are busy people who don’t always have the time to join a booked meeting, take a call or even engage via e-mail. Yet when the need arises, we want them to be able to get in touch with ease.
Now in Dynamics 365 Sales we have made connecting with customers even easier by adding an SMS channel communication for our users. This channel provides another option to the sellers ‘kit bag’ to aid connection.
It’s a convenient and non-intrusive option to send reminders, provide quick updates, or respond to customer queries.
Today most of the customers are already collaborating with their key contacts and business decision makers via various chat channels including SMS. However, often these happen over seller personal devices, or via custom SMS platforms which need connection and setup into a CRM application. With this new update sellers can just get started quickly knowing the conversation is tracked for the team to keep informed. We can stop data silos and reduce IT spend on activation.
Organizations will be able to:
Let’s dive in to see our key capabilities and learn how implementing this across sales scenarios will benefit your organization.
We are adding the SMS communication channel as a new activity record in Dynamics 365 Sales. It will power several workflows, including:


Additionally, you can leverage the personalization capability while creating the templates, to add a personal touch to the messages.


Administrators will need to set up the SMS provider within the Dynamics 365 Sales application and assign numbers to enable additional communication channels. Once done, sellers or sales team can leverage the assigned number to send and receive SMS.
Let’s look at the key steps involved in the process:

Start today with additional communication channels:
Configure the SMS channel provider and start using SMS as part of your sales workflows: Engage with customers through text messages | Microsoft Learn
We’re always looking for feedback and would like to hear from you. Please head to the Dynamics 365 Sales – Forums, Blogs, Support to start a discussion, ask questions, and tell us what you think!
If you are not yet a Dynamics 365 Sales customer, check out theDynamics 365 Sales webpagewhere you can take a guided tour or get a free 30-day trial.
*Available for Dynamics 365 Sales Enterprise and above, for pricing, please visit: Sales Pricing Compare Plans | Microsoft Dynamics 365
The post How to engage customers with the right communication channels appeared first on Microsoft Dynamics 365 Blog.
Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.
This article is contributed. See the original author and article here.
Expect your apps to work on Windows 11. If you still need extra reassurance, test only your most critical apps with Test Base for Microsoft 365. This efficient and cost-effective testing service is available today to all IT pros, along with developers and independent software vendors.
We hope you’re already experiencing the safer, smoother, and more collaborative work environment with the new Windows 11 features. We’re also here to help you ensure business continuity by maintaining high application compatibility rates and supporting you in testing the mission-critical applications that you manage. Many of you have traditional (read legacy) testing systems that are provisioned in on-premises lab infrastructure. This setup has a myriad of challenges: limited capacity, manual setup and configuration, cost associated with third-party software, etc. That’s why we created Test Base for Microsoft 365. It’s a cloud-based testing solution that allows IT pros and developers to test your applications against pre-release and in-market versions of Windows and Office in a Microsoft-managed environment. Scale your testing, use data-driven insights, and ensure fewer deployment issues and helpdesk escalations with targeted testing of critical apps.
Let’s take a closer look at how you can leverage Test Base to optimize your testing with:
If you’re on a mission to be more proactive with issue detection, give Test Base a try. Do you wish to have more time to test your apps before the monthly Windows updates go out? We know reactive issue detection is stressful after your end users have already been impacted. So why not leverage prebuilt test scripts in our smooth onboarding experience! With that, let Test Base perform the install, launch, close, and uninstall actions on your apps 30 times for extra assurance. Get helpful reports and know that your apps won’t have any foundational issues on the new version of Windows.
Here’s how you can use the prebuilt test scripts and get actionable insights:
The following image shows the Package Preview step of the onboarding experience, illustrating a custom script to test the installation of an application. You can also choose to write your own functional tests to validate the specific functionality of your applications.
Pre-generated template scripts available on the New Package page inside Test Base
After you configure your test matrix on the following screen, we automatically test your applications against Windows updates. You’ll also get an execution report and summary of any failures. Use this report to see data on your applications’ reliability, memory, and CPU utilization across all your configured tests. At the end, see Event Trace Logs (ETLs) and a video recording of the test execution to easily debug any issues that you find.
Script execution report in the test results summary in Test Base
As part of the test results, view CPU utilization insights for your applications during Windows feature updates. Within the same onboarding experience in Test Base > New Package > Create New Package Online > Test Matrix tab, you can view a side-by-side analysis between your configured run and a baseline run. Select any of the familiar OSs as the baseline, and it will appear below your test graph. In the following example, the tested December monthly security build (2022.12 B) shows up on the top, while the baseline November monthly security build (2022.11 B) is listed on the bottom. The results are summarized above both charts with a regression status and details.
Side-by-side CPU utilization analysis of a test run against the baseline inside Test Base
How do we determine that? We check for a statistically significant difference in CPU usage for processes between the baseline and target run. A run is regressed if one or more relevant processes experiences a statistically significant increase at one of our observed percentiles. Use these insights for faster regression detection and to ensure enough lead time to fix any issues you might find. Learn more about the CPU regression analysis in our official documentation.
Can you identify with managing scores of devices that run multiple different versions of Windows? Then you might worry about eventually facing an unexpected compatibility issue between Windows and another Microsoft first-party app update. Test Base is integrated with the Windows build system, so we can help you test your apps against builds as soon as they are available and before they are released to Windows devices. Use our comprehensive OS coverage and automated testing to detect potential issues as early as possible, improve your SLA, and reduce labor costs.
Note: To choose a baseline OS, evaluate which OS version your organization is currently using. This will offer you more regression insights. You can also leave it empty if you prefer to focus on new feature testing. |
The Test Base testing matrix of in-market and pre-release security and feature updates
That’s how you can easily automate testing against in-market and pre-release Windows security and feature updates. Start today with these considerations:
We know your users have multiple applications running on their devices at the same time: mail app, browser, collaboration tools, etc. Sometimes, updates affect the interaction between applications and the OS, which could then affect the application’s performance.
Do you find yourself wishing for better predictability of regressions from Microsoft product updates or more lead time to fix occasional resulting issues? Use our interoperability tests with detailed reliability signals just for that. Find them within the same Test Base onboarding experience:
The pre-install Microsoft apps is toggled on in the New package configuration in Test Base
For interoperability assurance, you’d want to look for any signals of regressions after running the test. From the test summary, select the run you want to examine closer, and navigate to the Reliability tab. Here’s what a successful test would look like, comparing the number of crashes between the target test run and the baseline.
Detailed reliability signals for feature update test results in Test Base
With Test Base, you can conduct pre-release testing against monthly Microsoft 365 and Office 365 updates for extra confidence that you are covered across a broad range of Microsoft product updates. Leverage the automation in Test Base to schedule monthly tests whenever the monthly updates become available! For that, learn how to pick security testing options in the Set test matrix step in our official documentation.
The best part about Test Base is that you can use it throughout the entire Windows and Microsoft 365 apps validation lifecycle, supported by people and tools you trust. Use our support not only for the annual Windows updates, but also for the monthly security updates.
The table below outlines some of the key benefits that Test Base provides across the end-to-end validation lifecycle: from testing environment, to testing tools and software, and, finally, to testing services.
Testing environment | Testing tools & software | Testing services |
|
|
|
Sign up for a free trial of Test Base to try out these features and optimize your testing process! To get started, simply sign up for an account via Azure or visit our website for more information.
Interested in learning more? Watch this contextualized walk-through of the Test Base service: How to build app confidence with Test Base. You can also leverage our series of how-to videos.
Check out more information and best practices on how to integrate Test Base into your Continuous Integration/Continuous Delivery pipeline in our GitHub documentation. We’ve also put together some sample packages that you can use to test out the onboarding process:
Finally, join our Test Base for Microsoft 365 Tech Community or GitHub Discussions Community to share your experiences and connect with other users.
Continue the conversation. Find best practices. Bookmark the Windows Tech Community and follow us @MSWindowsITPro on Twitter. Looking for support? Visit Windows on Microsoft Q&A.
This article is contributed. See the original author and article here.
In this guest blog post, Omri Eytan, CTO of odix, discusses how businesses relying on Microsoft 365 can protect themselves from file-based attacks with Filewall, available via Microsoft Azure Marketplace.
Many theses were written about the latest pandemic impact on our working habits, preferences, and how companies had to adopt new processes to keep business alive.
We see numerous reports describing the work-from-home trend becoming the new reality of hybrid working environment. This had a huge impact on IT departments, which had to enable a secured yet transparent working environment for all employees, wherever they work. A McAfee Report showed a 50 percent increase in cloud use across enterprises in all industries during COVID while the number of threat actors targeting cloud collaboration services increased 630 percent for the same period!
Furthermore, the reports highlighted the increase in number of threats mainly targeted collaboration services such as Microsoft 365.
Microsoft 365 security approaches: attack channel vs. attack vector protection
To begin with, businesses must take responsibility for their cloud SaaS deployments when it comes to security, data backup, and privacy. Microsoft 365 business applications (Exchange Online, OneDrive, SharePoint, and Teams) are no different.
Security solutions offering protection for Microsoft 365 users can be divided into two main methodologies:
These approaches remind us of an old debate when purchasing IT products: Should the company compromise a bit on innovation and quality and purchase a one-stop-shop type of solution, or is it better to choose multiple best-of-breed solutions and be equipped with the best technology available?
Security solutions for Microsoft 365 are no different.
Protecting Microsoft 365 users against file-based attacks
This article focuses on one of the top attack vectors hackers use: the file-based attack vector. Research shows the top file types used to embed malware in channels like email are commonly used, like Word, Excel, and PDF. Hackers use these file types because people tend to click and open them naturally. When malicious code is embedded well (e.g., in nested files), the file bypasses common anti-malware solutions such as anti-virus and sandbox methods that scan files for threats and block the files if malware was detected.
Deep file analysis (DFA) technology, introduced by odix, was designed to handle all commonly used files and offers a detectionless approach. With DFA, all malware, including zero-day exploits, is prevented and the user gets a safe copy of the original file.
What are DFA and CDR?
DFA or CDR (content disarm and reconstruction) describes the process of creating a safe copy of an original file by including only the safe elements from the original file. CDR focuses on verifying the validity of the file structure on the binary level and disarms both known and unknown threats. The detectionless approach ensures all files that complete the sanitization process successfully are malware-free copies and can be used safely.
odix, an Israel-based cybersecurity company driving innovation in content disarm and reconstruction technology, developed the FileWall solution to complement and strengthen existing Microsoft security systems. FileWall, available in Microsoft AppSource and Azure Marketplace, helps business users easily strengthen Microsoft 365 security within a few clicks.
How FileWall works with Microsoft security technology
FileWall integrates with the Microsoft Graph security API and Microsoft Azure Sentinel, bringing malware protection capabilities with an essential added layer of deep file analysis technology containing CDR proprietary algorithms.
FileWall blocks malicious elements embedded in files across Microsoft 365 applications including Exchange Online, SharePoint, OneDrive, and Teams. The unique DFA process is also extremely effective in complex file scenarios such as nested files and password-protected attachments where traditional sandbox methods could miss or result in lengthy delays and disruption of business processes.
Empowering Microsoft 365 security: granular filtering per channel
FileWall includes a modern admin console so the Microsoft 365 administrator can set security policies and gain overall control of all files and attachments across Exchange Online, SharePoint, OneDrive, and Teams. The FileWall file type filter lets the admin define which file types are permitted in the organization and which should be blocked. This minimizes the attack surface the organization is exposing via email and collaboration services by eliminating the threat vectors available in certain file types.
The type filter has three main controls:
How FileWall complements Defender’s safe attachments
As a native security solution within the Microsoft 365 deployment, FileWall doesn’t harm productivity. Consequently, all FileWall’s settings have been configured to complement Microsoft 365 Defender. Combining the two products provides high levels of security with multi antivirus, sandbox, and CDR capabilities. While the sandbox can manage executables and active content, FileWall handles all commonly used files such as Microsoft Office, PDF, and images. As most organizational traffic consists of non-executable documents, this method can reduce sandbox load by 90 percent to 95 percent, lowering total costs and improving the average latency.
FileWall enhances the existing Microsoft type filter and allows additional granular controls over the types of files that are allowed to enter the organization while enforcing these restrictions on nested and embedded files as well.
Call for Certified Microsoft CSPs who wish to increase revenues
FileWall for Microsoft 365 is available for direct purchase through Microsoft AppSource, Azure Marketplace, or via the FileWall-certified partner program for Microsoft CSPs.
Microsoft CSPs can bundle FileWall via Microsoft Partner Center for their customers. odix offers generous margins to Microsoft CSPs who joined the FileWall partner program.
FileWall-certified partners are eligible for a free NFR license according to odix terms of use.
This article is contributed. See the original author and article here.
Today marks a significant shift in endpoint management and security. We are launching the Microsoft Intune Suite, which unifies mission-critical advanced endpoint management and security solutions into one simple bundle.
The post The Microsoft Intune Suite fuels cyber safety and IT efficiency appeared first on Microsoft 365 Blog.
Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.
This article was originally posted by the FTC. See the original article here.
Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.
Recent Comments