by Contributed | Sep 8, 2021 | Technology
This article is contributed. See the original author and article here.
As stated in an earlier blog, Zero Trust Architecture resets the security posture of the organization to act as if hostile adversaries have both internal and external access to the network. You must assume that a breach has occurred or is imminent in a Zero Trust approach to cybersecurity. The traditional Defense in Depth strategy of protecting the perimeter of the digital estate and trusting everyone inside is no longer a sound cybersecurity premise. Identity is the new perimeter for Zero Trust Architecture and Azure Active Directory is at the core.
In the 1979 film, “When a Stranger Calls” the police are doing forensics on a series of threatening phone calls that Jill, the babysitter, keeps receiving. Finally, in the plot twist, Sergeant Sacker calls Jill back to inform her that they have traced the calls to their origin, “We’ve traced the call. It’s coming from inside the house.” Unlike Jill’s unfortunate circumstances our organizations must presume external actors have infiltrated our digital estate with a run of the house—including compromised identity and credentials.
Compromised Credentials Are a Hacker’s Favorite Things
In Verizon’s 2021 Data Breach Investigations Report, External Actors continually execute successful breaches by leveraging compromised credentials. The attacks tend to look like internal actors until forensic investigations reveal that actor is indeed an external threat that has infiltrated the digital estate. This is why one of the most foundational practices within the Identification and Authentication (IA) Domain, and arguably all of the Cybersecurity Maturity Model Certification (CMMC) framework, is IA.3.083 “Use multifactor authentication for local and network access to privileged accounts and for network access to non-privileged accounts.” After surpassing weak authentication through credential hijacking, the first stage of the breach can remain undetected for months or years while adversaries move laterally through your network.
Since the beginning of 2021, there have been more than two dozen ‘headliner’ cyber breaches—ranging from the top social networks to mobile carriers to local retailers in our communities. The principal interests of these attacks have been primarily stolen credentials and identity—email addresses, phone numbers, hashed passwords, and other Personally Identifiable Information (PII). Most often, an email address or phone number is 50% of the information needed to compromise login credentials. Combined with phishing attacks—the most prevalent attack type—adversaries have a treasure trove of identity credentials to be leveraged across a multitude of organizations.
As Alex Simons, CVP, Identity Program Management at Microsoft stated, “The only people in the world who love passwords are hackers!” Compromised credentials remain the number one method that bad actors use to gain access to your organization. In fact, Microsoft Security Research found that the risk of credential compromise could be reduced up to 99% by simply enabling Multifactor Authentication (MFA) across your enterprise.
Fortifying the New Identity-Centric Perimeter
This brings us back to adopting a Zero Trust mindset for your digital culture and estate. And yes, it is a cultural shift for everyone. By moving the digital perimeter from the network edge to identity, you no longer need to focus on whether bad actors are internal or external threats. No one accessing the network is implicitly trusted nor granted privileges beyond what is necessary for task completion. Even if an organization follows CMMC IA.3.083 mentioned above, it is imperative to control and track sessions and activities attached to a single identity.
Large, highly resourced companies are still challenged to perform this level of identity management and monitoring across their digital estate. With over 75% of the Defense Industrial Base (DIB) being small businesses, the challenge to effectively resource and standup a proactive Security Operations Center (SOC) remains a daunting task to meet the compliance requirements for CMMC Level 3 and beyond. Almost two entire CMMC domains – Audit and Accountability (AU) and Incident Response (IR) – could require one FTE for a 200-person company as an example. The only financial mitigation or offset comes through the use of technology or the use of a Managed Security Services Provider (MSSP).
Introducing Microsoft Defender for Identity

Microsoft Defender for Identity (MDI), previously known as Azure Advanced Threat Protection or Azure ATP, is one of those technologies that can help organizations protect and monitor user identities at scale. Organizations deployed on Microsoft 365 GCC or GCC High can take an identity-centric approach and evaluate user sign-in behaviors in real-time, along with device and application risk profiles. Not only can MDI ingest and analyze user activities (i.e. multiple data points around each authentication attempt and session) Microsoft will correlate suspicious user behavior with other verified malicious attacks happening across millions of other cloud environments to generate possible intelligence.
For example, if a user is attempting to login to an application from Birmingham, AL and then five minutes later the same account is attempting a second login from Lubbock, Texas on a non-standard device, this behavior needs to be flagged. Firstly, unless the user has broken the space-time barrier—and teleported 1000 miles in under five minutes—it’s unlikely that this impossible travel is authentic. Second, the device profile has changed and doesn’t represent what’s known or common for that individual. Thirdly, Microsoft Defender for Identity is using machine learning to analyze new threat patterns to determine if your organizational risk has increased or if your organization is being attacked.
The above example and capabilities align to System & Communications Protection SC.3.190 “Protect the authenticity of communications” and System & Information Integrity SI.2.216 “Monitor organizational systems… to detect attacks and indicators of potential attacks.”
Microsoft Defender for Identity gets a superset of capabilities when paired with other components of Microsoft 365 Defender, a fully functioning Extended Detection and Response (XDR) suite. By correlating data from apps, email, and endpoints, your organization gains a comprehensive view of your threat landscape and procedures to mitigate and remediate attacks. More importantly, by analyzing suspicious behaviors in real-time, your security operations team can proactively hunt for threats versus waiting for a breach to occur.
Identity Monitoring and Defense Built for Microsoft US Sovereign Cloud
In a previous article, we laid out the differences and benefits between Microsoft’s Commercial and Government Cloud offerings. Defender for Identity is built on the FedRAMP High accredited Microsoft Azure Government Cloud and includes interoperability with Microsoft 365 GCC, GCC High, and DoD. This MDI can be licensed for users through the F5 and E5 license or standalone.
Microsoft Defender for Identity for GCC, GCC High, and DoD leverage the same underlying technologies and provides same capabilities as the commercial instance of Defender for Identity with a few exceptions:
- Integration with Microsoft Defender for Endpoint (On Roadmap)
- VPN integration (On Roadmap)
Not only will MDI integrate with Azure Active Directory and your organization’s Microsoft 365 environment, MDI will also analyze on premises Active Directory and provide insights on where improvements can be made across a company’s entire identity estate.

Identity is the First Step to a Zero Trust Architecture
Already underway, the DIB is undertaking its most significant digital transformation with CMMC, and the Department of Homeland Security is now taking a deeper look at the program. Moreover, achieving Zero Trust Architecture is not a mutually exclusive endeavor for DIB companies. Nor is Zero Trust the latest bandwagon or industry buzzword that everyone is incorporating into their marketing. Zero Trust is a fundamental shift in our culture to not only protect our digital estate more effectively, but to also enable a higher degree of secure collaboration between organizations without boundaries.
Identity is the first step on our Zero Trust Journey. Once the Identity is fortified, we can shift to the security and protection of new devices, apps, IoT, and more. Bad actors and the threat landscape are evolving at cloud speed. These bad actors will continually attack the DIB supply chain through social engineering and pretexting to get your identity credentials—the keys to your digital estate.
In this article, there were some questions that were presented and left unanswered. However, this conversation is not done. We want you to pose questions of your own about your identity perimeter, Zero Trust Architecture, CMMC, and other security and compliance topics. The best way to get those questions asked and answered real-time is one of our upcoming events and webinars. We are expanding this conversation in the DIB community to accelerate the transformation. Let’s work together to secure our national supply chain and innovate like never before.
by Contributed | Sep 7, 2021 | Business, Microsoft 365, Technology
This article is contributed. See the original author and article here.
Microsoft acquires Clipchamp to help you express yourself through the power of video. The Clipchamp team is a creative powerhouse dedicated to quality and great customer outcomes—and we welcome them wholeheartedly as kindred spirits.
The post Microsoft acquires Clipchamp to empower creators appeared first on Microsoft 365 Blog.
Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.
by Contributed | Sep 7, 2021 | Technology
This article is contributed. See the original author and article here.
More and more Azure Sentinel customers are opting for long-term retention of their logs in Azure Data Explorer (ADX), either due to compliance regulations, or because they still want to be able to perform investigations on their archived logs in the event of a security incident.
As the Azure Sentinel ingestion price includes 90 days of retention for free, the option of keeping the logs for longer periods in Azure Data Explorer is preferred by many (see Using Azure Data Explorer for long term retention of Azure Sentinel logs – Microsoft Tech Community).
Even though the Azure Sentinel + ADX solution requires little to no maintenance, we wanted to provide a solution for our customers to keep an eye on the number of events and overall status of their ADX clusters and databases. For this reason, we have created two tools: the ADXvsLA workbook and the ADX Health Playbook. The workbook will allow you to have a look at the number of logs on Azure Sentinel & ADX and the overall health of your ADX cluster. The playbook will send you a warning if an unexpected delay in the ingestion of ADX is detected.
Below, we will describe both in more detail:
ADXvsLA Workbook
When you open the workbook, you can select the following parameters:
- the ADX cluster and database
- the Azure Sentinel workspace from which the logs are exported to the aforementioned ADX cluster,
- as well as the time range for which you want to see data
Use the Show Help toggle to see a detailed explanation of each section.

Raw Tables
When you ingest logs from Azure Sentinel to ADX, the logs are first ingested into an intermediate table with raw data. This raw data is updated by a function with an update policy and is saved to its destination table with the correct mapping. Afterwards, the data is deleted, which is why you will typically see that these raw tables are empty. The retention policy should also be set for 0 days.

Final ADX Tables
In this section, you will see information about the final ADX tables, which have the right schema and can be queried from Azure Sentinel. You will find information regarding the row count, size, retention policy and hot cache size etc.

Select one of the table names to generate the comparison section. This is where you can see the differences between the table on ADX and on your Log Analytics workspace. Then, select the time range for which you want to see the comparison.
In the table you will find:
- The number of entries in ADX, in Log Analytics, and the difference in number of logs between them.
- How long it has been since the last log was received
- The timestamp of the last logs.
- The number of new logs received in Log Analytics since the last log in ADX was received

Notice the New in Log Analytics column
- In the screenshot, you can see there are 52 logs in the “New in Log Analytics” column. This means that, at the time we compared the tables, there were 52 entries that had not reached ADX yet.
If this happens, you should compare the timestamp and the difference for the last log that was received. In this case, it is around 15 minutes. Delays of 30 minutes or less are expected, so this means your tables are working as expected.
- It is also possible that you see a negative number in the New in Log Analytics column. This could happen if, due to the lag in ADX, there were Log Analytics logs from the previous period that were received in ADX during the current period. Let’s suppose that you ingested 1000 logs in Log Analytics on the previous 24h window, but only 990 reached ADX in that period; and then you ingested 1000 logs again on the current 24h window, and all those logs, plus the 10 logs from the previous day, reached ADX. In this case, you will see that the “New in Log Analytics” column would say -10. In these cases, you only need to look at the LastTM difference. If it is around 30 minutes or less, then it will be fine.
Finally, at the bottom of the workbook you will see metrics regarding events received, events dropped, received data, volume and other metrics.
ADX Health Playbook
The ADX Health Playbook compares the number of logs in your Azure Sentinel tables and ADX tables periodically (every 24h by default) and sends you a warning via email if it detects a difference in the number of logs that may require your attention (that is, in the “New in Log Analytics” column mentioned previously). As it takes logs a few minutes to reach ADX after having been ingested into Log Analytics, the query in the playbook by default looks back at the period between the last 25h and last 30min.
Please read the accompanying readme.md file on GitHub to set it up.
We hope you find these tools useful! If you have any suggestions for improving this content or any questions, please leave us a comment.
by Contributed | Sep 6, 2021 | Technology
This article is contributed. See the original author and article here.
Welcome to the next installment in our blog series highlighting Microsoft Learn Student Ambassadors who achieved the highest milestone of Gold and have recently graduated from university. Each blog features a different student and highlights their accomplishments, their experience in the Student Ambassadors community, and what they’re up to now.
Today we’d like to introduce Haimantika Mitra who is from India and graduated recently from the Siliguri Institute of Technology.

Responses have been edited for clarity and length.
When you joined the Microsoft Learn Student Ambassadors community in January 2020, did you have specific goals you wanted to reach, and did you achieve them? How has the program helped to prepare you for the next chapter in your life?
Since joining, my life has taken a different turn, a good turn!
When I first joined the community, I had very little to no idea about community building or about tech. In general, I was a person with an ambition–I was always up for learning, but I had no idea where to start. The Student Ambassadors community has helped me face imposter syndrome [editor’s note: this is the belief that you are not as capable as others perceive you to be]. The community has helped me learn tech skills that bagged me my first internship, build a social brand for myself, and make some good friends for life.
In my initial days, I used to attend a lot of events organized by my fellow Student Ambassadors and the community. I was introduced to new tech industry leaders who inspired me to learn and grow. I can clearly recall when I attended an event in April 2020 on Power Apps by Microsoft’s Dona Sarkar. She gave us a small assignment to go through a Microsoft Learn module. Being totally awed by her and the technology, I immediately completed it, starting my journey of learning Microsoft Power Platform. After that day, I never looked back–I kept learning and sharing. I was conducting events and hackathons and interacting with a lot of inspiring people. To date, I continue to learn and deliver, but this community has given me everything I ever dreamed of.
In the Student Ambassadors community, what was the top accomplishment that you’re the proudest of and why?
It is a bit difficult to choose one event, because I had so many great ones that I am proud of! But being a speaker at Microsoft Build 2020 is something that I am very ecstatic about. I never imagined being a part of a global event–it was my first and thus very special. From speaking in front of a mirror to addressing such a huge audience, I am proud of who I have become. This event helped me gain the confidence I was lacking for so long. It introduced me to some amazing personalities, and helped me get involved in the community more.
I’ve spoken at various other Microsoft events and built solutions for people, specifically for the black, Asian, and minority ethnic (BAME) communities, I’ve been a part of the Black Minds Matter hackathon and have helped women in my country and the EMEA region upskill on Power Platform.
I posted about what I am learning every day, and as a result, in my final year of university, I was approached by various companies to work on their Power Platform teams. The opportunities I received from the Student Ambassador program gave me the necessary push. Everything else followed, and it was magical!
What do you have planned for after graduation? What’s next for you?
I will continue with community work. I consider myself a product of the community, and I know there are many like me who are looking for a direction. I wish to be that person who can provide them with direction. I will also be joining Microsoft in a full-time capacity as a support engineer. It is a dream to me; all the learnings that I had from the community helped me get closer to it.
If you were to describe the community to a student who is interested in joining, what would you say about it to convince him or her to join?
Most students have a common question: “How do I get started in tech?” I would simply say to them that if they are looking for the answer, this is the right place to be! I shall also brief them on the amazing perks such as the 1:1 mentoring sessions we have, Microsoft Training Certification vouchers, access to LinkedIn learning, tech-specific leagues headed by Microsoft developer advocates, the fun we have in the community calls, and more.
What advice would you give to new Student Ambassadors?
Embrace the opportunity that they are receiving. Initially attend as many sessions as possible, use Microsoft Learn (the best place to upskill from), make use of all the opportunities that Ambassadors are given, and check Teams {editor’s note: this is the communication platform Ambassadors and program managers use to communicate and collaborate] for 10 minutes a day to make sure that you do not miss on any notifications or opportunities.
What is your motto in life, your guiding principle?
“Technology for everyone”. I am trying my best to bring more people to tech rather than having them be scared of it. I look forward to taking this goal bigger and helping as many as I can.
What is one random fact about you that few people know about?
People have seen the side of me that hustles, that works hard a lot, but what they do not know is, I am a “serial chiller”. There are times when I pull all-nighters binge watching TV or just lying down and doing nothing.
We wish you the best of luck in all your future endeavors, Haimantika!
by Contributed | Sep 3, 2021 | Technology
This article is contributed. See the original author and article here.
This blog has been authored by Ranvijay Kumar, Principal Program Manager, Microsoft Health & Life Sciences
HL7 Fast Healthcare Interoperability Resources (FHIR®) is quickly becoming the de facto standard for persisting and exchanging healthcare data. FHIR specifies a high-fidelity and extensible information model for capturing details of healthcare entities and events.
This article will teach you a simple approach to creating analytical data marts by exporting, transforming, and copying data from Azure API for FHIR to Azure Synapse Analytics, which is a limitless analytics service designed for data warehousing and big data workloads. You can complete your Business Intelligence (to Artificial Intelligence (AI) analytics with Synapse due to the deep integration with Power BI, Azure Machine Learning, and Azure Cognitive services.

In this approach, as illustrated in the diagram, you will use the $export operation in Azure API for FHIR to export FHIR resources in NDJSON format (newline delimited JSON) to Azure storage. You will then use T-SQL from any of the serverless or the dedicated SQL pools in Synapse to query against those NDJSON files and optionally save the results into tables for further analysis.
Exporting FHIR data to Azure storage
Azure API for FHIR implements the $export operation defined by the FHIR spec to export all – or a filtered subset – of FHIR data in NDJSON format. It also supports de-identified export to enable secondary use of healthcare data. You can configure the server to export the data to any kind of Azure Storage account; however, we recommend exporting to ADLS Gen 2 for best alignment with Synapse.
Let’s consider a scenario in which data scientists want to analyze clinical data of patients who are former smokers. For the study, data scientists need an initial copy of data from the FHIR server followed by incremental data for the same set of patients every month for the next two years.
The first step to get this data is to identify the patients in the FHIR server who are former smokers. The following GET call searches the FHIR server using the LOINC code 72166-2 (Tobacco smoking status) for Observation, and SNOMED code 8517006 (Former smoker) for Observation value-concept to get subjects of the observations who are former smokers. You may need to use different codes depending on how your data is coded.
https://{{fhirserverurl}}/Observation?code=72166-2&value-concept=8517006&_elements=subject
|
You need to save this list of patients to enable exporting their clinical data monthly. There are a few options to manage a collection of resources in FHIR. Since Group is supported by the $export operation, you will manage the collection of patient resource IDs as a Group. Use the results from the above search query to create a person-type Group.
{
“resourceType”: “Group”, “id”: “1”,”type”: “person”, “actual”: true,
“member”: [{“entity”: {“reference”: “Patient/44f6f10e-96c2-4802-b857-4861f1802522”}},
… other patient entities from the result …
]
}
|
Once you have a Group, you can export all the data related to the patients in the Group with the following async REST call:
Note: Azure API for FHIR takes an optional container name to simplify the organization of exported data.
https://{{fhirserverurl}}/Group/{{GroupId}}/$export?_container={{BlobContainer}}
|
You can also use _type and _typefilter parameters in the $export call to restrict the resources we you want to export. Finally, you can use _since parameter in the $export call to do incremental exports every month for two years to meet your original requirement. This parameter restricts export to the resources that have been created or updated since the supplied time.
https://{{fhirserverurl}}/Group/{{GroupId}}/$export?_container={{BlobContainer}}&_since=2021-02-06T01:09:53.526+00:00
|
Now that you have data in ADLS Gen 2, let’s talk about Synapse and see how you can load it to Synapse.
About Azure Synapse Analytics
Create a pipeline
You can use a variety of REST clients such as Postman to export the data from the FHIR server and use Synapse Studio or any other SQL client to run the above T-SQL statements. However, it is a good idea to convert these steps into a robust data movement pipeline using Synapse Pipelines. You can use the Synapse Web activity for triggering the export, and the Stored procedure activity to run the T-SQL statements in the pipeline.
Conclusion
You can use the FHIR $export API and T-SQL to transform and move all or a filtered subset of data from FHIR server to Synapse Analytics. After the initial data load, the _since parameter in the $export operation can be used to do incremental data load. An ETL pipeline with the steps mentioned in this article can be used to keep the data in the FHIR server and the Synapse Analytics in sync.
®FHIR is registered trademark of Health Level Seven International, registered in the U.S. Trademark Office and is used with their permission.
Recent Comments