Migrating to SQL: Introduction to SSMA (Ep. 3) | Data Exposed

This article is contributed. See the original author and article here.

Looking to leave Oracle behind? Watch the episode of Data Exposed with Alexandra Ciortea to find out how to ease into the migration journey by automating Oracle migrations to SQL Server or Azure SQL using SQL Server Migration Assistant. Customers looking to migrate from other database systems to Azure experience challenges in schema conversion, data movement. Microsoft SQL Server Migration Assistant (SSMA) is a suite of 5 tools designed to automate heterogeneous database migration to Azure SQL and SQL on-premises from Oracle, DB2, MySQL, Microsoft Access, and SAP ASE.


Watch on Data Exposed



Resources:

Let’s talk about Coronavirus scams

Let’s talk about Coronavirus scams

This article was originally posted by the FTC. See the original article here.

Spotted a Coronavirus Scam? Tell your friends. Then tell the FTC: ReportFraud.ftc.gov. ftc.gov/PassItOn #OlderAmericansMonth

During this past year, the COVID-19 pandemic and its economic fallout have reminded us how important it is to help each other through difficult times. In May, as we celebrate Older Americans Month, remember that one of the best ways to help your friends and family is to pass on what you know about how to spot and avoid Coronavirus-related scams. 

Here are some things to share:

For more tips to share with your community, visit Pass It On and subscribe to Consumer Alerts. And if you spot a scam, report it at ReportFraud.ftc.gov.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Azure Marketplace new offers – Volume 138

Azure Marketplace new offers – Volume 138

This article is contributed. See the original author and article here.











We continue to expand the Azure Marketplace ecosystem. For this volume, 70 new offers successfully met the onboarding criteria and went live. See details of the new offers below:





































































































































































































































































































Applications


Advanced Shelf Intelligence for Retail Inventory.png

Advanced Shelf Intelligence for Retail Inventory: Pensa’s retail shelf intelligence data service delivers real-time data and reports that show an entire product category by brand, SKU, day, and store so you can effectively manage your store shelves by reducing stockouts and optimizing assortments.


AIMMO Enterprise Data Annotation Platform.png

AIMMO Enterprise Data Annotation Platform: The AIMMO Enterprise Data Annotation Platform is a cloud-based solution for managing self-service data labeling projects for your AI models. Available in English and Korean, the platform includes other useful tools like workflow management and customizable data extraction.


AlmaLinux 8 Latest.png

AlmaLinux 8 Latest: Cognosys offers this preconfigured image of AlmaLinux OS 8 on Microsoft Azure. AlmaLinux OS is a stable Linux distribution providing an enterprise-grade server operating system you can rely on for running critical workloads.


AlmaLinux 8 Minimal.png

AlmaLinux 8 Minimal: Cognosys offers this preconfigured image of a minimal installation of AlmaLinux OS 8 on Microsoft Azure. AlmaLinux OS is a stable Linux distribution providing an enterprise-grade server operating system you can rely on for running critical workloads.


AlmaLinux 8.3.png

AlmaLinux 8.3: Cognosys offers this preconfigured image of AlmaLinux OS 8.3 on Microsoft Azure. AlmaLinux OS is a stable Linux distribution providing an enterprise-grade server operating system you can rely on for running critical workloads.


Circulus.png

Circulus: Circulus provide small and midsize businesses with the tools to harness the power and convenience of automated accounts payable workflows. Streamline processes, enhance data quality, tighten accounts payable controls, and consolidate bill management into a single interface with Circulus.


edcom Vacation Manager.png

edcom Vacation Manager: Vacation Manager is an intuitive web application that provides automated leave and absence management for organizations using Microsoft 365. Available in English and German, Vacation Manager features an integrated calendar that displays an overview of all vacations, absences, and substitutions.


Effizency Sales.png

Effizency Sales: Accelerate your B2B energy services sales with Effizency’s platform on Microsoft Azure. Rapidly generate accurate energy solution proposals, provide flexible investment options to clients, close deals remotely, and reach more clients interested in reducing energy costs.


G-Image Digital Asset Management (DAM).png

G-Image Digital Asset Management (DAM): G-Image is a digital asset management (DAM) solution designed for companies that manage large streams of images and need to archive, categorize, and retrieve them in a simple, fast, and customizable manner.


Glance for Financial Services.png

Glance for Financial Services: Glance for Financial Services allows bankers to see a customer’s screen and guide them through simple or complex digital transactions. Glance is an embedded service that can be added to an existing website or app and is engineered for enterprise security and privacy compliance.


GUI on Ubuntu 20.04.png

GUI on Ubuntu 20.04: This preconfigured image from Lotus Beta Analytics provides the XFCE desktop environment (GUI) on Ubuntu 20.04. The XFCE desktop environment enhances users’ Linux experience with an intuitive way to operate various Linux functions and applications.


HR Communications Plugin for Microsoft 365.png

HR Communications Plugin for Microsoft 365: Velaku’s HR communications plugin for Microsoft 365 makes engaging employees fast, easy, and efficient across all Microsoft 365 communication channels. The full-service enterprise solution features cost-efficient volume-based pricing that’s perfect for companies of all sizes.


iBinder foundation.png

iBinder foundation: Designed for all stakeholders in the construction industry, iBinder is a scalable, flexible, and secure information management platform hosted on Microsoft Azure. Drive collaboration across teams and manage all data with one end-to-end solution.


Icertis Risk Management App.png

Icertis Risk Management App: Mitigate business risk through proactive assessment, discovery, and continuous monitoring with the ICI Risk Management application from Icertis. The Risk Management application conforms to any risk model to reduce the impact of operational, financial, and reputational risk.


Immuta Automated Data Governance.png

Immuta Automated Data Governance: Immuta is a modern data access and control solution for Microsoft Azure data ecosystems. Empower your data engineering and DataOps teams to improve productivity, increase security, and unlock more data by automating cloud data access and privacy controls.


Intelligent Data Analytics Platform.png

Intelligent Data Analytics Platform: ADDO AI’s highly scalable, AI-enabled big data platform delivers a centralized data management infrastructure. Enable efficient decision-making for business users, data accessibility for downstream digital applications, and support for data science and machine learning workloads.


Interoperable Backbone- Clinical Data Repository.png

Interoperable Backbone: Clinical Data Repository: Extend your facility’s hospital information system and optimize the flow of information between requesting systems and dispensing systems with 4WARD’s interoperable medical device platform on Microsoft Azure. This application is available only in Italian.


LoopUp Cloud Telephony - Direct Routing.png

LoopUp Cloud Telephony – Direct Routing: LoopUp Cloud Telephony connects Microsoft Phone System to the public switched telephone network (PSTN) using Direct Routing over a premium voice network. This allows users to make and receive external calls. LoopUp is available as a native app for Microsoft Teams or as a standalone solution.


Maxis i.StartupBi SaaS.png

Maxis i.StartupBi SaaS: Maxis i.StartupBi is a hybrid cloud dashboard that provides a single pane of glass to view the operations of your organization via Microsoft Power BI. This customizable offer is available as a monthly or annual subscription.


Medical Speech to Text (STT).png

Medical Speech to Text (STT): Sayint Speech to Text for the healthcare industry automatically converts audio into text, providing automated transcriptions of doctor-narrated prescriptions, discharge summaries, and other conversations related to the medical domain.


Multi-cloud Data Services for Dell EMC PowerScale.png

Multi-cloud Data Services for Dell EMC PowerScale: Faction’s Multi-cloud Data Services for Dell EMC PowerScale enable you to connect your PowerScale file scale-out storage directly to public clouds, including Microsoft Azure. Gain an on-demand, highly available cloud consumption model for compute workloads and storage.


nDivision Azure Migrate project template.png

nDivision Azure Migrate project template: The project template from nDivision uses Microsoft Azure Migrate to walk users through creating an Azure Migrate project in a customer’s Azure subscription. Azure Migrate is a set of tools and services for discovery and migration of servers from on-premises to Azure or from Azure to Azure.


Pilbara Insights for Higher Education.png

Pilbara Insights for Higher Education: Pilbara provides an outsourced service including designing, building, and maintaining an activity-based costing (ABC) model for your higher education institution. ABC uncovers the mix of human, physical, and financial resources used by your institution to maximize mission attainment.


Redis Sentinel Exporter Container Image.png

Redis Sentinel Exporter Container Image: Bitnami provides this preconfigured container image of Redis Sentinel Exporter, which gathers Redis Sentinel statistics and exports them via HTTP for Prometheus consumption.


Rubrik Cloud Data Management.png

Rubrik Cloud Data Management: Rubrik takes advantage of the flexibility and cost-effectiveness of Microsoft Azure by consolidating disparate hardware and software components into a single software platform for complete enterprise data management. It helps recover up to 80 percent of admin time with modernized data protection.


SAS 9.4 SaaS.png

SAS 9.4 SaaS: SAS 9 includes tools that enable users to access nearly any data source, analyze the data, and transform it into meaningful and valuable visualizations that help decision-makers gain a quick understanding of critical issues.


Semantix API Management.png

Semantix API Management: Semantix API Management is an integration platform with a powerful framework that allows you to create API gateways simply and quickly. This application is available only in Portuguese.


Semantix Integration Platform.png

Semantix Integration Platform: Available only in Portuguese, Semantix Integration Platform empowers users to develop integrations with little or no coding. Create your own components or use over 200 ready-made components from CRM systems, e-commerce platforms, ERP systems, and more.


Semantix Intelligent Chat.png

Semantix Intelligent Chat: Semantix Intelligent Chat is a comprehensive customer engagement solution that uses artificial intelligence to automate service. This application is available only in Portuguese.


Semantix Live Commerce.png

Semantix Live Shopping: Semantix allows sellers to interact with customers via live video streaming and answer any product questions in real time. Customers can then purchase a product directly through Semantix’s built-in payment system without leaving the site. This application is available only in Portuguese.


SNL Banker.png

SNL Banker: SNL Banker is an intuitive reporting solution offering community banks and credit unions streamlined tools that provide data insights to inform daily decision-making. Get the essential intelligence you need to assess daily performance metrics, identify risk, analyze opportunities, and more.


Symfony Container Image.png

Symfony Container Image: Bitnami provides this preconfigured container image of Symfony, an open-source PHP framework for web applications.


TIM InstantML for Alteryx - Starter Pack.png

TIM InstantML for Alteryx – Starter Pack: TIM InstantML for Alteryx Starter Pack delivers a unique combination of two technologies to enable those who work with time series data to benefit from machine learning for predictive and prescriptive scenarios.


TIM InstantML for Excel - Starter Pack.png

TIM InstantML for Excel – Starter Pack: The TIM Forecasting add-in for Microsoft Excel supports TIM’s RTInstantML technology, enabling users to get direct forecasts based on the data in their Excel files.


TIM InstantML for Qlik - Starter Pack.png

TIM InstantML for Qlik – Starter Pack: This TIM server-side extension enables Qlik Sense users to benefit from TIM’s capabilities without leaving the familiar Qlik Sense environment. The Qlik Sense TIM Starter Pack is your gateway to augmented machine learning for deeper insights into your data.


Wavefront Adapter for Istio Container Image.png

Wavefront Adapter for Istio Container Image: Bitnami offers this preconfigured container image of Wavefront Adapter for Istio. Wavefront Adapter for Istio is a lightweight tool written in Go that exposes Istio metrics to Wavefront and supports Istio v1.4+ and Kubernetes v1.15+.


Windows Server 2019 with IIS.png

Windows Server 2019 with IIS: Belinda offers this preconfigured image of Windows Server 2019 with Internet Information Services (IIS). This version of Windows Server 2019 offers improved performance and is ideal for customers looking to deploy a pre-installed IIS for low or heavy-traffic websites and web applications.


YOSI.png

YOSI: Yoshi is a Microsoft Azure-based solution that provides a virtual assistant who delivers a complete customer service experience for both internal and external users in your organization.



Consulting services


AI and ML - 6-Week Proof of Concept.png

AI and ML – 6-Week Proof of Concept: In this six-week engagement, Abersoft will assess your organization’s data requirements and identify how machine learning and artificial intelligence can help improve your data pipeline. This proof of concept includes a working prototype using Microsoft Azure Machine Learning and Azure Cognitive Services.


App Modernization Quick Start- 2-Week Assessment.png

App Modernization Quick Start: 2-Week Assessment: Cloocus’s two-week assessment is designed to help customers understand their existing architecture and the cost associated with running their IT environment. It will also provide a detailed plan on how to make the most cost-effective transition to Azure.


Azure Backup Plus Services- 1-Month Implementation.png

Azure Backup Plus Services: 1-Month Implementation: Azure Backup Plus Services is AccTech Systems’ cloud-based service that uses Microsoft Azure to back up, protect, and restore your data. This one-month implementation will make it easy to define backup policies and protect a wide range of enterprise workloads.


Azure Backup Services- 1-Month Implementation.png

Azure Backup Services: 1-Month Implementation: In this one-month implementation AccTech Systems will replace existing onsite or off-site backups with a cloud-based solution that is reliable, secure, and cost-effective.


Azure Synapse Migration - 2 Hour Workshop.png

Azure Synapse Migration – 2 Hour Workshop: Hexaware’s two-hour workshop offers cost-effective ways to re-platform your entire data warehouse landscape to Microsoft Azure Synapse Analytics in a secure and timely manner. Ensure your data is encrypted at rest as well as in transit.


Azure Transformation Program- 5-Week Implementation.png

Azure Transformation Program: 5-Week Implementation: This six-week implementation from Heroes B.V. will help you define the right cloud-based strategy for every application in your portfolio —no matter what your industry is — and guide you through the execution in five steps.


Building a Data Analytics Foundation- 10-Week Implementation.png

Building a Data Analytics Foundation: 10-Week Implementation: Available only in Japanese, Albert’s implementation will help your organization build a data analysis infrastructure on Microsoft Azure. Deliverables include identifying issues related to data analysis, a data analysis platform design document, and a manual for utilizing the platform.


Cloocus - DB Modernization- 3-Week Assessment.png

Cloocus – DB Modernization: 3-Week Assessment: Cloocus’s three-week assessment is designed to help customers make the most cost-effective migration of their SQL database environment to Microsoft Azure with almost no downtime while maximizing performance and security.


Cloud Discovery and Strategy - 1-Week Briefing.png

Cloud Discovery and Strategy – 1-Week Briefing: In this free one-week briefing Kainos will provide an actionable cloud adoption readiness assessment aligned with your business needs. This process will include a high-level strategy for migrating to Microsoft Azure.


Cloud Insight as a Service - 2-Hour Proof of Concept.png

Cloud Insight as a Service – 2-Hour Proof of Concept: Telindus’ proof of concept provides an overview of the current state and the potential improvement areas of your Microsoft Azure infrastructure. The subscription-based model allows you to purchase the full Cloud Insight Service, which implements the improvements outlined in the proof of concept.


Cloud Managed Service.png

Cloud Managed Service: Metanet TPlatform offers managed services that safely and reliably monitor the health of your Microsoft Azure environment. Available only in Korean, Metanet TPlatform’s services are tailored to your organization’s needs.


Cloud Native Modernisation - 1-Week Briefing.png

Cloud Native Modernization – 1-Week Briefing: Kainos’ architecture, development, and operations technologists will help you embrace cloud-native and Microsoft Azure PaaS services to modernize existing legacy applications. Deliver enhanced scalability, reliability, cost effectiveness, performance, and security with Kainos and Azure.


Cloud Security- 1-Month Assessment.png

Cloud Security: 1-Month Assessment: This Microsoft Azure Cloud Security Assessment from eacs delivers the analysis and visibility you need to detect, respond to, and prevent security and compliance gaps in your Azure environment. Deliverables include clear, actionable recommendations to improve your cloud security posture.


Data Architecture Modernization- 6-Week Assessment.png

Data Architecture Modernization: 6-Week Assessment: DataArt’s modernization approach for Microsoft Azure includes Azure Data Lake Storage, Azure Synapse Analytics, Azure Data Factory, Azure SQL Server, and more. Evolve an insight-driven organization through data management and agile business intelligence systems and practices.


Datacenter Migration to Azure Cloud.png

Datacenter Migration to Azure Cloud: IF-Tech provides analysis and consulting services for migrating your datacenter infrastructure to Microsoft Azure. Benefit from state-of-the-art Azure services and facilitate your organization’s cloud journey. This offer is available only in German.


Datacenter Migration - 4 to 8 Week Implementation.png

Datacenter Migration – 4 to 8 Week Implementation: Officeline’s Datacenter Migration implementation helps your organization execute its cloud journey by assessing your environment and requirements, migrating workloads to Microsoft Azure, optimizing your infrastructure, and securing and managing your Azure environment.


Desktop as a Service - 2-Week Proof of Concept.png

Desktop as a Service – 2-Week Proof of Concept: In this proof of concept, Metsys will deploy an operational Windows Virtual Desktop environment for Windows 10 according to your organization’s governance, security, and compliance requirements. This service is available only in French.


Digital Clinical Care- 10-Week Implementation.png

Digital Clinical Care: 10-Week Implementation: Persistent Systems offers this 10-week engagement enabling you to implement a digital clinical care (DCC) experience built on Microsoft Azure Cloud for Healthcare. Leverage software-driven recommendations for patient care and improve healthcare outcomes across your organization.


Dynamics Integration using Azure- 2-Day Workshop.png

Dynamics Integration using Azure: 2-Day Workshop: Mazars’ two-day workshop shows you how Microsoft Azure services deliver enterprise-grade integration solutions that link your Microsoft Dynamics 365 back-office instance and your critical line of business systems.


Facial Recognition System- 6-Week Implementation.png

Facial Recognition System: 6-Week Implementation: Synnex will implement a facial recognition system combining IoT and AI services and built on Microsoft Azure. Enable your company to better manage staff attendance using sensors, cameras, monitors, and other devices with results displayed in Microsoft Power BI for further analysis.


HR Chat Bot- 8-Week Implementation.png

HR Chat Bot: 8-Week Implementation: Ulteam will develop and implement an HR chatbot tailored to your business needs and based on Microsoft Azure Bot Services and Azure Cognitive Services. This service is available only in Russian.


Infra Migration on Azure- 4-Week Implementation.png

Infra Migration on Azure: 4-Week Implementation: Megazone Cloud will work with your stakeholders to develop and implement a systematic plan for migrating your current IT infrastructure to Microsoft Azure. This service is available only in Korean.


Insight Accelerator- 30-Day Implementation.png

Insight Accelerator: 30-Day Implementation: Unify Consulting’s Insight Accelerator is an artificial intelligence solution that reduces the barriers to entry for document research and analysis. Implement Insight Accelerator and enable your organization to handle the three Vs of data management: volume, velocity, and variety.


Kainos - Evolve EMR on Azure - 1-Week Briefing.png

Kainos – Evolve EMR on Azure – 1-Week Briefing: Provided as a fully managed service on Microsoft Azure, ​​Evolve will deliver a unified digital care record to enable seamless clinical workflows for your healthcare organization. Leverage secure, audited storage and management of patient documentation with digital workflows and electronic forms.​


Kubernetes- 5-Day Proof of Concept.png

Kubernetes: 5-Day Proof of Concept: Container technologies such as Kubernetes have become essential in application modernization journeys. In this five-day proof of concept, Inetum-Realdolmen will set up and configure Azure Kubernetes Service (AKS), deploy a sample application, and demonstrate the automated scaling of AKS.


Managed Security Operations Centre (SOC) Briefing.png

Managed Security Operations Centre (SOC) Briefing: In this three-hour briefing call Risual’s managed Security Operation Center team will cover how they can deliver a 24×7 service that detects and responds to cybersecurity threats in your cloud or onsite environment.


Modern Data Warehouse - Pilot Implementation.png

Modern Data Warehouse – Pilot Implementation: BI Applications offers the benefits of scaling your Microsoft Power BI solutions with a modern and analytical data warehouse pilot implementation. This service is available only in Spanish.


Smart Buildings- 4-Week Implementation.png

Smart Buildings: 4-Week Implementation: Persistent Systems’ four-week implementation provides your organization with a roadmap to convert its existing structures into “smart spaces” with the use of Microsoft Azure IoT.


Smart Data Platform- 10-Day Implementation.png

Smart Data Platform: 10-Day Implementation: Improve relevancy, adoption, and user satisfaction of your data solutions in this 10-day implementation from Macaw. Deliverables include real-time machine learning, reliable predictions, and actionable insights based on a pay-per-use model.


Windows Virtual Desktop Consulting Service - 4-Week Implementation.png

Windows Virtual Desktop Consulting Service: 4-Week Implementation: Customize your Windows Virtual Desktop experience with Daoudata’s four-week implementation service. This offering includes planning, cost optimization, and proof of concept aligned with your requirements. This service is available only in Korean.


Windows Virtual Desktop- 1-Day Workshop.png

Windows Virtual Desktop: 1-Day Workshop: Data Market Bilgi Hizmetleri offers a one-day Windows Virtual Desktop workshop that guides you through different desktop virtualization scenarios and helps you pick the one that is best for your organization.


Windows Virtual Desktop- 2-Week Proof of Concept.png

Windows Virtual Desktop: 2-Week Proof of Concept: Learn how you can eliminate hardware constraints and increase agility with Compunet’s Windows Virtual Desktop two-week proof of concept. This scalable, turnkey solution meets the current and future needs of today’s virtual business environment.



Difficulty Generating a Memory Dump

Difficulty Generating a Memory Dump

This article is contributed. See the original author and article here.

Hi there!


My name is Teeda, and I am a Support Escalation Engineer on the Windows Performance Team at Microsoft. This blog post provides several suggestions and workarounds when there is difficulty generating a memory dump for bugcheck issues (or even hang scenarios). Special thanks to my colleague, Alisse, for assembling this documentation.


 


Think about the goal…


Is a bugcheck occurring and you are trying to get a memory dump from that?  If so, you can skip the parts about manually triggering a dump.  However, you may want to use these settings to test out if you can get a memory dump.  This will be faster than waiting for the next bugcheck.


 


Do you need to crash the machine manually?  If so, pay attention to the type of machine (virtualized or physical) and the situation we are working with.


 


Is this a virtual machine?


VMware machines allow to create a snapshot which can then be converted to a memory dump.  Often, this is easier than trying to generate the memory dump manually.


  1. Capture the snapshot in the VMWare console with “Take Snapshot” either at the bugcheck screen or if another issue, at the time of the issue.

  2. Go to the following website: https://labs.vmware.com/flings/vmss2core




    • On the left-hand side, check the Agree and Download box.

    • Change the Dropdown to the appropriate OS (vmss2core-sb-8456865.exe).

    • Click on download.




  1. Once you have downloaded the file, save it on the C drive to a folder called c:Snapshot

  2. Copy the vmss or vmsn/vmem file that you wish to convert to that folder.

  3. Open an elevated command prompt and run the following command:

    1. cd c:Snapshot

      • For VMs OS until Windows 7/2008R2 use: vmss2core-sb-8456865 –W <snapshot.vmsn/Suspend.vmss> <snapshot.vmem>

      • For VMs OS Windows 8.1/2012 and above use: vmss2core-sb-8456865 –W8 <snapshot.vmsn/Suspend.vmss> <snapshot.vmem>



    2. Replace the ‘<snapshot.vmsn/Suspend.vmss> <snapshot.vmem>’ with the name of the snapshot.

    3. This process may take a few minutes depending on the size of the snapshot, but it will create a memory.dmp file in the c:snapshots folder.




There is also the option to use the NMI switch in VMWare as an alternative if taking a snapshot is not an option.  Please note you will still need to configure for a memory dump whether it be kernel or complete: https://kb.vmware.com/s/article/2149185

 


Hyper-V Machines allow to save the state of the machine which can then be converted to a memory dump.



  • To do this, please right click the VM from Hyper-V manager and click “save” in state.  There will be saved state files at the location of the hard disk.

  • To allow the VM to continue running, you will need to right click the server and click start. Please OS version of the host machine as this will be needed to use the correct tool for conversion.

  • You will need to engage Microsoft to convert the Save state files (.bin/.vsv or .vmrs).


Alternatively, you can also configure for a manual Hyper-V crash using: 
HKEY_LOCAL_MACHINESYSTEMCurrentControlSetServiceshyperkbdcrashdump
HKEY_LOCAL_MACHINESystemCurrentControlSetServiceshyperkbdParameters


Configuration information found here: Forcing a System Crash from the Keyboard – Windows drivers | Microsoft Docs


 


For Azure machines, Azure engineers can grab a memory dump or use NMI:


Configure for complete memory dump:


Step 1: Change page file size



  • Verify the machine has enough free space for 2x the RAM before continuing.

  • Launch File Explorer, then right-click This PC. Select Properties

  • Click Advanced system settings on the System page. Make sure you are on the Advanced tab.

  • Click Settings under the Performance area.

  • Click the Advanced tab, and then click Change under the Virtual memory area.

    • Note: To enable the system partition, you must uncheck “Automatically manage paging file size for all drives check box.”




TeedaN_0-1620920230012.png



  • Select the C: drive for page file location.

  • Click Custom Size. Set the value of Initial size and Maximum size to the amount of physical RAM that is installed plus 256 megabyte (MB) under the Custom Size button. (RAM*1024 + 256MB = Size in MB)

  • Click Set, and then click OK.


Step 2: Configure for a complete memory dump file



  • Go back to Advanced system settings page

  • Click Settings under the Startup and Recovery, and then make sure complete memory dump is selected.

    • Note: If you want to enable the complete memory dump option, manually set the CrashDumpEnabled registry entry to 0x1 under the following registry subkey and restart Windows: HKEY_LOCAL_MACHINESYSTEMCurrentControlSetControlCrashControl



  • Ensure the path is C:WindowsMEMORY.DMP (%SystemRoot%MEMORY.DMP)


TeedaN_1-1620920230022.png



  • Click OK

  • Reboot VM for settings to take effect


Step 3: Enable Boot Diagnostics for NMI Crash



  • Login to Azure portal > select VM > Serial Console


TeedaN_2-1620920230027.png



  • Note: Serial Console requires boot diagnostics enabled


TeedaN_3-1620920230030.png



  • So, if not enabled, go to Boot Diagnostics > click Settings > Turn On > Save


TeedaN_4-1620920230035.png


 


Step 4: Send NMI during issue



  • When computer is in problem state > Serial Console > click Send Command [1] > click Send Non-Maskable Interrupt (NMI) [2]


TeedaN_5-1620920230040.png



  • Click Send NMI


TeedaN_6-1620920230046.png



  • Dump will be generated.


TeedaN_7-1620920230053.png



  • After completes login to VM and dump will be in C:WindowsMEMORY.DMP


TeedaN_8-1620920230058.png


 


 


For AWS machines, try using these steps: https://docs.aws.amazon.com/AWSEC2/latest/WindowsGuide/diagnostic-interrupt.html


For Nutanix machines, please engage the vendor to capture the memory dump.



Do you have the correct configuration?


Step 1: Change your page file size



  • Verify the machine has enough free space for 2x the RAM before continuing.

  • Go to Advanced system settings

  • On the System page, click the Advanced tab.

  • Click Settings under the Performance area.

  • Click the Advanced tab, and then click Change under the Virtual memory area.

    • Note: To enable the system partition, you must click to clear the Automatically manage paging file size for all drives check box.



  • Select the C: drive for pagefile location.

  • Click Custom Size. Set the value of Initial size and Maximum size to the amount of physical RAM that is installed plus 256 megabytes (MB) under the Custom Size button.

  • Click Set, and then click OK three times


Step 2: Configure for a complete memory dump file



  • Go back to Advanced system settings

  • On the System page, click the Advanced tab.

  • Click Settings under the Writing debugging information area (Startup and Recovery), and then make sure complete memory dump is selected.

    • If the complete memory dump is not an option here, to enable the complete memory dump option, manually set the CrashDumpEnabled registry entry to 0x1 under the following registry subkey: HKEY_LOCAL_MACHINESYSTEMCurrentControlSetControlCrashControl




Step 3: Apply the settings 



  • Ensure there is more space available on the C drive than there is RAM on the machine.

  • Please restart the machine for the settings to take effect


 


More Options


Try to use DedicatedDumpFile.sys – How to use the DedicatedDumpFile registry value to overcome space limitations on the system drive when capturing a system memory dump | Microsoft Docs

Manual Dump Trigger Options


NMI


Does this machine have a NMI switch? This would be in the Integrated Lights Out (iLO) Web interface. Create a DWORD value called NMICrashDump under HKEY_LOCAL_MACHINESYSTEMCurrentControlSetControlCrashControl and set it to a 1.  Then reboot the machine for the setting to take effect.


 


Keyboard initiated


For a USB keyboard, create the following registry entry:



  • In HKEY_LOCAL_MACHINESystemCurrentControlSetServiceskbdhidParameters, create a value named CrashOnCtrlScroll, and set it equal to a REG_DWORD value of 0x01. 


For a PS/2 Keyboard, create the following registry entry:



  • In HKEY_LOCAL_MACHINESystemCurrentControlSetServicesi8042prtParameters, create a value named CrashOnCtrlScroll, and set it equal to a REG_DWORD value of 0x01.


Then reboot the machine for the setting to take effect.


Note: you will need to use the Right Ctrl key + press the ScrLk key twice to trigger the dump with the above settings. If the machine does not have those available, there are other options. Forcing a System Crash from the Keyboard – Windows drivers | Microsoft Docs


Ex: Left Ctrl + Space Bar:


HKEY_LOCAL_MACHINESystemCurrentControlSetServiceskbhidCrashDump


Create DWORD value Dump1keys set to 20 (hex)


Create DWORD value Dump2key (note no s here) set to 3d (hex)


 


NotMyFault


Use NotMyFault to initiate a crash: NotMyFault – Windows Sysinternals | Microsoft Docs


 


Change the Settings



  • Ensure there is enough space to capture the memory dump.  We need enough space for the page file, and for the memory dump itself which will be the size of the page file.

  • Disable the Autoreboot:(HKEY_LOCAL_MACHINESYSTEMCurrentControlSetControlCrashControlAutoReboot)

  • Change the memory dump location to another spot on a local drive

  • Ensure the option “Overwrite Any Existing File” (found in Control Panel System) is selected. It is a good idea to leave this box checked and to move or copy the current Memory.dmp file.


 


There is dump logging


You can create a DWORD registry key HKLMSYSTEMCurrentControlSetControlCrashControlEnableLogFile set to 1.  You will need to crash the machine twice, then you will see a dumpstack.log file on the root of the C drive which will keep track of what occurs during the action of writing to the page file.


 


Is ASR enabled?


Hardware vendors, such as HP, IBM, and Dell, may provide an Automatic System Recovery (ASR) feature. You should disable this feature during troubleshooting. For example, if HP and Compaq’s ASR feature is enabled in the BIOS, disable this feature while you are troubleshooting to generate a complete memory.dmp file. For the exact steps, contact your hardware vendor.



Antivirus and Encryption



  • Check for any dump filter drivers.

  • Remove the encryption to test.


 


What else?



  • It is possible the paging file on the boot drive is not large enough. To use the “Write Debugging Information To” feature to obtain a complete memory dump file, the paging file on the boot drive must be at least as large as physical memory + 100 MB. When you create a kernel memory dump file, the file is usually around one-third the size of the physical memory on the system. Of course, this quantity will vary, depending on your circumstances.

  • Also possible there is not room for the Memory.dmp file in the path specified for writing the memory dump.

  • It is possible that the SCSI controller is bad, or the system crash is caused by a bad SCSI controller board.

  • If you specify a non-existent path, a dump file will not be written. For example, if you specify the path as C:DumpfilesMemory.dmp and no C:Dumpfiles folder exists, a dump file will not be written.

  • Is the Host Guardian Service enabled on either the host or the guest?  There are several settings which may prevent dumps from writing.  Managing the Host Guardian Service | Microsoft Docs


Grab that Page file!


Ensure the Autoreboot key is set to 0, and when the bugcheck occurs, boot into winre.  Grab the pagefile.sys and rename to memory.dmp


 

Microsoft Defender for Identity native alert page in Microsoft 365 Defender

Microsoft Defender for Identity native alert page in Microsoft 365 Defender

This article is contributed. See the original author and article here.

We are excited to announce that starting today, Microsoft Defender for Identity alerts are natively integrated into Microsoft 365 security center (security.microsoft.com) with a dedicated Identity alert page format. This marks the first step in our journey to introduce the full Microsoft Defender for Identity experience into the unified Microsoft 365 Defender portal and is a continuation of the convergence motion to integrate protection across domains, which started with Defender for Office 365 and Defender for Endpoint.


 


The new Identity alert page unlocks value for Microsoft Defender for Identity customers such as better cross-domain signal enrichment and new automated identity response capabilities. It ensures that we can best help our customers to stay secure and help improve the efficiency of security operations. To learn more about Microsoft 365 Defender, check out this dedicated Tech Community blog.


 


Alerts and investigation


 


Alerts are a key experience when working with any security product. That’s why Defender for Identity is continuously investing in research and engineering efforts to provide new alerts to attack techniques, tools and vulnerabilities. Starting today, Microsoft Defender for Identity alerts are available to view within the Microsoft 365 Defender portal.


 


figure 1.png


(Figure 1. Alert experience in Microsoft 365 security center)


 


One of the benefits of investigating alerts through Microsoft 365 security center is that Microsoft Defender for Identity alerts are further correlated with information obtained from each of the other products in the suite. These enhanced alerts are consistent with the other Microsoft 365 Defender alert formats originating from Microsoft Defender for Office 365 and Microsoft Defender for Endpoint. The new page effectively eliminates that need to navigate (‘tab-out’) to another product portal to investigate alerts associated with identity.


figure 2.bmp


 


 


(Figure 2. Side panel for device entity that is enriched by both Microsoft Defender for Endpoint and Microsoft Defender for Identity)


 


The new alert page maintains a similar look and feel to Defender for Identity while adapting to the Microsoft 365 Defender user experience and style.


 


Not just a new home…


 


Alerts are now in one common alert queue with Defender for Office 365, Defender for Endpoint, Microsoft Cloud App Security and various compliance workload alerts. Another stand-out feature for alerts originating from Defender for Identity is that they can now trigger the Microsoft 365 Defender automated investigation and response (AIR) capabilities, including automatically remediating alerts and the mitigation of tools and process that can contribute to the suspicious activity.


figure 3 bmp.bmp


 (Figure 3. Automatic alert investigation based on Microsoft Defender for Identity alert)


 


How do I get started?


 


Defender for Identity alerts can easily be accessed from either the Incidents or Alerts queue. Open either of these areas, and then you can filter by Service Sources to see the specific alerts you’re looking for.


 


figure 4 bmp.bmp


 (Figure 4. Microsoft 365 security menu)


 


figure 5 bmp.bmp


(Figure 5. Filter options for alert view)


 


As always, we’d love to know what you think.


Leave us feedback directly on the Microsoft 365 security center

Customer Key support for Microsoft Teams now Generally Available!

This article is contributed. See the original author and article here.

Service encryption with Microsoft 365 Customer Key
Microsoft 365 provides baseline, volume-level encryption enabled through BitLocker and Distributed Key Manager (DKM) which ensures customer data is always encrypted at rest in the Microsoft 365 service with BitLocker and DKM. Microsoft 365 offers an added layer of encryption at the application layer for content, including data from Exchange Online, SharePoint Online, OneDrive, and Teams, called service encryption.



Microsoft 365 Customer Key is built on service encryption, providing a layer of encryption at the application layer for data-at-rest and allows the organization to provide and control the encryption keys used to encrypt customer data in Microsoft’s datacenters. Customer Key provides an additional protection against viewing of data by unauthorized systems or personnel, complimenting BitLocker disk encrypted in Microsoft datacenters. Customer Key enhances the ability of organizations to meet the demands of compliance requirements that specify key arrangements with the cloud service provider, assisting customers in meeting regulatory or compliance obligations for controlling root keys.

Microsoft 365 Customer Key now supports Microsoft Teams!
After providing the keys, Microsoft 365 then uses the provided keys to encrypt data at rest as described in the Online Services Terms (OST). The organization can create a data encryption policy (DEP) and assign it to encrypt certain Microsoft 365 data for all tenant users. While multiple DEPs can be created per tenant, only one DEP can be assigned at a time. For customers already using Customer Key for Exchange Online and SharePoint online, data encryption policies add broader control and now includes support for Microsoft Teams! Once a DEP is created and assigned, it will encrypt the following data for all tenant users:



  • Teams chat messages (1:1 chats, group chats, meeting chats and channel conversations)

  • Teams media messages (images, code snippets, video messages, audio messages, wiki images)

  • Teams call and meeting recordings stored in Teams storage

  • Teams chat notifications, Teams chat suggestions by Cortana, Teams status messages

  • User and signal information for Exchange Online

  • Exchange Online mailboxes that aren’t already encrypted using mailbox level DEPs

  • Microsoft Information Protection exact data match (EDM) data – (data file schemas, rule packages, and the salts used to hash the sensitive data)


When a DEP is assigned, encryption begins automatically but will take some time to complete depending on size of the tenant. For Microsoft Information Protection and Teams, Customer Key DEP encrypts new data from the time of DEP assignment. We are working to bring support to encrypting past data. For Exchange Online, the DEP starts encrypting all existing and new data.
For more details on using Microsoft 365 Customer Key across multiple workloads and how to get started, please see Service encryption with Customer Key.

Setting up https for Teams Tabs projects – without ngrok

Setting up https for Teams Tabs projects – without ngrok

This article is contributed. See the original author and article here.

I’ve started using the new Microsoft Teams toolkit, which is a Visual Studio Code extension and generator for Teams applications. One thing I noticed is a little challenge when creating tabs, and that’s due to the requirement to use SSL. The documentation is fine and explains how to trust your local project, but I found it a little painful since the certificates only last 1 month and there’s a different one for each project, so I need repeat the process frequently. Your teammates will need to do that as well.



localhostcert.png


 


Here is an alternative approach in which you create your own certificate authority and build certs from that so you can install just one root certificate across all your projects! Each teammate can have their own certs, so you can collaborate as much as you wish and nobody has to go installing certs.


 



NOTE: Did you know that the Teams Toolkit uses Create React App (CRA) for tabs? Create React App is a toolchain from Facebook (who created React in the first place) it’s very popular and well supported! If you need help, search on “Create React App” and you can find a plethora of helpful articles; this one helped me figure this out!



Step 1: Create and trust a certificate authority (CA)


This step only needs to be done once for as many projects as you wish. It assumes you already have Node.js installed, as required by the Teams Toolkit.


 


a. Create a safe/private folder somewhere and go there in your favorite command-line tool, and run these commands:


npm install -g mkcert
mkcert create-ca –organization “MyOrg” –validity 3650
mkcert create-cert –ca-key “ca.key” –ca-cert “ca.crt” –validity 3650


 


NOTE: 3650 is the number of days your certs will be valid; feel free to change it. You can use –help on mkcert to reveal other options, such as setting an organization name and location (the default org is “Test CA”) and customizing the domain names for your certificate (the default is “localhost,127.0.0.1”).



This will create a new Certificate Authority and a certificate that was issued from it. You should see 4 files:




























FILE DESCRIPTION
ca.crt Certificate for your new CA
ca.key Private key for your new CA
cert.crt Certificate for use in projects
cert.key Private key for use in projects


b. Now you need to trust the certificate for your new CA; by doing that any cert you create will be trusted with no additional action on your part.


On Windows



  • Double click on the ca.crt file and click “Install Certificate”.

    ssl-01.png

     



  • Choose Local Machine and click next.

    ssl-02.png

     



  • Select “Place all certificates in the following store” and then click the “Browse” button. Choose “Trusted Root Certification Authorities” click “OK” to close the dialog box, and then click “Next”.

    ssl-03.png

  • Restart all instances of your browser to force it to re-read its trusted roots. If in doubt, reboot your computer.


On Mac



  • Double click on the ca.crt file, which should be found under /Users/[your-name]/. It will launch Keychain Access app.

  • Enter your password or use Touch ID when prompted. 
    ssl-mac-01.png

  • The new certificate (in this case, “MyOrg”) should be added. Double-click it. 
    ssl-mac-02.png

  • In a new window, expand the Trust section of the certificate details. Select “Always Trust” for every option. 
    ssl-mac-03.png

  • Close the window. Enter your password or use Touch ID again if you are asked. Now the certificate is trusted. 
    ssl-mac-04.png

  • Restart all instances of your browser to force it to re-read its trusted roots. If in doubt, reboot your computer.


On Linux


There are more steps on Linux as most browsers don’t use the operating system’s certificate store, and a tool called certutil is needed to modify the browsers’ cert?.db files. This article explains how to install your new root certificate on Linux.


Step 2 – Add the certs to your project


This is what you need to do for each project.


a. Create a new folder in your project folder (the same level as the package.json file) called .cert. Copy the cert.crt and cert.key files into this folder.


b. Modify your .env file to tell the local web server to use your cert:


HTTPS=true

SSL_CRT_FILE=./.cert/cert.crt

SSL_KEY_FILE=./.cert/cert.key


c. Prevent saving the certs to your git repository by adding a line to the .gitignore file.



.cert


Azure Active Directory SSO Tabs


Tabs that implement Azure Active Directory Single Sign-On need to implement more than just a web page; they need to implement a web service to exchange the SSO token for an access token that the app can use to call downstream services such as the Microsoft Graph. This is explained in this blog article, or this one, more clearly than in the documentation.


When yo teams generates an SSO tab, this web service is hosted using the same web server as the page itself.


When the Teams Toolkit generates one, however, it creates a separate web service for the web service so there really are two endpoints that need to be SSL enabled. The web service is in a folder called api-server. To enable SSL here, follow these steps:



  1. Add these lines to the api-server.env file.


HTTPS=true
SSL_CRT_FILE=../.cert/cert.crt
SSL_KEY_FILE=../.cert/cert.key
CORS_ORIGIN=https://devappsforteams.local:3000


2. Immediately above the line app.get(‘/getGraphAccessToken’) in server.ts or server.js, add these lines to allow the cross-origin call from the web page (port 3000) to the web service (port 5000):


const cors = require(‘cors’);
app.use(cors({
    origin: process.env.CORS_ORIGIN
}));


3. Near the bottom of the same file, replace the line


app.listen(port);


with this code:


const fs = require(‘fs’);
const https = require(‘https’);
var privateKey = fs.readFileSync(process.env.SSL_KEY_FILE );
var certificate = fs.readFileSync(process.env.SSL_CRT_FILE);

https.createServer({
    key: privateKey,
    cert: certificate
}, app).listen(port);


Working in a team


Each team member needs to do Step 1 on their computer just once. When a developer starts working on a project they can simply copy their .cert folder into their project and go to work.


Many thanks to my colleague Tomomi Imura for documenting the Mac instructions and providing screen shots.


Do you have ideas on how to do this better, especially in a project team? Please chime in using the comments; thanks!


Step-By-Step: Migrating Active Directory Certificate Service From Windows Server 2008 R2 to 2019

Step-By-Step: Migrating Active Directory Certificate Service From Windows Server 2008 R2 to 2019

This article is contributed. See the original author and article here.

Windows Server 2008 R2 achieved end of support via Microsoft on January 14th 2020. In a previous post, steps were detailed on Active Directory Certificate Service migration from 2008 R2 to 2019 but required the new Windows Server 2019 server to have the same name as the previous 2008 R2 server.  Many of you have reached out asking for an update of the steps to reflect Active Directory Certificate Service migration from 2008 R2 to 2016 / 2019 containing a different name.  A solution has been found and tested with repeatable steps shared below.


 


NOTE: The following was tested in a lab environment. While the solution was successful it may not reflect your organization’s current setup. Please test the steps below in a lab environment prior to implementing on production.


 


Step 1: Backup Windows Server 2008 R2 certificate authority database and its configuration
 



  1. Log in to Windows 2008 R2 Server as member of local administrator group

  2. Go to Start > Administrative Tools > Certificate Authority

  3. Right Click on Server Node > All Tasks > Backup CA
     
    Certification Authority Backup CACertification Authority Backup CA
     

  4. Click Next on the Certification Authority Backup Wizard screen

  5. Click both check boxes to select both items to backup and provide the backup path for the file to be stored
     
    Certification Authority Backup Wizard Item SelectionCertification Authority Backup Wizard Item Selection
     

  6. Click Next

  7. Provide a password to protect private key and CA certificate file and click on next to continue

  8. Click Finish to complete the process


Step 2: Backup CA Registry Settings


 



  1. Click Start > Run > type regedit and click OK

  2. Expand the key in following path: HKEY_LOCAL_MACHINESYSTEMCurrentControlSetServicesCertSvc

  3. Right click on the Configuration key and click Export

  4. Provide a name, save the backup file and then click on save to complete the backup
     
    Backup CA Registry SettingsBackup CA Registry Settings


Backup of the Certificates is now complete and the files can now be moved to the new Windows 2016 / 2019 server.


 


CA Backup completeCA Backup complete


 


Step 3: Uninstall CA Service from Windows Server 2008 R2


 



  1. Navigate to Server Manager

  2. Click Remove Roles under Roles Summary to start the Remove Roles Wizard, and then click Next
     
    Uninstalling a CAUninstalling a CA


  3. Click to clear the Active Directory Certificate Services check box and click Next
     
    Removing Active Directory Certificate ServicesRemoving Active Directory Certificate Services
     

  4. Click Remove on the Confirm Removal Options page

  5. If Internet Information Services (IIS) is running and you are prompted to stop the service before you continue with the uninstall process, click OK

  6. Click Close

  7. Restart the server to complete the uninstall


Step 4: Install Windows Server 2016 / 2019 Certificate Services


 


*NOTE: The screenshots below show the server name as WS2019 to highlight which server we are working on. This step-by-step highlights screenshots from Windows Server 2019. Windows Server 2016 process is the same with similar screenshots
 



  1. Log in to Windows Server 2019 as Domain Administrator or member of local administrator group

  2. Navigate to Server Manager > Add roles and features

  3. Click on next to continue in the Add Roles and features Wizard

  4. Select Role-based or Feature-based installation and click next

  5. Keep the default selection from the server selections window and click next
     
    Windows Server 2019 Server SelectionsWindows Server 2019 Server Selections
     

  6. Select Active Directory Certificate Services, click next in the pop up window to acknowledge the required features that need to be added, and click next to continue
     
    Adding Active Directory Certificate ServicesAdding Active Directory Certificate Services
     

  7. Click Next in the Features section to continue

  8. Review the brief description about AD CS and click next to continue

  9. Select Certificate Authority and Certification Authority Web Enrollment, click next in the pop up window to acknowledge the required features that need to be added, and click next to continue
     
    Windows Server 2019 Add Role ServicesWindows Server 2019 Add Role Services
     

  10. Review the brief description about IIS and click next to continue

  11. Leave the default and click next to continue

  12. Click Install to begin the installation process

  13. Close the wizard once it is complete


 


Step 5: Configure AD CS


 


In this step will look in to configuration and restoring the backup created previously


 



  1. Navigate to Server Manager > AD CS

  2. In right hand panel it will show message as following screenshot and click on More
     
    AD CSAD CS
     

  3. Click on Configure Active Directory Certificate Service …… in the pop up window
     
    Configure Active Directory Certificate ServiceConfigure Active Directory Certificate Service
     

  4. In the Role Configuration wizard, ensure the proper credential for Enterprise Administrator is shown and click next to continue

  5. Select Certification Authority and Certification Authority Web Enrollment and click next to continue

  6. Ensure Enterprise CA is selected the setup type and click next to continue

  7. Select Root CA as the CA type and click next to continue

  8. With this being a migration, select Use existing private key and Select a certificate and use its associated private key and click next to continue
     
    AD CS ConfigurationAD CS Configuration
     

  9. Click Import in the AD CS Configuration window

  10. Select the key backed up during the backup process from windows 2008 R2 server. Browse and select the key from the backup we made and provide the password we used for protection and click OK.
     
    Import Existing CertificateImport Existing Certificate
     

  11. With the key successfully imported and select the imported certificate and click next to continue

  12. Leave the default certificate database path and click next to continue

  13. Click on configure to proceed with the configuration process

  14. Close the configuration Wizard once complete

  15. Open the Command Prompt in Administrator Mode

  16. Run the following to stop certificate services
     

    net stop certsvc


  17. Open the registry file exported from the Windows 2008 server in Notepad
     
    NOTE: Please ensure you have tested this in lab first prior to completing these steps. While the solution was successful in lab it may not reflect your organization’s current setup and may disrupt your service. Microsoft is not liable for any possible disruption that may occur.


  18. Locate CAServerName and change the value to the name of the NEW 2016 / 2019 Windows Server
     
    Modify registry fileModify registry file
     

  19. Save the changes in Notepad


 


Step 6: Restore CA Backup


 



  1. Navigate to Server Manager > Tools > Certification Authority

  2. Right click on server node > All Tasks > Restore CA

  3. A window will appear confirming the stop of Active Directory Certificate Services. Click OK to continue.
     
    Confirm stop of Active Directory Certificate ServicesConfirm stop of Active Directory Certificate Services

  4. Click Next to start the Certification Authority Restore Wizard

  5. Click both check boxes to select both items to restore and provide the backup path for the file to be restored from
     
    Certification Authority Restore WizardCertification Authority Restore Wizard

  6. Enter the password used to protect private key during the backup process and click next

  7. Click Finish to complete the restore process

  8. Click Yes to restart Active Directory Certificate Services


 


Step 7: Restore Registry info


 



  1. Navigate to the folder containing the backed-up registry key with the newly edited CAServerName value and double click > Run to initialize the restore

  2. Click yes to proceed with registry key restore

  3. Click OK once confirmation about the restore is shared


 


Step 8: Reissue Certificate Templates


 


It is now time to reissue the certificate with the migration process now complete.


 



  1. Under Server Manager, navigate to Tools > Certification Authority

  2. Right click on Certificate Templates Folder > New > Certificate Template to Reissue

  3. From the certificate templates list click on the appropriate certificate template and click OK


 


This completes the Active Directory Certificate Service migration steps from 2008 R2 to 2016 / 2019 containing a different server name. 


 


The following video also shares steps surrounding this process as well as migrating DNS.


 


https://channel9.msdn.com/Shows/IT-Ops-Talk/Windows-2008-End-Of-Support-Active-Directory-Migration/player?WT.mc_id=modinfra-27462-abartolo

Build real-time application with lightweight server and Azure Web PubSub service

Build real-time application with lightweight server and Azure Web PubSub service

This article is contributed. See the original author and article here.

With the growth of internet, the demands of real-time is also expanded to web application to achieve live and synchronous interaction with the world. The data must be efficiently processed and delivered to produce a responsive, real time experience, for example cross platforms chatting application with live video, group collaboration in remote education, live dashboard for IoT, instant notification and alert for IT systems, and so on.


 


The Azure Web PubSub service (AWPS) could help you build real-time web application easily with large scale and high availability and focus on your own business instead of infrastructure. This service enables you to build the real-time web application based on WebSocket technology and publish-subscribe pattern. It enables an extensive client platform, and you also have the flexibility to leverage WebSocket community ecosystem.


 


In some scenarios, we need the server to process the data between clients, for example, implementing the language moderation for a cross platforms chat room, raw data scaling and calibration for logistic location tracking, data statistics for live dashboard, etc. But in other cases, you may look for a more effective model which routes the data between clients directly with a lightweight server. Taking the group collaboration scenario in remote education as an example, you may want to build a whiteboard application for remote customers which will synchronize the customized events between clients.


 


The Azure Web PubSub service could support both server with the ability to process messages and lightweight server scenarios. To help you build application with lightweight server, the AWPS published a predefined subprotocol json.webpubsub.azure.v1 which empowers the clients to do publish-subscribe directly. For the client supporting this subprotocol, we call it “PubSub WebSocket Client”. Let’s walk through how to use this subprotocol and build a chatroom with lightweight server together. You build the application with any programming language supporting WebSocket API. We are taking JavaScript as an example here. If you are using others, like Node.JS, Python, etc., you need to replace the APIs accordingly.


 


Create the instance of AWPS


First, sign in to the Azure portal  with your Azure account. You could create the new free instance by searching the “Web PubSub” or find it from the “Web” category.


yan_jin_2-1620895972214.png


 


Once the instance is created, we need to go to the “Client URL Generator” in “Key” tab to generate the “Client Access URL”. Please make sure that it has the roles of “Send To Groups” and “Join/Leave Groups”.


yan_jin_1-1620895367357.png


 


Create the PubSub WebSocket Client


It is using the Client_Access_URL and the subprotocol json.webpubsub.azure.v1 to create the WebSocket connection. In general, you need to generate the URL and token by server with the connection string. To simplify this demo, we just copy the URL from portal directly.


 

// PubSub WebSocket client
var publisher = new WebSocket('Client_Access_URL', 'json.webpubsub.azure.v1');
var subscriber = new WebSocket('Client_Access_URL', 'json.webpubsub.azure.v1');

 


 


Join and subscribe message from group


You need to join the group at first before receiving the messages. The message format to join a group is as below.


 

{
    "type": "joinGroup",
    "group": "<group_name>"
}

 


 


Once you join the group, it is easy to receive the messages from the specific group by onmessage event as the code snippet below. 


 

subscriber.onopen = function () {
    subscriber.send(JSON.stringify({
        "type": "joinGroup",
        "group": "group1"
    }));
}

subscriber.onmessage = function (e) {
    console.log(e.data);
}

 


 


Publish a text message the group


You could publish a text message to the specific group with this message format, if you have proper permission with the Client Access URL. It is not required to join the group at first.


 

{
    "type": "sendToGroup",
    "group": "<group_name>",
    "dataType" : "text",
    "data": "Hello World!"
}

 


 


Here is the code snippet in Javascript &colon;


 

publisher.onopen = function () {
    publisher.send(JSON.stringify({
        "type": "sendToGroup",
        "group": "group1",
        "dataType" : "text",
        "data": "Hello World!"
    }));
}

 


 


Next Steps


Now, you have learned how to use Azure Web PubSub to complete pub/sub between clients, and you can use it to build a real application like chat room as this online demo and the sample code.  You could also get more helpful resources from the getting stated contents. We are looking forward your feedback and ideas to help us become better via Azure Feedback Forum!

Running PowerShell against all Azure subscriptions

This article is contributed. See the original author and article here.

 


Q: How do I run my PowerShell command against all my Azure subscriptions? 


A: Easy – Use the cmdlet I wrote when I ran into the same problem. 


  


When you go from one Azure subscription to two, three, or hundreds it is no longer trivial to run a single command against all your subscriptions in PowerShell.  I was working with one subscription that quickly expanded to three then soon more than a dozen. Opening new PowerShell hosts for each environment and switching between them was too much work. I needed an easy way to assess everything across all my subscriptions.  My solution was to write Invoke-AzureCommand.  Invoke-AzureCommand allows you to run a script block against every subscription easily.  To use it install AzureHelper, put your code in a script block, and run Invoke-AzureCommand to do the repetitive work of cycling the script block across all your subscriptions.  


 


Disclaimer: The sample scripts are not supported under any Microsoft standard support program or service. The sample scripts are provided AS IS without warranty of any kind. Microsoft further disclaims all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose. The entire risk arising out of the use or performance of the sample scripts and documentation remains with you. In no event shall Microsoft, its authors, or anyone else involved in the creation, production, or delivery of the scripts be liable for any damages whatsoever (including, without limitation, damages for loss of business profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use the sample scripts or documentation, even if Microsoft has been advised of the possibility of such damages.  


  


1.  Install AzureHelper


To get started, install the AzureHelper module using the following command: 


 

Install-Module AzureHelper

 


If you prefer, you could also download it from GitHub here: www.GitHub.com/PaulHCode/AzureHelper 


 


2. Put your code in a script block


Put whatever commands you want to run against all of your subscriptions into a script block.  If you are new to script blocks check out more information on script blocks here. 


For example, I want to find all Azure Disks that are larger than 512 GB across my subscriptions.  To find these I put the following script block together. 


 

$DiskScriptBlock = {Get-AzDisk | Where{$_.DiskSizeGB -gt 512}} 

 


 


3. Run your script block against all subscriptions 


Running the script block against all subscriptions is as easy as the example below. 


 

Invoke-AzureCommand -AllSubscriptions -ScriptBlock $DiskScriptBlock | FT ResourceGroupName, Name, DiskSizeGB 

 


This example gives the output from every subscription, but if we have the same resource group name in multiple subscriptions then it isn’t clear which subscription contains the resource.  To fix that we use a named expression to include the name of the subscription as seen in the following example. 


  


Are you concerned about deallocated VMs sitting around that you don’t need anymore?  Use the following: 


 

$MyScriptBlock = {get-azvm -Status | Where{$_.PowerState -eq 'VM deallocated'} | Select Name, ResourceGroupName, @{N='Subscription';E={(Get-AzContext).Subscription.Name}}} 

Invoke-AzureCommand -ScriptBlock $MyScriptBlock -AllSubscriptions 

 


 


Okay, that sure makes a quick query easier but what if I want to do something a little more complex in my script block that needs arguments passed in?  I’m glad you asked.  Invoke-AzureCommand also supports passing an array of arguments into your scriptblock as seen in the example here. 


 

$ArgumentList = @() 
$ArgumentList+=512    # The first parameter is the minimum disk size 
$ArgumentList+="westus2"    # The second parameter is the Azure region to search 

$BetterDiskScriptBlock = { 
   param($disksize, $region) 
   Get-AzDisk | Where{$_.DiskSizeGB -gt $disksize} | Where{$_.Location -eq $region} 
}  

Invoke-AzureCommand -ScriptBlock $BetterDiskScriptBlock -AllSubscriptions -ArgumentList $ArgumentList 

 


 


You can make this example shorter by passing the arguments directly to Invoke-AzureCommand like this. 


 

Invoke-AzureCommand -ScriptBlock $BetterDiskScriptBlock -AllSubscriptions -ArgumentList 512,"westus2" 

 


 


Have fun scripting!