Experiencing Latency, Data Gap and Alerting failure for Azure Monitoring – 07/18 – Investigating

This article is contributed. See the original author and article here.

Update: Saturday, 18 July 2020 11:17 UTC

We continue to investigate issues within Azure Monitoring services. Some customers continue to experience Data access, Data latency and Data Loss, incorrect Alert activation, missed or delayed Alerts and Azure Alerts created during the impact duration may not be available to be viewed in the Azure portal in multiple regions. We are working to establish the start time for the issue, initial findings indicate that the problem began at 07/18 ~07:58 UTC. We currently have no estimate for resolution.

  • Work Around: None
  • Next Update: Before 07/18 14:30 UTC

-Anmol


Initial Update: Saturday, 18 July 2020 08:58 UTC

We are aware of issues within Application Insights and Log Analytics and are actively investigating. Some customers may experience Data access issues in the Azure portal, Incorrect Alert Activation, Latency and Data Loss in multiple regions. 

  • Work Around: None. 
  • Next Update: Before 07/18 11:00 UTC

We are working hard to resolve this issue and apologize for any inconvenience.
-Madhuri


How to find SQL Server Replication related jobs and T-SQL statements

How to find SQL Server Replication related jobs and T-SQL statements

This article is contributed. See the original author and article here.

Verbose log is heavily used  in replication troubleshooting. You need to find the right job to enable to verbose log. However, it’s not easy to find the  jobs when you have hundreds replication jobs in one server.

 

 

Here is how:

1.Distribution agent

Following queries list all the distribution agent jobs, including push subscriptions and pull subscriptions. (You may need add more clause to customize your queries). By default, the SQL Server agent job names equal to the agent names for push subscription, unless you explicitly modify the job names.

use distribution—in distributor server

if not exists(select 1 from sys.tables where name =’MSreplservers’)

begin

select job.name JobName,a.name AgentName, a.publisher_db,a.publication as publicationName,sp.name as publisherName ,ss.name as subscriber,a.subscriber_db, a.local_job From MSdistribution_agents a

inner join sys.servers sp on a.publisher_id=sp.server_id–publisher

inner join sys.servers ss on a.subscriber_id =ss.server_id–subscriber

left join msdb..sysjobs job on job.job_id=a.job_id

where a.subscription_type <>2— filter out the anonymous subscriber

end

else

begin

select job.name JobName,a.name AgentName, a.publisher_db,a.publication as publicationName,sp.srvname as publisherName ,ss.srvname as subscriber,a.subscriber_db, a.local_job From MSdistribution_agents a

inner join msreplservers sp on a.publisher_id=sp.srvid–publisher

inner join msreplservers ss on a.subscriber_id =ss.srvid–subscriber

left join msdb..sysjobs job on job.job_id=a.job_id

where a.subscription_type <>2— filter out the anonymous subscriber

end

 

Liwei_0-1595042676620.png

 

 

For push subscription, you can use the job name directly to find the job in distributor server.

Liwei_1-1595042676622.png

 

 

For  pull subscriptions(local_job=0), you need to run following query in the subscription database in subscriber server.

use subdb2        —in subscriber server

go

select job.name as JobName,distribution_agent as AgentName, *From MSreplication_subscriptions s inner join msdb.dbo.sysjobs job on s.agent_id=job.job_id

 

By default, the SQL Server agent job names equal to the agent names for pull subscription, unless you explicitly modify the job names.

Liwei_2-1595042676623.png

 

 

Liwei_3-1595042676625.png

 

 

 

 

 

2.Merge agent

Following queries list all the merge agent jobs, including push subscriptions and pull subscriptions. (You may need add more clause to customize your queries). By default, the SQL Server agent job names equal to the merge agent names.

use distribution—in distributor server

if not exists(select 1 from sys.tables where name =’MSreplservers’)

begin

select job.name JobName,a.name AgentName, a.publisher_db,a.publication as publicationName,sp.name as publisherName ,ss.name as subscriber,a.subscriber_db, a.local_job From MSmerge_agents a

inner join sys.servers sp on a.publisher_id=sp.server_id–publisher

inner join sys.servers ss on a.subscriber_id =ss.server_id–subscriber

left join msdb..sysjobs job on job.job_id=a.job_id

 

end

else

begin

select job.name JobName,a.name AgentName, a.publisher_db,a.publication as publicationName,sp.srvname as publisherName ,ss.srvname as subscriber,a.subscriber_db, a.local_job From MSmerge_agents a

inner join msreplservers sp on a.publisher_id=sp.srvid–publisher

inner join msreplservers ss on a.subscriber_id =ss.srvid–subscriber

left join msdb..sysjobs job on job.job_id=a.job_id

end

Liwei_4-1595042676626.png

 

 

For push subscription, you can use the job name directly to find the job in distributor server.

Liwei_5-1595042676627.png

 

 

 

For  pull subscriptions(local_job=0), you need to run following query in the subscription database in subscriber server.

use subdb6–in subscriber server

go

select job.name, sub.publisher,sub.publisher_db,sub.publication from  msdb..sysjobs job inner join msdb..sysjobsteps jobStep on job.job_id=jobStep.job_id

inner join MSsubscription_properties sub on sub.job_step_uid=jobStep.step_uid

 

Liwei_6-1595042676628.png

 

 

You can use the job name directly to find the job in subscriber server.

Liwei_7-1595042676630.png

 

 

 

 

 

3.Snapshot agent

Following queries list all the snapshot agent jobs. (You may need add more clause to customize your queries). By default, the SQL Server agent job names equal to the snapshot agent names, unless you explicitly modify the job names.

 

use distribution–in distributor server

if not exists(select 1 from sys.tables where name =’MSreplservers’)

begin

select job.name JobName, a.name AgentName , publisher_db,publication, s.data_source as publisher,

case publication_type

when 0 then ‘Transactional’

when 1 then ‘snapshot’

when 2 then ‘Merge’

end as publication_type

   From MSsnapshot_agents a inner join sys.servers s on a.publisher_id=s.server_id

   inner join msdb..sysjobs job on a.job_id=job.job_id

 

end

else

begin

select job.name JobName, a.name AgentName, publisher_db,publication, s.srvname as publisher,

case publication_type

when 0 then ‘Transactional’

when 1 then ‘snapshot’

when 2 then ‘Merge’

end as publication_type

   From MSsnapshot_agents a inner join MSreplservers s on a.publisher_id=s.srvid

   inner join msdb..sysjobs job on a.job_id=job.job_id

end

Liwei_8-1595042676631.png

 

You can use the job name directly to find the job in distributor server.

Liwei_9-1595042676632.png

 

 

4.Logreader agent

Following queries list all the log agent jobs, including push subscriptions and pull subscriptions. (You may need add more clause to customize your queries). By default, the SQL Server agent job names equal to the log reader agent names.

use distribution

if not exists(select 1 from sys.tables where name =’MSreplservers’)

begin

select job.name JobName, a.name AgentName, publisher_db,s.name as publisher

From MSlogreader_agents a inner join sys.servers s on a.publisher_id=s.server_id

Inner join msdb..sysjobs job on job.job_id=a.job_id

end

else

begin

select job.name JobName, a.name AgentName, publisher_db,s.srvname as publisher

From MSlogreader_agents a inner join MSreplservers s on a.publisher_id=s.srvid

Inner join msdb..sysjobs job on job.job_id=a.job_id

end

 

Please note, all publications of same publication database share same agent job

Please note, the agent name equals to job name by default, unless user modifies the job.

Liwei_10-1595042676633.png

 

 

 

Liwei_11-1595042676635.png

 

 

 

 

In some complex cases, you may need to review all the T-SQL statements issued by these agent jobs.

Here is how:

 

One agent have more than one connections to the servers, they have different session id but share the same process id. Once the process id is identified, you can use this process id to filter, using SQL Server profiler trace or xevent.

  1. Distribution agent

Distribution agent connects both distributor and subscriber. It has more than one connections in distributor. The application name of one connections equals to the agent name, that’s how I get all the connections of one specific agent.

 

1)For example, following query returns 6 agents.

use distribution

select  *  From MSdistribution_agents

Liwei_12-1595042676636.png

 

 

2)Let’s say I need to check the T-SQL of first agent.

If SQL Server trace files are collected in distributor and subscriber are collected, use the agent name to filter ApplicationName column to find the process id in distributor trace.

Then Use the process id to filter queries in distributor trace and subscriber trace.

 

If you need online troubleshoot,run following statement in distributor server.

select hostprocess as PID from sys.sysprocesses where program_name =’NODE1SQLAG-AdventureWorks-TranPubTest1-NODE3SQLAG-6′

 

Liwei_13-1595042676638.png

 

3)It returns the process id of this agent. Then you can use the process id of find all the sessions in both distributor server and subscriber server.

In distributor server

select @@servername,*From sys.sysprocesses where hostprocess=1832

Liwei_14-1595042676639.png

 

 

In subscriber server

select @@servername,program_name,*From sys.sysprocesses where hostprocess=1832

Liwei_15-1595042676640.png

 

 

4)An alternative is to run following query in subscription database, then use the hostprocess to filter…

use subdbName

select hostprocess, *From sys.sysprocesses where spid in

(

select spid From MSsubscription_agents

)

 

 

2.Merge agent

The behavior of Merge agent is as same as distribution agent.  It connects publisher, distributor and subscriber.

1.Run following query in distributor server.

use distribution

select * From MSmerge_agents

Liwei_16-1595042676641.png

 

 

2)Let’s say I need to check the T-SQL of first agent.

If SQL Server trace files are collected in publisher,distributor and subscriber are collected, please use the agent name to filter ApplicationName column to find the process id in publisher trace.

Then Use the process id to filter queries in publisher, distributor trace and subscriber trace.

 

3)If you need online troubleshoot,. run following statement in publisher server.

select @@servername,hostprocess From sys.sysprocesses where program_name in

(

‘NODE1SQLAG-AdventureWorks-MergePubTest-NODE3SQLAG-1’

)

Liwei_17-1595042676642.png

 

 

3)Use this process id to filter queries in distributor and subscriber, the same way I used for distribution agent.

 

 

3.Snapshot agent

The behavior of Snapshot agent is as same as distribution agent.  It connects publisher and distributor

1.Run following query in distributor server.

use distribution

select * From MSsnapshot_agents

Liwei_18-1595042676643.png

 

2)Let’s say I need to check the T-SQL of first agent.

If SQL Server trace files are collected in publisher and distributor are collected, please use the agent name to filter ApplicationName column to find the process id in publisher trace.

Then Use the process id to filter queries in publisher and distributor trace.

 

3)If you need online troubleshoot, run following statement in publisher server.

select @@servername,hostprocess From sys.sysprocesses where program_name in

(

‘NODE1SQLAG-AdventureWorks-MergePubTest-NODE3SQLAG-1’

)

 

3)Use this process id to filter queries in publisher and distributor , the same way I used for distribution agent.

 

4.Logreader agent

Logreader agent is little bit different. The agent name of MSlogreader_agents does not match anything. You need to use  following pattern to filter.

The application name of logreaer agent consists of Repl-LogReader-number-publicationDBName-number.

 

 

For example, if you need to collect process id of logreader of AdventureWorks, run following query in both publisher server and distributor server.

select program_name, hostprocess,@@servername From sys.sysprocesses where program_name like ‘Repl-LogReader%’ and program_name like ‘%AdventureWorks%’

Liwei_19-1595042676644.png

 

 

Liwei_20-1595042676645.png

 

 

[Guest Blog] 7 Lessons from a 24 year tech career

This article is contributed. See the original author and article here.

This article was written by Microsoft employee Sonia Cuff as part of the Humans of IT Guest Blogger Series. Sonia shares about the universal truths that have appeared during her 24 years in tech that can apply to any career.

 

I have been in a technical role of some kind for the past 24 years. Twenty-four years! I had no idea that following my heart would take me on an adventure through this winding career path, but incrementally, every twist and turn has brought me to a dream job here at Microsoft.

I wasn’t always interested in tech. At school, I did “computers” as a subject and I was good at it, but I didn’t fall in love with BASIC coding on the Apple IIe. In fact, I wanted to be a flight attendant. A paid job in a bank straight after high school lured me away from my plans to study travel & tourism, so I never went into the travel industry. After two years in a bank branch, spending a fair amount of time on the “user” end of helpdesk calls as the resident branch tech expert (I was only 18 years old then!), they asked me to relocate to a different city and join the IT department there. It was December 1995 when I learnt lesson 1:

 

“Lesson 1: Change your plans.”

 

Sometimes opportunities come your way that didn’t fit your plans. And sometimes they are even better than your plans.

 

Could I work in an IT department, with no formal training? I had no idea, but they were willing to take a chance on me so we come to lesson 2:

 

“Lesson 2: If you are not sure – say yes.”

 

I was thrown into a project to build and deploy new workstations and servers to our branches nationwide. And I hated it. I had great colleagues who taught me everything, including one vivid session around a whiteboard explaining TCP/IP and subnet masks. But I struggled through the project work and the after-hours deployments, until I figured out why – I missed working with people! So, do I suck it up, after my employer had relocated me, or did I speak up about it? Lesson 3!

 

“Lesson 3: If you don’t like it, then leave.”

 

OK, there are all sorts of nuances with lesson 3. Understand that at the time I was living with a partner who could financially support me if this was going to be the end of my tech career, while I looked for a new job or did go and study, and we had no dependents. That makes it so much easier to take a risk and leave a role you are not enjoying. That said, sometimes you do have to suck it up for a while to keep feeding your family and paying the mortgage.

 

After confessing how I was feeling to my team leader… they gave me another chance in the second level support team. It was going to be sink or swim, so I dived into the basics – xcopying DOS files, installing token ring adapter drivers and more. Here’s where the groundwork was built in with lesson 4:

 

“Lesson 4: Learn and understand the basics.”

 

Tech concepts I learnt then are still foundational for the work I do today, and helped me build a strong troubleshooting skill. When things stopped working, I knew what to try next. I also built great working relationships with my colleagues, listening to those who had been in the industry longer than me and taking their advice. This was really informal mentoring before mentoring was a thing, and I soaked it up like a sponge. I jumped into email systems (MS Mail, Microsoft Exchange) and a large migration project, rising to become a Lotus Domino Systems Architect!

 

Now I’m at the peak of my tech skills inside this global organization – but do I really know what I’m talking about compared to others in the industry? Hello lesson 5!

 

“Lesson 5 – Take a chance”

 

I wrote up my CV, went for some interviews, and landed a role with a small IBM Partner.
The rest of my career has mostly been a repeat of those 5 main lessons. I continued to learn, continued to listen, and continued to take chances – then moving into a team leader role and finally as a Service Delivery Manager inside a large systems integrator.

 

After almost losing it all due to burnout, I changed my plans again – left the country and started a managed service provider looking after small and medium businesses. Here I learnt lesson 6:

 

“Lesson 6: Share your stories”

 

I became active on social media, attending user groups and conferences, and also blogging. I shared how I fixed errors or deployed things, and earned the title of Microsoft MVP. Then lesson 7 appeared:

 

“Lesson 7: Always be learning”

 

My tech career has been nothing but learning, and that was about to ramp up again with an opportunity to interview at Microsoft as a Cloud Advocate. In Azure. When I’d been focused on Office 365. Again, all of the qualities I’d grown throughout my career, plus my commitment to sharing with technical communities, meant I was an ideal candidate, even without an impressive Azure resume. The last two years in this role have been Learn – Share – Repeat!

 

There are many other lessons in this story, including one about taking care of yourself, and one about the stresses of being a manager or business owner. Some lessons are easier to say than they are to do.

And some lessons absolutely apply to people differently depending on their circumstances at the time.

 

Ultimately, I’ve found that my successful 23 year career has been directed by these simple truths and they’re likely to resonate with people in different tech roles too. Picking the hottest technology to specialize in is not as important as these human skills you will use to navigate your career.

 

Always be learning, and stay open to whatever the next change of plans looks like, for you.

 

#HumansofIT

#CareerJourneys

 

Enjoyed this article from Sonia? Be sure to watch this recording of her Microsoft Ignite 2019 session on “An Introvert’s Guide to the IT Industry”https://myignite.techcommunity.microsoft.com/sessions/81710?source=sessions.

You can also connect with Sonia by following her on Twitter at @SoniaCuff. 

How to use calendars in Project?

How to use calendars in Project?

This article is contributed. See the original author and article here.

As a project manager, your job is to schedule tasks accurately to ensure everything happens at the right time.  However, before you begin to schedule, you need to determine the number of the working hours and days it will take to complete these tasks.

 

Project for the web comes with a default work template that can be automatically applied to all projects. This template is based on the average work schedule: it specifies work hours of 9AM through 5PM from Monday through Friday. If your work does not follow this schedule, you can create a new work template to suit your project’s needs. This blog post will walk you through the steps of changing the working hours and days of a project in Microsoft Project.

Project for the web is built on the Microsoft Power Platform. Because of this, some aspects of calendar and assignment setup needs to be done in Power Apps.

 

To access Dynamics 365:

  1. While signed in to Office 365, open a browser window and go to https://web.powerapps.com
  2. On the PowerApps page, select Apps.
  3. On the Apps page, in the Org Apps tab, select Project.

Power Apps - 1st image.png

From here, you can make changes to your projects and resources, implementing behind-the-scenes capabilities that can help you with project planning. 

 

Note: For the purposes of this article, we will refer to assignees as resources. This is the term used by Dynamics 365 to refer to people assigned to complete tasks, but this term is not generally used in Project for the web.

 

Creating a work hours template

A work hour template (WHT) is a template that defines working days and hours and can later be applied to a project or resource. For example, a user might have a “Night Shift” WHT defined as Monday-Sunday 8PM-7AM, or a “Weekend Shift” WHT defined as Thursday-Sunday, 8AM-5PM.

There are several ways you can create a work hours template; however, all work templates are based off the calendars of bookable resources. A bookable resource is anything that can be scheduled. You can create a bookable resource in Dynamics 365 and configure its working hours. These working hours will define when the resource can be assigned work.

 

Note: You will not be able to create new bookable resources or work hours templates if you are not an Dynamics  admin. If you follow these steps and do NOT see the buttons discussed here, talk to your Office 365 admin to help you with this work.

 

To create a bookable resource:

  1. From the Project Power Apps page, select Resources.
  2. Select New.
  3. Add a resource type, user, and a name to your bookable resource, then select Save and Close. This will bring you back to the resources page     Note: To learn more about bookable resources, check out the Set up bookable resources article. In summary, “Resource type” indicates who and what the resource is, “User” indicates who owns the bookable resource, and “Name” is the name of the bookable resource. 
  4. Select the name of your newly created resource.
  5. On the top ribbon, select Show Work Hours Work calendar.png

     

  6. Select one of the working days. This should open a new view with options to ‘Edit’ and ‘Delete’ the working hours 
  7. Select ‘Edit’ and ‘All events in the series’
  8. From this pane, select your desired working days and working hours 4th image.png

     

  9. Select ‘Save’ on the Hours pane 
  10. Select Save and Close until you get back to your resource page.

Once you’ve configured a Bookable Resource’s calendar, you can create a work hours template based on this resource. You can do this in one of the following two ways.

To create a work template from the Active Work Hour Templates page:

  1. From the Project Power Apps page, select the Projects menu on the bottom of the left pane, and then select Settings.
  2. On the Project Settings Parameters page, select Calendar Templates.
  3. On the Active Work Hour Templates page, select New.
  4. Name your work template and (optionally) add a description
  5. Use the drop-down menu to select the Template Resource that you would like to base the work hours on.
  6. Select Save and Close. Your new work hours template will display on the Active Work Hour Templates page. 3rd work image.png

To create a work template from the Resources page:

  1. On the Resources page, select the resource you want to base your work hours on.
  2. Select Save Calendar As, enter a name for the work hours template, and then select Save.
  3. When you’re done changing options, select Save and Close.

 

Applying a calendar to a project

Once you’ve created a work hours template, you can apply that template directly to a project.

To apply a work hours template to a project:

  1. In your project, open the Project Settings pane.
  2. Select the Calendar drop-down menu and select the work hours template you want to apply to the project.

If you do not see the Calendar field in Project details, this means you only have one available work hours template.

Once you have configured the calendar field to the right WHT, your work should only be scheduled during the working hours defined on the template.

 

FAQ

Q: Where can I find out more?

A: Check out our article: Create and apply work calendars in Project for the web

 

Q: Can I configure a resource calendar?

A: You can apply a calendar to a resource from the Resources page on PowerApps. From the Resources page, select the resources that you want to apply a calendar for and select Set Calendar. In the Work Template window, select the work template that you want to apply to your resource.

You can learn more about applying a calendar to a resource.

 

Q: Who has access to this functionality?

A: Dynamics 365 admins have the ability to create and apply calendars. If you follow these steps and do not see the mentioned buttons, then you do not have admin access to Project. You should contact your admin for help with calendar issues.

SAP on Azure General Update – July 2020

SAP on Azure General Update – July 2020

This article is contributed. See the original author and article here.

1. New VM Type Certified & Generally Available – Intel Based Ev4

Azure Edsv4 based on the latest generation Intel CascadeLake CPU is now certified and supported for both Hana and NetWeaver AnyDB.  Hana OLTP and OLAP are both certified.

 

The VM types available are:

VM Size

vCPU

RAM

SAPS

Hana Certification

NW Certification

Standard_E2ds_v4

2

16

3,142

Cameron_MSFT_SAP_PM_14-1594868960979.png

 

Standard_E4ds_v4

4

32

6,284

Cameron_MSFT_SAP_PM_15-1594868960980.png

 

Standard_E8ds_v4

8

64

12,569

Cameron_MSFT_SAP_PM_16-1594868960980.png

 

Standard_E16ds_v4

16

128

25,138

Cameron_MSFT_SAP_PM_17-1594868960981.png

 

Standard_E20ds_v4

20

160

31,422

Cameron_MSFT_SAP_PM_18-1594868960981.png

 

Cameron_MSFT_SAP_PM_19-1594868960981.png

 

Standard_E32ds_v4

32

256

50,275

Cameron_MSFT_SAP_PM_20-1594868960982.png

 

Cameron_MSFT_SAP_PM_21-1594868960982.png

 

Standard_E48ds_v4

48

384

75,413

Cameron_MSFT_SAP_PM_22-1594868960982.png

 

Cameron_MSFT_SAP_PM_23-1594868960983.png

 

Standard_E64ds_v4 

64

504

100,550

Cameron_MSFT_SAP_PM_24-1594868960986.png

 

Cameron_MSFT_SAP_PM_25-1594868960986.png

 

 

Hana is certified for productive use on Eds_v4 VM sizes larger than E20ds_v4 in combination with Azure Storage options with the following conditions:

  1. Premium Disk: /hana/data
  2. UltraSSD Disk: /hana/data or /hana/log
  3. ANF: /hana/data and /hana/log
  4. Any storage type may be used for /hana/shared and/or /usr/sap/SID
  5. It is not supported to use Premium or UltraSSD Disk for /hana/data and ANF for /hana/log

 

More information on Certification can be found in the SAP IaaS Hardware Certification Directory and more information about Hana Storage Configuration on Azure can be found here.

 

All Ev4 are certified and supported for NetWeaver on Azure Premium Storage and Azure UltraSSD Disk.  Azure NetApp Filer service will be available for AnyDB later.

For a complete list of Certified VM Types for SAP Applications review SAP Note 1928533 – SAP Applications on Azure: Supported Products and Azure VM types  

The check availability of these Virtual Machine types in nearby Azure Regions here

 

Note: As at July 2020 the Edsv4 and Ddsv4 only are supported.  The Esv4 without an internal SSD disk is not supported.   

2. Confirm VM Availability in Zones Before Deployment

The powershell script below can be used to confirm which VM types are available in an Azure region prior to deploying VMs.  It is important to run this script using the actual subscription that host the VMs.  After planning the VM Types and zones submit a quota request to begin deploying

 

Get-AzComputeResourceSku | where {$_.Locations.Contains(“southeastasia”)  -and $_.LocationInfo[0].Zones -ne $null -and $_.ResourceType.Equals(“virtualMachines”)}

 

Cameron_MSFT_SAP_PM_26-1594868961018.png

Cameron_MSFT_SAP_PM_27-1594868961032.png

In the above example B-series is available in all 3 zones in Singapore, but Mv2 is available in zones 1 and 3.

3. Multi-SID Clustering for RedHat & Suse & a New Fencing Agent for Suse 12.x & 15.1

Multi-SID clustering of up to 5 ASCS on the same two node cluster is now supported for both Suse and Redhat.

 

High availability for SAP NetWeaver on Azure VMs on SUSE Linux Enterprise Server for SAP applications multi-SID guide

High availability for SAP NetWeaver on Azure VMs on Red Hat Enterprise Linux for SAP applications multi-SID guide

 

It is highly recommended that prior to installing a Linux Multi-SID cluster a “test lab” installation is performed on some test VMs and great care must be taken to document IP address and port numbers. 

 

Suse now supports the same STONITH mechanism that is implemented for Redhat.  The Suse Pacemaker service is now able to connect to the Azure Fabric and restart a hung or fenced VM.  This functionality requires an update to the Python/Python3 libraries and the package for the Azure SDK (python-azure-mgmt-compute on SLES12 ; python3-azure-mgmt-compute on SLES15).

After implementing these packages the Python versions can be verified via the commands

python –version

python3 –version

4. Recommended Blogs for SAP on Azure Customers & Consultants  

My colleague Anjan Banerjee has developed some very useful blogs based on real customer deployments.  The topics for these blogs are based on questions from consultants.  For example “Can I run Business Objects on SQL Azure DB PaaS?”. 

 

Installation of SAP Netweaver ABAP 7.50 on DB2/UDB 11.1 with HADR (Highly available Database Environment) in Azure Cloud

SAP Single Sign-on : Kerberos/SPNEGO Setup for AS-JAVA

SAP On Azure : High-Availability Setup of SAP Content Server 7.53 with MaxDB in Windows Environment

SAP On Azure : HIGH AVAILIABILITY setup for SAP NETWEAVER with SAP ASE 16 DB on WINDOWS SERVER

SAP on AZURE: HIGH AVAILIABILITY setup for SAP BusinessObjects Business Intelligence 4.2 SP8 with SQL server on Windows

SAP on AZURE: SAP BusinessObjects Business Intelligence Platform Setup with Azure SQL DB (Managed PaaS database)

 

The full list of blogs can be found here

Another useful blog from Etienne Dittrich can be found here

5. Requirements for SAP on Windows & Linux: OS Boot Disk

Several customer escalations have been traced to inadequate performance of the OS Boot Disk.  Based on these support cases it is strongly recommended to use Premium Storage for the OS Boot Disk. The same limitations and constraints that exist for Standard storage and that are documented in SAP Note 2367194 – Use of Azure Premium SSD Storage for SAP DBMS Instance also apply to the OS Boot Disk.

The minimum recommended Premium Disk for a large high performance VM (such as E64v3 or higher) would be P15 or higher. 

 

In extreme cases small OS Boot Disks using Standard storage can cause a VM to appear to freeze under high IO load.

6. Azure Storage Updates

Azure Storage options and features continue to expand and improve.  Below is a brief list of features recently released and in preview:

  1. Disk Bursting is a feature that allows certain disk types to temporarily exceed their quota for a specified period of time.   Note: the disk quota increase does not change the VM level quota
  2. Update to the Hana Storage Guide for Azure.    Key changes include:
    1. LVM Stripe Size recommendations have slightly changes in response to some new performance testing results
    2. A specific recommendation to ensure /hana/data, /hana/log and /hana/shared are always placed in separate Volume Groups. 
    3. Most Linux OS the IO Scheduler changes from NOOP to NONE

https://www.suse.com/support/kb/doc/?id=7024299

https://www.suse.com/support/kb/doc/?id=7024298

  1. Azure Files Premium for Windows SMB 3.x with Active Directory Integration – this feature is now in preview
  2. Azure Files Premium for NFS 4.1 – this feature is now in private preview
  3. Azure Shared Disk – this feature is in preview and currently under evaluation for suitability for SAP solution
  4. sFTPaaS – sFTP as a Service is in planning.  This feature would eliminate the requirement for VMs to provide sFTP services

A new blog will be released when Azure Files Premium for Windows and Linux is available for SAP customers

7. SQL Server Backup to URL – How to Throttle Backup Throughput   

Modern releases of SQL Server supports direct backup to URL to multiple target blob files.  The throughput to multiple blob files approach the Network Throughput Quota for a VM type.  When the backup network traffic saturates the VM the VM may be unresponsive.  Problems such as AlwaysOn initiating a failover may occur. 

To prevent these issues the backup parameter MAXTRANSFERSIZE can be limited. 

The precise value for MAXTRANSFERSIZE depends on the number of blob files, the VM size and other factors. 

Test values MAXTRANSFERSIZE = 3145728 and monitor network throughput and stability.  If the throughput is too high reduce the MAXTRANSFERSIZE value by half and retest.    

 

BACKUP DATABASE <DB SID> TO

<URL PATH>

WITH COMPRESSION, BUFFERCOUNT = 4, MAXTRANSFERSIZE = 3145728, BLOCKSIZE = 65536, CHECKSUM, FORMAT, STATS = 5,

     ENCRYPTION ( ALGORITHM = AES_256, SERVER CERTIFICATE = ‘ + @P_CERT +’ );

 

Additional links are here and here

8. SQL Server 2019 Generally Available for NetWeaver Customers   

SQL Server 2019 is now Generally Available for NetWeaver systems. The minimum supported release for SAP applications is SQL Server 2019 CU3.  It is generally recommended to apply the latest Support Pack and Cumulative Update available

New features in SQL Server 2019 can be found here

2779625 – Setting up Microsoft SQL Server 2019

2807743 – Release planning for Microsoft SQL Server 2019

2779607 – Configuration Parameters for SQL Server 2019

2922820 – DBSL Support for SQL Server 2019

2656107 – Support for Microsoft ODBC Drivers for SQL Server in SAP NetWeaver

 

Business Objects Supports both Windows Server 2019 and SQL Server 2019.  In addition to SQL Server 2019 IaaS solution Azure SQL DB PaaS is fully supported for Windows platforms on SAP BO releases 4.2 SP8 and 4.3 and higher

9. Update on Support Matrix for SAP on Azure  

In recent months many new features have become available for SAP customers.  The list below is a very brief overview of recommended features and updated documentation

  1. Proximity Placement Groups (PPG) are recommended for every SAP installation
  2. SuSE 15.1 fully supported for HANA & NetWeaver on Azure
  3. SuSE 12 Support Pack 5 fully supported for HANA & NetWeaver on Azure
  4. RHEL 8.1 – in testing.  RHEL 8.0 will not be supported for Hana and will not be supported on Azure
  5. RHEL 7.7 & 7.8 are not certified for Hana yet by SAP
  6. Oracle Linux 7.7 on Mv2 now supported
  7. Recommended stack for Oracle Customers – OEL 7.7 + Oracle 19.6c + Grid + ASM.  Oracle 18 is not recommended
  8. Windows 2019 – fully supported for NetWeaver and most standalone SAP components.  Hyper-V support matrix can be found here
  9. SAP ASE documentation update  
  10. DB2 High Availability HADR on Azure – HADR is available for Linux only.  Windows support is not released

10. Update on Azure Site Recovery  

Azure Site Recovery is a very popular feature for SAP customers.  The support matrix for Azure Site Recovery A2A can be found here

Use the “Find” function to search for support – for example to find “zone” to determine zone support.

ASR Azure to Azure now supports:

  1. Zone to Zone replication
  2. Replication from one Region to a specific Zone in another Region
  3. Generation 2 Images
  4. All OS that are commonly deployed by SAP customers Windows 2019, Suse 15.1, Suse 12.5, RedHat and OEL 7.7
  5. ADE for Linux and Windows
  6. Proximity Placement Groups

Transcontinental ASR (example: Primary in USA and DR in Europe) – contact Microsoft.  Some customers have asked about “Tertiary ASR” which typically involves Zone-to-Zone ASR within a single region and then additional ASR to another Region.  This is not possible as of July 2020.

 

Additional information on Azure Site Recovery for SAP solutions can be found here:

https://docs.microsoft.com/en-us/azure/site-recovery/site-recovery-sap

https://aka.ms/asr_sap

 

11. SAP on Azure YouTube Channel

The official SAP on Azure YouTube Channel contains a lot of Azure Data Factory, OData, PowerBI, Single Sign On, IoT Integration to SAP and other topics.  Suggestions for additional topics for the YouTube channel can be posted in this blog or the YouTube channel comments section.

https://www.youtube.com/c/SAPonAzure/videos

12. SAP on Azure – Customer Success Stories   

Two interesting customer success stores are available.  A large Pharmasuetical customer Zuellig Pharma moved their entire datacenter to Azure including a VLDB Suite on Hana system.  This Suite on Hana system runs on Mv2 416 and 13 E32v3 application servers.   The project was executed by T-Systems

A recent Press Release can be found here 

 

Swiess Re has deployed a new S4 system on Azure.  The partner was Cognizant – the SAP on Azure Microsoft Partner of the year 2020

13. SQL Server AlwaysOn Setup & Configuration    

The setup and configuration of SQL Server AlwaysOn has been fully automated and integrated into SWPM.

The previous versions of scripts such as sap_revlogin should not be used anymore as they may cause inconsistencies and support problems.

For all new installations, migrations, upgrades and homogeneous or heterogeneous system copies SAP Note 1772688 – SQL Server AlwaysOn and SAP applications should be followed. 

Some recommended guidelines:

  1. Do not use sap_revlogin
  2. Use the latest version of SWPM available
  3. A failover from AlwaysOn Primary to Secondary is always required as certain objects can only be created in the active AlwaysOn replica
  4. After running the AlwaysOn Setup procedure in SWPM run DBA Cockpit in SAPGUI and check the “AlwaysOn Setup Check”
  5. Perform a test failover and check the ABAP or Java application server can start normally.  Check trace files (such as dev_w0) if there is any issue
  6. Check SQL Server Agent standard jobs are correctly configured on all AlwaysOn nodes
  7. Ensure Transaction Log and DB Backup jobs/procedures are configured so such that in event of a failover (planned or unplanned) the Transaction Log is backed up (to ensure recoverability and avoid Log Full situations)
  8. If the DB is protected with TDE follow the procedures for AlwaysOn + TDE

14. SAP Hana 2.0 Support Pack 5 – Released

SAP has released Hana 2.0 Support Pack 5.  New features can be found here  

2932865 – SAP HANA 2 SPS05 Revision 050.00

 

Hana 2.0 Revision Strategy

2378962 – SAP HANA 2.0 Revision and Maintenance Strategy  

2235581 – SAP HANA: Supported Operating Systems

 

Additional Links & Notes

SAP Monitor for Azure is now in Public Preview.  Highly recommended to test this feature: https://azure.microsoft.com/en-us/blog/azure-monitor-for-sap-solutions-is-now-in-preview/

 

Learning Journey for SAP ASE

SAP ASE Learning Journey – Administration & Monitoring

SAP ASE Learning Journey – Installation & Upgrade

A utility to check the latency between Availability Zones  

https://github.com/Azure/SAP-on-Azure-Scripts-and-Utilities/tree/master/AvZone-Latency-Test

SQL Server releases, support packs and cumulative updates  

https://techcommunity.microsoft.com/t5/sql-server/bg-p/SQLServer/label-name/SQLReleases

Azure Backup for SAP Hana – support matrix

https://docs.microsoft.com/en-us/azure/backup/sap-hana-backup-support-matrix

https://docs.microsoft.com/en-us/azure/backup/backup-azure-vm-backup-faq

Recommended blog for Hana troubleshooting.  Highly recommended to load scripts onto every Hana DB instance 1969700 – SQL Statement Collection for SAP HANA https://blogs.sap.com/2017/09/04/health-checks-of-hana-system/

Latest news about SAP Kernels  

https://wiki.scn.sap.com/wiki/display/SI/SAP+Kernel%3A+Important+News

 

A very useful powershell script for parsing SAP NetWeaver trace files is below.  Thanks to Ashley Zebrowski for providing this.   Powershell is available for Windows and Linux https://docs.microsoft.com/en-us/powershell/azure/install-az-ps?view=azps-4.3.0

 

while($true) {

    $s = Get-Content .dev_w* | sls ‘<search string goes here>‘ -context 2 | out-string -stream | sls ‘2020’

    $a = @()

    foreach ($line in $s) {

        $date = [datetime]::parseexact(($line -replace ‘  C ‘,”), ‘ddd MMM dd HH:mm:ss:fff yyyy’, $null)

        $a += $date

    }

    clear

    “Most recent connection drops:”

    $a | sort -unique -descending | select -first 20 | %{$_.tostring(‘yyyy/MM/dd HH:mm:ss.fff K’)}

}

 

Microsoft has released a new RDP client that is highly recommended https://docs.microsoft.com/en-us/azure/virtual-machines/workloads/sap/hana-vm-operations-storage