The purpose of this article is to highlight backup solution that can be designed for HANA database on AWS without using third-party tool.
NOTE: Backup/Recovery solution is generally designed based on business RTO/RPO. This is just one of the many backup/recovery solutions that can be used for HANA database.
You have installed the HANA database on AWS, and now you want to design backup and recovery strategy. Below is the solution highlight that can be incorporated to achieve backup/recovery in AWS.
HANA Backup on Local EBS Volume:
Perform/Schedule complete, delta (incremental and differential) backups and log in local EBS volume (/backup). you can use throughput optimized HDD (st1) for /backup file system。
NOTE: Backup on local EBS volume (Optimized HDD – st1) has throughput of 165 MB/s, but you can optimize by stripping local backup volume or you can use SSD volume type with stripping.
You can schedule HANA database backup using any of the below options.
Option 1: Use Tcode DB13 by selecting database connection you have created in DBCO to configure jobs.
Option 2: In Hana Cockpit, SAP incorporates new functionality and with the latest cockpit version you can schedule “Backup” jobs and enable retention policy, which makes cockpit a central tool for backup.
Option 3: Use custom scripts to configure backup strategy and schedule a cron job.
Schedule Incron Job: Instead of scheduling cron job (time based to move files from local system to S3), you can use event driven incron job to set real-time inotify kernel watches on the backup directories and execute a given task (the S3 upload) when needed for the given file.
In above figure, data and log backup happens in /backup (local file system). So, when a log backup is generated in /backup/log/SYSTEMDB, it will trigger an event whose action is mentioned in incron job which will move file from /backup/log/SYSTEMDB/ to S3 bucket.
The advantage of this using incron job is that your backups are in sync with S3 at all time and you have a copy of backup in case your /backup (local file system) gets corrupted or deleted by some reason.
Adjust S3 lifecycle: Once the files are in S3, you can adjust the storage class based on business feasibility.
There is no need of log backup in S3 after certain time, so it is advisable to directly delete it, whereas for data backup you can move file to less expensive storage after certain number of days and delete data backup as well after x days based on retention policy defined in the organization.
Even though SAP HANA database is In-Memory database (data residing in memory), it has got its own persistency. Data is always saved from memory to disk at save points and changes are written into redo log files. So, in case of failure, it can be used for recovery.
You can gather information on backup frequency and retention policy in the below form to get an understanding of what you are implementing.
|Schedule||Path on Local File System||Retention Period – /backup||Retention Period – S3|
|Data Backup||Incremental Data Backup – Daily
Complete Data Backup – Weekly
|System DB: /backup/data/SYSTEMDB
Tenant DB: /backup/data/DB_(SID)
|7 or 15 Days||Depends on business|
|Log Backup||Automatic – 15 Minutes||System DB – /hana/log/SYSTEMDB
Tenant DB – /hana/log/DB_(SID)
|7 or 15 Days||Depends on business*
*No use of keeping log backup for long on S3
Local Backup File Retention
As described in “Solution Implementation” section, configure retention policy to delete local backup files older than X days. There are three ways to configure retention for local backup file and each has its pros and cons, so you can apply the one which is best suited for your landscape.
HANA Cockpit: As highlighted in “Solution Implementation”, with new version of cockpit we can schedule retention policy. It also takes care of housekeeping task of backup catalog and deletion of respective local backup files.
HANACleaner: Configure HANA cleanup with SAP HANACleaner, an expert tool designed by SAP Support. With this script you can perform housekeeping of backup catalog, backup and several other cleanup tasks like cleanup of trace files, audit logs etc. Please refer to SAP note 239996 for details.
Custom Script: You can develop a custom script for a situation like “If your incrontab service are not running and backups are not syncing to S3”, so if you go with HANA Cockpit and HANACleaner, it won’t perform a check whether your backup has been moved to S3 or not. It will straight away delete local backup files. In this case, you can develop a script which perform a sync between your local backup folder and S3 bucket before it deletes local files.
Note: This backup/recovery solution can be used in other cloud as well as on-premise.