Ransomware attacks. They are becoming more common and have an increasing impact. In this blog, our colleague Thomas Brinkman discusses three best practices for reducing data loss with a good database strategy.
Limited data loss thanks to backup
If you are the victim of a ransomware attack, you have to pay a substantial fee to have your hostage data released. And even then you sometimes have to wait a long time until your data has actually been decrypted. And that's often not the end of it. After a ransomware attack you probably want to clean up your infrastructure and choose to redeploy the database. Even if you don't want to pay for your hostage data. The first thing to do then is to fall back on backups that have been made. Depending on the backup strategy, with a backup you may only have a limited degree of data loss. But what if both the database and the backups are encrypted? Then the backups are no longer usable.
The whole chain under attack
Let me outline this situation. Imagine a computer connected to a NAS system, a storage device connected to a network (Network-Attached Storage). The NAS system is automatically linked within the operating system via its IP address and login credentials. This automatic link ensures that - without human intervention - in the background always a backup can be written. Within the NAS system it is possible to write a backup to, for example, another NAS system or to a storage location in the cloud. In the event of a ransomware attack, there is a good chance that the ransomware will nestle on the computer via a user action and thus unintentionally take the entire chain of backups with it.
How can you avoid this scenario?
Below, I describe three best practices that can help you avoid the possibility of a hostage database - including backups.
1 Data immutability
Object LockData-immutability - immutable data - is perhaps the best strategy against unintentional data loss due to, for example, ransomware or human error. The storage interface of object storage, for example, such as S3 Object Storage, offers the option of turning on the Object Lock feature. With the enabling of this function it is possible to set unique immutability - unchangeable - parameters for individual objects stored on the medium, this makes the data completely immune to modification or deletion by anyone, even the service account or management account of the organization from which the backup task is started. It is recommended to store the backups encrypted for an extra layer of security. Image source: cloudian.com
2 An isolated backup
You can opt for a separate server that can reach the DBMS software, but only exists to create and write backups to an external storage medium. That medium can be either on-premises or in a private or public cloud. Because the server is isolated from the rest of the network as much as possible, you keep the attack surface as small as possible. In addition, instead of sending the data from the DBMS software to the backup target via a push mechanism, it is advisable to choose a pull mechanism that moves the data to a separate backup server. With a push mechanism, if the production environment is attacked, you lose both files - on both the production machine and the backup machine. Because the production environment runs many services, it is more likely to be abused compared to the backup environment. So the pull mechanism is more secure.
3 The 3-2-1 backup plan
Within the IT world, the 3-2-1 backup plan is a respected strategy. It consists of having three copies of your data (your production data and two backups) on two different media (disk and tape) and one copy off-site for disaster recovery. The development of the cloud has made this strategy obsolete. Nowadays, it is recommended to use a 3-2-2 backup plan: this strategy dictates that three copies of your data be stored on two different devices, and two copies off-site, with one copy in an external location and one copy in the cloud. Image source: https://www.backupvergelijker.nl/3-2-1-backup-regel/
If the two devices on which you store local copies are in the same building, they are often both affected if a disaster occurs. By continuously replicating your data to the cloud, the amount of data at risk is minimized. In addition, by replicating your data offsite as well as to the cloud, you are at less risk if you are hit by a natural disaster, such as a hurricane or fire, or by a cyber attack.
Want to know more?
Want to know which backup strategy is best for your organization? Then feel free to contact us.