T-SQL Tuesday #87: Data Theft is Red, Backup Encryption is blue…


T-SQL Tuesday #87: Fixing Old Problems with Shiny New Toys

This month’s T-SQL Tuesday, hosted by Matt Gordon (b/t), is about using new features in SQL Server (from 2014 onwards) to fix old problems.

I’ll throw my hands up there right now, I’m not a security guru. Never have been, likely never will be. That said, I’m still cautious about permissions being given (I’ve had to fight against being given Domain Admin rights before), and how data is being stored. I have to be, ensuring the safety of this data is part of my job.

Not all threats you’ll encounter are external, in fact some of the greater threats are probably there right now, right beside you. Disgruntled employees, BYOD without proper policies or implementation, someone looking to make a quick quid or two, or even just carelessnes – data being restored here there and everywhere. I’m not trying to sound paranoid, but as much as we don’t like to, these are things that need to be considered. However you want to look at it, security is a problem, and it’s a big one.

For those of us only able to gaze wistfully at the gold leafed Valentines roses, TDE is not an option, and so we need to look into other alternatives which brings in a whole host of muddling around with third party applications. Then in came SQL Server 2014 with a little extra piece of the puzzle – native backup encryption.

There was an all-in-one application, a complete Practice Management System that dealt with case data, accounts, employee records. The SQL Server itself was locked down pretty well – there was no RDP and access to the drives themselves were locked down tight. All that is, except for the drive holding the backups. The server had been setup to perform daily backups at 2AM, and then transaction logs every 10 minutes. At around 4AM another server would spin up, connect, copy all the backups to its encrypted volume, clean up “expired” backups, then power down. Backups were then left on the SQL Server for a restore job that ran around 10AM, before clearing down.

Management was happy with the setup, so nothing further was thought of this. Requests for setting up Backup Encryption were met with the sounds of “Well, it works as it is…” That was, until the emails started.

There were several tables with triggers against them. The idea being that as new cases with certain conditions were entered into the system, emails highlighting things such as potential risk etc would fire off. Team leaders, doing as they were meant to do, went off in search of these cases within the PMS so they could be continued. Except, they didn’t exist. This probably happened a few times, them putting it down to the case maybe being removed, before they raised it further up the chain. No audit logs showing these being created in the first place, let alone deleted. So up and up the chain it went, until finally we were involved.

Traces, shadowing the user that seemed to be raising these alerts, these phantom cases just didn’t exist. If these cases weren’t being entered, or deleted, from this server, and there’s no trace of the mails being queued within dbmail, then the emails couldn’t have been coming from this server. So we went back and searched the email headers for an IP address… bingo!

Prior to the 10AM job, these backups were being copied in an ad-hoc fashion to another server and restored there. Normally we have a whole process for restoring to another server, which strips out any sensitive information and sanitised the database, but these were directly bypassing the processes. This was probably also not the only copy.

Now that we knew what was happening, we were able to get the copies we knew of sanitised. Fortunately, there was no malicious intent, but it allowed us to further the case for using Backup Encryption. Un-sanitised data would always be backed up with encryption, able to be restored only to a certain set of servers where the sanitisation process would run. Eventually backups were copied to a locked down (permissions wise) network share instead of the convoluted process of copying to the encrypted volume server. Management was satisified, team leaders were satisfied, and another gremlin was binned.

I still had to gaze wistfully at the gold leafed Valentines rose…

Setting up backup encryption:
https://blogs.msdn.microsoft.com/mvpawardprogram/2014/06/02/sql-server-2014-backup-encryption/
http://www.sqlservercentral.com/blogs/mssqlfun/2014/05/19/database-backup-encryption-with-sql-server-2014/

Leave a comment

Your email address will not be published. Required fields are marked *