During your design phase of an application and/or database sometimes incorporating a very useful feature within SQL Server can slip through the net and be forgotten about, and that feature is partitioning.
Using SQL Server partitioning within a large database can help resolve some basic problems such as:
- Maintenance
To perform online partition reorganising, and now in 2014 you can do rebuilds online - Partition Elimination
Your query can select the required partitions only, and not have to lock the entire table - Backup and Recovery
It is possible to mark older archived partitions as being Read Only, which then only requires the active partitions that are ReadWrite to be backed up. This reduces backup time, file size and recovery time. - Partition Switching
Move or remove, small or large volumes of data in the same database in seconds.
The Problem
I recently did some work for a transport company that was developing an in-house application that comprised of several .NET services that all shared and accessed the same data across a large database.
The database that was being used at the time did not have any partitioning in place, about 70 tables existed, some tables taking millions of rows per day, using foreign key relationships on most tables, storing about 4TB per year in data.
The in-house .NET service that had been created was struggling to retrieve the required data in an acceptable time frame. The .NET service was also suffering from continuous deadlocks from each service fighting to select, insert or update data in the same tables and SQL Server escalating locks between the related tables to guarantee integrity. The development team was also under a lot of pressure and had little to no time available due to other tasks and defects that had to be worked on, but still needed to resolve the performance issues.
Then from a DBA perspective maintenance was a concern. With no partitioning, it meant entire indexes on tables had to be rebuilt, or reorganised and with about 10 million records per day, per table, this could slowly start to impact the database doing it on a regular basis.
To summarise:
- No partitioning
- Non-Stop Deadlocks
- Limited developer resource to implement business logic to validate data instead of using only foreign keys to perform data checks
- Limited time to provide and implement a solution
- Slow response times for queries
The Solution
The only available solution at that time, and that could be managed by the DBA’s was to replace all the foreign keys with triggers to validate the data where needed.
Before I continue (as I hear most DBA’s shout out) the trigger solution is only a temporary solution and chosen due to limited resource and time available. Several tests were performed with the triggers, using the table variables, temp tables, adding indexes, making sure we got the best performance from each trigger. We also compared to other options such as snapshot isolation, which still caused deadlocks and an increase in the tempdb usage.
The trigger solution allowed:
- Partitioning to be implemented quickly and easily
- Queries to be tuned to use partitioning elimination in SP’s
- Old data could be easily archived, for performance and recovery using partition switching
- The locks being taken to validate data could be controlled within the trigger
- Query performance improved dramatically
- Backup and Recovery time reduced
- Backup file sizes reduced
- Maintenance can be performed at the partition level
Since the solution was implemented, the nonstop deadlocks that were occurring every couple of minutes over a process that used to take hours now completes within an hour. The one or two deadlocks remaining are due to the application design, not the database.
The Future
The next step is to redesign the application and database so that the triggers can be gradually removed, and the application changed to perform some validation in the business layer instead of relying only on foreign keys.
Although the triggers have proven to be very useful in this situation, this is only a temporary measure and a proper validation process should be used for this application.