By: Ben Snaidero | Updated: 2016-03-03 | Comments | Related: > SQL Server 2016
Problem
SQL Server 2016 introduced a new feature called Row-Level Security (RLS) which gives you the ability to provide fine grained access control to row data within a table based on a function that you define. While I do understand the importance of security when it comes to data I still would like to have the best possible performance while at the same time keeping my data secure. This tip will look at the performance overhead involved in implementing this new feature.
Solution
In this tip we will strictly look at the performance overhead of the new Row-Level Security feature and not go into too much detail on the feature itself. If you're not familiar with this new feature or would like a more detailed explanation of how to implement it you can have a look at the following tips which provide a really good explanation of these topics.
- SQL Server 2016 Row Level Security Example
- Implement Row Level Security Natively with SQL Server 2016
Test Setup
The first thing we will need for our test is a table with some sample data that we will use for our query. Since most uses cases for this feature require restricting the data that a specific user is allowed to see let's create a sample data table as well as another user table with a foreign key linked to this table that we can use later on when we create our security predicate function. The following T-SQL code will create these tables and load a fairly large amount of sample data into them for us to query as a part of our performance test.
-- Create SampleUser table CREATE TABLE [dbo].[SampleUser]( [UserID] [int] NOT NULL PRIMARY KEY, [Username] [varchar](30) ) GO -- Create SampleData table and foreign key to SampleUser table CREATE TABLE [dbo].[SampleData]( [RowKey] [int] NOT NULL PRIMARY KEY, [CreateDate] [datetime] NOT NULL, [OtherDate] [datetime] NOT NULL, [VarcharColumn] [varchar](20) NULL, [IntColumn] [int] NULL, [FloatColumn] [float] NULL, [UserID] [int] NOT NULL ) GO ALTER TABLE [dbo].[SampleData] WITH CHECK ADD CONSTRAINT [FK_SampleData_SampleUser] FOREIGN KEY([UserID]) REFERENCES [dbo].[SampleUser] ([UserID]) GO -- Load SampleUser table DECLARE @val INT SELECT @val=1 WHILE @val < 51 BEGIN INSERT INTO SampleUser VALUES (@val,'User' + cast(@val AS VARCHAR)) SELECT @val=@val+1 END GO -- Load SampleData table DECLARE @val INT SELECT @val=1 WHILE @val < 5000000 BEGIN INSERT INTO SampleData VALUES (@val, getdate(), DATEADD(DAY, ABS(CHECKSUM(NEWID()) % 365),'2015-01-01'), 'TEST' + cast(round(rand()*100,0) AS VARCHAR), round(rand()*100000,0), round(rand()*10000,2), round(rand()*49,0)+1) SELECT @val=@val+1 END GO
In order to make this test as realistic as possible we should probably create a couple of additional indexes on columns that would be indexed if this were a real production system. With no indexes on the tables we would probably just end up doing full table/index scans and not see any differences in our query execution times. Here is the T-SQL to create these indexes.
-- create indexes CREATE NONCLUSTERED INDEX [IX_SampleData_CreateDate] ON [dbo].[SampleData] ([CreateDate] ASC) GO CREATE NONCLUSTERED INDEX [IX_SampleData_UserID] ON [dbo].[SampleData] ([UserID] ASC) GO CREATE NONCLUSTERED INDEX [IX_SampleUser_Username] ON [dbo].[SampleUser] ([Username] ASC) GO
We will also have to create two different users to run our test queries in order to cover both of the use cases in our security predicate function. One user will have sysadmin privileges so that they will be able to see all of the data. The other user will just have read access to all the tables in our test database. This will make more sense when you see the security predicate function a little further down. Here is the T-SQL to create these users.
-- sysadmin user CREATE LOGIN [test_sa] WITH PASSWORD=N'#######', DEFAULT_DATABASE=[master], CHECK_EXPIRATION=OFF, CHECK_POLICY=OFF GO ALTER SERVER ROLE [sysadmin] ADD MEMBER [test_sa] GO -- regular user CREATE LOGIN [User10] WITH PASSWORD=N'#######', DEFAULT_DATABASE=[master], CHECK_EXPIRATION=OFF, CHECK_POLICY=OFF GO USE [TestDB] GO CREATE USER [User10] FOR LOGIN [User10] GO USE [TestDB] GO ALTER ROLE [db_datareader] ADD MEMBER [User10] GO
Baseline Test
Now that all the setup is complete let's first run some queries using each test account we created in order to get some baseline performance numbers. In order to keep the result sets of each query the same, both with and without the security policy in place, when were are querying using the User10 account without the security policy we will need to add an additional join with the SampleUser table so the same user filter is applied. Below is the T-SQL for these baseline queries. Also make sure when you run these to you start a SQL Profiler trace so you can capture the performance statistics.
-- run using test_sa account -- query 1 SELECT * FROM SampleData d INNER JOIN SampleUser u on d.UserID=u.UserID WHERE u.Username = 'User10' GO -- query 2 SELECT * FROM SampleData d WHERE CreateDate > '2016-02-03 19:59:47.560' -- run using User10 account -- query 3 SELECT * FROM SampleData d INNER JOIN SampleUser u on d.UserID=u.UserID WHERE u.Username = user_name() GO -- query 4 SELECT * FROM SampleData d INNER JOIN SampleUser u on d.UserID=u.UserID WHERE u.Username = user_name() AND CreateDate > '2016-02-03 19:59:47.560' GO
Now that we have some baseline numbers let's setup the Row-Level Security (RLS). Our security predicate function is pretty straightforward and will just filter based on the user querying the table. Notice in our function that we need to query the SampleUser table in order to map the UserID in the SampleData column to an actual username. Here is the T-SQL to create this function.
CREATE FUNCTION dbo.fn_securitypredicateUser (@UserID int) RETURNS TABLE WITH Schemabinding AS RETURN SELECT 1 AS [fn_securitypredicateUser_result] FROM dbo.SampleUser SU WHERE (SU.UserID=@UserID AND SU.Username = user_name()) OR IS_SRVROLEMEMBER(N'sysadmin') = 1 GO
The last step required is to create the security policy that uses this function and map it to the table we want to apply it to. Here is the T-SQL required to apply this policy to the SampleData table.
CREATE SECURITY POLICY fn_security ADD FILTER preDICATE dbo.fn_securitypredicateUser(UserID) ON dbo.SampleData GO
Now let's rerun our test queries. Remember as we mentioned above, since RLS is now in place we no longer need to join the SampleUser table when querying with the User10 account as the security predicate function will take care of filtering the data. Below are the updated T-SQL queries.
-- run using test_sa account -- query 1 SELECT * FROM SampleData d INNER JOIN SampleUser u on d.UserID=u.UserID WHERE u.Username = 'User10' GO -- query 2 SELECT * FROM SampleData d WHERE CreateDate > '2016-02-03 19:59:47.560' -- run using User10 account -- query 3 SELECT * FROM SampleData d GO -- query 4 SELECT * FROM SampleData d WHERE CreateDate > '2016-02-03 19:59:47.560' GO
Row-Level Security Performance Results
We can see from the SQL Profiler results below that there is some overhead that can be attributed to this new feature.
Looking at the admin user statistics (query 1 and 2) which can see all data regardless of whether or not RLS has been configured we really only see some additional CPU usage. This is not really much of a concern because users with sysadmin would not be running adhoc queries very often in most environments.
The regular user statistics (query 3 and 4) show a much more interesting result. Looking at the results from query 3 we can see that although queries coming in through an application should never run a SELECT query without a WHERE clause a regular user might. Below shows the extra number of reads the database engine performs when a query like this is executed. This is because the function is called for every row returned since it has to check that the User is allowed to see the data and in this case that's a full table scan of a lot of records. The last query is probably the most realistic query we would see coming from an application or even from an adhoc user, but even with this example we see the overhead caused by the function calls for each record returned by the query.
Query |
Account |
RLS Enabled |
CPU (ms) |
Reads |
Writes |
Duration (ms) |
---|---|---|---|---|---|---|
query 1 | test_sa | NO | 390 | 34675 | 0 | 1031 |
YES | 875 | 34179 | 0 | 1206 | ||
query 2 | test_sa | NO | 0 | 9376 | 0 | 69 |
YES | 15 | 9395 | 0 | 73 | ||
query 3 | User10 | NO | 500 | 34617 | 0 | 1199 |
YES | 52813 | 10034257 | 0 | 52817 | ||
query 4 | User10 | NO | 0 | 9378 | 0 | 7 |
YES | 31 | 15341 | 0 | 44 |
Even with the results shown above there are definitely some benefits to this new feature as it provides a way to restrict access to data without requiring modifications to any existing queries or the users having any knowledge that the data they are receiving is being filtered. The only real question is whether this benefit outweighs the additional overhead that it causes on your database environment.
Next Steps
- Read more on other ways to restrict data access:
- Read more on SQL Server Encryption
- A method to block a DBAs data access
About the author
This author pledges the content of this article is based on professional experience and not AI generated.
View all my tips
Article Last Updated: 2016-03-03