Comparing SQL Server CPU usage with arithmetic for float versus numeric datatypes

By:   |   Updated: 2018-12-11   |   Comments   |   Related: > Database Design


Problem

A lot of my time as a SQL Server DBA is focused on performance. One of the things I enjoy most about database performance is squeezing the best possible performance out of every query that hits the database. Sometimes performance gains can come not only from having the correct indexes on a tables column(s), but also using the right datatypes. In this tip we will show how arithmetic operations on the float datatype will use less CPU than its numeric counterpart.

Solution

For those who don't understand the difference between the real/float and the numeric/decimal datatypes at a high level the former are approximate-number datatypes which means that not all numbers can be represented exactly and are instead stored with the closest possible approximation. The latter are exact-number datatypes and the values stored here are represented with exact precision. That said, if your application requires exact precision then the following test really won't benefit you, but if your application does not require this then the testing below will show you how using approximate datatypes could give you some performance benefit.

Test Setup

For the following test we are going to setup two tables. One table will contain two float datatype columns and the other table will contain two numeric datatype columns. The TSQL to create and load matching (with the exception that the float data is approximate) random data into the tables is as follows.

create table testnumeric (
   num1 numeric(19,8),
   num2 numeric(19,8))
go

create table testfloat (
   f1 float,
   f2 float)
go

declare @random1 numeric(19,8)
declare @random2 numeric(19,8)
declare @x integer
select @x=1
while @x < 500000
begin
   select @random1 = rand() * 100000.0 + 1.0,
          @random2 = rand() * 100000.0 + 1.0;
   insert into testnumeric values (@random1,@random2);
   insert into testfloat values(cast(@random1 as float),cast(@random2 as float));
   select @x = @x + 1;
end
go

Note: The numeric type was chosen with the precision and scale above to keep the space used by each column as close as possible so the test would not be affected by any differences in column size. Using sp_spaceused below you can see they are quite close.

sp_spaceused 'testnumeric'
go

sp_spaceused 'testfloat'
go
Name Rows Reserved Data Index_size Unused
testnumeric 499999 14088 KB 14040 KB 8 KB 40 KB
testfloat 499999 13000 KB 12992 KB 8 KB 0 KB

Testing

In order to measure the CPU used by each query we will the SQL Server SET statement to enable the output of the query timing. Once we enable the STATISTICS TIME option using the following statement, after each query completes, we will see the CPU and total elapsed time for the parse, compile and execution phases of each query.

set statistics time on

In order to test this we will perform 4 basic math operations (+,-,*,/) on the pair of columns for every row in each table. The TSQL for these two queries is as follows.

select num1+num2,num1-num2,num1*num2,num1/num2 from testnumeric;  
go

select f1+f2,f1-f2,f1*f2,f1/f2 from testfloat; 
go

Looking at the results from the statistics we enabled we can see that not only did the operations on the float columns use considerably less CPU (~37%) it also completed faster.

Name CPU (ms) Duration (ms)
testnumeric 1062 5810
testfloat 391 4652

This first test showed us that, overall, arithmetic on a float column type uses less CPU than arithmetic on a numeric column type. Now let’s see if we can narrow it down to just one operation that causes the extra CPU to be used when dealing with numeric columns or if every operation uses less CPU. To test this we will just run each item in the select list above as its own query. The TSQL for these statements are below.

select num1+num2 from testnumeric
go
select f1+f2 from testfloat
go
select num1-num2 from testnumeric;  -- CPU 328ms   TOTAL 4454ms
go
select f1-f2 from testfloat;  -- CPU 203ms  TOTAL 3713ms
go
select num1*num2 from testnumeric;  -- CPU 344ms   TOTAL 4009ms
go
select f1*f2 from testfloat;  -- CPU 219ms  TOTAL 3808ms
go
select num1/num2 from testnumeric;  -- CPU 547ms   TOTAL 4186ms
go
select f1/f2 from testfloat;  -- CPU 296ms  TOTAL 4077ms
go

The test results below confirm that every operation against the float datatype uses less CPU than its numeric counterpart, anywhere from 35-45% less.

Name Operation CPU (ms) Duration (ms)
testnumeric   + 359 4176
testfloat   + 235 3713
testnumeric   - 328 4454
testfloat   - 203 3713
testnumeric   * 344 4009
testfloat   * 219 3808
testnumeric   / 547 4186
testfloat   / 296 4077

Conclusion

This simple test has shown that if your application requirements dictate that you do not need the accuracy of the numeric datatype then switching your columns to the float datatype could result in considerably less resource usage when performing arithmetic on these datatypes in your SQL Server environment. As always you should test with your own application and data to confirm you see an improvement before making any permanent changes in your production environment.

Next Steps


sql server categories

sql server webinars

subscribe to mssqltips

sql server tutorials

sql server white papers

next tip



About the author
MSSQLTips author Ben Snaidero Ben Snaidero has been a SQL Server and Oracle DBA for over 10 years and focuses on performance tuning.

This author pledges the content of this article is based on professional experience and not AI generated.

View all my tips


Article Last Updated: 2018-12-11

Comments For This Article

















get free sql tips
agree to terms