Wednesday, March 9, 2011

Parameter sniffing has been one of the oldest blockers in query optimisation.If understood correctly it can be quite easy to resolve it.

Lets create a table of about 10000 records and name it as LargeTable.Here is the script for creating the same.

CREATE TABLE LargeTable (
ID INT,
SomeInt int,
RandomDate DATETIME
)
GO

;WITH DataPopulate (RowNo,Strng,ADate) AS (
SELECT 1 AS RowNo, 5 as Strng,DATEADD(dd, FLOOR(RAND()*75454),'1753/01/01')
UNION ALL
SELECT rowNo+1, CASE when rowno%10=0 then 5 else 89 end as Strng,DATEADD(dd, FLOOR(RAND(RowNo*96322)*85454),'1753/01/01')
FROM DataPopulate WHERE RowNo<10000

INSERT INTO LargeTable
SELECT * FROM DataPopulate
OPTION (MAXRECURSION 10000)
GO

CREATE CLUSTERED INDEX idx_Large1 on LargeTable (ID)
GO


Now let us create a SP to return no of records from the table.


CREATE PROCEDURE TestSniffing @int int AS
SELECT * from largetable where someint = @int
GO

Let us execute the SP :-

Exec TestSniffing 5



Look at the red part highlighted in the screen shot(Click on the image for better viewing).The Estimate Number of rows and the Actual Number of rows are matching.So far so good.

Now let us change the parameter value to 89 since 89 is another value which has a large number of records in the table.

Exec Testsniffing 89

Now refer to the highlighted red part of the following screenshot



The Estimated No of rows and the Actual No of rows differ by 8000.
Well why did this happen ? The answer is simple.Because of parameter sniffing which can cause a sub optimal query plan.

Now how should this be resolved ?Most of the solutions available tend to state to use pass a value of the parameter to a SP level variable and then pass that variable as a predicate to the actual query in the SP.Let us examine whether it solves the problem.

First let us alter the SP to fit our above requirement.

ALTER PROCEDURE TestSniffing @int int AS
Declare @varint int
Set @varint=@int
SELECT * from largetable where someint = @varint
GO

Now let us execute the SP with values 5 and 89 and examine the graphical execution plan

Exec TestSniffing 5



Again pay attention to the red part marked in the screen shot.The difference between the Estimated and the actual No of rows has decreased but is not exactly the same.

Let us try changing the parameter value to 89

Exec TestSniffing 89

Again there seem to be a huge difference between the actual and the estimated no of rows.



So what should be the solution?

Well one can execute the SP using With Recompile option.Something like this

Exec TestSniffing 89 With Recompile

But before doing that you need to alter the SP to its original state

ALTER PROCEDURE TestSniffing @int int AS
SELECT * from largetable where someint = @int
GO

The query plan generated will have a exact match for both the Estimated and the Actual No of rows.



Let us try for value of 5

Exec TestSniffing 5 With Recompile.



It gives a exact match between the estimated and the actual no of rows.

Adding the "With recompile" option seem to be the only way that to create a exact count between estimated and actual row count.Even updating the statistics did not seem to help in my example.

Saturday, January 22, 2011

Today I am going to post why you should NOT shrink log files periodically.Let us create a Test database.Let us name it as "Test"

Create Database Test
Go


Now get to get the logspace used by our newly created database you can use the following DBCC command.

DBCC SQLPERF(logspace)

In our case the logspace size for the newly created database is 0.484375MB.




Now let us create a single table say T with a single column id and insert about a million records in it and then run the DBCC SQLPERF(logspace) to get the log size.

As you can see the log size has now increased by 300 percent.



The following query will give the logical name of the logfile for the database we just created.

Use Test
Go

select name from sys.sysfiles

In our case it is Test_log.

Now let us shrink the log file.We will use the DBCC SHRINKFILE command.

DBCC SHRINKFILE(Test_log)

The o/p of the above command shrinks the log file to the approximately the original size as when the database was created.



Now let us try to delete the records from the the only table in the Database.This will cause lot of log records written to the log file.

Delete from T where id>0

again let us check the log file size.

Wow log file has now bloated to almost 460 percent of the original size.


So the conclusion from all the above would be that log file will increase no matter how many times you shrink it.

So always have a good backup strategy to keep the log file size in check.Log shrinking should only be done in very very emergency situations only.

Thursday, June 3, 2010

NULL values in Unique keys

Recently I posted a post at SQLTeam.com

http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=145497

asking about NULL values in Unique keys.

Cleary BOL states

One of the differences between a primary key constraint and a unique key constraint is that null values are allowed in a column with a unique constraint.

But it never states how many NULL values it allows.
Just if you read at my above post at SQLTeam.com you will get nowhere.
Man talk about confusion Microsoft as usual is leading the way.

Monday, May 24, 2010

Lets play with XML

Below is a nested XML which contains Customer info and Order Details.I need to get the Order Details for each customer in the XML.One way to do it was using OPENXML and sp_xml_preparedocument.

With new enhancements for XML datatype from SQL 2005 and beyond it can be easily done by using Xquery.This is how it can be done.

declare @xmlData XML
set @XMLData =
'<root><row>
<custid>1</custid>
<custname>name1</custname>
<order>
<orderdetail orderid = "1" orderdate = "2008-01-01" />
<orderdetail orderid = "2" orderdate = "2008-01-02" />
</order>
</row>
<row>
<custid>2</custid>
<custname>name2</custname>
<order>
<orderdetail orderid = "3" orderdate = "2008-01-03" />
<orderdetail orderid = "4" orderdate = "2008-01-04" />
</order>
</row>
</root> '

select @xmldata

select x.i.value('../../custid[1]','varchar(40)') as custid,
x.i.value('../../custname[1]','varchar(40)')as custname,
x.i.value('@orderid','int')as orderid,
x.i.value('@orderdate','date')as orderdate
from @xmldata.nodes('/root/row/order/orderdetail')x(i)

The resultset for the above query is this.

Friday, May 7, 2010

Getting numeric or character values from an alphanumeric string

Sometimes it becomes necessary to get all the numeric or all the character values from a alphanumeric string.
The idea in the solution I have posted is to convert all the characters from the string into rows in a derived table
using a look table @tbl & then concating the charater values excluding the numeric values in my example below.
If you need to get the numeric values from the string simply replace like '[a-z]' to like '[0-9]'

Declare @str As Varchar(50)
Select @str='klasdjfkl034ioerut'
Declare @value As Varchar(40)=''

Declare @tbl As Table(number Int Identity)
Insert Into @tbl Default Values
While SCOPE_IDENTITY()< len(@str)
Insert Into @tbl Default Values

Select @value=@value + '' + value From
(
Select number,SUBSTRING(@str,number,1)As value From @tbl Where number Between 1 And LEN(@str)
)t Where value like '[a-z]'

Select @value

Wednesday, May 5, 2010

Difference between 2 dates in Hours

Sometimes it is necessary to show the difference between 2 dates in hours.The query can be modified to be used as an function accepting 2 parametrs Date1 and Date2
The accuracy can be checked hereto compare the results. Here is the query

Declare @dt1 datetime='12-May-2010 09:00:00'
Declare @dt2 datetime='13-May-2010 04:01:00'

Select
RIGHT('0'+CONVERT(varchar(2),DATEDIFF(mi,@dt1,@dt2)/60),2)+ ':' +
RIGHT('0'+CONVERT(varchar(2),DATEDIFF(mi,@dt1,@dt2)%60),2)+ ':' +
RIGHT('0'+CONVERT(varchar(2),DATEDIFF(SECOND,@dt1,@dt2)%60),2) As DiffInHours