Using sp_ExecuteSql to Run Dynamic SQL


Notice: Undefined index: HARDESCAPE in C:\home\site\wwwroot\wp-content\plugins\codecolorer\lib\geshi.php on line 2785

Warning: Invalid argument supplied for foreach() in C:\home\site\wwwroot\wp-content\plugins\codecolorer\lib\geshi.php on line 2785

Notice: Undefined index: HARDESCAPE in C:\home\site\wwwroot\wp-content\plugins\codecolorer\lib\geshi.php on line 2785

Warning: Invalid argument supplied for foreach() in C:\home\site\wwwroot\wp-content\plugins\codecolorer\lib\geshi.php on line 2785

Notice: Undefined index: HARDESCAPE in C:\home\site\wwwroot\wp-content\plugins\codecolorer\lib\geshi.php on line 2785

Warning: Invalid argument supplied for foreach() in C:\home\site\wwwroot\wp-content\plugins\codecolorer\lib\geshi.php on line 2785

Notice: Undefined index: HARDESCAPE in C:\home\site\wwwroot\wp-content\plugins\codecolorer\lib\geshi.php on line 2785

Warning: Invalid argument supplied for foreach() in C:\home\site\wwwroot\wp-content\plugins\codecolorer\lib\geshi.php on line 2785

Notice: Undefined index: HARDESCAPE in C:\home\site\wwwroot\wp-content\plugins\codecolorer\lib\geshi.php on line 2785

Warning: Invalid argument supplied for foreach() in C:\home\site\wwwroot\wp-content\plugins\codecolorer\lib\geshi.php on line 2785

Notice: Undefined index: HARDESCAPE in C:\home\site\wwwroot\wp-content\plugins\codecolorer\lib\geshi.php on line 2785

Warning: Invalid argument supplied for foreach() in C:\home\site\wwwroot\wp-content\plugins\codecolorer\lib\geshi.php on line 2785

Notice: Undefined index: HARDESCAPE in C:\home\site\wwwroot\wp-content\plugins\codecolorer\lib\geshi.php on line 2785

Warning: Invalid argument supplied for foreach() in C:\home\site\wwwroot\wp-content\plugins\codecolorer\lib\geshi.php on line 2785

Notice: Undefined index: HARDESCAPE in C:\home\site\wwwroot\wp-content\plugins\codecolorer\lib\geshi.php on line 2785

Warning: Invalid argument supplied for foreach() in C:\home\site\wwwroot\wp-content\plugins\codecolorer\lib\geshi.php on line 2785

Notice: Undefined index: HARDESCAPE in C:\home\site\wwwroot\wp-content\plugins\codecolorer\lib\geshi.php on line 2785

Warning: Invalid argument supplied for foreach() in C:\home\site\wwwroot\wp-content\plugins\codecolorer\lib\geshi.php on line 2785

Notice: Undefined index: HARDESCAPE in C:\home\site\wwwroot\wp-content\plugins\codecolorer\lib\geshi.php on line 2785

Warning: Invalid argument supplied for foreach() in C:\home\site\wwwroot\wp-content\plugins\codecolorer\lib\geshi.php on line 2785

Introduction

Lately it seems like I am being bombarded by Dynamic SQL and people who insist it is the only way to accomplish something. Dynamic SQL is almost never the best way to accomplish a given task, but sometimes (and I do mean just sometimes) it is. In the post I will show why to use sp_ExecuteSql and how to convert an existing procedure to use it properly.

The Example

Below is an example stored procedure that I came up with that takes multiple parameters and based on those parameters formulates and executes a query. It is intentionally easy and could well be written otherwise but it works well for this purpose.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
CREATE PROCEDURE dbo.GetTableInfo @object_id int = null, @name sysname = null
AS
DECLARE @sql_command nvarchar(max)

SELECT @sql_command = N'
SELECT [name],
[object_id]
FROM sys.tables
'


IF @object_id IS NOT NULL
BEGIN
SELECT @sql_command = @sql_command + N'WHERE [object_id] = ' + CAST(@object_id AS nvarchar)
IF @name IS NOT NULL
BEGIN
SELECT @sql_command = @sql_command + N'
AND [name] = '
'' + @name + ''''
END
END
ELSE
BEGIN
IF @name IS NOT NULL
BEGIN
SELECT @sql_command = @sql_command + N'WHERE [name] = ''' + @name + ''''
END
ELSE
BEGIN
SELECT @sql_command = N''
END
END

PRINT @sql_command
EXEC sp_executesql @sql_command

The stored procedure takes 2 parameters, @object_id and/or @name. Based on the inputs a WHERE clause is created and appended to the rest of the SQL statement. In case anyone tries to pull a fast one to see a full table list by submitting null for both arguments a check has been added to clear the string, causing nothing to be returned. At this point, the example looks fairly robust and somewhat bulletproof.

The Test(s)

Testing the stored procedure should be fairly simple. To keep things easy I will test in msdb by searching for the table sysjobs by name, object_id and then both. I will then test for the possibility of SQL Injection.

Test #1

EXEC dbo.GetTableInfo @name = ‘sysjobs’

Returns:

SELECT [name],
[object_id]
FROM sys.tables
WHERE [name] = ‘sysjobs’

name object_id
—————– —————
sysjobs 277576027

(1 row(s) affected)

Test #2

EXEC dbo.GetTableInfo @object_id = 277576027

Returns:

SELECT [name],
[object_id]
FROM sys.tables
WHERE [object_id] = 277576027

name object_id
—————– —————
sysjobs 277576027

(1 row(s) affected)

Test #3

EXEC dbo.GetTableInfo @object_id = 277576027, @name = ‘sysjobs’

Returns:

SELECT [name],
[object_id]
FROM sys.tables
WHERE [object_id] = 277576027
AND [name] = ‘sysjobs’

name object_id
—————– —————
sysjobs 277576027

(1 row(s) affected)

Test #4 – SQL Injection

EXEC dbo.GetTableInfo @name = ‘sysjobs” OR ”1”=”1”; SELECT name FROM sys.server_principals WHERE name != ”’

Returns:

SELECT [name],
[object_id]
FROM sys.tables
WHERE [name] = ‘sysjobs’ OR ‘1’=’1′; SELECT name FROM sys.server_principals WHERE name != ”

name object_id
—————– —————
syssubsystems 5575058
sysproxysubsystem 21575115
restorefilegroup 30623152
.
.
.

(97 row(s) affected)

name
——————————-
sa
public
sysadmin
.
.
.

(33 row(s) affected)

The Problem

Testing looked great until the SQL Injection test. By mangling the parameters passed into the stored procedure I was able to get it to dump a list of all logins on the server. There is no reason that a call to a stored procedure could not be added in there instead followed by “–” to comment out the rest of the string. If xp_cmdshell were unsecured on this server the box would now belong to anyone that could exploit this vulnerability. This is about as bad as it gets short of leaving the sa password blank.

The Fix

In the example above I made a key error to prove a point. Even though I used sp_ExecuteSql, I used it in a way that was no better than using the EXEC statement. The proper way to use it is to build a parameterized SQL statement and pass the parameters into sp_ExecuteSql seperately. I know this sounds like a lot of extra work but an important thing to note is that all of the parameters can be passed for every call and will only used if they are in the SQL string that is passed in.

The added benefit of parameterized SQL is the elimination of procedure cache bloat. The example above creates a unique query string that will cause a plan to be added to the procedure cache every time the procedure is called unless the same exact parameters are passed. The fixed or improved version will add 3 plans to the procedure cache, slightly more if there are wide variations in the data but far less than the 1 for every call the example will add.

Here is the fixed version:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
CREATE PROCEDURE dbo.GetTableInfo @object_id int = null, @name sysname = null
AS
DECLARE @sql_command nvarchar(max),
@sql_parameters nvarchar(max)

SELECT @sql_command = N'
SELECT [name],
[object_id]
FROM sys.tables
'
,
@sql_parameters = N'@object_id int, @name sysname'

IF @object_id IS NOT NULL
BEGIN
SELECT @sql_command = @sql_command + N'WHERE [object_id] = @object_id'
IF @name IS NOT NULL
BEGIN
SELECT @sql_command = @sql_command + N'
AND [name] = @name'

END
END
ELSE
BEGIN
IF @name IS NOT NULL
BEGIN
SELECT @sql_command = @sql_command + N'WHERE [name] = @name'
END
ELSE
BEGIN
SELECT @sql_command = N''
END
END

PRINT @sql_command
EXEC sp_executesql @sql_command, @sql_parameters, @object_id = @object_id, @name = @name

GO

The Re-Test(s)

Testing the stored procedure should be fairly simple. To keep things easy I will test in msdb by searching for the table sysjobs by name, object_id and then both. I will then test for the possibility of SQL Injection.

Test #1

EXEC dbo.GetTableInfo @name = ‘sysjobs’

Returns:

SELECT [name],
[object_id]
FROM sys.tables
WHERE [name] = @name

name object_id
—————– —————
sysjobs 277576027

(1 row(s) affected)

Test #2

EXEC dbo.GetTableInfo @object_id = 277576027

Returns:

SELECT [name],
[object_id]
FROM sys.tables
WHERE [object_id] = @object_id

name object_id
—————– —————
sysjobs 277576027

(1 row(s) affected)

Test #3

EXEC dbo.GetTableInfo @object_id = 277576027, @name = ‘sysjobs’

Returns:

SELECT [name],
[object_id]
FROM sys.tables
WHERE [object_id] = @object_id
AND [name] = @name

name object_id
—————– —————
sysjobs 277576027

(1 row(s) affected)

Test #4 – SQL Injection

EXEC dbo.GetTableInfo @name = ‘sysjobs” OR ”1”=”1”; SELECT name FROM sys.server_principals WHERE name != ”’

Returns:

SELECT [name],
[object_id]
FROM sys.tables
WHERE [name] = @name

name object_id
—————– —————

(0 row(s) affected)

Conclusion

Dynamic SQL should be considered a highly specialized tool of last resort and used in the way I have described. As I have shown, misuse can lead to your server and the data stored on it no longer belonging to you.

Stored Procedures to Store and Get Database Users with All Permissions


Notice: Undefined index: HARDESCAPE in C:\home\site\wwwroot\wp-content\plugins\codecolorer\lib\geshi.php on line 2785

Warning: Invalid argument supplied for foreach() in C:\home\site\wwwroot\wp-content\plugins\codecolorer\lib\geshi.php on line 2785

Notice: Undefined index: HARDESCAPE in C:\home\site\wwwroot\wp-content\plugins\codecolorer\lib\geshi.php on line 2785

Warning: Invalid argument supplied for foreach() in C:\home\site\wwwroot\wp-content\plugins\codecolorer\lib\geshi.php on line 2785

Notice: Undefined index: HARDESCAPE in C:\home\site\wwwroot\wp-content\plugins\codecolorer\lib\geshi.php on line 2785

Warning: Invalid argument supplied for foreach() in C:\home\site\wwwroot\wp-content\plugins\codecolorer\lib\geshi.php on line 2785

Notice: Undefined index: HARDESCAPE in C:\home\site\wwwroot\wp-content\plugins\codecolorer\lib\geshi.php on line 2785

Warning: Invalid argument supplied for foreach() in C:\home\site\wwwroot\wp-content\plugins\codecolorer\lib\geshi.php on line 2785

Notice: Undefined index: HARDESCAPE in C:\home\site\wwwroot\wp-content\plugins\codecolorer\lib\geshi.php on line 2785

Warning: Invalid argument supplied for foreach() in C:\home\site\wwwroot\wp-content\plugins\codecolorer\lib\geshi.php on line 2785

Notice: Undefined index: HARDESCAPE in C:\home\site\wwwroot\wp-content\plugins\codecolorer\lib\geshi.php on line 2785

Warning: Invalid argument supplied for foreach() in C:\home\site\wwwroot\wp-content\plugins\codecolorer\lib\geshi.php on line 2785

Introduction
Traditionally when doing a restore, moving a database, etc. a DBA would simply go into Management Studio and use the Generate Scripts wizard to script all logins and permissions. This approach can take several minutes and as I recently learned does not script database level permissions.

Solution
Bothered by the amount of effort and the fact that I kept forgetting to script out the permissions until just after I had wiped them out by starting a restore, I set out to create scripts that I could just include as steps in my restore jobs. Given that the restore wipes out the database, I knew I had to have 2 steps. The first step stores the permissions before the restore, while the second puts them back after. In the spirit of keeping the restore jobs simple, I wrapped up all of the logic into 2 stored procedures that do not require any arguments.

The first stored procedure, sp_dba_StoreDatabasePermissions, stores the logins roles and permissions to a table in msdb. I use msdb here because everybody has it and it is not master. The table is named for the database it corresponds to followed by ‘_permission_store’. The permission store table has 2 columns, the first is the actual SQL command while the second is the order to run it in. The ordering is done in groups, with all roles ranked to be run first, followed by users, adding users to roles and finally the permissions that correspond to the users and roles. The stored procedure makes use of a synonym to point to the permission store, cutting down on the use of dynamic SQL and enhancing readability. The logic to get the permissions is based on the logic of a script by Narayana Vyas Kondreddi.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
USE [master]
GO
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER OFF
GO
IF  EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[sp_dba_StoreDatabasePermissions]') AND type in (N'P', N'PC'))
    DROP PROCEDURE [dbo].[sp_dba_StoreDatabasePermissions]
GO

CREATE PROCEDURE [dbo].[sp_dba_StoreDatabasePermissions]

AS

 BEGIN

    DECLARE @table_name sysname,
            @create_cmd nvarchar(4000)

    SELECT  @table_name = db_name() + '_permission_store'

    IF NOT EXISTS(SELECT * FROM msdb.sys.sysobjects WHERE name = @table_name)
     BEGIN
        SELECT  @create_cmd = 'CREATE TABLE [msdb].[dbo].' + QUOTENAME(@table_name) + ' (command nvarchar(max), run_order int)'
     END

    EXEC    sp_executesql  @create_cmd

    SELECT  @create_cmd = 'IF NOT EXISTS (SELECT * FROM sys.synonyms WHERE name = N''permission_store'') CREATE SYNONYM permission_store FOR [msdb].[dbo].' + QUOTENAME(@table_name)
    EXEC    sp_executesql  @create_cmd

    INSERT  permission_store
    SELECT  'IF NOT EXISTS(SELECT * FROM sys.database_principals WHERE name = ''' + name + ''') CREATE ROLE [' + name + ']' AS Command,
            0 AS run_order
    FROM    sys.database_principals
    WHERE   type_desc = '
DATABASE_ROLE'
                AND is_fixed_role = 0

    INSERT  permission_store
    SELECT  '
IF NOT EXISTS(SELECT * FROM sys.database_principals WHERE name = ''' + name + ''') CREATE USER [' + name + '] FOR LOGIN [' + name + ']' + ISNULL(' WITH DEFAULT_SCHEMA=[' + default_schema_name + ']', '') AS Command,
            1 AS run_order
    FROM    sys.database_principals
    WHERE   type_desc in ('
WINDOWS_GROUP', 'SQL_USER','WINDOWS_USER') --Updated to include WINDOWS_USER on 10/5/2011. Thanks to Alex Hatcher for catching that.

    INSERT  permission_store
    SELECT  '
EXEC sp_addrolemember @rolename=''' + dp_r.name + ''', @membername =''' + dp_m.name + '''' AS Command,
            2 AS run_order
    FROM    sys.database_role_members drm
            INNER JOIN sys.database_principals dp_r
                ON drm.role_principal_id = dp_r.principal_id
            INNER JOIN sys.database_principals dp_m
                ON drm.member_principal_id = dp_m.principal_id
    WHERE   dp_m.name NOT IN ('
dbo')

    INSERT      permission_store
    SELECT      '
IF EXISTS(SELECT * FROM sys.objects WHERE name = ''' + obj.name + ''' AND USER_NAME(schema_id) = ''' + USER_NAME(obj.schema_id) + ''' AND type = ''' + RTRIM(obj.type) + ''') ' +
                CASE WHEN perm.state <> '
W' THEN perm.state_desc ELSE 'GRANT' END
                + '
' + perm.permission_name + ' ' + 'ON ' + QUOTENAME(USER_NAME(obj.schema_id)) + '.' + QUOTENAME(obj.name)
                + CASE WHEN cl.column_id IS NULL THEN '
' ELSE '(' + QUOTENAME(cl.name) + ')' END
                + '
TO ' + QUOTENAME(usr.name) COLLATE database_default
                + CASE WHEN perm.state <> '
W' THEN '' ELSE ' ' + 'WITH GRANT OPTION' END AS Command,
                3 AS run_order
    FROM        sys.database_permissions AS perm
                INNER JOIN sys.objects AS obj
                    ON perm.major_id = obj.[object_id]
                INNER JOIN sys.database_principals AS usr
                    ON perm.grantee_principal_id = usr.principal_id
                INNER JOIN sys.database_principals AS adm
                    ON perm.grantor_principal_id = adm.principal_id
                LEFT JOIN sys.columns AS cl
                    ON cl.column_id = perm.minor_id AND cl.[object_id] = perm.major_id
 END
GO
EXEC sp_MS_marksystemobject '
sp_dba_StoreDatabasePermissions'
GO

The second stored procedure in the pair, sp_dba_GetDatabasePermissions, is very simple. First it checks for the synonym in the database and creates it if it is missing. After verifying the synonym, the stored procedure opens up a cursor against the permission store and begins executing commands ordered ascending by run_order. After running all commands, the stored procedure checks the value of the optional parameter @keep_permission_store to see if it should clean up the permission store table or leave it out there. The default behavior is to drop the permission store when done with it.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
USE [master]
GO
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER OFF
GO
IF  EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[sp_dba_GetDatabasePermissions]') AND type in (N'P', N'PC'))
    DROP PROCEDURE [dbo].[sp_dba_GetDatabasePermissions]
GO

CREATE PROCEDURE [dbo].[sp_dba_GetDatabasePermissions] @keep_permission_store bit = 0

AS

 BEGIN


    DECLARE @command nvarchar(max)

    SELECT  @command = 'IF NOT EXISTS (SELECT * FROM sys.synonyms WHERE name = N''permission_store'') CREATE SYNONYM permission_store FOR [msdb].[dbo].' + QUOTENAME(db_name() + '_permission_store')
    EXEC    sp_executesql  @command

    DECLARE commands CURSOR FAST_FORWARD READ_ONLY FOR
        SELECT      command
        FROM        dbo.permission_store
        ORDER BY    run_order

    OPEN commands

    FETCH NEXT from commands INTO @command

    WHILE(@@FETCH_STATUS <> -1)
     BEGIN
        PRINT @command
        EXEC sp_executesql @command
        FETCH NEXT from commands INTO @command
     END

    CLOSE commands

    DEALLOCATE commands

    IF @keep_permission_store != 1
     BEGIN
        SELECT @command = 'DROP TABLE ' + base_object_name FROM sys.synonyms WHERE name = 'permission_store'
        DROP SYNONYM dbo.permission_store
        PRINT @command
        EXEC sp_executesql @command
     END
 END
GO
EXEC sp_MS_marksystemobject 'sp_dba_GetDatabasePermissions'
GO

Other Uses
It seems like every day I find more uses for these stored procedures. Most recently I have been running them at replication subscribers before making any changes at the publisher. The @keep_permission_store flag of sp_dba_GetDatabasePermissions comes in really handy here. Passing a value of 1 allows permissions to be put back several times, a lifesaver when things do not go right on the first try.

I also use these stored procedures to copy permissions between databases on different servers. I started out running sp_dba_StoreDatabasePermissions then selecting from the permission store table ordered by rank in text, copying, pasting and running on the new server. After adding the print statement to sp_dba_GetDatabasePermissions it was just easier to call sp_dba_StoreDatabasePermissions then sp_dba_GetDatabasePermissions right away, copying the commands from the messages window and pasting them into a query window to run wherever I want.

Wrap-Up
These stored procedures have served me well and I hope they will serve you well. I would love to hear how people are using them and any suggestions for enhancements. As always, scripts on the internet are like Halloween candy, check them thoroughly before consuming.

Script Individual User Rights in a Database with PowerShell

Introduction

Tonight on Twitter the call went out on #SqlHelp looking for a way to script just database permissions for a specific user. I wrote a script once that I was pretty sure did that so I jumped in to help. It turns out that I lost that particular script and it really did not do the right thing anyway. Somewhat embarassed I decided I better stay up and get a script written to solve the problem rather than admit to wasting someone’s time.

The Result

After a couple of hours of hacking through the various SMO classes I was able to cobble together the script below. I had forgetten how much I enjoyed challenges like this from my developer days so this was actually quite a treat.

Here is the code:
 

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
$serverName="localhost"
$databaseName="AdventureWorks"

$serverConnection = new-object Microsoft.SqlServer.Management.Common.ServerConnection
$serverConnection.ServerInstance=$serverName

$server = new-object Microsoft.SqlServer.Management.SMO.Server($serverConnection)
$database = $server.Databases[$databaseName]
foreach($user in $database.Users)
{
foreach($databasePermission in $database.EnumDatabasePermissions($user.Name))
{
Write-Host $databasePermission.PermissionState $databasePermission.PermissionType "TO" $databasePermission.Grantee
}
foreach($objectPermission in $database.EnumObjectPermissions($user.Name))
{
Write-Host $objectPermission.PermissionState $objectPermission.PermissionType "ON" $objectPermission.ObjectName "TO" $objectPermission.Grantee
}
}
$server.ConnectionContext.Disconnect()

Update
The discussion continues on this one.  The current thinking is that you should be able to call User.Script(). Jonathan Kehayias (Blog, Twitter) will be logging a Connect item and I will post voting information here.

A Busy/Accidental DBA’s Guide to Managing VLFs

Introduction

Properly managing VLFs can make or break the performance of your databases. There is a ton of information out there on the proper management of VLFs, but nothing I have found that tries to boil it down to the most important parts. So here it is, my attempt at A Busy/Accidental DBA’s Guide to Managing VLFs.

What are VLFs?

When SQL Server allocates new space in a log file it does it using Virtual Log Files (VLFs), meaning every growth of a transaction log file is made of 4 or more VLFs.  Think of VLFs as small files within the file that are easier for SQL Server to manage than one large file. (There really is a lot more to it than that but rather than lift from BOL I will refer you to this page for a more detailed explanation.)

Why Manage VLFs?

Having too many or in some cases not enough VLFs can cause sluggish database performance. I have also heard cases of database recovery taking far longer than expected when a log file contains too many VLFs.

How Many VLFs Should I have?

To quote someone much wiser: “It depends”. I use 50 VLFs as my rule of thumb because it is much easier to have a simple rule and it is a safe number in most cases. I do suggest reading this article: Transaction Log VLFs – too many or too few? before committing to a number of your own, especially if you are working with VLDBs.

How do I Manage VLFs?

Managing VLFs is a 2 step process. Step 1 is figuring out how many VLFs you have in each of your transaction logs. Step 2 is deciding on what number of VLFs is acceptable to you and shrinking and growing the log files to get them back under your threshold.  I have included scripts below that will help you identify and remediate high VLF counts. They probably could be wrapped up into a single script but I prefer to have control of what is running when so I can monitor for any issues the maintenance might cause.

Many people also add a step 3 where they increase the auto-growth increment of their database. I tend to avoid raising the auto-growth unless the database is new. The log should only grow very rarely on a mature database; constantly having to address VLFs in a particular database’s log could be a sign of a larger problem like auto-shrink being turned on.

What if I Just Shrink the Log and Let it Grow Back?

There is a misconception that shrinking a log and increasing the auto-growth is enough to remediate high VLF counts. While shrinking a log file may lower VLF counts temporarily, they will come right back when the log file grows back. This article: Transaction Log VLFs – too many or too few? lays out how many VLFs will be added based on the auto-growth increment.  Rephrased from the article:

  •       If the file growth is up to 64MB the new portion of the log file will contain 4 VLFs
  •       If the file growth is more than 64MB and up to 1GB the new portion of the log file will contain 8 VLFs
  •       If the file growth larger than 1GB = 16VLFs

Based on that, if an 80GB log with 100 VLFs was shrunk to remove VLFs then allowed to auto-grow back to 80GB with a larger auto-growth increment, say 4GB, the log would contain 20*16 = 320 VLFs.

How Many VLFs are in My Databases?

This script will return the VLF count for each database on the server it is run on. I am not sure of the origins of the script but I can say it works for me.  If you know or are the original author of this script please let me know so I can give proper credit or replace the script with a link to a more current version.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
DECLARE @query varchar(1000),
@dbname varchar(1000),
@count int

SET NOCOUNT ON

DECLARE csr CURSOR FAST_FORWARD READ_ONLY
FOR
SELECT name
FROM master.dbo.sysdatabases

CREATE TABLE ##loginfo
(
dbname varchar(100),
num_of_rows int)

OPEN csr

FETCH NEXT FROM csr INTO @dbname

WHILE (@@fetch_status &lt;&gt; -1)
BEGIN

CREATE TABLE #log_info
(
fileid tinyint,
file_size bigint,
start_offset bigint,
FSeqNo int,
[status] tinyint,
parity tinyint,
create_lsn numeric(25,0)
)

SET @query = 'DBCC loginfo (' + '''' + @dbname + ''') '

INSERT INTO #log_info
EXEC (@query)

SET @count = @@rowcount

DROP TABLE #log_info

INSERT ##loginfo
VALUES(@dbname, @count)

FETCH NEXT FROM csr INTO @dbname

END

CLOSE csr
DEALLOCATE csr

SELECT dbname,
num_of_rows
FROM ##loginfo
WHERE num_of_rows &gt;= 50 --My rule of thumb is 50 VLFs. Your mileage may vary.
ORDER BY dbname

DROP TABLE ##loginfo

Updated August 27, 2013. Here is a version that works on SQL 2012:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
DECLARE @query varchar(1000),
 @dbname varchar(1000),
 @count int

SET NOCOUNT ON

DECLARE csr CURSOR FAST_FORWARD READ_ONLY
FOR
SELECT name
FROM sys.databases

CREATE TABLE ##loginfo
(
 dbname varchar(100),
 num_of_rows int)

OPEN csr

FETCH NEXT FROM csr INTO @dbname

WHILE (@@fetch_status <> -1)
BEGIN

CREATE TABLE #log_info
(
 RecoveryUnitId tinyint,
 fileid tinyint,
 file_size bigint,
 start_offset bigint,
 FSeqNo int,
[status] tinyint,
 parity tinyint,
 create_lsn numeric(25,0)
)

SET @query = 'DBCC loginfo (' + '''' + @dbname + ''') '

INSERT INTO #log_info
EXEC (@query)

SET @count = @@rowcount

DROP TABLE #log_info

INSERT ##loginfo
VALUES(@dbname, @count)

FETCH NEXT FROM csr INTO @dbname

END

CLOSE csr
DEALLOCATE csr

SELECT dbname,
 num_of_rows
FROM ##loginfo
WHERE num_of_rows >= 50 --My rule of thumb is 50 VLFs. Your mileage may vary.
ORDER BY dbname

DROP TABLE ##loginfo

How Do I Lower a Database’s VLF Count?

Once armed with a list of databases that have high VLF counts, the next step is to shrink the logs to as small as possible then grow them back to the original size, ideally in a single growth. This is best done during off-peak times. I wrote the following script to perform those exact steps given the appropriate USE statement. You may have to run it multiple times to get to a low enough VLF count.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
/*USE <db_name>*/ --Set db name before running using drop-down above or this USE statement

DECLARE @file_name sysname,
@file_size int,
@file_growth int,
@shrink_command nvarchar(max),
@alter_command nvarchar(max)

SELECT @file_name = name,
@file_size = (size / 128)
FROM sys.database_files
WHERE type_desc = 'log'

SELECT @shrink_command = 'DBCC SHRINKFILE (N''' + @file_name + ''' , 0, TRUNCATEONLY)'
PRINT @shrink_command
EXEC sp_executesql @shrink_command

SELECT @shrink_command = 'DBCC SHRINKFILE (N''' + @file_name + ''' , 0)'
PRINT @shrink_command
EXEC sp_executesql @shrink_command

SELECT @alter_command = 'ALTER DATABASE [' + db_name() + '] MODIFY FILE (NAME = N''' + @file_name + ''', SIZE = ' + CAST(@file_size AS nvarchar) + 'MB)'
PRINT @alter_command
EXEC sp_executesql @alter_command

In Closing

This has by no means a comprehensive lesson in VLFs or transaction log management, but hopefully enough to get the job done. If you are looking for a more in-depth look at VLFs and transaction logs in general I suggest reading the following articles: Understanding Logging and Recovery in SQL Server, Transaction Log VLFs – too many or too few? and 8 Steps to better Transaction Log throughput.

How to Shrink TempDB in SQL 2005

Introduction

From time to time you find yourself needing to shrink some space out of TempDB. Shrinking database files is never my first choice but sometimes it is the best I have. Many people think that you cannot shrink TempDB in SQL 2005, but I am going to show you how.

Why would I need to shrink TempDB?

Yesterday afternoon my pager started going crazy because an Ad-Hoc query that needed some tuning filled TempDB on a server. Luckily, the user only impacted their own query so it was easy to quickly identify them and work with the right people to get the query rewritten.

Once the immediate problem was resolved there had to be some cleanup. On this server, TempDB has 32 files (1 per processor) all on the same disk. The full database condition caused all kinds of alerts in our monitoring tools, from drive space alerts to too few growths remaining. There were 3 possible solutions to quiet the alerts:

1. Reboot – There is never a good time to reboot a production server

2. Turn off the Alerts – Not really an option. My preference would be for increasing the sensitivity

3. Shrink TempDB – Not a great option, but the best of the 3

Shrinking TempDB

Once we had decided that we would go ahead and shrink the files in TempDB it seemed like the hard part was done, but after running the following command:

USE [tempdb]

GO

DBCC SHRINKFILE (N’tempdev’ , 5000)

GO

I got back the following:

DBCC SHRINKFILE: Page 1:878039 could not be moved because it is a work file page.

DbId FileId CurrentSize MinimumSize UsedPages EstimatedPages

—— ———– ———– ———– ———– ————–

2 1 878040 640000 4672 4672

 

(1 row(s) affected)

 

DBCC execution completed. If DBCC printed error messages, contact your system administrator.

 

“Page could not be moved because it is a work file page.”…grrr. This is a new thing in SQL 2005 caused by the caching that is done in TempDB. I am not going to try to explain here how objects are cached in TempDB, but Kalen Delaney’s Inside Sql Server Series is a great place to learn about it if you are interested (http://www.insidesqlserver.com/books.html). What is important is that the cached objects are tied to a query plan and that by freeing the procedure cache you can make those objects go away, allowing you to shrink your files.

Trying again:

DBCC FREEPROCCACHE

GO

USE [tempdb]

GO

DBCC SHRINKFILE (N’tempdev’ , 5000)

GO

This time it worked:

 

DBCC execution completed. If DBCC printed error messages, contact your system administrator.

DbId FileId CurrentSize MinimumSize UsedPages EstimatedPages

—— ———– ———– ———– ———– ————–

2 1 640000 640000 264 264

 

(1 row(s) affected)

 

DBCC execution completed. If DBCC printed error messages, contact your system administrator.

I think I got lucky that the shrink worked on the first try. There will certainly be times when you have to try freeing the procedure cache and shrinking multiple times to get a file to shrink, but eventually it will get the job done.