Skip to content

A VLF Example

There are plenty of references around that talk about how the number of virtual log files (VLFs) in a database can have negative impacts. Recently I ran into a real-world example of this. We were setting up a log shipping destination for an application database. I have done this countless times in my career. In this case, that database was about a terabyte. However, after setting up the destination with a restore of the production database, the transaction log restores immediately fell behind. The transaction log backup interval was set to 15 minutes, but the backup\copy\restore process was taking far longer. After looking at several factors (disk IO, network contention etc.) what we discovered was that the transaction log had over 32,000 VLFs. We followed the recommendations to shrink and re-grow the transaction log following the recommendations of Kimberly Tripp at sqlskills.com, and we ended up with around 70 VLFs. The backup\copy\restore process now takes only a few minutes and our log shipping destination now is fulfilling its purpose.

Kimberly Tripp: http://www.sqlskills.com/blogs/kimberly/8-steps-to-better-transaction-log-throughput/

Transaction Log VLFs – too many or too few?

As an example of how too many VLFs can negatively impact performance, I created the following example. First, I created a database with some questionable settings, specifically a transaction log of 1 MB set to auto grow 1 MB at a time. I immediately back this up.

--Script 1: Create database
CREATE DATABASE [VLFTest] ON PRIMARY (
NAME = N'VLFTest'
,FILENAME = N'C:\SQL\UserDBData\VLFTest.mdf'
,SIZE = 1700 MB
,FILEGROWTH = 102400 KB
) LOG ON (
NAME = N'VLFTest_log'
,FILENAME = N'C:\SQL\UserDBLog\VLFTest_log.ldf'
,SIZE = 1024 KB
,FILEGROWTH = 1024 KB
)
GO

ALTER DATABASE VLFTest SET recovery FULL

BACKUP DATABASE VLFTest TO DISK = 'C:\SQL\Backups\VLFTest.bak'

Now that I have this database, I will add a table to fill up.

--Script 2: Create table
USE VLFTest
GO
CREATE TABLE filler_up (
filler_id INT identity(1, 1) ,
fat_guid UNIQUEIDENTIFIER ,
fat_text nvarchar(2000) ) 

Now I run a loop to stuff this thing full. As a Bruce Springsteen fan, I decided to use the lyrics to The River to fill the nvarchar column.

--Script 3: Fill Table
SET NOCOUNT ON

declare @the_river nvarchar(2000)
set @the_river = 'I come from down in the valley
where mister when you''re young
They bring you up to do like your daddy done
Me and Mary we met in high school
when she was just seventeen
We''d ride out of this valley down to where the fields were green

We''d go down to the river
And into the river we''d dive
Oh down to the river we''d ride

Then I got Mary pregnant
and man that was all she wrote
And for my nineteenth birthday I got a union card and a wedding coat
We went down to the courthouse
and the judge put it all to rest
No wedding day smiles no walk down the aisle
No flowers no wedding dress

That night we went down to the river
And into the river we''d dive
Oh down to the river we did ride

I got a job working construction for the Johnstown Company
But lately there ain''t been much work on account of the economy
Now all them things that seemed so important
Well mister they vanished right into the air
Now I just act like I don''t remember
Mary acts like she don''t care

But I remember us riding in my brother''s car
Her body tan and wet down at the reservoir
At night on them banks I''d lie awake
And pull her close just to feel each breath she''d take
Now those memories come back to haunt me
they haunt me like a curse
Is a dream a lie if it don''t come true
Or is it something worse
that sends me down to the river
though I know the river is dry
That sends me down to the river tonight
Down to the river
my baby and I
Oh down to the river we ride'
DECLARE @counter INT = 1
declare @text nvarchar(2000)

WHILE @counter < 400000
BEGIN
INSERT INTO filler_up (fat_guid, fat_text)
VALUES (NEWID(), @the_river)

SET @counter = @counter + 1
END

Loading this data took 4:52 on my machine. Now that the database has been bloated, we can run DBCC LOGINFO and see that we have, in my case, 1722virtual log files. Now, to emulate the issue I experienced, lets do a simple backup and restore.

--Script 4: Backup and Restore
USE [master]
GO

BACKUP DATABASE VLFTest TO DISK = 'C:\SQL\Backups\VLFTest_manyVLFS.bak'
with STATS = 5, INIT

RESTORE DATABASE [VLFTest_restored]
FROM DISK =  'C:\SQL\Backups\VLFTest_manyVLFS.bak'
WITH FILE = 1
,MOVE N'VLFTest' TO N'C:\SQL\UserDBData\VLFTest_restored.mdf'
,MOVE N'VLFTest_log' TO N'C:\SQL\UserDBLog\VLFTest_restored_log.ldf'
,NOUNLOAD
,STATS = 5
GO

This backup and restore process took 3:41 on my laptop. Now, to see the difference that we can have with a properly sized transaction log with far fewer VLFs, we will start over by dropping both databases and creating the new VLFTest database with a “correct” transaction log file. The only difference between this database creation script and the one above is the size of the transaction log.

--Script 5: Create database with different log settings
DROP DATABASE VLFTest
DROP DATABASE VLFTest_restored

CREATE DATABASE [VLFTest] ON PRIMARY (
NAME = N'VLFTest'
,FILENAME = N'C:\SQL\UserDBData\VLFTest.mdf'
,SIZE = 1700 MB
,FILEGROWTH = 102400 KB
) LOG ON (
NAME = N'VLFTest_log'
,FILENAME = N'C:\SQL\UserDBLog\VLFTest_log.ldf'
,SIZE = 1600 MB
,FILEGROWTH = 400 MB
)
GO

ALTER DATABASE VLFTest SET recovery FULL

BACKUP DATABASE VLFTest TO DISK = 'C:\SQL\Backups\VLFTest.bak'

Now, all I need to do is to recreate and load data into the table, using the same scripts above (Scripts 2 and 3). This time the load took only 2:23, but that is because the transaction log did not have to autogrow, not because of the VLF issue. However, re-running DBCC LOGINFO shows only 24 VLFs now. To finish the test, run the backup and restore process again (Script 4). This time the backup and restore took only 2:38.  A significant improvement.

Leave a Reply

Your email address will not be published. Required fields are marked *