Monday, December 29, 2008

Interview Question - DB Mirroring vs Clustering


Database mirroring and Clustering – both can serve the purpose of high availability and data protection. However Database mirroring is at database level only. Clustering is at server level, all databases on the instance are covered. In case of clustering there is a single copy of your data and log files where in case of mirroring you have multiple copies of files – on primary and secondary server both,  which increases your chances of recovery in case of disaster. In case of mirroring you cannot mirror the system databases viz. master, msdb etc. For mirroring to work database recovery model has to be set to "full recovery". I would see mirroring as a sophisticated log shipping mechanism with automatic failover capability.


Wednesday, December 17, 2008

SQL server - version question

Most of the people who deal with SQL Server know how to get the current SQL Server version (build number or information in other words). Let me give the command once again

select @@version

The output would look something like,

----------------------------------------------------------------- ---------------------------------------------------------------
2008-11-23 19:13:25.29 Server Microsoft SQL Server 2008 (RTM) - 10.0.1600.22 (Intel X86) Jul 9 2008 14:43:34 Copyright (c) 1988-2008 Microsoft Corporation Developer Edition on Windows NT 5.1 (Build 2600: Service Pack 2)

Now, looking at the build number or the version details “10.0.1600.22” you really cannot make out at what patch level or service pack level your SQL Server instance is at. So, here is a handy web page by SQL Security.

It lists all the version/build number with respective patch number along the side. In this case “10.0.1600.22” refers to SQL Server version 10 i.e. 2008 RTM. Once you apply some service pack this number will increase to something else and sometime its listed in either KB article or in this SQL Security website.

This should help you on - how to figure out the patch level of your SQL Server instance without asking anyone.

How to connect SQL SERVER instance ?

This article is for those who are new to SQL Server or for those who are new to windows platform.

Quiet a lot of people go to Google and search for these key words “SQL Server”+FREE. Top in the results listed is the Microsoft SQL Server Express edition. Yes- Microsoft SQL Server Express edition is free to user and it comes with certain limitations please read the documentation before you start using it. Curious to learn the SQL Server quickly most of us download and install it. Next is how do I connect and start writing and testing some queries, as it does not come with standard client connectivity GUI tools

There are two methods of user authentication in SQL Server unlike most of the RDBMS products like Oracle, MySQL, Sybase etc. The two authentication methods are know are

Window Authentication and

SQL Server Authentication

In case of “Window Authentication” your connection request is authenticated against the windows operating system. if you are using SQL Server Management studio or another other GUI’s tools and if you are the local admin of your system (which mostly we are all) you need to make following selections in your connection screen >> SQL Server Name and Authentication = “Window Authentication” (or integrated authentication in some cases) no need of password anywhere.

If you are using OSQL or SQLCMD – the command line client utility to manage SQL Server, which is mostly installed when you install SQL Server (any edition). You just need to type following command at command prompt

{Third party GUI Tools}

osql -E -S computer_name\instance_name

Where computer_name = your desktop/pc name, no need of keying in any PASSWORD as you have specified –E option i.e. to use window authentication. If you (logged in user) are a valid windows user and have a local admin privileges you will be successfully connecting to SQL Server instance.

In case of “SQL Server Authentication” your connection request is authenticated by the SQL Server itself. In this case you need to pass one more value i.e. password along with user name and also you need to opt for “SQL Server authentication” if there is any in you GUI tool. In this case the OSQL command would be,

osql -S computer_name\instance_name –U <SQL_userId> -P <respective password>

{command line utils}

Saturday, August 9, 2008

NT Event log analysis (for DBA's)

Have ever looked "deep" into your NT Event logs ? - most of we DBA's will say NO. Here I am describe a simple and little easy way to do that trick. Using this trick you are going to generate a report of type of errors that your server logging into the logs and you will woundering what is all this junk on my server.

We are going to use SQL Server to do this, as it is our strenght !

First pick your box to analyze and export the NT Event logs CSV format. Why CSV - because we are going to import that into a SQL Server and do a "deep" analysis of the issues that are going on you server (sort of a health report). Here are the steps for your convience

Open Event viewer
Right click on the "Application Log"
Choose "Save Log File As" option from the pop up menu
Save dialogue will appear
Type the filename and select file type as "CSV (Comma delimited) (*.csv)"

Simple isn't ? (this is not all that I wanted to share in this article). Similary you an save other logs too if you are interested.

Ok, once we have the NT Event logs saved in CSV format out next step is to load them into a SQL Server table, so that we can query, slice and dice the data in this log(s). We are going to import the CSV file into SQL Server using a linked server. Why linked server, because its easy :), here is the simple command to create a linked server,

EXEC sp_addlinkedserver EventLogs, 'Jet 4.0',

"C:\MyEventLogs" is the folder where my all the eventlogs are stored, "EventLogs" is my linked server name. Refer to SQL Books online for specifics of sp_addlinkedserver command, if you are interested.

After linked server is created, run the following command,

EXEC sp_tables_ex EventLogs

This will display a recordset and if you look at the column "TABLE_NAME", you will see that all the CSV files under the folder "C:\MyEventLogs\" will be listed. Lets assume that you NT Eventlogs were save a file named "PefCounter.csv" the file will be listed as "PefCounter#csv". Now let query this "PefCounter#csv" table. Here is a simple query,

SELECT * FROM EventLogs...PefCounter#csv

Was't is simple and neat idea !

Now don't run your analytics directly on the file directly (linked server) it will be slow. Instead import it into a table. Below is the table structure that I use and it mostly works all the time, I mean the datatypes, come take a look,

IF OBJECT_ID('EventLog_BeforeQ2') > 0
DROP TABLE [dbo].[EventLog_BeforeQ2]

CREATE TABLE [dbo].[EventLog_BeforeQ2](
[LogDate] [datetime] NULL,
[LogTime] [nvarchar](255) NULL,
[LogSource] [nvarchar](255) NULL,
[LogType] [nvarchar](255) NULL,
[LogCategory] varchar(5) NULL,
[LogEvent] [INT] NULL,
[LogUser] [nvarchar](255) NULL,
[LogComputer] [nvarchar](255) NULL,
[LogErrDesc] [ntext] NULL

SELECT * INTO [dbo].[EventLog_BeforeQ2] FROM EventLogs...PefCounter#csv

At this point your data is sitting in EventLog_BeforeQ2 table. You can run any query again this table now to do any kind of analysis that you want, but my intention of writing this article was about looking "deep" into the event logs. So, first thing I do is

SELECT DATEDIFF(DD, MIN(LogDate), MAX(logDate)) FROM EventLog_BeforeQ2

This will tell how many days worth of logs you have in your table. Try the below query for more detials,

SELECT LogEvent, COUNT(LogEvent)[Count] FROM [EventLog_BeforeQ2]
WHERE LogType= 'Error'

Below is the output that I see on my logs.

LogEvent Count
----------- -----------
6043 2813
4019 823
6013 528
1 381
318 303
17052 217
4045 122

You can also try to find out what specific applications/services are logging most of the erorrs using following query,

SELECT LogSource, COUNT(LogEvent)[Count] FROM [EventLog_BeforeQ2]
WHERE LogType= 'Error'
GROUP BY LogSource

This is how I target to improve my SQL Server's Maintenance, Monitoring and Performance. I will leave rest of the creativity of log analysis to the readers of this article.

Friday, May 9, 2008

DeFragmenting Heap Tables.

    Ever wanted to know how to deal with fragmentation on tables without any index/primary key, the heap tables. Here is a script that one of my friend forwarded to me when I asked him this question.


            ALTER TABLE [dbo].[XYX_HeapTable] ADD [c] VARCHAR(1) NULL
            CREATE  CLUSTERED  INDEX [i] ON [dbo].[XYX_HeapTable] ([c]) WITH  FILLFACTOR = 100
            DROP INDEX [dbo].[XYX_HeapTable].[i]
            ALTER TABLE [dbo].[XYX_HeapTable] DROP COLUMN [c]

    Where XYZ_HeapTable is a name of the table which in heap (no primary key nor indexes on it). I wanted to test run this script to see the effectiveness of the above script. So I wrote down the following batch of scripts,

    USE Tempdb


    CREATE TABLE XYX_HeapTable (
            TrackId INTEGER,
            Fname VARCHAR(200),
            Sname VARCHAR(200)


    SET @iCount = 0
    WHILE @iCount < 10000
            INSERT INTO XYX_HeapTable VALUES (@iCount, '234234234234232432as34234232432dfasdfasdfasdfasdfasdfasdfasdfasdf', 'asdfasdfasdfasdfasdfasdfasdfasdfasdfasdfasdfasdfasdfasdf')

            SET @iCount = @iCount +1

    SELECT @ObjectId = OBJECT_ID('XYX_HeapTable')




            ALTER TABLE [dbo].[XYX_HeapTable] ADD [c] VARCHAR(1) NULL
            CREATE  CLUSTERED  INDEX [i] ON [dbo].[XYX_HeapTable] ([c]) WITH  FILLFACTOR = 100
            DROP INDEX [dbo].[XYX_HeapTable].[i]
            ALTER TABLE [dbo].[XYX_HeapTable] DROP COLUMN [c]



    SELECT @ObjectId = OBJECT_ID('XYX_HeapTable')


    DROP TABLE XYX_HeapTable

    Conclusion: I saw good improvement in ScanDensity and ExtentFragmentation after running of the above script, so it works.

CSV or TXT file as a database !

Do you have a large CSV ( or delimited txt ) file which you cannot open in XL sheet, because of XL's 65K row limit ? Because you are unable to load you are unable to data analysis - here is what I did.
As a SQL Server DBA/Developer I am assuming you have SQL Server installed on your PC or at least a instance is available for you with Sys. Admin  previlages on it. Connect to the instance using query analyzer and create a linked server. Use following command..

EXEC master.dbo.sp_addlinkedserver @server = N'txtsrv', @srvproduct=N'Jet 4.0', @provider=N'Microsoft.Jet.OLEDB.4.0', @datasrc=N'C:\My_CSV_Files\JetDatabase', @provstr=N'Text'

Lets look at each of the parameters specified in above command. @server is pointing to txtsrv i.e. "use text files as database", then use Jet 4.0 as a provider, @datasrv - is the data source name. In this case it is a folder path where my text/CSV files are located, that I want to access or query. Once you add the linked server successfully, try the following command.

EXEC sp_tables_ex txtsrv

The output will show you the list of table(s) available for querying. These tables are nothing but the files that are located in the folder "C:\My_CSV_Files\JetDatabase". In my case the output shows following tables Server_A01#txt, Server_A02#txt. Now lets try querying the data in those files. Try the following command..

select TOP 10 * from txtsrv...Server_A01#txt

Wow, I see the output as if its a real table from my SQL Server Engine. You can run all the sorts of queries on this data(file data) like you do with real table. The performance looks somewhat slow but for adhoc data analysis and processing purpose this is just great.

Friday, May 2, 2008

Backing up MySQL Database

How is it done:

mingle database is hosted on mySQL. To backup this database we use mysqldump utility which is invoked through a VB script. Earlier we were using "My SQL Administrator" utility, but it lacks some capabilities and due to which we saw that some of tables were missing in the backup. After testing we found that mysqldump is good for our purpose.

Here is how the VB script looks like

on error resume next

Dim TodaysFileName, BatchFileTxt

Set WshShell = WScript.CreateObject("WScript.Shell")

TodaysFileName = DatePart("yyyy", Now) & Lpad(DatePart("m", Now), "0", 2) & Lpad(DatePart("d", Now), "0", 2) & _
"_" & Lpad(DatePart("h", Now), "0", 2) & Lpad(DatePart("n", Now), "0", 2) & _
Lpad(DatePart("s", Now), "0", 2)

BatchFileTxt = "mysqldump -uroot -p" & CryptVBS("#^rT4", "mingle") & " mingle > ""E:\Backups\Mysql Daily\Mingle_"& _
TodaysFileName &".sql"""

'*** Add delete old backup files using FORFILES cmd
BatchFileTxt = BatchFileTxt & vbcrlf & "forfiles /P ""E:\Backups\Mysql Daily"" /M *.SQL /D -3 /C ""CMD /c del @file"" "

'*** Add delete "myself" step to batch file
BatchFileTxt = BatchFileTxt & vbcrlf & "del backup.bat"

'*** create a batch file with all the step/commands
OpenTextFileForWrite BatchFileTxt, "backup.bat"

'*** invoke the batch file
WshShell.Run "backup.bat"

'*** Function to LEFT pad the number to create fixed length number string. ***
Function Lpad (MyValue, MyPadChar, MyPaddedLength)
Lpad = string(MyPaddedLength - Len(MyValue),MyPadChar) & MyValue
End Function

'*** Procedure to DECRYPT the string ***
Function CryptVBS(Text, Key)
KeyLen = Len(Key)
For i = 1 To Len(Text)
KeyPtr = (KeyPtr + 1) Mod KeyLen
sTxtChr = Mid(Text, i, 1)
wTxtChr = Asc(stxtchr)
wKeyChr = Asc(Mid(Key, KeyPtr + 1, 1))
CryptKey = Chr(wTxtChr Xor wKeyChr)
hold = hold & CryptKey
CryptVBS = hold
End Function

'*** Procedure to WRITE the output to a given file ***
Sub OpenTextFileForWrite (TxtToWrite, strFileName)

Const ForReading = 1, ForWriting = 2, ForAppending = 8
Dim fso, f
Set fso = CreateObject("Scripting.FileSystemObject")
Set f = fso.OpenTextFile( strFileName, ForWriting, True)
f.Write TxtToWrite & VBCRLF

End Sub

This is how the output batch file would look the end it delets itself!

mysqldump -uroot -p****** mingle > "E:\Backups\Mysql Daily\Mingle_20080508_111407.sql"
forfiles /P "E:\Backups\Mysql Daily" /M *.SQL /D -3 /C "CMD /c del @file"
del backup.bat

Currently VB script Mingle_Bkp.vbs is located in WINNT system folder on C:\ drive and that is where it creates the backup.bat and triggers it from.

The backup file retention period is set to 3 days, any files older than 3 days will get deleted. We are doing it using *FORFILES* utility. forfiles.exe is part of windows OS & is available on all windows systems(computers).

The script lines are not completely written by me I picked them up from different places and modified them to suite my requirements, but you are free to use it as you like.

Thursday, April 10, 2008

Default database file path

I keep seeing this question in many forums. When you fire "CREATE DATABASE <database name>" T-SQL command without any parameters. The database will get created and its file will be located in some path, lets say "C:\Program Files\MS SQL\Data". But if you want to have the files located else where, lets assume "D:\SQL_Data\" then there are two options that you can think of.

1 - specify the file path in the "CREATE DATABASE <database name>" command or
2- set the default path location in server configuration.

For option one refere to BOL, here is how to workout Option-II: again there are two ways to do this task. one, using GUI (Management Studio/Enterprise manager) - right click on server node, select properties from pop-up menu, look for "Database Settings" in that section there two boxes - one for LOG and other for DATA file. Key in the appropriate values and save it. This change does NOT require SQL Server re-start.
The other way to achieve this configuration change is using T-SQL, as show below,

USE [master]
EXEC xp_instance_regwrite N'HKEY_LOCAL_MACHINE', N'Software\Microsoft\MSSQLServer\MSSQLServer', N'DefaultData', REG_SZ, N'D:\SQL_Data'
EXEC xp_instance_regwrite N'HKEY_LOCAL_MACHINE', N'Software\Microsoft\MSSQLServer\MSSQLServer', N'DefaultLog', REG_SZ, N'D:\SQL_Data'

Yes, its instance specific registery entry that it takes to set this property. Here the command is using extended stored procedure "xp_instance_regwrite" to do the trick. This type of command can be used in cases where you want to standardise you SQL Server setup and you may want to add this step as a post installation procedure/batch file etc.

Update 12:18 PM 3/26/2009

Couple of people have asked me how to find the database file paths, i.e basically they are asking - how do I know which database is linked to which file. Here is the query,

select * from master.dbo.sysaltfiles --where db_name(dbid) = 'x_db_name'

Above query will display the list of all the database files located on that server. But if you want to to be spcific then use

select * from x_db_name.dbo.sysfiles

To understand more on interpreting the results (meaning of each column displayed) you may want to check documentation in "Book Online" aka. BOL !

Monday, February 25, 2008

Backup Monitoring Across Multiple Servers

Here is a simple back verification solution. "back verification " I am basically looking to answer following question(s)

1 > When was the last FULL backup was done?
2 > When was the last T.LOG backup was done?.

This solution is very handy if you are monitoring lot of SQL Server(s) in your environment. The solution is split into 3 files as follows,

1 > Batch file
2 > SQL Query file
3 > Server list file

What is in Batch file?

In this batch file I am just calling OSQL with required parameters. The dos "FOR" loop is very fantastic and I found that from the web (thanks to the author).

for /f "tokens=1,2,3" %%i in (SrvList.txt) do OSQL.exe -E -S"%%i" -q -i"BkpDetails.sql" -n -w 2000 >> %random%log.txt

BkpDetails.sql - T-SQL Query

PRINT ' DB Backup details for server ... ['+@@ServerName+'] as on '+ CONVERT(VARCHAR(30), GETDATE())
SELECT A.NAME [DB_Name], MAX(backup_finish_date) [Bkp Date], MAX(type) [Bkp Type] INTO #Tmp1
FROM master..sysdatabases A
LEFT JOIN msdb..backupset ON a.NAME = database_name
WHERE type = 'L'

SELECT A.NAME [DB_Name], MAX(backup_finish_date) [Bkp Date], MAX(type) [Bkp Type] INTO #Tmp2
FROM master..sysdatabases A
LEFT JOIN msdb..backupset ON a.NAME = database_name
WHERE type = 'D'

select LEFT(#Tmp2.[DB_Name], 30), LEFT(#Tmp2.[Bkp Date], 30) AS [Last Full Bkp Done At],
LEFT(#Tmp1.[Bkp Date], 30) AS [Last T.Log Bkp Done At]
from #Tmp2
LEFT JOIN #Tmp1 ON #Tmp2.[DB_Name] = #Tmp1.[DB_Name]
ORDER BY #Tmp1.[Bkp Date] ASC

PRINT ' ------------------------------ ------------------------------ ------------------------------ '

What is in SrvList.txt ?

In this txt file I am just listing all my servers that I want to monitor. One server name per line only. In this case I have repeated same server name multiple times to show sample output


Sample Output..

DB Backup details for server ... [MSSQL_2K5] as on Feb 25 2008 9:37AM

Last Full Bkp Done At Last T.Log Bkp Done At
------------------------------ ------------------------------ ------------------------------
msdb Feb 25 2008 9:33AM NULL
master Feb 25 2008 9:33AM NULL
model Feb 25 2008 9:33AM Feb 25 2008 9:34AM
TestDB Feb 25 2008 9:33AM Feb 25 2008 9:34AM
TestDB_Prod Feb 25 2008 9:33AM Feb 25 2008 9:35AM
InboxDB Feb 25 2008 9:33AM Feb 25 2008 9:35AM

------------------------------ ------------------------------ ------------------------------

DB Backup details for server ... [MSSQL_2K5] as on Feb 25 2008 9:37AM

Last Full Bkp Done At Last T.Log Bkp Done At
------------------------------ ------------------------------ ------------------------------
msdb Feb 25 2008 9:33AM NULL
master Feb 25 2008 9:33AM NULL
model Feb 25 2008 9:33AM Feb 25 2008 9:34AM
TestDB Feb 25 2008 9:33AM Feb 25 2008 9:34AM
TestDB_Prod Feb 25 2008 9:33AM Feb 25 2008 9:35AM
InboxDB Feb 25 2008 9:33AM Feb 25 2008 9:35AM

------------------------------ ------------------------------ ------------------------------

DB Backup details for server ... [MSSQL_2K5] as on Feb 25 2008 9:37AM

Last Full Bkp Done At Last T.Log Bkp Done At
------------------------------ ------------------------------ ------------------------------
msdb Feb 25 2008 9:33AM NULL
master Feb 25 2008 9:33AM NULL
model Feb 25 2008 9:33AM Feb 25 2008 9:34AM
TestDB Feb 25 2008 9:33AM Feb 25 2008 9:34AM
TestDB_Prod Feb 25 2008 9:33AM Feb 25 2008 9:35AM
InboxDB Feb 25 2008 9:33AM Feb 25 2008 9:35AM

------------------------------ ------------------------------ ------------------------------

Friday, February 22, 2008

SQL Express for Absolute Beginners!

Lesson 1: What is a database?
This introduction covers the questions and terminology SQL Server beginners need to create and manage databases, such as: What is a SQL Server database? How is a SQL Server database different from an Excel spreadsheet? What do SQL Server databases do in my Web site or software applications? What's SQL Server Express' relation to Visual Studio Express? What do terms like "ADO.NET" mean? Download the Video

Lesson 2: Understanding Database Tables and Records
SQL Server beginners will learn about tables and definitions of data types, properties, keys, etc. in this second video. Find out how table rows, columns, and fields interrelate and whether columns can be empty. Download the Video

Lesson 3: More about Column Data Types and Other Properties
SQL Server beginners will explore data integrity, learn which data types to use for what, and avoid common data type pitfalls. Download the Video

Lesson 4: Designing Relational Database Tables
Why are relational databases the most popular? SQL Server beginners will learn to differentiate between primary and foreign keys, design relational database tables, and learn about normalization rules and de-normalization.Download the Video

Lesson 5: Manipulating Database Data
SQL Server beginners will learn about ADO.NET objects and execution of SQL statement from ADO.NET. Explore SQL and learn about insert, update, and delete statements.Download the Video and Project

Lesson 6: More Structured Query Language
SQL Server beginners will discover more SQL statements like sub-select and learn stored procedures and their counterparts in C# and VB methods.Download the Video

SQL Express for Absolute Beginners !

Nice video(s) for absolute beginners, these videos introduce to couple of SQL Terms and Microsoft's Express Edition of SQL Server. SQL Server Express Edition is freely downloadble from Microsoft's website.

Lesson 1: What is a database?

This introduction covers the questions and terminology SQL Server beginners need to create and manage databases, such as: What is a SQL Server database? How is a SQL Server database different from an Excel spreadsheet? What do SQL Server databases do in my Web site or software applications? What's SQL Server Express' relation to Visual Studio Express? What do terms like "ADO.NET" mean?
Download the Video

Lesson 2: Understanding Database Tables and Records

SQL Server beginners will learn about tables and definitions of data types, properties, keys, etc. in this second video. Find out how table rows, columns, and fields interrelate and whether columns can be empty. Download the Video

Monday, January 28, 2008

Notification Service Deployment :: Timeout expired

    While working on one of the Notification services deployment I was getting following error message.

        An attempt to run a Transact-SQL CREATE PROCEDURE statement failed.
        SqlCommandText: CREATE PROCEDURE dbo.<procedure name>

        Source: .Net SqlClient Data Provider
        Number: -2
        State: 0
        Class: 11
        Server: myServer
        Message: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.

        Line Number: 0

        Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.

    Timeout error for just creation of stored procedure? - I did check the stored procedure, it's T-SQL was 200 line long that; should not matter anyway. However I went ahead and ran the "CREATE PROCEDURE" code in query analyzer. I got created fine, just in slit seconds.

    Did some search on web for "Message: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding." - lot of forums were discussing about increasing timeout - command, connection everything, and some counter-argueing why increase timeout parametes its like "passing the buck to next developer " :) which is alo true. Anyway in my case I didn't had a way to change any parameters as NSControl.exe (the one which runs all the T-SQL scripts) is provided by MS and I am not aware of any way to set it. So we just gave a shot to reboot the server and it worked.

    I did further research and found that this is a know issue with NSControl.exe. Here is the link Per this article "problem occurs if the application databases for the instance of Notification Services are large", but in my case the databases are very small, like 50 subscriptions and 50 members in the list (ofcourse this was TEST environment). Anyway I have decided to apply the patch, with the hope that this issue will not occur again. If it occurs again, I will use my technique - "just reboot".

Saturday, January 26, 2008

Caution about Temporary Tables !

    If you run the following statement, it will create a temporary table with Primary key named as "Pk_Col",

        CREATE TABLE #Tmp (

    I will open another connection using same username and run the same statement it should run successfully. However it will fail with following error.

        Msg 2714, Level 16, State 4, Line 1
        There is already an object named 'Pk_Col' in the database.
        Msg 1750, Level 16, State 0, Line 1
        Could not create constraint. See previous errors.

    From the error message it is very clear that there is a object name conflict not the #Tmp, but the Pk_Col. Unlike #Tmp the Pk_Col is not treated as temporary object.

    So to avoide such issue, and in the sprit of "good practice" don't give name to the constraint. If you don't give name to a constraint SQL Server will automatically give some arbitrary name like Pk_Col_A2340 which would never conflict if you rerun the command again and again. So the T-SQL should look like,

        CREATE TABLE #Tmp (
        col_01 INT PRIMARY KEY

    Note that I have not given a name, so the primary key was named as PK__#Tmp______________1CF15040

    Actually this issue occurred bit differently in my case. I was troubleshooting some T-SQL which was part of schedule job. While troubleshooting I ran the statement and everything looked fine. So went to Management studio and opened the job and pasted this modified T-SQL and executed the job. The job failed! - I looked at the job history and found that its failing due same error as above "Object name conflict". First I was surprise, b'cos both connections are separate and the primary key was on temporary table, but suddenly I realized the issue.

    So, best option all the time is NOT to give a name to a constraints used for temporary tables (# tables). This behaviour is seen in SQL 2000 & 2005 both.

Thursday, January 24, 2008

Not enough server storage is available to process this command


    In one of the deployment procedure we detach the orginal databases and attach them as bak_<Original Name> while doing this we have to rename the file so that we can avoide file name conflict while createing a new instance of databases. In order to rename we use \\<ServerName>\I$ as the access path, where I$ is the administrative share. But for some reason this was not accessable and was showing an on-screen error message as follows.

      "Not enough server storage is available to process this command"

    And in the System event logs following error message was logged.

      "The server's configuration parameter "irpstacksize" is too small for the server to use a local device. Please increase the value of this parameter".


    As the onscreen error message was showing "Not enough server storage is available to process this command" we tried rebooting the system once, as the article suggested there could be possible memory issue, but that did not resolve the issue. Then contacted NT Admin team to trouble shoot this issue. From the update that I got from that team was that - they followed the steps given in aticle, but still they also don't know the root cause of this behaviour. After this change mentioned in the article was implemented the $ share path was accessable as usuall.

    Intrim Solution Steps( This was creative me) :

    While NT Admin team was working on this, we decided to run our deployment procedure manually. So were not sure how do we go about it, b'cos our deployment procedure is made up of lot DOS and SQL Scripts and some of these scripts are created as dynamically during run time and executed. So decided to run the steps one-by-one manually, but that was very tedious. Instantaly we got an idea to introduced a PAUSE in one of the CMD files, just before the line where it was erroring out. So when we triggered the process it came until that pause and was waiting to hit some key. Here we went opened dynamically created files and modified the commands in one of the files to use I:\ instead of I$. Then went back to process window and hit enter key to continue. We were able to run the deploy process successfully. This was just an intrim workaround for that day only.

Thursday, January 17, 2008

SQL 2008 - Backup Compression feature

    All these days we spent time and resources on identifying the best SQL Backup utility that has compression mechanism. Finally we heard that MS-IT uses SQL Litespeed for its operations and most of the people out there bought LiteSpeed, those who couldn't afford settled with other softwares like Red-Gate etc. Put some time in implementing it with custom code to fire backup commands and to clean the old backup files as these tools are not tightly integrated with SQL Maintenance Plans.

    In SQL-2008 MS has introduced "Compressed Backups" - yes all your energy is (that we talked about in above paragraph) is wasted. Instead of saying wasted just assume that you got some knoledge and working of some product like LiteSpeed and Red-Gate etc. Also this "compression of backups" feature is available in "Enterprise Edision" only. And certainly you can resore the "Compressed backups" onto other edition like Standard, & Developer etc.

    Now, Lets look what actually is this new feature in SQL 2008 is doing. Ok - it’s the same compression technique, basically it will compress the data in the memory and dump it into a file in that format which reduced the disk space used, plus the time taken to complete the backup - b'cos the disk I/O is less in this case. I don't see encryption feature in here which is mostly available with third party tools. And the other point is that there are couple of restrictions on the having compressed backups, which are,

    • This feature is only available on SQL 2008 - "Enterprise Edition"
    • Compressed and Un-compressed backups cannot be in the same media set (i.e SQL 2008 - "Enterprise Edition" will understand the media set, however the older versions of SQL Server will fail to read the media set.)
    • NTbackups cannot share a tape with compressed SQL Server backups (I am NOT sure why - may be bc'os of the above reason)

    So how do we enable this feature on SQL 2008 ? - This is done via sp_configure system stored procedure. By default when instance is installed this value is set to OFF (0), so in order to enable it you need to run

      USE master
      EXEC sp_configure 'backup compression default', '1'

    Also, here is the command that you can use in case if you don't want to enable the compression at server level, but want only for specific backup command then,

              TO DISK='Z:\SQLServerBackups\AdvWorksData.bak'

    After reading all this I was curious to think - is there any change in backupset system table in MSDB. And yes, there is a new column in backupset table named  "compressed_backup_size". The value in this column will tell you the byte count of the file (compressed) stored in the disk. If this value is NULL then that backup set is NOT_COMPRESSED.

    Performance wise - Certainly compression demands extra CPU cycles, good that MS has provided two more counter to help use deal with performance related to compression. The counter are Device Throughput Bytes/sec (in SQLServer:Backup Device object) and Backup/Restore Throughput/sec counter (in SQLServer:Databases object) with these two I think we should be able to deal with perf. Issue related to backup process involving compression.

    Compression Ratio - This is something that depends on couple of factors like is the data repeated too much in that case you can achieve high level of compression. As you migh know binary data cannot be compressed much so in those situation as well as in case of encrypted data character repeation happen very less, so you get less compression ratios.

Tuesday, January 8, 2008

Capturing PSSDIAG/SQLDump, Right In Time

    Usually when you contact Microsoft PSS for SQL Server related issue they will ask you to run the PSSDIAG if its performance issue or if its SQL Server hung/restart issue they will ask you to capture memory dumps. If the issue is known to occur at certain time or if you can reprooduce the issue its fine you can trigger PSSDIAG or SQLDump well in time. But usually that is not the case. The issues are totally random and unknown in that kind of situation as a DBA you will face one problem - how will you run PSSDIAG or SQLDump right at the time of issue occurance.

    Understand the symtoms: If you know the symptoms like heavy blocking, slow down, unresponsiveness etc you can write a VBScript or combination of DOS and OSQL to detect the symptoms. So lets take un-responsiveness as an example. If this is the case I would write an Vscript and ping the SQL Server if the connection time is beyound 3-4 seconds then I will trigger the PSSDIAG or SQLDump whatever MS Support engineer has asked me to capture. This would totally automate the process and no human intervention is required.

    In one of my cases what happened was there was an issue with SQL Server and it used to restart automatically. This behaviour was very much random was not sure when would the next re-start would occur. After initial investigation we found that the MS Clustering service was causing the SQL Server to restart. The way MS Clustering service works is - if the resource (in this case SQL service) is not responding (failure of isAlive check) then it will attempt to restart it. After restart attempts exceeds the configured thershold value Clustering service will attempt to failover. Which means the SQL Service will stop & has to start from the other node and hence I used to see the restart event occuring. So what I did was, I whipped out a simple VBScript to detect the SQL Server's situation and trigger the memory dump. Along with this I also changed the threshold timer and number of attempts in cluadmin so that Clustering service will not failover before my script does its job.

    Note: Pls evaluate the Pros/Cons or risks before making any change in Cluster Configurations.

    So this is how I was successful in capturing the SQLDump right in time and sent it to PSS for analysis. We found that the SQL Server was running out of "worker thread" and due to which applications were failing (new connections) to connect to SQL Server. And even Clustering service was unable to do its "isAlive" checks and it used to interpret this as serice failure.

    Similarly like what I explained above you can execute PSSDIAG or any other utility of choice if required.

Saturday, January 5, 2008

Who is incharge ?

All the organs of the body were having a meeting, trying to decide who was the one in charge .

'I should be in charge,' said the brain , 'Because I run all the body's systems, so without me nothing would happen .'

'I should be in charge,' said the blood , 'Because I circulate oxygen all over so without me you'd all waste away .' 'I should be in charge,' said the stomach,' Because I process food and gi! ve all o f you energy .'

'I should be in charge,' said the legs , 'because I carry the body wherever it needs to go .'

'I should be in charge,' said the eyes , 'Because I allow the body to see where it goes .'

'I should be in charge,' said the rectum-Asshole' 'Because I'm responsible for waste removal.' All the other body parts laughed at the rectum And insulted him, so in a huff, he shut down tight.
Within a few days, the brain had a terrible headache, the stomach was bloated, the legs got wobbly, the eyes got watery, and the blood was toxic . They all decided that the rectum should be the boss . The Moral of the story? Even though the others do all the work.

The asshole is usually in charge .

Wednesday, January 2, 2008

SQL Server 2008 - New Security Features

Transparent Data Encryption: What actually this means is - in earlier versions of SQL Server anybody could restore or attach and use the data in the database, of course if assuming if the unauthorized person got hold of the physical backup/data files. But SQL 2008 enables encryption of an entire database, data files or log files, without the need for application changes, plus search the encrypted data using both range and fuzzy searches, search secure data from unauthorized users, and data encryption without any changes in existing applications (developers must me happy). So in this scenerio if unauthorised person has to get access to data he needs to have the Master key, not just copy of physical backup/data file. Once encryption is turned on, the data is secured. This is not available in current CTP, soon it might be available to explore. I am expecting some kind of performance degradation on encrypted databases access, as one extra layer of data transformation is involved, but I am not sure.

Extensible Key Management: This feature was there in SQL Server 2005 but was kinda basic solution for encryption and key management. In SQL Server 2008 it even supports third-party key management and HSM (Hardware Security Modules) products, and they say usage of HSM's are much more secured. Hardware Security Module (HSM) is a hardware-based security device that generates, stores and protects cryptographic keys. It provides the foundation for a high-level secure campus certification authority. Certification modules are also available in software, but a hardware device provides a higher level of security. (source:

Auditing: I am very happy to see this feature since SQL 2005. You can create and manage auditing via DDL while simplifying compliance by providing more comprehensive data auditing. This enables organizations to answer "basic" questions such as "What data was retrieved?".

All these above great features are available for usage via your favourite T-SQL command, isn't it great?

Why use object type identifier ?

Following is the typical T-SQL statement found in most of the database object deployment scripts. Let me describe the script a little bit - the below script will find the object "[dbo].[fnHasMarketingOptions]" which is of TYPE FN (Scalar function), IF (In-lined table-function) OR TF (Table function) and if FOUND it will drop that function.

    IF EXISTS (SELECT * FROM dbo.sysobjects WHERE id = OBJECT_ID(N'[dbo].[fnHasMarketingOptions]') and xtype in (N'FN', N'IF', N'TF'))

    DROP FUNCTION [dbo].[fnHasMarketingOptions]

I don't understand why use "and xtype in (N'FN', N'IF', N'TF')" this part - because anyway you can't create a TABLE and PROCEDURE/FUNCTION with the same names in a database. Below is the illustration T-SQL statements,

    CREATE TABLE Test (col1 varchar(30))


    CREATE PROC test

Here is the error message….

    Msg 2714, Level 16, State 5, Procedure test, Line 4
    There is already an object named 'test' in the database.

If two object types (TABLE/PROCEDURE/TRIGGER) cannot be created with same name then why use "and xtype in (N'FN', N'IF', N'TF')" ?

About Me

By profession I am a Database Administrator (DBA) with total 13 yrs. of experience in the field of Information Technology, out of that 9 yrs as SQL DBA and last 3 years in IBM System i/iSeries and DB2 LUW 9. I have handled Developer, plus production support roles, and I like both the roles. I love and live information technology hence the name "Techonologyyogi" Apart from that I am a small, retail investor, with small investments in India and United States in the form of Equity holdings via common stocks. Don't ask me if I have made money, I have been loosing money in stocks.