T-SQL function to Get Maximum of values from the same row


 Based on the ScottPletcher   solution from http://www.experts-exchange.com/Microsoft/Development/MS-SQL-Server/Q_24204894.html
–SELECT dbo.GetMax(23, 45, 64, 22, 18, 224, 74, 138, 1, 98, 11, 86, 198)
–Naturally adjust data type to match what you actually need for your specific values
I’ve created a set of functions (e.g. GetMaxOfDates3 , GetMaxOfDates13 )to find max of up to 13 Date values.
@value01 DateTime = NULL,  
@value02 DateTime = NULL,
@value03 DateTime = NULL,
@value04 DateTime = NULL,
@value05 DateTime = NULL,
@value06 DateTime = NULL,
@value07 DateTime = NULL,
@value08 DateTime = NULL,
@value09 DateTime = NULL,
@value10 DateTime = NULL,
@value11 DateTime = NULL,
@value12 DateTime = NULL,
@value13 DateTime = NULL
SELECT TOP 1 value
SELECT @value01 AS value UNION ALL
SELECT @value13
) AS [values]
ORDER BY value DESC    
@value01 DateTime = NULL,  
@value02 DateTime = NULL,
@value03 DateTime = NULL
RETURN dbo.GetMaxOfDates13(@value01,@value02,@value03,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL)

 I haven’t considered UNPIVOT solution at the time of writing these functions, but it probably will be better.

Other solutions can be found at http://stackoverflow.com/questions/71022/sql-max-of-multiple-columns



Use CompiledQuery.Compile to improve LINQ to SQL performance



After reading DLinq (Linq to SQL) Performance and in particular
Part 4  I had a few questions.

If CompiledQuery.Compile gives so much benefits, why not to do it for all Linq To Sql queries? Is any essential disadvantages of compiling all select queries? What are conditions, when compiling makes whose performance, for how much percentage? World be good to have default on application config level or on DBML level to specify are all select queries to be compiled? And the same questions about Entity Framework CompiledQuery Class.

However in comments I’ve found answer  of the author ricom

6 Jul 2007 3:08 AM
Compiling the query makes it durable. There is no need for this, nor is there any desire, unless you intend to run that same query many times.
SQL provides regular select statements, prepared select statements, and stored procedures for a reason.  Linq now has analogs.

Also from 10 Tips to Improve your LINQ to SQL Application Performance

If you are using CompiledQuery make sure that you are using it more than once as it is more costly than normal querying for the first time. The resulting function coming as a CompiledQuery is an object, having the SQL statement and the delegate to apply it.  And your delegate has the ability to replace the variables (or parameters) in the resulting query.

However I feel that many developers are not informed enough about benefits of Compile.
I think that tools like FxCop and Resharper should check the queries  and suggest if compiling is recommended.

Related Articles for LINQ to SQL:
MSDN How to: Store and Reuse Queries (LINQ to SQL)
10 Tips to Improve your LINQ to SQL Application Performance

Related Articles for Entity Framework:
MSDN: CompiledQuery Class
Exploring the Performance of the ADO.NET Entity Framework – Part 1

Exploring the Performance of the ADO.NET Entity Framework – Part 2
ADO.NET Entity Framework 4.0: Making it fast through Compiled Query

Lookup Tables with fallback support

Our database has a few lookup tables that uses similar approach to implemented by .Net Framework for localized resources:

At run time, ASP.NET uses the resource file that is the best match for the setting of the CurrentUICulture property. If there is no match for the current UI culture, ASP.NET uses resource fallback. It starts by searching for resources for a specific culture. If those are not available, it searches for the resources for a neutral culture. If these are not found, ASP.NET loads the default resource file.

Let’s assume that we have a look-up table with Time Zones for different countries, states within the countries and post codes within state.
To reduce maintenance of lookup table, if most of the country has one timezone, but only a few states(or post codes) have different timezones, we are not storing information about all states/postcodes, but storing default timezone for the country and then exceptional  states/postcodes:

Country Code State Code Post Code Time Zone Comment(not a part of database table)
CC1 Null Null 1 Simple -the same Time Zone for the whole country
CC2 ST1 Null 2 In the country different states have different zones
CC2 ST2 Null 3 regardless of postcodes
CC3 ST1 1001 2 In the same state different post codes have different time zones
CC3 ST1 1002 3
CC4 null 1001 2 Country doesn’t have states, but different post codes have different time zones
CC4 null 1002 3
CC5 null null 2 Most of states in the country have one time zone, but some states have different time zones
CC5 ST55 null 3
CC6 ST5 null 2 Most of postcodes in the state have one time zone, but some postcodes have different time zones
CC6 ST5 2007 3

The algorithm to retrieve the the timezone is the following
1. Try to retrieve the row for specified country, state and postcode. If record found, we return the time zone.
2.  If record not found, try to retrieve the row for specified country and state. If record found, we return the time zone.
3.  If record not found, try to retrieve the row for specified country. If record found, we return the time zone.
4.  If record not found, throw exception- we expect that at least 1 record per country should exist.

This approach works quite well, but I have some  concerns about performance. Usually the table is stored as cached singleton with LINQ to SQL generated and loaded table.
I am not sure how good LINQ to SQL generated entities work with queries to retrieve records for specified partial keys.
I world be interesting to see some benchmarks for LINQ to SQL loaded into memory collections similar as it was done for DataTable .
I am appreciate any suggestions regarding improving performance, as well as the regarding whole fallback approach.

SQL Script to create indexes for Foreign keys


I’ve used SQL script similar to paul_nielsen’s to Create Indexes for Foreign Keys and added “if not exists” condition

‘if not exists (select * from sys.indexes
 where id=object_id(”’ + TableName +”’) and name=”Ix’ + ForeignKeyName+”’)
 CREATE INDEX Ix’ + ForeignKeyName
    + ‘ ON ‘ + TableName + ‘(‘ + ColumnName + ‘);

print @SQL

At the end I would recommend to print @SQL to show all new lines correctl in messages ta of SSMS,
rather than SELECT @SQL or execute SQL


Another script can be found here(login is required): http://www.sqlservercentral.com/scripts/Index+Management/62069/

From http://msdn.microsoft.com/en-us/library/ms175464.aspx
Indexing FOREIGN KEY Constrains

Creating an index on a foreign key is often useful for the following reasons:

  • Changes to PRIMARY KEY constraints are checked with FOREIGN KEY constraints in related tables.
  • Foreign key columns are frequently used in join criteria when the data from related tables is combined in queries by matching the column or columns in the FOREIGN KEY constraint of one table with the primary or unique key column or columns in the other table. An index enables the Database Engine to quickly find related data in the foreign key table.

Restore SQL Server Database from suspect

Below are a few links, that can help, if you need  Restore SQL Server Database from suspect

MyITforum.com : How to recover user databases from a “Suspectstatus(

You could also try detaching the DB, then do a single file attach only using
the db, dropping the log file. That solved the suspect issue for one of
our DBs. Somewhere in the SQL message logs it indicated that the LOG file
was corrupted

Here is the MSSQL command needed to attach an .mdf SQL database file with no .ldf file, enjoy!

sp_attach_single_file_db @dbname= ‘databasename’, @physname= ‘C:UsersBlakeDocumentsDatabasesdatabasefile.mdf’


Server: Msg 823, Level 24, State 6, Line 1

I/O error 2(The system cannot find the file specified.) detected during read at offset 0000000000000000 in file ‘D:OnePointDBOnePointLog1.log’.

Run sp_resetstatus with the @dbname parameter. (ie : sp_resetstatus @dbname = “pubs”)
USE Master

— Determine the original database status
SELECT [Name], DBID, Status
FROM master.dbo.sysdatabases

— Enable system changes
sp_configure ‘allow updates’,1

— Update the database status
UPDATE master.dbo.sysdatabases
SET Status = 24
WHERE [Name] = ‘YourDatabaseName’

— Disable system changes
sp_configure ‘allow updates’,0

— Determine the final database status
SELECT [Name], DBID, Status
FROM master.dbo.sysdatabases


SQL’s “like” patterns to compare in .Net

I wanted to use   SQL’s “like” patterns to compare in .Net.
I found the good C# implementation of  function in thread Using Regex to create a SQL’s “like” like function.
The function  IsSqlLikeMatch works fine, but I’ve noticed that the search is case-sensitive.
It’s also doesn’t match % if there are multiple lines.
But it was easy to change by modifying IsMatch call to

return Regex.IsMatch(input, pattern, RegexOptions.IgnoreCase | RegexOptions.Singleline);

Note that it could be very serious performance hit, if the pattern is started with %.


E.g for patternsPattern=’%Part1%Part2%’ and long message (Length 30720) time spent 40 sec!

but for pattern ‘Begin%Part1%Part2%’ and the same long message  time spent is almost 0 .The test code with time measured:



  DateTime start = DateTime.Now;

DebugOutputHelper.TracedLine(“IsSqlLikeMatch sPattern=” + sPattern + ” errorMsg.Length “ + errorMsg.Length + ” Started “ + start.ToString());

bRet = errorMsg.IsSqlLikeMatch(pattern);

TimeSpan timeDifference = DateTime.Now – start;

DebugOutputHelper.TracedLine(“IsSqlLikeMatch sPattern=”+ sPattern +” errorMsg.Length “+ errorMsg.Length +

” time spent “ + timeDifference.ToString());

Custom alert on the SQL Database

 I wanted to create a custom alert on the SQL Database when number of records with some values(considered as invalid) exceeds the expected limit. 
First of all you need to Set up Database Mail for SQL 2005 and follow procedure
IMPORTANT: don’t forget to Restart SQL Agent to activate settings. 

Similar to the article Define custom error messages in SQL Server 2005
I’ve defined the error
EXEC sp_addmessage 60001, 1, N’Number of not-processed  tasks %d exceed the limit on %s.’
and SP:

ALTER PROCEDURE [dbo].[CheckNotProcessed]
	@limit int = 10

declare @StartDate datetime
declare @EndDate  datetime
declare @NotProcessedCount int
set @EndDate  =GetDate()
set @StartDate =DATEADD (day ,-1, @EndDate) print @StartDate print @EndDate select @NotProcessedCount=count(*) from dbo.[MyTBL] where [ProcessState] <> 99999 and createdDate between @StartDate and @EndDate if( @NotProcessedCount>@limit) begin declare @CurrentDBName varchar(60) set @CurrentDBName=DB_NAME() RAISERROR (60001, 10,1,@NotProcessedCount, @CurrentDBName) WITH LOG END /* -- ============================================= -- Example to execute the stored procedure -- ============================================= EXECUTE DBO.CheckNotProcessed 10 */
Note that if you have more than one similar database on the same server, it’s important to specify  DB_NAME()
I’ve also created Job to call the SP on a regular basis.
Then I’ve created an alert using an error number (Enterprise Manager)

I fould that this approach is very powerful and allow to send monitor regular business pricesses, as well as get notifications about some  data conditions, that required investigation/debugging.