Javascript -dynamically change the width of one table based on the size of other table.

I need to  ensure that the width of one table on my html page be calculated from the width of other table. (less for determined value). Currently the size is hardcoded in the Server user control code. But ideally it could be done using  Javascript – to dynamically change the width of one table based on the size of other table.

The Task  looks very strightforward, but I decided to post a few links, that I found interesting:

Dynamic Content with DOM-2
:JavaScript | Table and in particular Q1594 How can I detect the width of a table?

JavaScript DOM Examples

Controlling Table Size 

Watin Test Recorder helps to navigate to required for debugging page.

It is quite common, that  a developer needs to do a few (sometimes time-consuming) steps to navigate to the pages that are required debugging. And if you do some non-trivial change, Edit&Continue doesn’t work, and you have to start debugging again.
Yes, I know, that small unit tests should cover all the code paths and make debugging redundant. Unfortunately in the real life debugging is sometimes unavoidable.

 Watin Test Recorder    allows to record the navigation steps and save it as a console executable(or Windows forms test harness).
Then run the executable, and Watin script will fill all the forms and navigate through multiple pages. You can just set breakpoints in your ASP.NET application and wait until your code will be reached.
I want to confirm that v1.0 of  Watin Test Recorder  is quite stable and produce workable scripts(Beta versions didn’t do the job well).

DataView.Sort doesn’t accept long name for direction

According to the documentation DataView.Sort  is a string that contains the column name followed by “ASC” (ascending) or “DESC” (descending). Columns are sorted ascending by default. Multiple columns can be separated by commas. 

If you pass long name: Descending or Ascending (from System.Web.UI.WebControls.SortDirection enum)

the error is reported:

System.IndexOutOfRangeException: Cannot find column LastName Descending. at System.Data.DataTable.ParseSortString(String sortString) at System.Data.DataView.UpdateIndex(Boolean force, Boolean fireEvent) at System.Data.DataView.UpdateIndex(Boolean force) at System.Data.DataView.SetIndex2(String newSort, DataViewRowState newRowStates, DataExpression newRowFilter, Boolean fireEvent) at System.Data.DataView.SetIndex(String newSort, DataViewRowState newRowStates, DataExpression newRowFilter) at System.Data.DataView..ctor(DataTable table, String RowFilter, String Sort, DataViewRowState RowState)

EnitySpaces: the same Query can be used to save multiple tables

I wanted to minimize number of calls to the database, but get back results of dynamic query for small number of columns, as well as detailed data(e.g Itinerary table and Itinerary joined with Itinerary items) .In EntitySpaces
I’ve created a where condition, filled Query and loaded detailed data, then saved the returned table to DataSet.

Then  for the same query I’ve specified distinct select columns and Load it again(When LINQ will be available, it could be done in memory without extra database call).  Different table was created, that I’ve added to the same dataset. Below is code snippet:

                VwItinerariesCollection collPendingList = new VwItinerariesCollection();
                VwItinerariesQuery queryFilter = collPendingList.Query;
                queryFilter.AddWhereConditions();//custom code
                 collPendingList.Query.Load();
                 DataTable tbl= collPendingList.ConvertToDataTable();
                tbl.TableName = ListWithDetailsTable”;   
              DataSet  itineraries = new DataSet();
                itineraries.Tables.Add(tbl);
                 queryFilter.DistinctSelectForList();//custom code
                 collPendingList.Query.Load();
                 tbl = collPendingList.ConvertToDataTable();
                 tbl.TableName = ListTable”;
                 itineraries.Tables.Add(tbl);

SQL Server Seminar presented by Kevin Kline

Last week I was on SQL Server Seminar presented by Kevin Kline.
Performance, Baselining, Benchmarking and Monitioring Presentation
Stored Procedure Best Practices Presentation
Surviving the Data Avalanche Presentation
Top 10 Mistakes on SQL Server Presentations

There were a few interesting points.
Causes of performance problems-5%-hardware, 15%-bad database design, 80%-bad application code.
Do not Interleaved DDL and DML -No create/Drop table in the same SP as insert/Select.
Keep transactions as short as possible.Use @@TRANCOUNTor sys.dm_tran_active_transactionsto look for orphaned transactions when entering a new routine.

Set Theory is Better. then rows.Cursors are depricated.
Use table variables,minimize use of tempdb
Again: Set Nocount On
Again: Owner qualify all names (2-part)

Tiered storage:
Active(e.g. RAID10, high RPM speeds).
Near-term older data (e.g. RAID5, middle RPM speeds)
Old, long-term data (e.g. SATA drives – high volume, low RPM speeds)

Related white papers from Quest Site(login probably will be required):

The Top 10 Mistakes on SQL Server
Tuning SQL Statements on Microsoft SQL Server
Finding and Fixing Bad SQL in Microsoft SQL Server 2000
Performance Baselining, Benchmarking, and Monitoring for SQL Server 2000

Microsoft T-SQL Performance Tuning Part 1: Analyzing and Optimizing T-SQL Query Performance on Microsoft SQL Server using SET and DBCC 

Microsoft T-SQL Performance Tuning Part 2: Analyzing and Optimizing T-SQL Query Performance on Microsoft SQL Server using Indexing Strategies  

Microsoft T-SQL Performance Tuning Part 3: Analyzing and Optimizing T-SQL Query Performance on Microsoft SQL Server using Query Optimizer Strategies

Microsoft T-SQL Performance Tuning Part 4: Analyzing and Optimizing T-SQL Query Performance on Microsoft SQL Server using SHOWPLAN Output and Analysis
Virtualization and SQL Server in the Enterprise

Issues using binary serialization to store object state for unit tests

To save object state for unit tests we use binary serialization. The object state saved to data file and at the start of the test is loaded from the file. It works OK until the object will be changed in a new version. Fortunately There is no problem when a new field is added, but if a field is deleted or  removed, it most likely will cause deserialization exception(see   
Version Tolerant Serialization, Net Serialization FAQSolving Binary serialization versioning conflicts articles for details).

Also changing return type will break deserialization. E.g. I’ve changed return type from ArrayList to type-safe List<MyType> and I’ve got errors.

I wanted to change the storage to use SoapFormatter to be able to read and optionally modify the data file, If it’s broken.
(Similar approach described by Mauro Sturaro  in discussion here)

But SoapFormatter does not support generics and beginning with the .NET Framework version 3.5, this class is obsolete.
Dino Esposito suggests to consider LosFormatter Class -but it is not human-readable, like binary.

Dictionary with keys from enum.

One of our classes keeps state in a dictionary of attributes, with the keys from enum. Adding new values to enum do not causes deserialization exceptions, But the problem was that enum didn’t have explicit assignments for numbers, but used automatic increment. When someone inserted a new key into enum, actual enum values were shifted, and all dictionary was corrupted.
The highly recommended way is to explicitely assign values for enum . Do not forget to have defualt value for 0- e,g, NonInitialized.
I’ve created a  helper functions AreValuesUnique<EnumType, ValueType>() and  List<ValueType> FindDuplicates<EnumType, ValueType>() to ensure that explicit values are not duplicated and I am calling it from unit test.

Based on these problems I don’t like to store objects for unit test using binary serialization.
I am going to research, which other options are viable. 

Ensure that JavaScript files or CSS files are refreshed for each new version

We can add into references to JS or CSS files from HTML:

<SCRIPT TYPE=”text/JavaScript ” SRC=”FileName.js?v=<%=AssemblyVersionNumber()%>”>

The attribute does nothing other than trick the browser into thinking that the .js file must be retrieved from server for new version instead of cached .

You can write your own AssemblyVersionNumber() function or use my function from My SystemHelper class. like the following:

SystemHelper.GetAssemblyVersion(Assembly.GetExecutingAssembly())

Idea from http://www.thescripts.com/forum/thread165240.html

Note that it could be  problems, if head element has runat=”server” attribute.

Also consider to reduce number of CSS and JS files.

Generic function to Find Duplicates in Generic List

I’ve created a new function to Find Duplicates in Generic List similar to Generic function to removeDuplicates from Generic List as well as bool AreValuesUnique.

<!–

.csharpcode{font-size: 10pt;color: black;font-family: Courier New , Courier, Monospace;background-color:Transparent;}.csharpcode pre { margin: 0px; }.rem { color: #008000; }.kwrd { color: #0000ff; }.str { color: #006080; }.op { color: #0000c0; }.preproc { color: #cc6633; }.asp { background-color: #ffff00; }.html { color: #800000; }.attr { color: #ff0000; }.alt{background-color: #f4f4f4;width: 100%;margin: 0px;}.lnum { color: #606060; }>

 public static List<GenericType> FindDuplicates<GenericType>(List<GenericType> inputList)
        {
            Dictionary<GenericType, int> uniqueStore = new Dictionary<GenericType, int>();
            List<GenericType> finalList = new List<GenericType>();
 
            foreach (GenericType currValue in inputList)
            {
                if (uniqueStore.ContainsKey(currValue))
                {
                    finalList.Add(currValue);
                }
                else
                {
                    uniqueStore.Add(currValue, 0);
                }
            }
            return finalList;
        }
     public static bool AreValuesUnique<GenericType>(List<GenericType> inputList)
        {
            foreach (GenericType currValue in inputList)
            {
                if (inputList.IndexOf(currValue) != inputList.LastIndexOf(currValue))
                    return false;
            }

EntitySpaces API: How to get esQueryItem column name.

In constructor columnName is passed as parameter.

public esQueryItem (
	esDynamicQuery query,
	string columnName
)
However I wasn't able to fint the property ColumnName or something similar.
After some time I understood, that they use implicit operator string
public static implicit operator string  (
	esQueryItem item
)
which does the magic.
However  the explicit read-Only property ColumnName will be useful.

By the way the help shows 2 different operators as the same
static memberImplicit(esQueryItem)
ToString() (to use in GroupBy/OrderBy and such ....)

static memberImplicit(esQueryItem)
ToString() (to use in GroupBy/OrderBy and such ....)

DataSetHelper Class that I am using

Previously I’ve posted a few Helper Classes . This post describes  DataSetHelper class, that mostly based on series of MSDN articles. However I did some modifications(e.g see posts DataSetHelper.SelectDistinct method for multiple columns and  “Handling missing source columns in DataSetHelper.InsertInto method” )

using System;
using System.Data;
using System.Data.SqlClient;
using System.Diagnostics;
using System.Collections;
using Microsoft.ApplicationBlocks.Data;
using System.Collections.Generic;
 
namespace FSHelperLib
{
      ///<summary>
      /// Created based on HOW TO: Implement a DataSet SELECT INTO Helper Class in Visual C# .NET
      ///</summary>
      public class DataSetHelper
      {
            public DataSet m_ds;
 
            public DataSetHelper(ref DataSet DataSet)
            {
                  m_ds = DataSet;
            }
            public DataSetHelper()
            {
                  m_ds = null;
            }
        private List<FieldInfo> m_FieldInfo;//System.Collections.ArrayList
        private string m_FieldList;
 
            public class FieldInfo
            {
                  public string RelationName;
                  public string FieldName;      //source table field name
                  public string FieldAlias;     //destination table field name
                  //public string Aggregate;
            }
            #region “private Helper methods”
            private void ParseFieldList(string FieldList, bool AllowRelation)
            {
                  /*
                   * This code parses FieldList into FieldInfo objects and then
                   * adds them to the m_FieldInfo private member.
                   *
                   * FieldList syntax: [relationname.]fieldname[ alias], …
                  */
                  if (m_FieldList == FieldList) return;
            m_FieldInfo = new List<FieldInfo>();// System.Collections.ArrayList();
                  m_FieldList = FieldList;
            m_FieldInfo=ParseToFieldInfoList(FieldList, AllowRelation);
            }
///<summary>
        ///              * FieldList syntax: [relationname.]fieldname[ [AS] alias], …
///</summary>
///<param name=”FieldList”></param>
///<param name=”AllowRelation”></param>
///<returns></returns>
        public static List<FieldInfo> ParseToFieldInfoList(string FieldList, bool AllowRelation)
        {
            // Doesn’t support parsing comticate function like the following “, Left(ma_compdate,10) as Closed,”
            List<FieldInfo> lstFieldInfo = new List<FieldInfo>();
            FieldInfo Field; string[] FieldParts;
            string[] Fields = FieldList.Split(‘,’);
            int i;
            for (i = 0; i <= Fields.Length – 1; i++)
            {
                Field = new FieldInfo();
                //Parse FieldAlias.
                FieldParts = Fields[i].Trim().Split(new char[] { ‘ ‘ }, StringSplitOptions.RemoveEmptyEntries);
                switch (FieldParts.Length)
                {
                    case 1:
                        //to be set at the end of the loop
                        break;
                    case 2:
                        Field.FieldAlias = FieldParts[1];
                        break;
                    default: //if more than 2 , it shoud ended with ” AS Alias “
                        if (FieldParts[FieldParts.Length-2].ToUpper().Trim() == “AS”)
                            Field.FieldAlias = FieldParts[FieldParts.Length – 1];
                        else
                            throw new Exception(“Too many parts in field definition: ‘” + Fields[i] + “‘.”);
                        break;
                }
                if (FieldParts.Length <= 3)
                {                    //Parse FieldName and RelationName.
                    FieldParts = FieldParts[0].Split(‘.’);
                    switch (FieldParts.Length)
                    {
                        case 1:
                            Field.FieldName = FieldParts[0];
                            break;
                        case 2:
                            if (AllowRelation == false)
                                throw new Exception(“Relation specifiers not permitted in field list: ‘” + Fields[i] + “‘.”);
                            Field.RelationName = FieldParts[0].Trim();
                            Field.FieldName = FieldParts[1].Trim();
                            break;
                        default:
                            throw new Exception(“Invalid field definition: “ + Fields[i] + “‘.”);
                    }
                }
                else //In case of calculated field (e.g. cl_Addr1 + cl_Addr2 + cl_Addr3 as Address)
                {
                  Field.FieldName= StringHelper.LeftBeforeLast(Fields[i].Trim().ToUpper() , ” AS “);//not always reliable
                }
                if (Field.FieldAlias == null)
                    Field.FieldAlias = Field.FieldName;
                lstFieldInfo.Add(Field);
            }
            return lstFieldInfo;
        }
            #endregion //”private Helper methods”
           
            #region “CreateTable,INSERTInto and SelectINTO”
            ///<summary>
            /// This code creates a DataTable and creates the fields (not specified types) in the
            ///   order that is specified in the FieldList.
            /// See Also CreateTable overload with SourceTable parameter />
            ///</summary>
            ///<example>
            /// Sample of call
            ///<code>dt = dsHelper.CreateTable(“TestTable”, “FirstName FName,LastName LName,BirthDate”);
            ///</code></example>
            public DataTable CreateTable(string TableName, string FieldList)
            {
                  /*
                  */
                  DataTable dt;
                  {
                        dt = new DataTable(TableName);
                        ParseFieldList(FieldList,false);
                        DataColumn dc;
                        foreach (FieldInfo Field in m_FieldInfo)
                        {
                              dc = new DataColumn(Field.FieldName);//not specified types
                              dt.Columns.Add(Field.FieldAlias, dc.DataType);
                        }
                  }
                  if (m_ds!=null)
                        m_ds.Tables.Add(dt);
                  return dt;
            }
            ///<summary>
            /// This code creates a DataTable by using the SourceTable as a template and creates the fields in the
            ///   order that is specified in the FieldList. If the FieldList is blank, the code uses DataTable.Clone().
            /// See Also CreateTable overload without SourceTable parameter />
            ///</summary>
            ///<example>
            /// Sample of call
            ///<code>dt = dsHelper.CreateTable(“TestTable”, ds.Tables[“Employees”], “FirstName FName,LastName LName,BirthDate”);
            ///</code></example>
            public DataTable CreateTable(string TableName, DataTable SourceTable, string FieldList)
            {
                  DataTable dt;
            if (FieldList.Trim() == “”)
            {
                dt = SourceTable.Clone();
                dt.TableName = TableName;
            }
            else
            {
                dt = new DataTable(TableName);
                ParseFieldList(FieldList, false);//not static
                DataColumn dc;
                foreach (FieldInfo Field in m_FieldInfo)
                {
                    dc = SourceTable.Columns[Field.FieldName];
                    dt.Columns.Add(Field.FieldAlias, dc.DataType);
                }
            }
            if (m_ds != null)
                        m_ds.Tables.Add(dt);
                  return dt;
            }
            ///<summary>
            /// Sample of call
            /// dsHelper.InsertInto(ds.Tables[“TestTable”], ds.Tables[“Employees”], “FirstName FName,LastName LName,BirthDate”, “EmployeeID&lt;5”, “BirthDate”) ;
            ///</summary>
            public void InsertInto(DataTable DestTable, DataTable SourceTable,
                  string FieldList, string RowFilter, string Sort)
            {
                  //
                  // This code copies the selected rows and columns from SourceTable and inserts them into DestTable.
                  //
                  ParseFieldList(FieldList, false);
                  DataRow[] Rows = SourceTable.Select(RowFilter, Sort);
                  DataRow DestRow;
                  foreach(DataRow SourceRow in Rows)
                  {
                        DestRow = DestTable.NewRow();
                        if (DataHelper.IsNullOrEmpty(FieldList))
                        {
                    foreach (DataColumn dc in DestRow.Table.Columns)
                    {
                        if (dc.Expression == “”)
                        {
                            if (SourceTable.Columns.Contains(dc.ColumnName))//source can miss some target columns
                                DestRow[dc] = SourceRow[dc.ColumnName];
                            else
                                DebugHelper.LineWithTrace(“The column is missing in the source:” + dc.ColumnName);
                        }
                    }
                        }
                        else
                        {
                              foreach(FieldInfo Field in m_FieldInfo)
                              {
                                    DestRow[Field.FieldAlias] = SourceRow[Field.FieldName];
                              }
                        }
                        DestTable.Rows.Add(DestRow);
                  }
            }
            //MNF 10/9/2004
            ///<summary>
            /// Sample of call
            /// dsHelper.ImportInto(sTestTable, rSourceRow) ;
            ///</summary>
            ///<returns>true if inserted, false if primary key is already in the table</returns>
            public bool ImportInto(string sDestTableName, DataRow rSourceRow)
            {
                  //
                  // This code copies the specified row and inserts it into DestTable.
                  //
                 
                  Debug.Assert(m_ds!=null);
                  DataTable tbl=m_ds.Tables[sDestTableName];
                  if (tbl==null)
                  { tbl=CreateTable(sDestTableName,rSourceRow.Table,“”);
                  }
            bool bRet=RowInTableExists( tbl, rSourceRow );
                  //DataColumn[] colKeys=tbl.PrimaryKey;
            //object[] keyValues=PrimaryKeyValues(rSourceRow);
 
//                if (tbl.Rows.Find(keyValues)==null)
                  if (bRet==false)
                  {
                        tbl.ImportRow(rSourceRow); //TODO can be change to use foreach as in InsertInto
//                      bRet=true;
                  }
            //else //already exists
            //    bRet=false;
                  return bRet;
            }
           
            ///<summary>
            /// Sample of call
            /// dt = dsHelper.SelectInto(“TestTable”, ds.Tables[“Employees”], “FirstName FName,LastName LName,BirthDate”, “EmployeeID&lt;5”, “BirthDate”) ;
            ///</summary>
            public DataTable SelectInto(string TableName, DataTable SourceTable,
                  string FieldList, string RowFilter, string Sort)
            {
                  /*
                   * This code selects values that are sorted and filtered from one DataTable into another.
                   * The FieldList specifies which fields are to be copied.
                  */
                  DataTable dt = CreateTable(TableName, SourceTable, FieldList);
                  InsertInto(dt, SourceTable, FieldList, RowFilter, Sort);
                  return dt;   
            }
            #region “SelectDistinct”
            //It is used to compare field values (including NULL).
            private static bool ColumnEqual(object A, object B)
            {
 
                  // Compares two values to see if they are equal. Also compares DBNULL.Value.
                  // Note: If your DataTable contains object fields, then you must extend this
                  // function to handle them in a meaningful way if you intend to group on them.
 
                  if ( A == DBNull.Value && B == DBNull.Value ) // both are DBNull.Value
                        return true;
                  if ( A == DBNull.Value || B == DBNull.Value ) // only one is DBNull.Value
                        return false;
                  return ( A.Equals(B) ); // value type standard comparison
            }
                             
 
      //    2. Add the following Public method to the class definition. This method copies unique values of the field that you select into a new DataTable. If the field contains NULL values, a record in the destination table will also contain NULL values.
            public DataTable SelectDistinct(string TableName, DataTable SourceTable, string FieldName)
            {
            DataTable dt = SelectDistinct(SourceTable, FieldName);
            dt.TableName = TableName;
                  if (m_ds != null)
                        m_ds.Tables.Add(dt);
                  return dt;
            }
 
        public static DataTable SelectDistinct(DataTable SourceTable, string FieldName)
        {
            DataTable dt = new DataTable();
            dt.Columns.Add(FieldName, SourceTable.Columns[FieldName].DataType);
 
            object LastValue = null;
            foreach (DataRow dr in SourceTable.Select(“”, FieldName))
            {
                if (LastValue == null || !(ColumnEqual(LastValue, dr[FieldName])))
                {
                    LastValue = dr[FieldName];
                    dt.Rows.Add(new object[] { LastValue });
                }
            }
            return dt;
        }
            public static DataTable SelectDistinct( DataTable SourceTable, string[] FieldNames)
            {// select distinct on multiple fields.
//From: Nageswara Reddy http://www.dotnet247.com/247reference/msgs/43/218182.aspx
            FieldNames = StringArrayHelper.ToLower(FieldNames);
            DataTable DistinctTable = SourceTable.Clone();
                  DataColumn [] constraintColumns = new DataColumn[FieldNames.Length];
            int nFound=0;
                  for (int i =0 ; i< DistinctTable.Columns.Count ; i++)
            {
                if (Array.IndexOf(FieldNames, DistinctTable.Columns[i].ColumnName.ToLower()) >= 0)
                {
                    if (nFound >= FieldNames.Length)
                    {
                        throw new ApplicationException(“Too many fields are similar to passed FieldNames “ + FieldNames.ToString());
                    }
                    constraintColumns[nFound++] = DistinctTable.Columns[i];
                }
                  }
            //Report if passed field names are not found in the table
            if (nFound < FieldNames.Length)
            {
                throw new ApplicationException(“Some of fields “ + FieldNames.ToString() + ” not found in the database”);
            }
                  UniqueConstraint _UniqueConstraint = new UniqueConstraint(constraintColumns);
                  DistinctTable.Constraints.Add(_UniqueConstraint);
 
                  for (int i =0 ; i< SourceTable.Rows.Count ; i++)
                  {
                        try
                        {
                              DistinctTable.ImportRow(SourceTable.Rows[i]);
                        }
                        catch(Exception ex)
                        { // Keep quite
                              Debug.WriteLine(ex.ToString());
                        }
                  }
                  return DistinctTable;
            }
      #endregion “SelectDistinct”
 
 
            //MNF 10/12/2004
            ///<summary>
            /// Merge consider merged items as new
        /// dsHelper.MergeAsNew(tblSource,      MissingSchemaAction.AddWithKey) ;
            ///</summary>
            ///<remarks> function clones source table to modify status of records as add </remarks>       
            ///<returns>true if inserted, false if primary key is already in the table</returns>
            public DataSet MergeAsNew( DataTable tblSource, MissingSchemaAction missingSchemaAction)
            {
                  Debug.Assert(m_ds!=null);
                  DataTable tblClone=tblSource.Clone();
                  InsertInto(tblClone, tblSource, “”,“”,“”);
                  bool preserveChanges=true;//hardcoded
                  m_ds.Merge(tblClone, preserveChanges,missingSchemaAction);
                  return m_ds;
            }
        ///<summary>
        /// The function expects that original table exist in the dataset.
        /// New records from tblToAdd will have DataRowState.Added and existing records will have DataRowState.Modified
        /// dsHelper.MergeAsNewOrModified(tblToAdd,   MissingSchemaAction) ;
        ///</summary>
        ///<remarks> function clones source table to modify status of records as add </remarks>       
        ///<returns>modified dataset
        /// Newly added rows are marked as insertions, and changes to existing rows are marked as modifications.
        ///</returns>
        public DataSet MergeAsNewOrModified(DataTable tblToAdd,MissingSchemaAction missingSchemaAction)
        {
                  Debug.Assert(m_ds!=null);
            string sTableName = tblToAdd.TableName;
            if (m_ds.Tables[sTableName] == null)
            {
                MergeAsNew(tblToAdd, missingSchemaAction);
            }
            else
            {
                DataTable tblOrig = m_ds.Tables[sTableName];
                foreach (DataRow rowToAdd in tblToAdd.Rows)
                      {
                   //Finds and updates a specific row. If no matching row is found, a new row is created using the given values.
                  DataRow row= DataHelper.LoadDataRow(tblOrig, rowToAdd, true);
                      }
            }
            return m_ds;
        }
        #endregion //”CreateTable,INSERTInto and SelectINTO”
            //MNF 10/9/2004
            ///<summary>
            /// Sample of call
            /// dsHelper.RemoveFromTable(sTestTable, rSourceRow) ;
            ///</summary>
            ///<returns>true if delete, false if primary key is not in the table</returns>
            public bool RemoveFromTable(string sTableName, DataRow rSourceRow)
            {
                  Debug.Assert(m_ds!=null);
                  DataTable tbl=m_ds.Tables[sTableName];
                  if (tbl==null)
                  {
                              return false;
                  }
            //bool bRet=RowInTableExists( tbl, rSourceRow );
            //object[] keyValues=PrimaryKeyValues(rSourceRow);
            return RemoveFromTable( tbl,rSourceRow);
            }
        ///<summary>
        ///
        ///</summary>
        ///<param name=”tbl”></param>
        ///<param name=”rSourceRow”></param>
        ///<returns>true if delete, false if primary key is not in the table</returns>
        public static bool RemoveFromTable(DataTable tbl,DataRow rSourceRow )
        {
            DataRow row = FindRow(tbl, rSourceRow);
            if (null != row)
            {
                tbl.Rows.Remove(row);
                //bRet=true;
            }
            //else //not exist
            //    bRet=false;
            return (null != row);
        }
        ///<summary>
        /// Remove Rows with the specified state, e.g during refresh it may be required to delete Unchanged rows
        ///</summary>
        ///<param name=”sTableName”></param>
        ///<param name=”RowState”></param>
        ///<returns></returns>
        public static bool RemoveRowsFromTable( DataTable tbl, DataRowState RowState)
        {
            if (tbl == null) { throw new ArgumentNullException(“DataTable tbl”); }
            foreach (DataRow row in tbl.Rows)
            {
              if(row.RowState==RowState)
              {
                  row.Delete();  
              }
            }
            return true;
        }
 
        ///<summary>
        /// return table(not attached to dataset) with row copies with specified keys 
///</summary>
///<param name=”tbl”></param>
        ///<param name=”keys”>if Keys is null,return copy of full table not attached to dataset</param>
///<returns></returns>
        public static DataTable FilterTableByKeys(DataTable tbl, string[] keys)
        {
            if (tbl == null) { throw new ArgumentNullException(“DataTable tbl”); }
            if (keys == null) { throw new ArgumentNullException(“string[] keys”); }
            DataTable newTbl = null;
            if (keys != null)
            {
                newTbl = tbl.Clone();
                foreach (string key in keys)
                {
                    DataRow row = tbl.Rows.Find(key);
                    if (null != row)
                        newTbl.ImportRow(row);
                    else
                    {
                        DebugHelper.TracedLine(“key not found” + key);
                        Debug.Assert(false, “key not found” + key);
                    }
                }
            }
            return newTbl;
        }
        ///<summary>
        /// return table(not attached to dataset) with row copies with specified keys 
        ///</summary>
        ///<param name=”tbl”>Not Null</param>
        ///<param name=”lstKeys”>Not Null</param>
        ///<returns></returns>
        public static DataTable FilterTableByKeys(DataTable tbl, List<string> lstKeys)
        {
            string[] keys = null;
            if (null!=lstKeys)
            {
                keys = lstKeys.ToArray();
            }
            return FilterTableByKeys(tbl, keys);
        }
        ///<summary>
        /// Remove all rows that have keys different to listed in lstKeys
        ///</summary>
        ///<param name=”tbl”></param>
        ///<param name=”lstKeys”>if lstKeys is null, the unchanged table is returned</param>
        ///<returns></returns>
        public static DataTable ReplaceTableWithFilteredByKeys(DataTable tbl, List<string> lstKeys)
        {
            if (null != lstKeys)
            {
                DataTable tblNew = FilterTableByKeys(tbl, lstKeys);
                tbl.Clear();
               tbl.Merge(tblNew);
            }
            return tbl;
        }
 
        //MNF 21/10/2004
            ///<summary>
            /// Sample of call
            /// dsHelper.FindInTable(sTestTable, rSourceRow) ;
            ///</summary>
            ///<returns>true if found, false if primary key is not in the table</returns>
            public bool FindInTable(string sTableName, DataRow rSourceRow)
            {
                  //bool bRet;
                  Debug.Assert(m_ds!=null);
                  DataTable tbl=m_ds.Tables[sTableName];
                  if (tbl==null)
                  {
                        return false;
                  }
            return RowInTableExists( tbl, rSourceRow);
            }
        public static bool RowInTableExists(DataTable tbl,DataRow rSourceRow )
        {
            //    bool bRet;
            if(tbl==null){    throw new ArgumentNullException(“tbl”);}
            DataRow row = FindRow(tbl, rSourceRow);
            return (null!=row) ;
        }
        public static DataRow FindRow(DataTable tbl, DataRow rSourceRow)
        {
           // bool bRet;
            object[] keyValues = PrimaryKeyValues(rSourceRow);
            return tbl.Rows.Find(keyValues);
 
        }
 
            ///<summary>
            /// Sample of call
            /// dsHelper.InsertInto(sTestTable, ds.Tables[“Employees”], “FirstName FName,LastName LName,BirthDate”, “EmployeeID<5″, “BirthDate”) ;
            /// </summary>
            /// <returns>true if inserted, false if primary key is already in the table</returns>
            public static object[] PrimaryKeyValues(DataRow rSourceRow)
            {
                  DataColumn[] colKeys=rSourceRow.Table.PrimaryKey;
                  ArrayList values = new ArrayList();
                  foreach(DataColumn dc in colKeys)
                  {
                              values.Add( rSourceRow[dc.ColumnName]);
                  }
                  return values.ToArray();;
            }
 
            ///<summary>
            /// Sample of call
            /// dsHelper.InsertInto(sTestTable, ds.Tables[“Employees”], “FirstName FName,LastName LName,BirthDate”, “EmployeeID<5″, “BirthDate”) ;
            /// </summary>
            /// <returns>true if inserted, false if primary key is already in the table</returns>
            public static bool IsIndexValid(DataTable tbl,int RowIndex)
            {
                  return ((RowIndex>=0) && (RowIndex< tbl.Rows.Count) ) ;
            }
            public static bool HasChanges(DataTable tbl)
            { //from http://www.groupsrv.com/dotnet/viewtopic.php?t=30224&view=previous
                  //thisDataSet.Tables[tableName]
                  foreach (DataRow dr in tbl.Rows) 
                  {
                        if (dr.RowState != DataRowState.Unchanged)
                              return true;
                  }
                  return false;
            }
 
 
            //use SqlCommandBuilder
        public static DataTable UpdateDataSet(string connString, DataSet ds, string sSelectSQL, string TableName)
            { return UpdateDataSet( connString, ds, sSelectSQL, TableName,false);
            }
            public static DataTable UpdateDataSet(string connString, DataSet ds,string sSelectSQL,string TableName,bool ContinueUpdateOnError )
            {
                  //SqlCommandBuilder commandBuilder;
            DataTable dt = null;
            if (null == TableName)
            {
                dt = ds.Tables[0];//assume that only the first table(with defalt name “Table”) is updated
            }
            else
            {
                dt = ds.Tables[TableName];
            }
            DataTable dtChanges = dt.GetChanges();
                  if (!(dtChanges == null))
                  {
                dtChanges=UpdateDataTable(connString, dt, sSelectSQL, ContinueUpdateOnError);
                //SqlDataAdapter da = new SqlDataAdapter(sSelectSQL, connString);
                //commandBuilder = new SqlCommandBuilder(da);
                //commandBuilder.QuotePrefix =”[“;//TODO for all SqlCommandBuilder
                //commandBuilder.QuoteSuffix =”]”;
                //da.ContinueUpdateOnError = ContinueUpdateOnError;
                //DebugHelper.PrintSqlCommandBuilder(commandBuilder,”UpdateDataSet:”);
                //if (null==TableName)
                //{
                //    da.Update(dsChanges);//assume that only the first table(with defalt name “Table”) is updated
                //    ds.AcceptChanges();
                //}
                //else
                //{
                //    da.Update( dsChanges, TableName);
                //    ds.Tables[TableName].AcceptChanges();
                //    //                                    LogErrors(dsChanges, “UpdateKeywordTables ” + tbl + ” SQL=” + sSQL);
                //}
                  }
            return dtChanges; //important GetErrors
        }
        ///<summary>
        ///
        ///</summary>
        ///<param name=”connString”></param>
        ///<param name=”dt”></param>
        ///<param name=”sSelectSQL”></param>
        ///<param name=”ContinueUpdateOnError”></param>
        ///<returns>can return null, if no cahnges</returns>
        public static DataTable UpdateDataTable(string connString, DataTable dt, string sSelectSQL, bool ContinueUpdateOnError)
        {
            DataTable dtChanges = dt.GetChanges();
            if (!(dtChanges == null))
            {
                SqlDataAdapter da = CreateSqlDataAdapterWithBuilder(sSelectSQL, connString, ContinueUpdateOnError);
                da.Update(dtChanges);//assume that only the first table(with defalt name “Table”) is updated
                dtChanges.AcceptChanges();//not sure, it is required
                dt.AcceptChanges(); //alternatively dt.Merge(dtChanges) 5/4/2006
                //                                    LogErrors(dsChanges, “UpdateKeywordTables ” + tbl + ” SQL=” + sSQL);
            }
            return dtChanges;
        }
        public static SqlDataAdapter CreateSqlDataAdapterWithBuilder(string sSelectSQL, string connString, bool ContinueUpdateOnError)
        {
            SqlCommandBuilder commandBuilder;
            SqlDataAdapter da = new SqlDataAdapter(sSelectSQL, connString);
            commandBuilder = new SqlCommandBuilder(da);
            commandBuilder.QuotePrefix = “[“;//TODO for all SqlCommandBuilder
            commandBuilder.QuoteSuffix = “]”;
            da.ContinueUpdateOnError = ContinueUpdateOnError;
            DebugHelper.PrintSqlCommandBuilder(commandBuilder, “UpdateDataSet:”);
            return da;
        }
 
        #region “DataView operations”  
 
            public static DataTable CreateTable(DataView vwSource)
            {
                  // short circuiting out here
                  int nRowCount = vwSource.Count;
                  if (0 == nRowCount) return null;
                  DataTable tableNew = vwSource.Table.Clone();// clone the schema
                  // copy the values to the new table
                  foreach(DataRowView rv in vwSource)
                  {
                     tableNew.ImportRow(rv.Row);
                  }
                  return tableNew;
            }
        public static string SqlTableForNewRows(string TableName)
        {
            // short circuiting out here
            string sSQL = “Select * From “ + TableName + ” where 0=1″;
            return sSQL;
        }
        public static DataTable OpenTableForNewRows(string TableName,string connString)
        {
            // short circuiting out here
            string sSQL = SqlTableForNewRows(TableName );
            return ExecuteTable(connString, CommandType.Text, sSQL);
            //DataSet ds = SqlHelper.ExecuteDataset(connString, CommandType.Text, sSQL);
            //DataTable tableNew = ds.Tables[0];
            //return tableNew;
        }
        public static DataTable ExecuteTable(string connectionString, CommandType commandType, string commandText)
        {
            DataSet ds = SqlHelper.ExecuteDataset(connectionString, commandType, commandText);
            DataTable tableFirst = ds.Tables[0];
            return tableFirst;
        }
        public static void Delete(DataView view, int startRecord, int maxRecords)
            {
                  view.RowStateFilter = DataViewRowState.CurrentRows | DataViewRowState.Deleted;
                  int nLastRecord = startRecord+ maxRecords – 1;
                  for (int num1 = startRecord; num1 <= nLastRecord; num1++)
            {
                if (num1<view.Count)//5/4/2006
                {
                              view[num1].Delete();
                }
            }
                  view.Table.AcceptChanges();
            }
        public static DataView CloneDataView(DataView vwSource,string newSort)
        {
            return new DataView(vwSource.Table ,vwSource.RowFilter,newSort,vwSource.RowStateFilter) ;
        }
        #endregion “DataView operations”
 
    }//public class DataSetHelper
 
}//namespace