Why UpdateDataSet n-duplicates the inserts?

Topics: Data Access Application Block
Mar 19, 2009 at 8:50 PM

I am running the following code

DataSet ds = db.ExecuteDataSet( CommandType.Text, "select mpid from " + dbAndTableName );

//add rows into table
foreach (var mp in mpidList) {
    DataRow newRow = ds.Tables[0].NewRow();
    newRow["mpid"] = mp.mpid;
    ds.Tables[0].Rows.Add( newRow );

}

string insertSql = "insert into " + dbAndTableName
        + "( mpid) values "
      + "( @mpid)";
dbCommand = db.GetSqlStringCommand( insertSql );
db.AddInParameter( dbCommand, "@mpid", DbType.String, "mpid", DataRowVersion.Current );

db.UpdateDataSet( ds, ds.Tables[0].TableName, dbCommand, null, null,
    UpdateBehavior.Continue, 1000 );
   
   
When I check the table I see the same n-1000 value duplicated 1,000 each. Assume we have 4,000
rows, it picks the 1000, 2000, 3000 and 4000 rows and inserts them 1000 each.
Changing the batchsize to 1 instead of 1000 gets the correct result.
Why is it happening? Would like the use the BatchUpdateSize.

Mar 20, 2009 at 2:54 AM

Hi Georgio,

Are mpidList were the rows from dataset ds to be inserted back to the same table 1000 times? If this was the case, maybe it will really have a duplicate value for each mpid 1000 each. J

Mar 20, 2009 at 3:29 AM
The data in the datatable are unique. Say we have values 1 - 4000. The BatchUpdateSize is set to 1000. The code will pick  these values 1,000, 2000, 3000 and 4,000 and insert them into the db 1000 each. The data will look something like this after the insert
mpid
1000
1000
1000
..... 1000 times like the above
2000
2000
2000
..... another 1000 times the value of 2000 like the above
3000
3000
...... another 1000 times the value of 3000 like the above
and so on.

I want to see the unique values of 1-4000 in the table ie
1
2
3
....
4000
but would like to use a batch size of 1000 so I can save trips to the sql db.

How do I accomplish the above? What am I doing wrong.

Thanks





Mar 20, 2009 at 4:04 AM
Will the ID be inserted on same table? If this was the case, I think you will encounter uniqueness violation on the process. I’m trying to replicate what you’re doing and hope I will provide some answer J
Apr 1, 2009 at 7:22 PM
This may be an issue with the dataprovider I am using. Looking into it.

Thanks
Apr 9, 2009 at 2:53 PM
This is an issue with the provider
Apr 13, 2009 at 9:13 AM
Thanks for the Info.