Multi-record Saves
One of the best ways to improve performance is to simply reduce the number of server round-trips. When transactions need to save a list of order items, or customer locations, or other related data, the entire data set should be loaded and committed as a batch. Unfortunately, ADO.Net makes this a little more difficult because it doesn’t offer consistency across platforms.
Although SQL Server can accept multi-valued parameters via XML or JSON or even parsable strings, the standard, recommended approach is to use Table Valued Parameters. ArgentSea offers a Mapper for TVPs too.
The TVP Mapper accepts a Model object and returns a SqlDataRecord
build from the mapping attributes. The Table Valued Parameters accepts a collection (list, array, etc.) or the SqlDataRecord
objects. Like all Mapper processes, the conversion logic is extracted and compiled on the first call, offering native-compiliation performance on subsequent calls.
An implementation, then, simply iterates the list of Model objects, creating the corresponding list of SqlDataRecords
. The list of records is a assigned to a parameter like any other value. Like this:
var contactRecords = new List<SqlDataRecord>();
customer.Contacts.ForEach(contact =>
{
contactRecords.Add(TvpMapper.ToTvpRecord<ContactModel>(contact, _logger));
});
var prms = new ParameterCollection()
.AddSqlIntInputParameter("@CustomerId")
.AddSqlTableValuedParameter("@Contacts", contactRecords);
For higher performance with very large data sets, the Batch functionality could allow someone to implement a BatchStep that uses the SqlBulkCopy class. This would be similar to the approach used by the PostgreSQL implementation.
Next: Sharding