Accelerating DataTable to Database Imports: Best Practices
Importing large datasets from DataTables into databases requires efficient techniques to minimize processing time. While row-by-row insertion using foreach
loops is a common approach, it's inefficient for substantial datasets.
A superior solution is SqlBulkCopy
, offering significantly improved performance without requiring stored procedures.
Implementing SqlBulkCopy
is relatively simple. Start by creating a database connection, utilizing SqlBulkCopyOptions.KeepIdentity
to preserve identity columns.
Next, map DataTable columns to their corresponding database table columns using the ColumnMappings
property.
Finally, specify the destination table and employ the WriteToServer
method to transfer the DataTable data.
Here's a code example illustrating SqlBulkCopy
usage:
<code class="language-csharp">using (var bulkCopy = new SqlBulkCopy(_connection.ConnectionString, SqlBulkCopyOptions.KeepIdentity)) { // Map columns between DataTable and SQL table foreach (DataColumn col in table.Columns) { bulkCopy.ColumnMappings.Add(col.ColumnName, col.ColumnName); } bulkCopy.BulkCopyTimeout = 600; // Increased timeout for large datasets bulkCopy.DestinationTableName = destinationTableName; bulkCopy.WriteToServer(table); }</code>
Using SqlBulkCopy
dramatically enhances data insertion speed, leading to optimal performance when dealing with large DataTables. Remember to adjust BulkCopyTimeout
as needed for exceptionally large datasets.
The above is the detailed content of How Can I Optimize DataTable Insertion into a Database for Maximum Speed?. For more information, please follow other related articles on the PHP Chinese website!