One little known fact about SqlBulkCopy is that not only can the WriteToServer method use a DataTable, but it can also use any object that implements IDataReader.  Since the SqlDataReader object implements the IDataReader interface, it is a perfect candidate for use with a SqlBulkCopy object.  One instance where I found this extremely useful, is transferring data between different databases using C#.

At first when this requirement came to me, I was simply using a SqlDataAdapter, populating a DataTable object, and using SqlBulkCopy to push this data into a separate database.  This was working well until I begin moving millions of records at once and starting running out of memory.  This is when the idea of using a SqlDataReader instead of a DataTable came to life.  This fixed the out of memory issues I was experiencing.

See below for an example:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Data.SqlClient;

namespace DataReader_Streaming
{
    class Program
    {
        private static string connectionStringSource = "Data Source=SOMESERVER\\SQLEXPRESS;Initial Catalog=AdventureWorks;Integrated Security=SSPI;";
        private static string connectionStringDestination = "Data Source=SOMESERVER\\SQLEXPRESS;Initial Catalog=Development;Integrated Security=SSPI;";
        static void Main(string[] args)
        {
            using (SqlConnection conn = new SqlConnection(connectionStringSource))
            {
                conn.Open();
                using (SqlCommand cmd = new SqlCommand("SELECT * FROM Production.TransactionHistory", conn))
                {
                    using (SqlDataReader reader = cmd.ExecuteReader())
                    {
                        performBulkCopy(reader);
                    }
                }
            }
        }

        private static void performBulkCopy(SqlDataReader r)
        {
            using (SqlConnection conn = new SqlConnection(connectionStringDestination))
            {
                conn.Open();
                using (SqlBulkCopy cpy = new SqlBulkCopy(conn))
                {
                    cpy.DestinationTableName = "TransactionHistory";
                    cpy.WriteToServer(r);
                }
            }
        }
    }
}

This code is quite simple to understand. The Main method begins by opening a connection to the AdventureWorks database. It then creates a SqlCommand object using the query SELECT * FROM Production.TransactionHistory. It then calls the ExecuteReader method on the SqlCommand object and finally passes the returned SqlDataReader to the private performBulkCopy method.

Once we are in the performBulkCopy method, we again create a connection to a database. This time we use a development database I created which has an table called TransactionHistory with the same schema as the source table. Once the connection is open, we create a new SqlBulkCopy object, set the destination table name accordingly, and call the WriteToServer method with the SqlDataReader we passed into this method.

That’s about it to streaming data between databases using SqlBulkCopy and a SqlDataReader. It comes in very handy if you need to stream large amounts of data between databases. Most of the time, an in-memory DataTable should be fine, but for the time when you breach memory limits, an object that implements IDataReader is always an option.

Stay tuned to learn how you can implement IDataReader in your own classes so you can stream custom objects in bulk into your database.

Streaming Data Between Databases using SqlBulkCopy and SqlDataReader
Tagged on:         

Leave a Reply