Home » Export multiple SQL Server tables to CSV files

Export multiple SQL Server tables to CSV files

by Vlad Drumea
0 comment

In this post I cover two ways to export and import multiple SQL Server tables to and from CSV in one go with the help of a little PowerShell magic and tools like dbatools or bcp.

If you’ve ever needed to export multiple tables from SQL Server to CSV files you might have noticed that SSMS’s Import and Export Wizard only lets you export one table at a time.

That’s great if you have a small number of tables to export, but if you want to export more than a handful of tables that tends to become pretty tedious.

Intro

In my example I’ll be exporting the results from the following 3 views found in the AdventureWorks2019 database:

  • HumanResources.vEmployee
  • HumanResources.vEmployeeDepartment
  • HumanResources.vEmployeeDepartmentHistory

Note that even if in my example I’m exporting data from views, the same applies to tables.

Using dbatools

Export

This requires dbatools to be installed, so if you don’t already have it you should get it from here.

Note that I’m using the backtick character to split a command into multiple lines.

I’m using the -TrustServerCertificate flag because my instance isn’t configured for encrypted connections, if you want other ways of avoiding encryption related errors when using newer versions of dbatools, check this post out.

So in this example I’m using dbatools’ Invoke-DbaQuery to execute the query that is built using each table name, and then the result is piped into Export-Csv which outputs it to a Unicode-encoded csv file using | as a delimiter.


No output is a good sign in this case.

And this is how the contents of one of the CSV files looks like.

It contains the column headers as well as all the values enclosed in quotation marks and separated by | .

Import

To import the data I’ve used a lazy trick to create tables with the same column names and data types as the views from where I’ve exported the data.

I’m pre-creating the tables because, in my specific use case, the tables that I wanted to import to already existed.

Plus I don’t want the import process to assume the data type and width of the columns based on the csv contents alone.

And then used dbatools’ Import-DbaCsv combined with Get-ChildItem to import all the csv files from a specific path.

The -UseFileNameForSchema flag tells Import-DbaCsv to load the data into tables with the same names as the csv files.

This spares me the effort of having to pass the target table names.


Checking a small sample of the newly populated Employees table, everything looks good.

BCP

bcp stands for Bulk Copy Program and it’s a utility that is used to export or import large amounts of records from and to tables.

Export

The logic of the commands is similar to the one using dbatools (I create a list of tables and iterate through it), the only part that changes is the bcp-specific command.

And this is how the output looks like.

Starting copy…
SQLState = S1000, NativeError = 0
Error = [Microsoft][ODBC Driver 17 for SQL Server]Warning: BCP import with a format file will convert empty strings in delimited columns to NULL.

290 rows copied.
Network packet size (bytes): 4096
Clock Time (ms.) Total : 16 Average : (18125.00 rows per sec.)

Starting copy…
SQLState = S1000, NativeError = 0
Error = [Microsoft][ODBC Driver 17 for SQL Server]Warning: BCP import with a format file will convert empty strings in delimited columns to NULL.

290 rows copied.
Network packet size (bytes): 4096
Clock Time (ms.) Total : 1 Average : (290000.00 rows per sec.)

Starting copy…
SQLState = S1000, NativeError = 0
Error = [Microsoft][ODBC Driver 17 for SQL Server]Warning: BCP import with a format file will convert empty strings in delimited columns to NULL.

296 rows copied.
Network packet size (bytes): 4096
Clock Time (ms.) Total : 1 Average : (296000.00 rows per sec.)

The warning about empty strings being converted to NULL is just a warning and can be safely ignored.

And this is how a sample of one of the resulting files looks like.


Note that, in this case, there are no headers and the values aren’t enclosed in quotation marks.

Import

First, I truncate the test tables from the previous import.

The import using bcp is a bit more complex than using dbatools, since it requires to be in a loop.

And the output looks like this:

Starting copy…

290 rows copied.
Network packet size (bytes): 4096
Clock Time (ms.) Total : 1 Average : (290000.00 rows per sec.)

Starting copy…

290 rows copied.
Network packet size (bytes): 4096
Clock Time (ms.) Total : 1 Average : (290000.00 rows per sec.)

Starting copy…

296 rows copied.
Network packet size (bytes): 4096
Clock Time (ms.) Total : 16 Average : (18500.00 rows per sec.)

Comparing the data again to make sure it looks good.

Conclusion

A little bit of PowerShell goes a long way when it comes to speeding up data export and import to and from CSV files involving multiple SQL Server tables.

You may also like

Leave a Comment

* By using this form you agree with the storage and handling of your data by this website.