In this post I cover two ways to export/import multiple SQL Server tables to/from CSV in one go with the help of a little PowerShell magic and tools like dbatools or bcp.
If you’ve ever needed to export multiple tables from SQL Server to CSV files you might have noticed that SSMS’s Import and Export Wizard only lets you export one table at a time.
That’s great if you have a small number of tables to export, but if you want to export more than a handful of tables that tends to become pretty tedious.
Intro
In my example I’ll be exporting the results from the following 3 views found in the AdventureWorks2019 database:
- HumanResources.vEmployee
- HumanResources.vEmployeeDepartment
- HumanResources.vEmployeeDepartmentHistory
Note that even if in my example I’m exporting data from views, the same applies to tables.
Using dbatools to export and import multiple csv files into SQL Server
Export SQL Server data to csv files
This requires dbatools to be installed, so if you don’t already have it you should get it from here.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 | #Initiate a connection $MyConnection = Connect-DbaInstance -SqlInstance LOCALHOST\VSQL2019 -TrustServerCertificate #Define a list of tables $Tables = @("HumanResources.vEmployee", "HumanResources.vEmployeeDepartment", "HumanResources.vEmployeeDepartmentHistory") #Iterate through the list foreach ($Table in $Tables) { #Build the query $Query = "SELECT * FROM $Table;" #Build the output file path and remove the schema, dot, and v from the file name #and append the .csv extension $CsvPath = "F:\CSVOut\" + $Table.replace('HumanResources.v', '') + ".csv" #Run the query and export results to CSV Invoke-DbaQuery -SqlInstance $MyConnection -Database AdventureWorks2019 ` -Query "$Query" | Export-Csv -Path "$CsvPath" ` -Delimiter "|" -NoTypeInformation -Encoding unicode } |
Note that I’m using the backtick character to split a command into multiple lines.
I’m using the -TrustServerCertificate
flag because my instance isn’t configured for encrypted connections, if you want other ways of avoiding encryption related errors when using newer versions of dbatools, check this post out.
So in this example I’m using dbatools’ Invoke-DbaQuery to execute the query that is built using each table name, and then the result is piped into Export-Csv which outputs it to a Unicode-encoded csv file using | as a delimiter.

No output is a good sign in this case.
And this is how the contents of one of the CSV files looks like.

It contains the column headers as well as all the values enclosed in quotation marks and separated by | .
Import csv files into SQL Server
To import the data I’ve used a lazy trick to create tables with the same column names and data types as the views from where I’ve exported the data.
I’m pre-creating the tables because, in my specific use case, the tables that I wanted to import to already existed.
Plus I don’t want the import process to assume the data type and width of the columns based on the csv contents alone.
1 2 3 4 5 6 7 8 9 10 11 | USE [AdventureWorks2019] GO SELECT TOP 0 * INTO Employee FROM HumanResources.vEmployee; GO SELECT TOP 0 * INTO EmployeeDepartment FROM HumanResources.vEmployeeDepartment; GO SELECT TOP 0 * INTO EmployeeDepartmentHistory FROM HumanResources.vEmployeeDepartmentHistory; GO |
And then used dbatools’ Import-DbaCsv combined with Get-ChildItem to import all the csv files from a specific path.
1 2 | Get-ChildItem -Path F:\CSVOut\*.csv | Import-DbaCsv -SqlInstance $MyConnection ` -Database AdventureWorks2019 -UseFileNameForSchema -Delimiter "|" |
The -UseFileNameForSchema
flag tells Import-DbaCsv to load the data into tables with the same names as the csv files.
This spares me the effort of having to pass the target table names.

Checking a small sample of the newly populated Employees table, everything looks good.
1 2 3 4 5 6 7 8 | USE [AdventureWorks2019] GO SELECT TOP (10) * FROM Employee; GO SELECT TOP (10) * FROM HumanResources.vEmployee; GO |

Using BCP to export and import multiple csv files into SQL Server
bcp stands for Bulk Copy Program and it’s a utility that is used to export or import large amounts of records from and to tables.
Export SQL Server data to csv files
The logic of the commands is similar to the one using dbatools (I create a list of tables and iterate through it), the only part that changes is the bcp-specific command.
1 2 3 4 5 6 | $Tables = @("HumanResources.vEmployee", "HumanResources.vEmployeeDepartment", "HumanResources.vEmployeeDepartmentHistory") foreach ($Table in $Tables) { $CsvPath = "F:\CSVOut\" + $Table.replace('HumanResources.v', '') + ".csv" bcp "$Table" out "$CsvPath" -w -t"|" -S LOCALHOST\VSQL2019 -d AdventureWorks2019 -T -k } |
And this is how the output looks like.
Starting copy…
SQLState = S1000, NativeError = 0
Error = [Microsoft][ODBC Driver 17 for SQL Server]Warning: BCP import with a format file will convert empty strings in delimited columns to NULL.290 rows copied.
Network packet size (bytes): 4096
Clock Time (ms.) Total : 16 Average : (18125.00 rows per sec.)Starting copy…
SQLState = S1000, NativeError = 0
Error = [Microsoft][ODBC Driver 17 for SQL Server]Warning: BCP import with a format file will convert empty strings in delimited columns to NULL.290 rows copied.
Network packet size (bytes): 4096
Clock Time (ms.) Total : 1 Average : (290000.00 rows per sec.)Starting copy…
SQLState = S1000, NativeError = 0
Error = [Microsoft][ODBC Driver 17 for SQL Server]Warning: BCP import with a format file will convert empty strings in delimited columns to NULL.296 rows copied.
Network packet size (bytes): 4096
Clock Time (ms.) Total : 1 Average : (296000.00 rows per sec.)
The warning about empty strings being converted to NULL is just a warning and can be safely ignored.
And this is how a sample of one of the resulting files looks like.

Note that, in this case, there are no headers and the values aren’t enclosed in quotation marks.
Import csv files into SQL Server
First, I truncate the test tables from the previous import.
1 2 3 4 5 6 7 8 | USE [AdventureWorks2019] GO TRUNCATE TABLE [Employee]; GO TRUNCATE TABLE [EmployeeDepartment]; GO TRUNCATE TABLE [EmployeeDepartmentHistory]; GO |
The import using bcp is a bit more complex than using dbatools, since it requires to be in a loop.
1 2 3 4 5 6 7 | $CSVPath = "F:\CSVOut" $CsvFiles = Get-ChildItem -Path $CSVPath\*.csv | Select-Object -ExpandProperty Name foreach ($CSV in $CsvFiles) { $FilePath = "$CSVPath\$CSV" $Target = $CSV.replace('.csv', '') bcp "$Target" in "$FilePath" -w -t"|" -S LOCALHOST\VSQL2019 -d AdventureWorks2019 -T -k } |
And the output looks like this:
Starting copy…
290 rows copied.
Network packet size (bytes): 4096
Clock Time (ms.) Total : 1 Average : (290000.00 rows per sec.)Starting copy…
290 rows copied.
Network packet size (bytes): 4096
Clock Time (ms.) Total : 1 Average : (290000.00 rows per sec.)Starting copy…
296 rows copied.
Network packet size (bytes): 4096
Clock Time (ms.) Total : 16 Average : (18500.00 rows per sec.)
Comparing the data again to make sure it looks good.

Conclusion
A little bit of PowerShell goes a long way when it comes to speeding up data export and import to and from CSV files involving multiple SQL Server tables.
If you’re importing CSV files with Linux line endings, you might run into the issue covered in this post.
2 comments
Thanks for the great article.
I’m new to using DbaTools, but got it installed, and I’ve been using the modified version of your “export to csv” script from this page. Upon pasting my script to the posh cmd line, I don’t get any output and no csv is created in the directory specified.
I pasted my code (below) and added -Verbose , but still no output – the cmd line does not return to the initial directory. These are small tables, so I wouldn’t think it would still be processing? Would there be a good test to run to see what the problem is? Thanks
#Initiate a connection
$MyConnection = Connect-DbaInstance -SqlInstance LOCALHOST\MSSQLSERVER -TrustServerCertificate
#Define a list of tables
$Tables = @(“cop.aaaa1_table4”, “cop.aaaa1_table5”, “cop.newspic”)
#Iterate through the list
foreach ($Table in $Tables) {
#Build the query
$Query = “SELECT * FROM $Table;”
#Build the output file path and remove the schema, dot, and v from the file name
#and append the .csv extension
$CsvPath = “E:\CSVOut\” + $Table.replace(‘cop.’, ”) + “.csv”
#Run the query and export results to CSV
Invoke-DbaQuery -SqlInstance $MyConnection -Database cop
-Query "$Query" | Export-Csv -Path "$CsvPath"
-Delimiter “|” -NoTypeInformation -Encoding unicode -Verbose 4>&1
}
Hi Louis,
I’m away from my computer so no way to troubleshoot this for you, but it seems like it’s based on your environment.
The small number of records shouldn’t be an issue.
If no files are generated in the E:\CsvOut directory, I’d start from the Invoke-DbaQuery command to make sure data is actually being retrieved, and then proceed from there.