mine_detector

There’s no shortage of options for importing data into (or exporting out of) Microsoft SQL Server databases.  Whether you use the explicit “Import” tools, or SSIS, it’s become old hat.

One customer I worked with a little while back has a “mission critical” SQL Server Express 2014 database they want to update with an hourly CSV file drop from an outside partner.  Don’t get me started with “mission critical” and “express” in the same sentence, but indulge me for a minute please.

One option was to use SSIS, but their particular environment and staffing assignments don’t lend well to using SSIS from a “real” SQL instance, nor do they wish to deploy SQL Server Management Studio (SSMS) to a machine for doing nothing but hourly data imports.  Something to do with SSMS licensing, and they were in need of Xanax and a bottle of wine, so I went with the path-of-least-anxiety.

Solution A

  • SQL Server 2014 Express (already in place)
  • CSV data file (already in place)
  • PowerShell script (new)
  • Task Scheduler (new)
  • Coffee (I came prepared)

The table (dbo.Products)

dbcsv1

A Sample CSV file

ProductName,Category,SubCategory,UnitPrice
"Desktop, Dell E300A",2,1,349.99
"Desktop, HP B7550",2,1,449.99

(yes, this is bogus data.  just for demo)

Note that the table structure defines “ProductID” as an auto-populate field and “DateAdded” is set to auto-populate using GETDATE() as default binding value.  So, we only need to specify values for the other four (4) columns: ProductName, Category, SubCategory and UnitPrice.

Yes, decimal(18,2) is a kind of loose and sloppy, but so was I.

The PowerShell script

The script is very basic and does not provide any documentation, input parameters or exception handling.  After all, It’s “cheap” (the “2 out of 3 rule” applies).  Jeff Hicks, Don Jones or Jeff Snover would probably throw up after looking at it, but hey:  I carry Tums.

$InputFile = "D:\imports\products.csv"
$DataSet = Import-Csv $InputFile
$sql = "insert into dbo.Products (ProductName,Category,SubCategory,UnitPrice) values "
foreach ($row in $DataSet) {
  $pn = $row.ProductName
  $c1 = $row.Category
  $c2 = $row.SubCategory
  $up = $row.UnitPrice
  $sql += "('$pn',$c1,$c2,$up),"
}

# trim the trailing comma off because I'm lazy :)
$sql = $sql.Substring(0,$sql.Length-1) 

$Conn = New-Object System.Data.SqlClient.SqlConnection
$Conn.ConnectionString = "server=SDB4;database=INVX;trusted_connection=true;"
$Conn.Open()

# open up and say AH!
$Cmd = New-Object System.Data.SqlClient.SqlCommand
$Cmd.Connection = $Conn
$Cmd.CommandText = $sql
$cmd.ExecuteReader()
$conn.Close()
# burp!

The Scheduled Task

The script, import CSV file and scheduled task are all configured on a server named Server1.  The scheduled task is named named “Import Products” and configured to run under a domain service account which has permissions to read the CSV file, and insert rows into the target SQL Server database “INVX” on server “SDB4”.  The script resides in D:\scripts, while the CSV file is under D:\imports.   A separate process drops (overwrites) the products.csv file every hour.

powershell.exe -bypass -file import_products.ps1

Total time to write script, test, create scheduled task, test and activate: 1 hour (actually, less.  but I stopped to talk a lot of shit and spill my coffee whenever possible).

Solution B

Bribe a real DBA to build a simple SSIS task.

Advertisements

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s