Someone please explain this to me?  I have two scripts reading from the same Excel 2013 spreadsheet and I get dramatically different performance results.  Not even close.



The two (2) computers on which this has been tested are as follows:


  • HP Elite 8200 Desktop
  • 16 GB RAM
  • Windows 8.1 Enterprise 64-bit
  • Microsoft Office 2013 Professional, SP1, updated to 10/2/2014


  • HP Elitebook 9470M Folio Laptop
  • 16 GB RAM
  • Windows 8.1 Professional 64 bit
  • Microsoft Office 2013 Professional, SP1, updated to 10/2/2014

And for the script code…


$time1 = Get-Date
write-host "start: $time1"

$objExcel=New-Object -ComObject Excel.Application

$worksheet = $workbook.sheets.item("UsersList")

$intRowMax = ($worksheet.UsedRange.Rows).count
$intColMax = ($worksheet.UsedRange.Columns).count

#$Columnnumber = 1

for($intRow = 2 ; $intRow -le $intRowMax ; $intRow++) {
 for ($intCol = 1 ; $intCol -le $intColMax ; $intCol++) {
 $name = $worksheet.cells.item($intRow,$intCol).value2
 write-host $name

$time2 = Get-Date
write-host "finish: $time2"

write-host $($time2-$time1).TotalSeconds " seconds"


time1 = Timer
wscript.echo "start: " & time1
Const strPath = "\\server\scripts\ps\Create-BulkADusers-Input.xlsx"
Set objExcel = CreateObject("Excel.Application")
objExcel.Visible = False

Set workBook = objExcel.Workbooks.Open(strPath)
Set workSheet = workBook.Sheets.Item("UsersList")

intRowMax = workSheet.UsedRange.Rows.Count
intColMax = workSheet.UsedRange.Columns.Count

For intRow = 2 to intRowMax
 For intCol = 1 to intColMax
 name = workSheet.Cells.Item(intRow,intCol).Value2
 wscript.echo name

time2 = Timer
wscript.echo "finish: " & time2
wscript.echo Round(time2-time1,2) & " seconds"

The script logic is nearly identical between the two. That is, it instantiates an Excel Application object, opens the spreadsheet file, gets the workbooks collection object and gets the specified worksheet object. Then it queries for the active range boundaries to determine row and column extents. Finally, it iterates the rows to print out the value  in each column for each row.

PowerShell nerds:  Yes, I know “write-host” is “bad” to use, but I ran it with “write-verbose” and without any “write-” statements at all.  The performance impacts for each were insignificant.

The spreadsheet itself is a standard, boring, Active Directory account bulk-loader layout.  It has 50 data rows, a heading row, and 9 columns. The values in most columns are concatenations (e.g. =CONCATENATE()), or calculations using string parse operations (e.g. =LEFT(), =MID(), =LOWER()).

Test 1 (desktop) reads the spreadsheet file from a server share which resides in the same LAN with only one switch in between the client and server. Test 2 (laptop) reads the spreadsheet from a local folder. In both cases, the script files are stored in, and executed from, the same folder where the spreadsheet resides.

PowerShell on both test computers is 4.0, running with .NET 4.5.1. Multi-Targeting Pack and 4.5.1 SDK.

I was a bit surprised (not really, really surprised, but a little bit surprised) at how
similar the code looks with such different language constructs. I suppose they’re not that
different in some situations.

Anyhow, in both scripts, I bookend the code with before and after time snapshots to compare for deriving the total runtime. After running the tests for comparison, on both computers at least a dozen times, the results are consistent:

VBScript: 2-3 seconds
PowerShell: 68-75 seconds
Average: 22 to 37 times faster using VBScript+WSH

cap_vbscript2 cap_ps2

My intention here is NOT to bash PowerShell. Sorry for the “bash” pun, er, uh, ha ha. I like PowerShell, and I use it as often as I can, but in my line of work I am required to maintain code in many languages from Javascript, VBscript, PHP, ASP, SQL, KiXtart, to PowerShell.  I want to know if I’m missing something in how I should have PowerShell or .NET configured which may be impacting performance this dramatically.

I’m open to suggestions regarding things I can do to make the PowerShell side run faster on my computers.

7 thoughts on “Uh Oh? PowerShell vs. VBScript

    1. BIG difference! Thank you! After installing the preview and doing a reboot, I ran the script against the same spreadsheet six times. The first two completed in 2.89 and 2.78 seconds. The rest in 1.4 to 1.65 seconds.

    1. Yes. I ran the same script about a dozen times on each computer. CSV is not an option in this particular situation due to customer requirements and not wanting to add another conversion step to their process. Not ideal anyway. Id much rather pull from SQL Server than a spreadsheet or other file. Regardless, I wouldn’t expect there to be an inherent performance hit as a default between .NET and flagship Microsoft products like Office.

      Granted. They’d prefer I use VSA or something else, but this much of a performance hit is surprising to me.

      Anyhow, im going to try the preview that Lee suggested and post my results. Thank you for your comment!

Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s