Category Archives: general

Copy a Formula Down a Column for Thousands of Rows in Microsoft Excel

For whatever reason, I had never figured out the easy way to do this until today.  There are two approaches, one with the mouse, and one with the keyboard:

Mouse Method

  1. Click to select the formula in the top cell.  If you’re still editing the formula, press Ctrl-Enter to save and stay in the cell.
  2. Double click the “copy handle” … make sure you’re hovering with the “narrow cross” not the “fat cross”:

Keyboard Method

  1. Navigate to the top cell containing the formula.  If you’re still editing the formula, press Ctrl-Enter to save and stay in the cell.
  2. Press Ctrl-C to copy the formula.
  3. Press the down arrow to move to the cell directly below the copied cell.
  4. Press Ctrl-Shift-End to select the range of cells including the target cell and extending down until an empty row is reached.
  5. Press Ctrl-P to paste the formula into the selected range of cells.

That’s it!  In either case, the formula is copied down to all cells in the column until an entirely blank row is reached, so make sure you have at least one cell in every row filled in, or the copy down process will stop at the empty row. With either method, you can verify by pressing Esc to clear any selection, then Ctrl-Shift-End to jump down to the last updated row.

ASCII Delimited Text: Why the “Right” Answer Isn’t Very Useful

To overcome some issues with the SQL Server bulk import/export process, I went shopping for a an unquoted, delimited format this morning.  There’s a very old, very simple solution that would have worked fine in my specific circumstance.  Here’s why I’m not going to use it:

The problem I had was that I need to export using the SQL Server bcp utility and re-import using BULK INSERT.  While you can specify the separators with these tools, they don’t handle quoted fields at all, i.e. quoting, if present, doesn’t actually protect the separators should they appear in data.  Per the documentation:

If a terminator character occurs within the data, it is interpreted as a terminator, not as data, and the data after that character is interpreted as belonging to the next field or record. Therefore, choose your terminators carefully to make sure that they never appear in your data.

This was a problem, because my data actually could contain tabs, carriage returns and line feeds, which were the default separators.  It also definitely contained commas.  That puts all the common delimiters out of commission.  “Choose your terminators carefully” is pretty vague, but I figured why not give it a shot?

The approach I almost took was to use the ASCII “record separator” and “unit separator” (ASCII 30 and 31, HEX 0x1E and 0x1F respectively).   Here’s a blog post that nicely sums up the format and why it’s historically and logically valid.

Though it’s not well-documented (somewhat contra-documented), SQL Server supports this.   Even though they suggest that only “printable” characters and a small set of control characters are valid, I had no problem exporting an ASCII-delimited text file using a command like this:

bcp SELECT "x,y FROM t" queryout "c:\filename.asc" -c -r 0x1e -t 0x1f -UTF8 -T -Slocalhost"

I didn’t get as far as trying the BULK INSERT on the other end, though, and here’s why…

Once I had that “ASCII delimited” file, I opened it in NotePad++ to verify that the format was readable and correct.  It was, but the effect wasn’t pretty.  I immediately realized that if I wanted to do anything else with this data–extract a column as text, import into Excel–I was going to have problems.  Excel’s text import wizard, for example, doesn’t support control characters other than tab.  This wasn’t really news to me as I see weird control characters in ASCII all the time working with magnetic stripes, bar codes, RFID, and other encoding and storage mechanisms.  Yes, you can eventually get at the data to a usable form with search and replace, or worst-case regular expressions, but why make it hard to manage if you don’t have to?

In my case, the whitespace control characters in the data improved readability but weren’t functionally required–the data payload itself was code.  Plus, I had comment blocks available as a “quote-like” protection structure.  So, I ended up compromising on replacing the whitespace control characters in a such a way that I can get them back if I want to, or leave them alone for the automated process.  What I ended up doing was this:

bcp SELECT "x,REPLACE(REPLACE(REPLACE(y,CHAR(13),'/*CR*/'),CHAR(10),'/*LF*/'),CHAR(9),'/*TAB*/') FROM t" queryout "c:\filename.tsv" -c -UTF8 -T -Slocalhost"

That produces a “normal” tab-separated file with CRLF as the record separator.  I knew that “x” couldn’t contain those characters, so by replacing them out of “y” I have a file that safely exports and imports while being viewable, editable and importable using normal tools without excessive processing.

I wish we had kept better support for ASCII control characters in our tools as we moved forward with new technologies–it would have been useful to have distinct separators that never appeared in data (until, inevitably, your data payloads themselves contained ASCII-delimited data, at which point you’re back to quoting or replacing or choosing new separators… rinse… repeat).  Of course another solution would have been making the SQL Server utilities just slightly more robust so they could process modern quoted .csv and .tsv formats.  There’s always, right?

Fix Missing Boot Loader After DISKPART UNIQUEID DISK on Windows Server 2008R2 VM Under Hyper-V

Just had a scary one… After setting up Windows Server 2012 R2 to back up all its Hyper-V VMs through the host’s nightly Windows Backup I found that only one of the three Windows Server 2008 R2 VM clients was being backed up.  The other two were failing, one with a disk CRC error, the other for no listed reason.  Based on some searching, I suspected a problem with disk ID clashes–all the VMs were from the same template .vhdx file, so they all had the same “unique” ID.  Oops.  This is actually supposed to be handled okay by the current version of Windows Backup on a Hyper-V host, but I figured I’d eliminate it as a potential issue anyway.  So, I had run DISKPART UNIQUEID DISK to reset all the Ids on all the partitions (virtual disks) in the pool of VMs to unique values.   This morning I checked the logs and saw the backups up the two VMs again failed.

Next I tried a chkdsk on all the volumes on the affected VMs.  However, after scheduling the chkdsks and rebooting the VMs I got the dreaded black screen of “no boot device can be found.”  This happened on both the VMs I had tried (the third I hadn’t attempted to reboot yet), for which I, of course, had no backup.  Oh no.  Unlike Windows 7, Windows Server 2008 R2 lacks any automatic repair tool.  Fortunately, a quick Google search turned up a Tom’s Hardware thread that indicated that the “repair” command prompt available from the original install DVD would allow the repair of the boot record with this command:

Bootrec /RebuildBcd

That command actually will scan all the mounted partitions and offer to add any Windows installations to the boot loader.  In my case, there was only one installation offered per VM and by selecting it I was back in business.  The scheduled disk checks even kicked in and now all the machines are up and running.  Whether this actually fixes the backup issue we’ll see tomorrow…



Visual Studio 2010 SP1 Slow to Start with “Loading toolbox content” Status

I still install VS2010 on all new machines for a number of reasons.  It seems that inevitably over the course of the life of the install I will eventually run into a problem where Visual Studio becomes slow to start, getting “stuck” for many seconds with this message in the status bar:

Loading toolbox content from package Microsoft.VisualStudio.IDE.Toolbox.ControlInstaller.ToolboxInstallerPackage ‘{2C98B35-07DA-45F1-96A3-BE55D91C8D7A}’

I initially theorized this had something to do with the Telerik control suite updating (via its outside “Control Panel” installer), but even completely uninstalling that didn’t solve it.  Since I keep doing a Google search to remember the real solution, I wanted to permalink the answer.  Basically:

  1. Close all instances of Visual Studio
  2. Back up registry (regedit, File|Export, Export Range: All)
  3. Delete registry key: [HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\VisualStudio\10.0\Packages\{2c298b35-07da-45f1-96a3-be55d91c8d7a}] (I also back up the specific key I’m deleting before I do it, just in case I click the wrong one)
  4. Navigate to C:\Users\WindowsUserAccount\AppData\Local\Microsoft\VisualStudio\10.0\
  5. Move toolbox*.tbd (typically four files) out of this folder and to a backup location:
  6. Restart Visual Studio

The toolbox*.tbd files will be immediately recreated on launch, probably at a much smaller size.  After the first launch, during which the toolbox is being rebuilt, Visual Studio should start much more quickly.


Calculate MD5, SHA-1, SHA-2, etc. Hash for a File Under Windows

[Updated for 2017]

This post originally was about FCIV, which is no longer the right choice.  Today we use CertUtil:

certUtil -hashfile pathToFileToCheck [HashAlgorithm]

HashAlgorithm choices: MD2 MD4 MD5 SHA1 SHA256 SHA384 SHA512

via SuperUser

Legacy Info

Because I keep forgetting…

fciv Filename.iso -both

Resides in C:\Windows\System32 and requires an elevated command prompt (or the Visual Studio Command Prompt).

If you don’t have fciv, it can be downloaded here:

Dropbox Doubles Referral Bonus and Ceiling

Dropbox is a great, easy-to-use service for automatically synchronizing files—documents, music, photos, video, whatever!—between computers and mobile devices. As well as being available on all your devices, your files are also backed up “in the cloud” and are available securely through the Dropbox website. The standard account includes 2GB of storage and is completely free. You can install the Dropbox client on as many computers and mobile devices as you wish.

Dropbox Logo

On your computer, Dropbox acts just like a regular folder. Drag in files and they’re automatically synchronized with the cloud and your other devices. That’s it! Dropbox does all the hard work in the background, keeping your files in sync and keeping backup copies when files are changed or deleted.

Dropbox allows you to earn additional free storage through referrals. If you’re already a Dropbox user, you’ll notice they doubled the referral bonus and ceiling (including retroactively for existing referrals). That means you now get 500MB per referral for up to 32 referrals (16GB additional space).

If you don’t already have Dropbox, get it now! If you use my referral link, you’ll get an extra 500MB right off the bat (meaning you’ll start with 2.5GB instead of two). You can also earn extra space for completing a few simple steps on their web site (taking the video your, inviting at least one friend, etc.).

I’ve been using Dropbox for the last two and a half years and I’m not sure I could get by without it now. If you haven’t tried Dropbox yet, try it now!

Impact on Smartphone Data Usage of Getting a WiFi Router that Works

I upgraded to a Netgear N900 Wireless Dual Band Gigabit Router (WNDR4500) on 11/23/2011 because my (not that) old WNR3500 simply couldn’t maintain connections on mobile clients and Apple products.  The effect on my 3G data usage via my Sprint phone was dramatic:


I did try two other routers first:

Back in May I had tried an ASUS Black Diamond Dual-Band Wireless-N 600 Router (RT-N56U) briefly, but it ran super-hot and the settings required to make it compatible with the IPSEC VPNs I use resulted in noticeably slow throughput over the WAN port.  That went back within the 30-day return window.

With the holidays on the horizon, and the attendant flood of Apple devices that inevitably brings, in early November I became desperate enough to try the ZyXEL ZyWALL USG20W 802.11n Wireless Internet Security Firewall with 4 Gigabit LAN/DMZ Ports, 2 IPSec VPN, SSL VPN , and 3G WAN Support.  That required me to re-develop a level of expertise in TCP/IP networking that I hadn’t exhibited in years.  As much fun as that was, even with the help of ZyXEL tech support neither I nor they were ever able to get the unit to function the way I wanted (a main sticking point being the impossibility of bridging the wired and wireless networks while continuing to support the other functions of the router, in particular NAT loopback).  But what finally made me take it back was the horrendous WiFi coverage.  Maybe I’m just spoiled by MIMO, but the ZyXEL was essentially useless at WiFi—it would have required me to run a separate access point just to cover the whole house, and that’s not evening addressing the low throughput from the main unit.

The WNR3500 had worked well until the arrival of smartphones and Apples products in the house, so I decided to give NetGear another chance.  I’ve been completely satisfied with the WNDR4500, and going by the graph so has my smartphone.

Gmail “Safe Mode” – Log In with Labs Disabled

This morning, on first attempting to connect to gmail using  Chrome, I began experiencing this message:

You have been signed out of this account.

This may have happened automatically because another user signed in from the same browser. To continue using this account, you will need to sign in again. This is done to protect your account and ensure the privacy of information.

Upon clicking okay, I was immediately logged out of my account. Clearing cache worked to log in once, but the problem recurred upon subsequent log out-log in cycles. Per this thread at Google support, the answer was to start gmail in “safe mode” with labs disabled using this url:

From there, I went into settings and turned off the “Super Stars” lab, though other users have reported the problem from other labs as well.

Update: The problem recurred, even with “Super Stars” turned off, so I disabled all Labs.

Invalid Viewstate errors in ScriptResource.axd and WebResource.axd

I’ve done the requisite google searching on this one and can’t find a solution, so I’m posting about it in case anyone has an answer…

The basic problem is, I have an ASP.NET Dynamic Data web site that is experiencing occassional “Invalid viewstate errors” in the ScriptResource.axd or WebResource.axd javascript includes on the page. In almost all cases these are due to the URL of the include being corrupted with HTML or other text from the page. That is, instead of the normal “d” and “t” parameters in the querystring, I see something like this (as reported by my custom error handler in global.asax, which emails me a report for every Application_Error event):

http://domain-omitted/ScriptResource.axd?d=w0PmGeT7CMNAKpp-4k5pBNkf0bD1315QQFyhtn4142pWyyhkp5lhqQR6XcoT1lgNSX5Cdw8Q6uPcgDBOu4CSX-e3-cB38lN5S211HGUect<span class=


I have not been able to replicate this in testing, but in the field, where we have the application running exclusively on IE8 clients, we are seeing this behavior. I’ve read that this could have something to do with IE8’s parsing engine (Trident/4.0). I’ve validated my page using w3c (document type: XHTML 1.0 Transitional). There are a couple of attribute validations I can’t solve, but they’re for stuff like border=”0″ in something generated by the .NET platform (which isn’t actually sending border=”0″ but rather ‘style=”border-width:0px;”‘ as part of an asp:Button… more weirdness). I’ve made sure that all my local javascript is enclosed in XML data blocks as was suggested by some of the forum posts on this topic.

Server is up to date with Windows Server 2003 Standard x64, .NET Framework 3.5 SP1 with family updates installed.

Anyone have any further suggestions?