Things I’ve learned, published for the public benefit
Hope This Helps header image

Entries from November 23rd, 2009

What you should know about Volume Shadow Copy/System Restore in Windows 7 & Vista (FAQ)

What is volume shadow copy?

Volume Shadow Copy is a service that creates and maintains snapshots (“shadow copies”) of disk volumes in Windows 7 and Vista. It is the back-end of the System Restore feature, which enables you to restore your system files to a previous state in case of a system failure (e.g. after a failed driver or software installation).

Does volume shadow copy protect only my system files?

No. Volume Shadow Copy maintains snapshots of entire volumes. By default, it is turned on for your system volume (C:) and protects all the data on that volume, including all the system files, program files, user settings, documents, etc.

How is this different from what’s in Windows XP?

In Windows XP, System Restore does not use the Volume Shadow Copy service. Instead, it uses a much simpler mechanism: the moment a program attempts to overwrite a system file, Windows XP makes a copy of it and saves it in a separate folder. In Windows XP, System Restore does not affect your documents – it only protects files with certain extensions (such as DLL or EXE), the registry, and a few other things (details). It specifically excludes all files in the user profile and the My Documents folder (regardless of file extension).

When are the shadow copies created?

Volume shadow copies (restore points) are created before the installation of device drivers, system components (e.g. DirectX), Windows updates, and some applications.

In addition, Windows automatically creates restore points at hard-to-predict intervals. The first thing to understand here is that the System Restore task on Vista and 7 will only execute if your computer is idle for at least 10 minutes and is running on AC power. Since the definition of “idle” is “0% CPU usage and 0% disk input for 90% of the last 15 minutes, plus no keyboard/mouse activity” (source), it could take days for your machine to be idle, especially if you have a lot of programs running in the background.

As you see, the frequency with which automatic restore points are created is hard to estimate, but if you use your machine every day on AC power and nothing prevents it from entering an idle state, you can expect automatic restore points to be created every 1-2 days on Windows Vista and every 7-8 days on Windows 7. Of course, the actual frequency will be higher if you count in the restore points created manually by you and those created before software installations.

Here’s a more precise description: By default, the System Restore task is scheduled to run every time you start your computer and every day at midnight, as long as your computer is idle and on AC power. The task will wait for the right conditions for up to 23 hours. These rules are specified in Scheduled Tasks and can be changed by the user. If the task is executed successfully, Windows will create a restore point, but only if enough time has passed since the last restore point (automatic or not) was created. On Windows Vista the minimum interval is 24 hours; on Windows 7 it is 7 days. As far as I know, this interval cannot be changed.

What cool things can I do with Volume Shadow Copy?

  • If your system malfunctions after installing a new video card driver or firewall software, you can launch System Restore and roll back to a working system state from before the installation. If you can’t get your system to boot, you can also do this from the Windows Setup DVD. This process is reversible, i.e. your current state will be automatically saved as a restore point, to which you can later go back. (Note: System Restore will not roll back your documents and settings, just the system files.)
  • previous_versionsIf you accidentally delete 10 pages of your dissertation, you can right-click the document, choose Restore previous versions, and access a previous version of it. You can open it (in read-only mode) or copy it to a new location.
  • If you accidentally delete a file or folder, you can right-click the containing folder, choose Restore previous versions, and open the folder as it appeared at the time a shadow copy was made (see screenshot below). All the files and folders that you deleted will be there!

previous_folder

Note: While the Volume Shadow Copy service and System Restore are included in all versions of Windows Vista, the Previous versions user interface is only available in Vista Business, Enterprise and Ultimate. On other Vista versions, the previous versions of your files are still there; you just cannot access them easily. The Previous versions UI is available in all versions of Windows 7. It is not available in any version of Windows 8.

Is Volume Shadow Copy a replacement for versioning?

No. A versioning system lets you access all versions of a document; every time you save a document, a new version is created. Volume Shadow Copy only allows you to go back to the moment when a restore point was made, which could be several days ago. So if you do screw up your dissertation, you might have to roll back to a very old version.

Is Volume Shadow Copy a replacement for backups?

No, for the following reasons:

  • Shadow copies are not true snapshots. When you create a restore point, you’re not making a new copy of the drive in question — you’re just telling Windows: start tracking the changes to this drive; if something changes, back up the original version so I can go back to it. Unchanged data will not be backed up. If the data on your drive gets changed (corrupted) for some low-level reason like a hardware error, VSC will not know that these changes happened and will not back up your data. (see below for a more detailed description of how VSC works)
  • The shadow copies are stored on the same volume as the original data, so when that volume dies, you lose everything.
  • With the default settings, there is no guarantee that shadow copies will be created regularly. In particular, Windows 7 will only create an automatic restore point if the most recent restore point is more than 7 days old. On Windows Vista, the minimum interval is 24 hours, but remember that the System Restore task will only run if your computer is on AC power and idle for at least 10 minutes, so it could take days before the conditions are right, especially if you run a lot of background processes or do not use your computer frequently.
  • There is no guarantee that a suitable shadow copy will be there when you need it. Windows deletes old shadow copies without a warning as soon as it runs out of shadow storage. With a lot of disk activity, it may even run out of space for a single shadow copy. In that case, you will wind up with no shadow copies at all; and again, there will be no message to warn you about it.

How much disk space do Volume Shadow Copies take up?

By default, the maximum amount of storage available for shadow copies is 5% (on Windows 7) or 15% (on Vista), though only some of this space may be actually allocated at a given moment.

You can change the maximum amount of space available for shadow copies in Control Panel | System | System protection | Configure.

How efficient is Volume Shadow Copy?

It’s quite efficient. The 5% of disk space that it gets by default is usually enough to store several snapshots of the disk in question. How is this possible?

The first thing to understand is that volume shadow copies are not true snapshots. When a restore point is created, Volume Shadow Copy does not create a full image of the volume. If it did, it would be impossible to store several shadow copies of a volume using only 5% of that volume’s capacity.

Here’s what really happens when a restore point is created: VSC starts tracking the changes made to all the blocks on the volume. Whenever anyone writes data to a block, VSC makes a copy of that block and saves it on a hidden volume. So blocks are “backed up” only when they are about to get overwritten. The benefit of this approach is that no backup space is wasted on blocks that haven’t changed at all since the last restore point was created.

Notice that VSC operates on the block level, that is below the file system level. It sees the disk as a long series of blocks. (Still, it has some awareness of files, as you can tell it to exclude certain files and folders.)

The second important fact is that shadow copies are incremental. Suppose it’s Wednesday and your system has two shadow copies, created on Monday and Tuesday. Now, when you overwrite a block, a backup copy of the block is saved in the Tuesday shadow copy, but not in the Monday shadow copy. The Monday copy only contains the differences between Monday and Tuesday. More recent changes are only tracked in the Tuesday copy.

In other words, if we were to roll back an entire volume to Monday, we would take the volume as it is now, “undo” the changes made since Tuesday (using the blocks saved in the Tuesday shadow copy), and finally “undo” the changes made between Monday and Tuesday. So the oldest shadow copy is dependent on all the more recent shadow copies.

When I delete a 700 MB file, does VSC add 700 MB of data to the shadow copy?

No. When you delete a file, all that Windows does is remove the corresponding entry (file name, path, properties) from the Master File Table. The blocks (units of disk space) that contained the file’s contents are marked as unused, but they are not actually deleted. So all the data that was in the file is still there in the same blocks, until the blocks get overwritten (e.g. when you copy another file to the same volume).

Therefore, if you delete a 700 MB movie file, Volume Shadow Copy does not have to back up 700 MB of data. Because it operates on the block level, it does not have to back up anything, as the blocks occupied by the file are unchanged! The only thing it has to back up is the blocks occupied by the Master File Table, which has changed.

If you then start copying other files to the same disk, some of the blocks formerly occupied by the 700 MB file will get overwritten. VSC will make backups of these blocks as they get overwritten.

If VSS is constantly backing up blocks of data that get overwritten, what actually happens when a restore point is created if data is automatically being backed up anyway?

Not much — VSS simply starts backing up the data to a new place, while leaving the “old place” there (at least until it runs out of space). Now you have two places to which you can restore your system, each representing a different point in time. When you create a restore point, you’re simply telling VSS: “I want to be able to go back to this point in time”.

Note that it’s a mistake to think that VSS is backing up every change you make! It only backs up enough to enable you to go to a specific point in time. Here’s an example scenario to clear things up:

  1. You create a file (version #1)
  2. You create a restore point
  3. You change the file (resulting in version #2) — VSS backs up version #1
  4. A week later, you change the file again (resulting in version #3) — VSS doesn’t back anything up, because it already has version #1 backed up. As a result, you can no longer go back to version #2. You can only go back to version #1 — the one that existed when the restore point was created.

(Note that actually VSS doesn’t operate on files but on blocks, but the principle is the same.)

What are the security implications of Volume Shadow Copy?

Suppose you decide to protect one of your documents from prying eyes. First, you create an encrypted copy using an encryption application. Then, you “wipe” (or “secure-delete”) the original document, which consists of overwriting it several times and deleting it. (This is necessary, because if you just deleted the document without overwriting it, all the data that was in the file would physically remain on the disk until it got overwritten by other data. See question above for an explanation of how file deletion works.)

Ordinarily, this would render the original, unencrypted document irretrievable. However, if the original file was stored on a volume protected by the Volume Shadow Copy service and it was there when a restore point was created, the original file will be retrievable using Previous versions. All you need to do is right-click the containing folder, click Restore previous versions, open a snapshot, and, lo and behold, you’ll see the original file that you tried so hard to delete!

The reason wiping the file doesn’t help, of course, is that before the file’s blocks get overwritten, VSC will save them to the shadow copy. It doesn’t matter how many times you overwrite the file, the shadow copy will still be there, safely stored on a hidden volume.

Is there a way to securely delete a file on a volume protected by VSC?

No. Shadow copies are read-only, so there is no way to delete a file from all the shadow copies.

A partial solution is to delete all the shadow copies (by choosing Control Panel | System | System protection | Configure | Delete) before you wipe the file. This prevents VSC from making a copy of the file right before you overwrite it. However, it is quite possible that one of the shadow copies you just deleted already contained a copy of the file (for example, because it had recently been modified). Since deleting the shadow copies does not wipe the disk space that was occupied by them, the contents of the shadowed file will still be there on the disk.

So, if you really wanted to be secure, you would also have to wipe the blocks that used to contain the shadow copies. This would be very hard to do, as there is no direct access to that area of the disk.

Some other solutions to consider:

  • You could make sure you never save any sensitive data on a volume that’s protected by VSC. Of course, you would need a separate VSC-free volume for such data.
  • system_protectionYou could disable VSC altogether. (After disabling VSC, you may want to wipe the free space on your drive to overwrite the blocks previously occupied by VSC, which could contain shadow copies of your sensitive data.) However, if you disable VSC, you also lose System Restore functionality. Curiously, Windows offers no option to enable VSC only for system files. If you want to protect your system, you also have to enable Previous versions (see screenshot to the right).
  • The most secure approach is to use an encrypted system volume. That way, no matter what temporary files, shadow copies, etc. Windows creates, it will all be encrypted.

Notice that VSC only VSC only lets you recover files that existed when a restore point was created. So if the sequence of events is as follows:

create file → create restore point → make encrypted copy → overwrite original file

the original file will be recoverable. But if the sequence is:

create restore point → create file → make encrypted copy → overwrite original file

you are safe. If you make sure to encrypt and wipe files as soon as you create them, so that no restore point gets created after they are saved on disk in unencrypted form, there will be no way to recover them with VSC. However, it is not easy to control when Windows creates a restore point; for example, it can do it at any time, just because your computer happens to be idle.

Can I prevent VSC from keeping snapshots of certain files and folders?

Yes, but you have to edit the registry to do that. Here are detailed instructions from MSDN.

What happens when VSC runs out of space?

Most of the time, most of the data on your disk stays unchanged. However, suppose you uninstall a 5 GB game and then install another 5 GB game in its place. This means that 5 GB worth of blocks got overwritten and had to be backed up by VSC.

In such “high-churn” scenarios, VSC can run out of space pretty quickly. What happens then? VSC deletes as many previous shadow copies as necessary, starting from the oldest, until it has enough space for the latest copy. In the rare event that there isn’t enough space even for the one most recent copy, all the shadow copies will be deleted. There are no partial copies.

Thanks to Adi Oltean, who was one of the engineers of Volume Shadow Copy at Microsoft, for answering my questions on the subject.

Tags:

The Hidden Shadow

The flower delivery van had been parked across the street for far too long. Cahey peered outside through the window blinds for the third time. By now he was certain they had him under surveillance. He had been careful not to discuss the subject matter of his current project with anyone, but there were a few souls at the Tribune who knew he was working on a major investigative piece. Apparently that was enough to spike the government’s interest.

Cahey lit a cigarette and reflected on the van’s relatively conspicuous location. Sloppy surveillance work or a deliberate attempt to scare him into silence? There was no way to know. He was, however, sure of one thing: if they came here, they would find nothing. Knowing that digital content was much easier to protect from prying eyes than papers, photographs and recordings, he had disposed of every physical record of his investigation, leaving only a digitized copy on the hard drive of his laptop computer. Two days ago, he had encrypted all this data using an open-source application called TrueCrypt, making sure to overwrite the original files several times before deletion. Now his data was unrecoverable without the password, and there was nothing anybody could do about it, not even the NSA with their army of PhD’s and their supercomputers. The spooks would be in for a surprise.

“Drrrrrt” — the sound of the doorbell pierced the smoke-infused air. Cahey glanced through the window. The van was gone. As he walked towards the door, he contemplated logging out of his Windows account, but decided against it. Bypassing that layer of security would be a trivial exercise, and it wouldn’t do the government much good anyway, given the fact that everything of interest was now encrypted. He opened the door. On his porch stood five serious-looking men in suits. “Stephen Cahey? We have a warrant to search the premises.”

———-

Agent Jack Trallis looked at the machine he had been ordered to process. It was a pretty standard Dell laptop with a dual-core CPU and a 15-inch screen that was covered with fingerprints. “God, do I hate those glossy displays”, he muttered to himself. He was alone in the room; the other agents were in the living room questioning the suspect. Trallis noticed the prominent TrueCrypt icon on the machine’s desktop. “Uh oh. Strong encryption.” He fixed his eyes on the taskbar at the bottom of the screen. There was a row of oversized, unlabeled icons that reminded him of the Hackintosh he had once built for his girlfriend. The guy’s laptop was running Windows 7. There was still a chance.

He located the Documents folder, opened its Properties window, and clicked on the “Previous Versions” tab. Just as he thought, there were five previous versions of the folder – “shadow copies” created regularly by the operating system as part of the System Restore mechanism. As these snapshots were prepared silently in the background and stored on a hidden disk volume, few users were aware of them. Agent Trallis was smiling. The good guys from Redmond were going to make his job easy again.

He selected one of the snapshots and clicked Open. An Explorer window popped up, showing the contents of the Documents folder exactly as it had appeared three days ago. “This is too funny”, he thought. There was a subfolder labeled Project Foxhunt full of scanned documents and audio files. Trallis grabbed his radio. “Sir”, he called out to his commanding officer, “I’ve got something you might want to have a look at.”

For technical information on Volume Shadow Copy, read What you should know about Volume Shadow Copy/System Restore in Windows 7 & Vista

Tags:

An audiophile’s look at the audio stack in Windows Vista and 7

If you are an audiophile who uses a PC as a source in your audio system, you’re probably aware of the fact that Windows Vista introduced a brand-new audio engine to replace the much hated KMixer of Windows XP. In my opinion, there are a few reasons why audiophiles should be happy with this change:

  • The new audio stack automatically upconverts all streams to a 32-bit floating-point sample depth (the same that is used in professional studios) and mixes them with the same precision. Because of the amount of headroom that comes with using 32-bit floats, there is no more clipping when playing two samples at the same time. There is also no loss of resolution when you lower the volume of a stream (see below).
  • The Vista/Win7 audio engine automatically feeds your sound card with the highest-quality output stream that it can handle, which is usually 24 bits per sample. Perhaps you’re wondering why you should care, given that most music uses only 16 bits per sample. Suppose you’re playing a 16-bit song with a digital volume control set to 10%. This corresponds to dividing each sample by 10. Now let’s assume the song contains the following two adjacent samples: 41 and 48. In an ideal world, after the volume control we would get 4.1 and 4.8. However, if the output stream has a 16-bit depth just like the input stream, then both output samples will have to be truncated to 4. There is now no difference between the two samples, which means we have lost some resolution. But if we can have an output stream with 24 bits per sample, for each 16-bit level we get 28 = 256 additional (“fractional”) levels, so we can still preserve the difference between the two attenuated samples. In fact, we can have ≈4.1016 and ≈4.8008, which is within 0.04% of the “ideal” samples of 4.1 and 4.8.
  • Don’t you hate it when you change the volume in your movie player or instant messaging software and instead of changing its own volume, it changes your system volume? Or have you ever used an application with its own poorly implemented volume control (iTunes, I’m pointing at you!)? Well, these abominations should now be behind us. In Vista and Win7, each application gets its own audio stream (or streams) and a separate high-quality volume control, so there should no longer be any reason for application vendors to mess with the system volume or roll their own and botch the job.

So Windows Vista and Windows 7 upconvert all your samples to 32-bit floats and mix them with 32-bit precision into an output stream that, by default, has the highest bit depth that your hardware can handle. The output bit depth is customizable; you can change it in the properties of your audio device. If you change it e.g. to 16 bits, the audio engine will still use 32-bit floats for internal processing — it will just downconvert the resulting stream to 16 bits before sending it to your device.

Now, what about the sample rate? You can set the output sample rate in the audio device properties window, but is there also some internal sample rate that the Windows audio engine uses regardless of your setting? For example, does it upsample your 44.1 kHz songs to 96 or 128 kHz? Unlike the upconverting from 16-bit integers to 32-bit floats (which should be completely lossless), this could potentially introduce some distortion as going from 44.1 kHz to 96 or 128 kHz requires at least some interpolation.

I couldn’t find the answer to this question anywhere, so I wrote to Larry Osterman, who developed the Vista and Win7 audio stacks at Microsoft. His answer was that the sample rate that the engine uses is the one that the user specifies in the Properties window. The default sample rate is chosen by the audio driver (44.1 kHz on most devices). So if your music has a sample rate of 44.1 kHz, you can choose that setting and no sample rate conversion will take place. (Of course, any 48 kHz and higher samples will then be downsampled to 44.1 kHz.)

There is some interesting technical information on the Windows Vista audio stack in this Channel9 video.

Tags:

Setting up cross-domain tracking of e-commerce transactions with Google Analytics and FastSpring

The problem

You have a website running Google Analytics (with the latest tracking code). You are selling widgets using a 3rd party online store provided by FastSpring.

You would like to open Google Analytics, display a list of all your sales, and be able to see where your paying customers are coming from. You want to know what websites they are being referred from, what search keywords they are using to find your site, and which pages on your site they are landing on. You also want to know which keywords (organic or paid) and which landing pages are earning you the most money.

The non-solution

FastSpring boasts easy Google Analytics integration, so getting your e-commerce transactions to show up in your Analytics reports is a piece of cake. Pretty much all you need to do is enter your Analytics profile number in the FastSpring management panel and set “E-commerce website” to “Yes” in your profile settings, if you haven’t done so already.

You can now see all your sales in Analytics. However, when you start looking at the referrals and keywords that led to the sales, you will get a nasty surprise: All your sales appear as though they were referred from your site! Instead of “cheap widgets”, or whatever keywords your customers use to find your site, you’ll see “(direct)”. Instead of Google, you’ll see “yourwidgetsite.com”, or whatever you called your silly widget site. This will not stand!

Here’s why Analytics loses track of the referral information: The Google Analytics code on your website’s pages recognizes visitors by a cookie which stores a unique visitor ID. However, this cookie is assigned to your domain (yourwidgetsite.com). When a visitor leaves your site and enters your FastSpring store (at sites.fastspring.com), the Google Analytics code FastSpring added to your store pages does not have access to that cookie. So what it does is create a brand-new tracking cookie. And the only information it has is the referring page on yourwidgetsite.com, which is why the transaction will be shown as having been referred from yourwidgetsite.com.

The proper solution

1.

First, change the Google Analytics code on all your pages to:

<script type="text/javascript">
var gaJsHost = (("https:" == document.location.protocol) ? "https://ssl." : "http://www.");
document.write(unescape("%3Cscript src='" + gaJsHost + "google-analytics.com/ga.js' type='text/javascript'%3E%3C/script%3E"));
</script>
<script type="text/javascript">
var pageTracker = _gat._getTracker("UA-XXXXXX-1");
pageTracker._setAllowHash(false);
pageTracker._setAllowLinker(true);
pageTracker._trackPageview();
</script>

Of course, you should replace “UA-XXXXXX-1” with your actual profile ID. If your website consists of multiple subdomains that you want to track under a single Analytics profile, you should also add this line:

pageTracker._setDomainName("yourwidgetsite.com");

This line will ensure that GA will use only one cookie (for yourwidgetsite.com) rather than a separate cookie for each subdomain.

It is critical that you use the exact same code on all your pages. Google Analytics is very sensitive and will set a new cookie when it detects the slightest change in the tracking code from one page to another. When this happens, you lose information on what your customer was doing before.

Google’s official help tells you to use pageTracker._setDomainName(“none”) instead of pageTracker._setAllowHash(false), but this solution is not recommended by the gurus at the Analytics Help forum. Caleb Whitmore says:

setDomainName(“none”) will make GA set the cookies to the full, EXPLICIT domain of the current hostname.  Thus, if someone is on “mysite.com” and then clicks to a page on “www.mysite.com” they’ll get new cookies on the “www” domain, or vice-versa.”

2.

Add the exact same code (with the exception of the _setDomainName line) to your FastSpring store pages. First, you need to disable Google Analytics tracking in the FastSpring management panel. FastSpring does not let you customize the tracking code and they use a very old version of the code anyway (from back when Google Analytics was known as Urchin).

You will need to insert the proper GA code into your store page template. In the FastSpring panel, go to “Styles” and create a new style (if you haven’t created one before). You will need a ZIP file with your current style files. You can get this file if you e-mail FastSpring. Unzip the file, then open a file named “window.xhtml”. Add the tracking code described above (with the exception of the _setDomainName line) right after the <body> tag.

3.

Change all the links from yourwidgetsite.com to your store at FastSpring to look like this:

<a href="https://sites.fastspring.com/..."
onClick="pageTracker._link(this.href, true); return false;">Buy my widget</a>

This code will take all the information stored in the GA tracking cookie for yourwidgetsite.com and append it to the URL query string used to navigate to your online store. The GA tracking code on the target store page will retrieve all this data from the query string and put it into the new cookie (assigned to sites.fastspring.com) that it will create. That way, all the information about your visitor’s previous activities will be preserved.

Remember that for the new cookie to get this information, every single link to your store must look as shown above.

4.

Now the GA code on your store pages has access to all the interesting information on search keywords, landing pages, etc. The only remaining problem is that no transactions are being recorded, since you disabled FastSpring’s built-in Analytics support in step 2. We will fix that now.

In the FastSpring management panel, go to “External Tracking”. This is where you previously disabled Google Analytics tracking. Now you are going to turn the tracking back on, but with the updated Analytics code.

Click “Add Custom Tracking Method”. Name it something like “Google Analytics (new code)”. Choose “Free-form XHTML Fragment”. Paste the following code into the text box:

<script type="text/javascript">
try {
pageTracker._addTrans(
"#{order.reference}",          // Order ID
"",                        // Affiliation
"#{order.subTotal.textValue}", // Total
"#{order.tax.textValue}",      // Tax
"",                            // Shipping
"#{order.address.city}",       // City
"#{order.address.region}",     // State
"#{order.address.country}"     // Country
);


<repeat var="item" value="#{order.allItems}">
pageTracker._addItem(
"#{order.reference}",     // Order ID
"#{item.productName}",   // SKU
"#{item.productName}",    // Product Name
"",                      // Category
"#{item.priceTotal.textValue}",   // Price
"#{item.quantity}"        // Quantity
);
</repeat>

pageTracker._trackTrans();
} catch(err) { }
</script>

Whereas the code you added to window.xhtml (your store Style) is inserted into every page of your FastSpring store, the above code appears only on the order confirmation page (near the bottom). It sends some basic transaction data to Analytics, then it takes each order item and sends information about it to Analytics as well (the <repeat> loop).

A few comments are in order:

  • FastSpring has no SKU’s or product ID’s, but Analytics needs this parameter — otherwise it does not record all the line items in an order correctly. So the best solution is to pass the product name as the SKU.
  • #{order.subTotal.textValue} is passed as the “Total”, because the order amount before taxes and shipping is probably what you are interested in. If you’d rather see the total amount paid by the customer in your Analytics reports, you can change it to #{order.total.textValue}.
  • “Shipping” is left empty because FastSpring does not provide that variable.

Neither the above variables nor the <repeat> construct are documented anywhere in the manuals that FastSpring provides to sellers. I got this information from Ryan Dewell, the VP of Software Development at FastSpring. He has mentioned to me that FastSpring will be updating its Google Analytics support, so it is possible that full Analytics integration will eventually be possible without all the hacking described here.

$latex ihbarfrac{partial}{partial t}left|Psi(t)right>=Hleft|Psi(t)right>$

Tags:

How to clean eyeglasses

Picture of Ludwik dishwashing liquidI’ve worn eyeglasses since I was 3 years old. A few years ago, I started getting annoyed with the dust and grease that keep building up on my glasses. Maybe it’s old-age grumpiness kicking in, or maybe it’s because I started to use LCD displays whose immaculate picture quality sensitized me to any blurriness between the LCD matrix and my retina.

Anyway, I started cleaning my glasses regularly. The problem was that I couldn’t figure out a good cleaning technique. First, I tried washing my glasses with running water and then drying them up with towels. That didn’t work so well for the grease and the towels (either cloth or paper) would leave tons of lint on my glasses. So I bought a professional microfiber cloth, the same kind that I use for cleaning photographic lenses, and some isopropyl alcohol (isopropanol), the stuff that they put in those overpriced “lens cleaning kits” you’ll find in the photography section of your electronics store. That was a lot better than my previous technique, but the alcohol would not clean off all the grease, which was impossible to remove completely with the cloth.

Well, I’ve finally figured it out. (Actually, I wish I had. I learned about this technique from my optician.) The answer is dishwashing liquid (AKA dish soap).

  1. Rinse your glasses under running water.
  2. Put a bit of dishwashing liquid on one of the lenses, then use your fingers to gently rub the liquid on both sides of both lenses.
  3. Rinse glasses again to remove the dish soap. You don’t need to use your fingers to get the dish soap off – just use running water. You should be looking at perfectly clean lenses with a few drops of water on them. If there’s any grease or other spots, repeat steps 2 and 3.
  4. Use a microfiber cloth to gently clean off remaining water drops. Use light touches – there might be small pieces of dirt on the cloth and if you rub it too hard, they might scratch the lenses. The microfiber cloth leaves no fluff, so your glasses should be perfectly clean.

It’s really a perfect combination. The dish soap dissolves all the grease, so you don’t get any smudges when you use the microfiber cloth. The microfiber cloth removes the remaining water drops and (non-greasy) stains made by evaporating water, and leaves no lint. The result: pristine-looking glasses in one minute.

What’s more, this technique is fairly convenient to use. Many online how-tos recommend special eyeglass-cleaning sprays or vinegar, which may be expensive or unavailable. On the other hand, most people have dish soap in their kitchen, so the only special accessory you need is a microfiber cloth, which costs $7 (for a top-quality one) and can be re-used for years. And even that isn’t really necessary, as paper towels or tissues work almost as well.

Tags:

Google Toolbar shows incorrect PageRank for HTTPS pages

If you are a webmaster looking to improve your Google ranking by arranging links from high-PageRank sites, you should be wary of pages whose URL starts with “https” (secure pages). As I have found, the Google Toolbar reports incorrect (usually inflated) PageRank for such pages, so that a secure page which appears to have a PageRank of 8 may in fact have a much lower PR or may not even be indexed at all.

When you visit a secure (https) page, the PageRank you see in the Google Toolbar is not the PageRank of the page — instead, the toolbar shows you the PR of the root page of the domain.

  • https://addons.mozilla.org/en-US/firefox/user/1754094 is the user info page for Jose Bolanos, one of the many developers at the Mozilla add-on community website. In both IE and Firefox, it appears to have a PageRank of 8. Now, if that were true, Jose would truly have a reason to smile. Unfortunately for him, what the toolbar is actually showing is the PageRank of https://addons.mozilla.org, the homepage of the add-ons site. If you try any other page on the site (e.g. this one or this one), you will see that they all have a PR of 8.
  • The Google Adsense login page (https://www.google.com/adsense/login/en_US/) appears to have a PR of 10 (higher than adobe.com) not because it has so many incoming links or because Google has given it an artificial boost — it’s just that the Google Toolbar is reporting the PR of google.com.
  • This obscure page at the Michigan State University and all its subpages appear to have a PR of 8. Of course, by now we know that what we’re really seeing is the PR of the MSU homepage (www.msu.edu) and the actual PR of the page is unknown.

The bug seems to affect the latest versions of the Google Toolbar for IE and Firefox. However, I have seen an earlier version of the toolbar that did not suffer from this problem, so I believe the issue is version-dependent.

Tags:

Should we care about ABX test results?

Policy no. 8 in the Terms of Service of the respected audiophile community Hydrogenaudio states:

8. All members that put forth a statement concerning subjective sound quality, must — to the best of their ability — provide objective support for their claims. Acceptable means of support are double blind listening tests (ABX or ABC/HR) demonstrating that the member can discern a difference perceptually, together with a test sample to allow others to reproduce their findings.

What a breath of fresh air. Other audio forums are full of snake-oil-peddling and Kool-Aid-drinking evangelists who go on and on about how replacing $200 speaker wires with $400 speaker wires “really opened up the soundstage and made the upper-midrange come alive”. The people at Hydrogenaudio know that such claims demand proper scientific evidence. How nice to see that they dismiss subjective nonsense and rely instead on the ultimate authority of ABX tests, which really tell us what makes a difference and what doesn’t.

Except that ABX tests don’t measure what really matters to us. ABX tests tell us whether we can hear a difference between A and B. What we really want to know, however, is whether A is as good as B.

1.

“Wait a second!”, I hear you exclaim. “Surely if I cannot tell A from B, then for all intents and purposes, A is as good as B and vice versa. If you can’t see the difference, why pay more?

Actually, there could be tons of reasons. To take a somewhat contrived example, suppose I magically replaced the body of your car with one that were less resistant to corrosion, leaving all the other features of your vehicle intact. Looking at the car and driving it, you would not notice any difference. Even if I gave you a chance to choose between your original car and the doctored one, they would seem identical to you and you could choose either of them. However, if you were to choose the one I tampered with, five years later your vehicle’s body would be covered in spots of rust.

The obvious lesson here is that “not seeing a difference” does not guarantee that A is as good as B. Choosing one thing over another can have consequences that are hard to detect in a test because they are delayed, subtle, or so odd-ball that no one even thinks to record them during the test.

But how is this relevant to listening tests? Assuming that music affects us through our hearing, how could we be affected by differences that we cannot hear?

In his fascinating book Burning House: Unlocking the Mysteries of the Brain, Jay Ingram describes the case of a 49-year-old woman suffering from a condition called hemispatial neglect (the case was researched by neuropsychologists John Marshall and Peter Halligan). Patients with hemispatial neglect are unable to perceive one (usually the left) side of the objects they see. When asked to copy drawings, they draw only one side; when reading out words, they read them only in half (e.g. they read simile as mile).

burning-houseIn Marshall and Halligan’s experiment, the woman was given two simple drawings showing two houses. In one of the drawings, the left side of the house was covered in flames and smoke; the houses looked the same otherwise. Since the flames were located on the left side, the patient was unable to see them and claimed to see no difference between the drawings. When Marshall and Halligan asked her which of the houses she would rather live in, she replied — rather unsurprisingly — that it was a silly question, given that the houses were identical.

However, when the experimenters persuaded her to make a choice anyway, she picked the flameless house 14 out of 17 times, all the time insisting that both houses look the same.

Marshall and Halligan’s experiment shows (as do other well-known psychological experiments, including those pertaining to subliminal messages) that it is possible for information to be in a part of the brain where it is inaccessible to conscious processes. This information can influence one’s state of mind and even take part in decision-making processes without one realizing it.

If people can be affected by information that they don’t even know is there, then who says they cannot be affected by inaudible differences between an MP3 and a CD? Failing an ABX test tells you that you are unable to consciously tell the difference between two music samples. It does not mean that the information isn’t in your brain somewhere — it just means that your conscious processes cannot access it.

So the fact that you cannot tell the difference between an MP3 and a CD in an ABX test does not mean that an MP3 is as good as a CD. Who knows? Maybe listening to MP3s causes more fatigue in the long run. Maybe it makes you get bored with your music more quickly. Or maybe the opposite is true and MP3s are actually better. We can formulate and test all sorts of plausible hypotheses — the point is, an ABX test which shows no audible difference is not the end of the discussion.

2.

I have shown that the lack of audible differences between A and B in an ABX test does not imply that A is as good as B. Before you read this post as an apology for lossless audio formats, here is a statement that will surely upset hard-core audiophiles:

The fact that you can tell the difference between an MP3 and a CD in an ABX test does not mean that the MP3 is worse than a CD.

First of all, the differences between MP3s encoded at mainstream bitrates (128 kbps and 192 kbps) and original recordings are really subtle and can be detected only under special conditions (quiet environment, good equipment, full listener concentration, direct comparisons of short samples). Because the differences are so tiny, we cannot automatically assume that it is the uncompressed version that sounds better. Subtle compression artifacts such as slightly reduced sharpness of attacks on short, loud sounds may in fact be preferred by some listeners in a direct comparison.

Secondly, even if we found that the uncompressed version is preferred by listeners, that wouldn’t necessarily mean that it is better. People prefer sitting in front of the TV to exercising, but the latter might make them feel much better overall. If it were discovered, for example, that compressed music is less tiring to listen to (this is of course pure speculation), then that fact might outweigh any preference for uncompressed sound in blind tests.

Summary

The relevance of ABX tests to the lives of music lovers is questionable. Neither does the absence of audible differences imply equal quality, nor does the presence of audible differences imply that the compressed version is inferior. Rather than being the argument to end all debate, the results of ABX tests are just one data point and the relative strengths of various audio formats may well be put in a new light by further research.

Tags:

Blind-testing MP3 compression

Among music listeners, the use of lossy audio compression technologies such as MP3 is a controversial topic. On one side, we have the masses who are glad to listen to their favorite tunes on $20 speakers connected to their PC’s onboard audio device and couldn’t care less what bitrate MP3s they get as long as the sound quality is better than FM radio. On another side, we have the quasi-audiophiles (not true audiophiles, of course, as those would never touch anything other than a high-quality CD or LP player properly matched to the amplifier) who stick to lossless formats like FLAC due to MP3’s alleged imperfections.

If I considered myself part of either group, my life would be easy, as I would know exactly what to do. Unfortunately, I fall somewhere in between. I appreciate music played through good equipment and I own what could be described as a budget audiophile system. On the other hand, I am not prepared to follow the lead of the hard-core lossless format advocates, who keep repeating how bad MP3s sound, yet do not offer anything in the way of objective evidence.

So, me being me, I had to come to my own conclusions about MP3 compression. Is it okay for me to listen to MP3s and if so, what bitrate is best? To answer these questions, I spent many hours doing so-called ABX listening tests.

What is an ABX test?

An ABX test works like this: You get four samples of the same musical passage: A, B, X and Y. A is the original (uncompressed) version. B is the compressed version. With X and Y, one is the original version (same as A), the other is the compressed version (same as B), and you don’t know which is which. You can listen to each version (A, B, X or Y) as many times as you like. You can select a short section of the passage and listen to it in each version. Your objective is to decide whether X = A (and Y = B) or X = B (and Y = A). If you can get a sufficient number of right answers (e.g. 7 times out of 7 or 9 times out of 10), you can conclude that there is an audible difference between the compressed sample and the original sample.

What I found

  1. The first thing I found was that telling the difference between a well-encoded 128 kbps MP3 and a WAV file is pretty damn hard. Since 128 kbps is really the lowest of the popular MP3 bitrates and it gets so much bad rap on forums like Head-Fi, I expected that it would fail miserably when confronted with the exquisite work of artists like Pink Floyd or Frank Sinatra. Not so. Amazingly, the Lame encoder set at 128 kbps (ABR, high quality encoding) held its own against pretty much anything I’d throw at it. The warm, deeply human quality of Gianna Nannini’s voice in Meravigliosa Creatura, the measured aggression of Metallica’s Blitzkrieg, the spacious guitar landscapes of Pink Floyd’s Pulse concert — it all sounded exactly the same after compression. There were no changes to the ambiance of the recording, the quality of the vocals, the sound of vowels and consonants, the spatial relationships between the instruments on the soundstage, or the ease with which individual instruments could be picked out.
  2. That said, MP3s at 128 kbps are not truly transparent. With some training, it is possible to distinguish them from original recordings in blind listening tests. My trick was to look for brief, sharp, loud sounds like beats or certain types of guitar sounds — I found that compression takes some of the edge off them. Typically, the difference is so subtle that successful identification is only possible with very short (a few seconds long) samples, a lot of concentration and a lot of going back and forth between the samples. Even then, the choice was rarely obvious for me; more often, making the decision felt like guessing. Which of the identical bass riffs I just heard seemed to carry more energy? A few times I was genuinely surprised that I was able to get such high ABX scores after being so unsure of my answers.
  3. With some effort, it is possible to find passages that make the difference between 128 kbps MP3 and uncompressed audio quite obvious. For me, it was just a matter of finding a sound that was sharp enough and short enough. In David Bowie’s Rock ‘n Roll Suicide, I used a passage where Bowie sings the word “song” in a particular, Dylanesque way (WAV file). Another example is a 1.2-seconds-long sample from Thom Yorke’s Harrowdown Hill (WAV file). The second beat in the sample is accompanied by a static-like click (clipping) that is considerably quieter in the compressed version. More samples that are “difficult” for the MP3 format can be found on the Lame project page (I found the “Castanets” sample especially revealing.).
  4. What about higher bitrates? As I increased the bitrate, the differences that were barely audible at 128 kbps became inaudible and the differences that were obvious became less obvious.
    • At 192 kbps, the Bowie and Yorke samples were still too much of a challenge and I was able to reliably tell the MP3 from the original, though with much less confidence and with more going back and forth between the two versions.
    • At 256 kbps (the highest bitrate I tested), I was not able to identify the MP3 version reliably — my ABX results were 7/10, 6/10 and 6/7, which can be put down to chance.

Caveats

Obviously, the results I got apply to my particular situation. If you have better equipment or better hearing, it is perfectly possible that you will be able to identify 256 kbps MP3s in a blind test. Conversely, if your equipment and/or hearing is worse, 192 kbps or even 128 kbps MP3s may sound transparent to you, even on “difficult” samples.

Test setup

  • Lame MP3 encoder version 3.98.2. I used Joint Stereo, High Quality, and variable bitrate encoding (ABR).
  • Foobar2000 player with ABX plugin. I used ReplayGain to equalize the volume between the MP3 and the original file — otherwise I found it too easy to tell the difference in ABX tests, since MP3 encoding seems to change the volume of the track somewhat.
  • Auzentech X-Meridian 7.1 — a well-respected audiophile-quality sound card with upgraded LM4562 op-amps.
  • RealCable copper jack-RCA interconnect.
  • Denon PMA-350SE amplifier — an entry-level audiophile receiver designed in England.
  • Sennheiser HD 25-1 II, top-of-the-line closed headphones with stock steel cable.

When I write that there was an audible difference in an ABX test, I mean that I got 7/7 or 9/10 correct answers without repeating the test.

Conclusions

If my goal was to use an MP3 bitrate that is indistinguishable from the original in a blind listening test, I would use 256 kbps, since that is the bitrate which I was unable to identify in a reliable way, despite repeated attempts on a variety of samples (including the “difficult” samples posted on the Lame website).

Whether I will actually standardize on 256 kbps, I’m not sure. The fact that a 192 kbps MP3 can be distinguished from the original in a contrived test (good equipment, quiet environment, high listener concentration, specially selected samples) does not mean it is unsuitable for real-world scenarios. Sure, at 192 kbps the music is not always identical to the original, but judging by my experiments, the difference affects less than 1% of my music (in a 100-second sample, more than 99 seconds would probably be transparent). Even if all I did was listen to this tiny proportion of my music, I would be in a position to perceive the difference less than 1% of the time (what percent of the time do I listen to music in a quiet environment? what percent of the time am I really focused on the music as opposed to other things I’m doing?). Besides, there is the rarely-posed question of whether “different” necessarily means “inferior” — it is quite possible that subtle compression artifacts might actually improve the perceived quality of music in some cases.

Tags:

Heartburn remedy: Ground flaxseed

About a month ago, I had a bad case of heartburn (AKA “acid indigestion”). After two days of having my stomach “burp” hydrochloric acid up my digestive tract, my esophagus (the part of the digestive tract that’s directly above the stomach) got inflamed and the pain would persist even when there was no acid.

I had to do something about it, and, of course, I turned to the Internet for advice. Here’s what I tried based on my research:

  • First, I started taking Maalox (mixture of magnesium hydroxide and aluminum hydroxide) chewable tablets. I also considered Manti, but it is 50% more expensive (in Poland at least) and the only difference is that it also contains simethicone, which helps relieve excess gas. Maalox helped neutralize the acid, providing instant relief, but I found I had to take it at least once every 2 hours. That didn’t look like a good remedy to me.
  • Some obscure sites recommended eating Jonagold apples for heartburn and GERD (gastroesophageal reflux disease), which is a kind of chronic heartburn. The idea is that apples contain pectin, which neutralizes stomach acid, and Jonagolds have the most pectin. I could find no reputable sources recommending apples as a remedy, but I thought it was worth a try. What I found was that the apples worked well — they actually provided longer-lasting relief than Maalox.

Even though I had found some ways to temporarily relieve my symptoms, my esophagus kept hurting. My mother suggested that I drink ground flaxseed, which she considers a good remedy for indigestion. Obviously, I scoffed at this advice. After all, I spent many hours researching heartburn on the Web and there was not one place that listed flaxseed as a possible remedy. Just to be sure, I ran a quick Google search, and — sure enough — absolutely no link between heartburn and flaxseed. Was I going to trust the whole of the world’s medical knowledge or my mother’s uneducated guesses?

Later that day, my mom made me some ground flaxseed with hot water, which I drank reluctantly, and — you guessed it — now the Internet shall have a page linking flaxseed and heartburn, because the flaxseed worked like a charm. Not only did it instantly kill the burning pain in my esophagus, it also seemed to neutralize the acid (or somehow shield my digestive tract from it). After days of popping Maalox, eating apples and watching my diet, I finally felt I was getting to the underlying cause of my condition. As an added bonus, I find the taste much better than Maalox. [Update: For a few more weeks, I still had the burping and an acidic taste in my mouth, especially after eating a considerable meal on a relatively empty stomach. But I had no burning pain, and I credit flaxseed with this change. Also, I did not have to take flaxseed every couple of hours, like antacids. In my case, the effects persisted for a really long time.]

Certainly I could be an isolated case, or I guess it could be some weird coincidence that I got better immediately after drinking the flaxseed. I don’t know if flaxseed will relieve your heartburn. But the results I experienced were too dramatic for me to keep this to myself.

The specific product that I used was Len mielony (podwójnie odolejony) made by Herbapol Lublin, a Polish manufacturer of herbal products. The name translates as Ground flaxseed (doubly de-oiled) and it is a sort of “diet” version of regular ground flaxseed with less flaxseed oil and fewer calories. It is a pretty fine powder with has a faint smell reminiscent of pumpkin seeds. You prepare it by simply pouring a glass of hot water over a teaspoon of the flaxseed powder.

In Poland, ground flaxseed is readily available in most drugstores. I’m not sure how easily it can be purchased in other countries. You can probably get it in health food stores and the like. If you suffer from heartburn and you can find it, I suggest you give it a shot.

About heartburn causes: I don’t know what caused my episode. The Internet tells me that heartburn is commonly caused by an incompetence of the lower esophageal sphincter (LES), which is a kind of valve that connects the stomach with the esophagus. There is a long list of possible causes of this “incompetence” (eating too much, eating wrong, smoking, hiatus hernia, etc.). About the only thing that I was able to identify with was eating shortly before bedtime, and I have adjusted my habits accordingly. So far this seems to have helped, although I sometimes feel some acidity, which seems to occur when I make long breaks between meals.

Tags:

Review of the Dell 2209WA 22″ LCD monitor

So, the Eizo S2231W went back to the store (kudos to Proline.pl for a problem-free returns policy) and my search for a new 22″ LCD continued. After some more Web research, my attention turned to the recently released Dell 2209WA. According to the well-informed posters at hardforum.com, this display is supposed to be a big deal for three reasons:

  • It is a 22″ panel that doesn’t use cheap TN technology. Up to now, the only non-TN 22″ displays available had been the Eizo S2231W, the Lenovo L220x (which has ultra-tiny pixels), and the HP LP2275W.
  • It uses an IPS panel. IPS panels are currently considered better than TN and PVA/MVA panels. They are supposed to have better color fidelity and better viewing angles than other panel types.
  • It only costs $380 (in Poland), half the price of the Eizo S2231W!

Spurred by rave user reviews, I quickly found a store that carries these monitors in my city and made sure that the store would accept a return if I didn’t like the screen for whatever reason. Having spent the equivalent of $750 on the Eizo a few days before, I almost felt like I was getting the screen for free this time!

My expectations were pretty high. Everyone and their mother says that IPS panels are better than S-PVA, so despite the budget price of the Dell 2209WA, I expected the image quality to be close to that of the Eizo S2231W. I was concerned with quality control issues like uneven backlighting or bad pixels. But what was really on my mind was the text clarity: Would the Dell be as horrible as the S2231W? I hoped it would at least be in the same league as my trusted Eizo S1910.

Text quality

As soon as I unpacked the Dell 2209WA, I placed it next to my S1910 to check how the text looked. I could barely believe my eyes. Not only was the Dell in the same league as my old display — it was actually better! All the letters looked sharper even after I dialed down the sharpening control (the default setting of 50 amounts to subtle sharpening; to disable sharpening altogether, you need to set it at 40).

The explanation for the difference in favor of the Dell 2209WA turned out to be quite simple. The subpixels on the S1910 are not rectangular. Their shape looks a bit like this: “>”.  On the other hand, the Dell 2209WA has rectangular subpixels, which are better suited for the subpixel rendering used by ClearType.

Here is a photo which compares the pixel structure of the Eizo S1910 with the Dell 2209WA (sorry for the slightly out-of-focus S1910 photo):

ClearType text on Eizo S1910 and Dell 2209WA

Image quality

Now that the text sharpness concern is out of the way, let’s take a look at other issues:

  • Color reproduction: Side-by-side comparison with the Eizo S1910 shows that the Dell 2209WA displays slightly less saturated reds. The subjective color quality when viewing photographs in full-screen mode can only be described as very good.
  • Black level: Compared with the Eizo monitors that I have owned or tested, the black level is quite poor. Dark photographs and dark scenes in movies and games appear flooded with a dark shade of grey, which makes for a washed out, bland image. Eizo monitors, while incapable of displaying true blacks, still manage a good contrast ratio when showing dark images. The Eizos show a clear difference between e.g. RGB (0,0,0) and RGB (2,2,2), so the picture retains a lot of “punch”. On the Dell, the near-black shades are still distinguishable from each other; however, the perceived contrast is lower. From my point of view, the difference between the Dell 2209WA and an Eizo (S-)PVA screen is so large that I think I would rather watch a movie on a smaller 4:3 screen with a good black level than on a 16:10 screen with a black level of the 2209WA. Perhaps I will get used to the lower quality over time, but it’s certainly a big step backwards. (If you used a TN panel before, you will likely consider it a step forward. It’s all relative.) [Added Nov 2009: I have to confess I watch movies on the Dell most of the time, despite the poor black level. It still bothers me, but it turns out I’m too lazy to move my displays around every time I want to watch a movie. Plus I like the bigger size.]
  • Brightness: The display is definitely too bright. Most hardforum.com users seem to set the brightness at 15 out of 100; I set it at zero. Even at 0 brightness, I find the monitor unusable in an unlit room at night — I have to put an Ikea Grönö lamp with a 40-watt-equivalent bulb right behind it. This is a perfect example of unethical marketing trickery. Pumping up the brightness allows Dell to advertise an impressive contrast ratio, which looks good to ignorant consumers, but really means nothing in terms of picture quality. The real quality of an LCD is measured by its black level. If the black level is good, you don’t need dazzling, eye-burning whites to maintain a high contrast ratio. In fact, a critical user will quickly find two things: (1) that bright whites are very hard on the eyes, (2) that the “turbo-brightness” trick does not work anyway when dark images are being displayed.
  • “Wet screen” / Sparkle effect: Visible if you take a close look. Any more of it, and it would be a problem for me. I still don’t see why manufacturers can no longer make a proper anti-glare coating like that of my Eizo S1910 (made in 2005), which displays perfectly matte solid-color areas that don’t sparkle like they are covered in hair gel.
  • Color shifting: I expected my Eizo S2231W to have the infamous “Rainbow White” effect. It didn’t have it. But the Dell 2209WA seems to be a textbook example of it. The left side of the screen has a greenish tint, whereas the right side is slightly reddish. The effect was present on both the units that I tested. I don’t think it will be too much of a problem for me, but if you are sensitive to hue shifting, be warned. [A week later: I can just barely notice the color shifting, even on grey areas taking up the entire screen. I think it was more noticeable when I first tested it. Did the monitor “burn in” or something? Anyway, it didn’t bother me before, it bothers me even less now.]
  • White glow / silvery shimmer: Along with the poor black level, this is a major problem with the Dell 2209WA. As I’m writing this, I’m smiling at all the people on the Internet who have written that IPS panels have better viewing angles than PVA panels. Sure, the colors don’t change as you change the viewing angle; instead, all the dark areas of the screen start to glow. For example, I normally sit about 70 centimeters from the screen. When playing a game with dark areas (e.g. BioShock or Mass Effect), I can always see bright, shimmering patches in the lower left and lower right corners of the screen. (That’s the best-case scenario after adjusting the monitor tilt.)  The effect goes away only when I sit 1 meter from the monitor, but then I have to be looking at the exact center of the panel. If I move my head 3 centimeters to the left or right, the white glow reappears. When the screen is completely black, the white glow is impossible to eliminate completely, unless I sit 1.5 meters or farther from the screen. Considering the fact that the blacks on the 2209WA are pretty grey anyway, this is a screen that has serious problems displaying dark images.
  • Angle-dependent white glow on the Dell 2209WA

    Angle-dependent white glow on the Dell 2209WA (from a distance of about 70 cm)

  • Backlight uniformity: There is a small bright patch adjacent to the top edge of the screen, about 13 cm from the right edge. I had my unit replaced because of this. The replacement unit has a slightly less visible patch in exactly the same location. I will not be replacing this unit, as it looks like this problem is present in the entire batch. Besides, this minor glitch is completely overpowered by the angle-dependent white glow described above.
  • Bad pixels: No bad pixels on either of the units I’ve tested. Pretty good.

Ergonomics

  • Heat / power issues: Internet wisdom says IPS panels consume more power than PVA panels and the Dell 2209WA seems to confirm this. The back of the monitor can get quite hot in normal operation — in contrast, the back of my other display (Eizo S1910) gets barely warm. On a warm summer day, the Dell can really add to the temperature in my small room. This could be the result of the absurdly bright backlight that Dell used in this display (see above).
  • Noise: This display is quite noisy. There is a pretty annoying high-pitched whine (1) right after you turn it on (for at least a few minutes), and (2) when it is displaying resolutions and refresh rates other than the native 1680 x 1050 @ 60 Hz. For example, there is a loud whine during the BIOS POST sequence, which uses character mode. During normal work in the native resolution and refresh rate, I can only hear a slight buzzing inverter noise at a distance of about 70 cm. This noise is not particularly annoying, but I can hear the buzzing go away when I turn off the monitor. This is in a quiet room at 2 am, when the loudest sound is the sound of my (quiet) hard drive and the near-silent fans in my PC (I’m a bit of a silent-PC enthusiast).  While at other times of day it is a complete non-issue, I sure don’t like the fact that my new monitor is a source of noise in my system. If you use stock cooling in your PC, don’t worry about the noise — you probably won’t hear it, as long as you stick to the native resolution.
  • Startup time: The backlight takes a long time to reach its proper brightness. When I turn on the Dell and my Eizo at the same time, the Dell is initially much darker than the Eizo and takes about 20 minutes to “catch up”.
  • Response time: Subjectively better than my old Eizo, but not by much.
  • 75 Hz refresh rate: The 2209WA supports a 75 Hz refresh rate, which means that it can display 25% more frames per second than a standard 60 Hz LCD. This makes for much more fluid motion in videogames. It really makes a difference in almost every game. However, turning on 75 Hz is tricky. You need to add a custom display mode in the nVidia control panel (if you have an nVidia card) or in Powerstrip (if you have an ATI card). Besides, when I set my monitor to anything over 60 Hz, the usual power supply noise it emits becomes much louder and turns into a high-pitched whine. It is quieter at some refresh rates than at others (in my case 73 Hz was relatively quiet), but it never completely goes away. In the end, I have decided to stick to 60 Hz.
  • Input lag: I did not notice anything troubling.
  • Anti-glare coating: Apart from the slight sparkle mentioned before, the anti-glare coating reflects a bit too much light. When I put it side-by-side with my Eizo, it is obvious that the Eizo is “more black”.

Summary

  • Text/office work: Very good. Text is very crisp, though brightness needs to be set at close to zero for comfortable work. If you like working with the lights out, I would recommend putting a 40 watt lamp right behind the monitor. You might also need to dial down the contrast, sacrificing color quality.
  • Photo viewing/editing: Satisfactory. Colors are well-reproduced and photos look basically the same as on much more expensive Eizo FlexScan LCDs. However, dark images look worse due to the combination of the mediocre black level and the white glow effect (you have to sit at a distance of 1 m or more to eliminate the latter defect).
  • Movie viewing/editing: Poor. Movies have a lot of dark, moody scenes, which look washed out on the 2209WA, even if you sit far away from the screen and position the monitor carefully to avoid the white glow effect. I’m afraid grey isn’t the new black…
  • Gaming: Mixed bag. Shooters and RPGs often use shadows to create atmosphere. On this monitor, these scenes have little depth. Brighter, more cheerful games look fine. On the other hand, the possibility of using a 75 Hz refresh rate (despite its relative noisiness) means that this monitor can display more fluid motion than typical LCDs.

My search for a 22-inch widescreen LCD that would match my 19″ Eizo S1910 has turned out to be a disappointment. Four years after I bought that display, there appears to exist no 22″ that is good at everything: text work, photo editing, movies and games.

That said, I will be keeping the Dell 2209WA. Why? First, I’m running out of options. The only non-TN 22″ that I have not yet tested is the HP LP2275W, but since it is equipped with an S-PVA panel, I expect it to have the same ClearType defect that made me return the Eizo S2231W.

Second, if I have to choose between a monitor that is suitable only for text work and a monitor that is only good for photos, movies and games (Eizo S2231W), I’m going to choose the former. After all, the amount of time I spend reading websites and working with documents far outweighs the combined time I spend working with Photoshop, watching movies or gaming. Last but not least, the Dell’s low price more than makes up for its shortcomings. If you cannot get a monitor that does it all, at least get an inexpensive monitor that satisfies most of your needs.

Tags: