Things I’ve learned, published for the public benefit
Hope This Helps header image

Entries categorized as 'Technology'

iTunes 10.2.1 fails to decode MP3 files properly

It is generally assumed that all major MP3 playback software produces the same output. The reason for this thinking is that the MPEG standard defines a decoder in a strict way, allowing only small deviations due to rounding.

A few years ago, I was disabused of that idea when I did an informal test to compare several well-known music players (iTunes 7, Winamp, Foobar2000, Windows Media Player). The test revealed iTunes 7 to be the outlier producing different output from the rest of the pack.

Today, I will present the results of a more rigorous test using the latest version of iTunes (10.2.1).

Test setup

  • Windows 7 Professional SP1 (32-bit) with all the latest updates
  • Auzentech X-Meridian 7.1 sound card
  • Cool Edit Pro 2.0 audio editing software

Tested players

  • Windows Media Player 12.0.7601.17514
  • Winamp 5.56
  • Foobar2000 1.1.5
  • iTunes 10.2.1

MethodolOgy

I played two 10-second MP3 clips in each player, recording the output digitally with Cool Edit Pro 2.0 using the S/PDIF loopback mechanism provided by the sound card driver.

All postprocessing options (crossfade, sound check, etc.) were turned off. Both application and system volume were at 100%.

I used the following MP3 files:

  1. a 10-second clip from Wszystko Ch. by Elektryczne Gitary encoded with LAME 3.97 at 256 kbps ABR with high encoding quality (download file)
  2. a 10-second clip from Time by Pink Floyd encoded with LAME 3.98 at 256 kbps ABR with high encoding quality (download file)

After recording in Cool Edit Pro (as a 16-bit, 44.1 kHz file which matched the source material), I saved each stream as a text file, which looked like this:

-354	-172
1	203
-447	-443
-2490	-3088
-3504	-3676
-3233	-2944
-3206	-3867
-2829	-4348
-2391	-4461
-2196	-4165...

I also opened each of the two MP3 files directly in Cool Edit Pro 2.0 and then saved it as a text file. This file was used as a reference: the output of each player was compared against it. Cool Edit Pro 2.0 uses a Fraunhofer MP3 decoder (Fraunhofer IIS is the institute where MP3 was developed).

I opened the text files in Notepad++ and synchronized them by discarding the initial silence in each file. The goal was to make sure that the first sample in each file corresponded to the start of the clip to enable direct sample-by-sample comparison.

After synchronization, each text file was opened in Cool Edit Pro 2.0 again. Each waveform was subtracted from the reference waveform to reveal the differences.

Results

Each waveform below shows the difference between the reference output stream (Cool Edit Pro 2.0 with Fraunhofer decoder) and the output stream produced by an MP3 player.

Wszystko Ch. – Windows Media Player 12

wszystko-wmp

Wszystko Ch. – Winamp 5.56

wszystko-winamp

Wszystko Ch. – Foobar2000 1.1.5

wszystko-foobar

Wszystko Ch. – iTunes 10.2.1

wszystko-itunes10

As you can see, Windows Media Player, Winamp and Foobar2000 all produced output that matched the reference stream very closely. A review of the text files showed that all three players produced virtually identical bitstreams: the differences between individual samples and the reference stream did not exceed 1, or in rare cases, 2. These differences were not large enough to register on the waveform view, even with magnification.

iTunes 10.2.1, however, added significant distortion that can be seen in the waveform above. In some cases, the samples deviated from the reference values by as much as 5 percent (e.g. 1811 instead of 1719). You can also download the above waveform as a WAV file to hear the “enhancement” added by iTunes. It basically sounds like a very high-pitched sound (> 15000 Hz) of an uneven volume. The ability to hear it will depend on your age: younger listeners will find it more prominent. (Of course, during normal music listening this sound would be very hard to hear.)

The output generated by iTunes 10.2.1 did not depend on the output setting in QuickTime (which iTunes uses to play audio). DirectSound, WaveOut and Windows Audio Sessions all produced the same output.

Time – Windows Media Player 12

time-wmp

TimeWinamp 5.56

time-winamp

Time Foobar2000 1.1.5

time-foobar

TimeiTunes 10.2.1

time-itunes10

Again, Windows Media Player, Winamp and Foobar2000 match the reference stream, while iTunes engages in creative decoding. In this sample, the distortion is smaller: personally, I cannot hear anything when I play the above waveform.

Conclusions

Cool Edit Pro 2.0, Windows Media Player 12, Winamp 5.56 and Foobar2000 1.1.5 all decoded the MP3 clips in virtually the same way. iTunes 10.2.1, on the other hand, produced a distorted output stream. While the distortion is probably inaudible in normal listening situations, it seems to mean that the latest version of iTunes fails to conform to the MP3 standard and is probably best avoided by users who care about audio fidelity.

Notes

In further tests using the same samples, I found that iTunes 9.2.1 matched the reference stream as well as WMP, Winamp and Foobar2000 – it would therefore seem that it decodes MP3 files properly. I also evaluated MediaMonkey 3 and detected very significant distortion (much larger than iTunes 10.2.1), even after disabling as many postprocessing options as I could find (volume leveling, clipping protection, crossfade, smooth stop/seek/pause, remove silence – did I miss anything?).

Check out the thread at Hydrogenaudio for an interesting discussion and independent measurements which confirm my findings.

Added November 2012: I made some quick measurements with the latest version of iTunes (10.7.0.21 for Windows) and it seems that it decodes MP3 files properly.

Added November 2014: I tested iTunes 11.4 for Windows (with DirectSound playback enabled) on one of the files and it is close enough to the reference waveform.

Tags:

If someone swapped out your CPU for a slower one, would you notice?

Today morning, I was checking the specs of my CPU when I noticed something weird. Intel Core 2 CPUs are supposed to slow down when they are not under load. Intel calls this feature “Enhanced SpeedStep technology” and it’s designed to conserve energy. But my CPU – a 3 GHz Core 2 Duo – was running at its full clock speed at all times.

When I launched the Task Manager, the cause of the problem became obvious. The CPU was under load: both cores were fully utilized by ORTHOS, a simple program used to stress-test CPUs and RAM. In fact, ORTHOS had been running for 58 hours. I had started it two days ago to heat up my room in the night, and forgotten to shut it down the next morning. (My Radeon HD4850 is a much better heater, but I wanted to effect a gentle increase in temperature, not turn my room into a sauna.)

Chew on this: Over the past two days, I had been using my machine almost continuously and hadn’t realized I had two computationally intensive processes sucking the life out of both CPU cores! (In the interest of full disclosure, there was a brief moment yesterday evening when I thought that skipping forward and backward in a HD video clip took a bit too long, but I put it down to normal differences between video formats.) If I hadn’t checked my CPU parameters this morning, which I did for a completely random reason, who knows how much longer it would have taken me to realize something was amiss.

Now, that wasn’t the first time that I’d had ORTHOS running in the background while using my computer. Those other times, it was a different experience altogether. Applications took a long time to launch, websites took much too long to load – the lack of responsiveness was simply unacceptable. I would have sworn to you that ORTHOS was crippling my PC.

Of course, the only difference between those other times and the last two days was in my head. Back then, I knew I had ORTHOS stressing my CPU, so I expected poor performance, which is why every single operation seemed slow to me. Without that knowledge and that expectation, my PC was, it seems, perfectly responsive.

Today’s experience will make me think long and hard before I decide to spend money on a new CPU. And every time I hear someone say how much snappier their new Intel i7 rig feels next to a Core 2, I will wonder: would they even notice if I secretly swapped out their i7 for their old Core 2?

Tags:

Windows 7: The almost-there operating system

One thing that struck me soon after I upgraded my main computer from Windows XP to Windows 7 is how many things it gets almost right. The OS is full of well-engineered features that seem awesome, yet – upon closer inspection – turn out to have some hidden flaw that renders them useless or at least very frustrating.

Math Input Panel

I’ll start with the Math Input Panel. This is a feature so awesome that you want to show it to your friends. You scribble a mathematical expression with your mouse, touch screen or graphics tablet, and it is magically converted into proper typographical form.

Screenshot of the Math Input Panel

But then you want to insert your formula into a document. You open the built-in (and greatly improved) Write editor of Windows 7. You click “Insert”. Nothing happens. You open Paint (also improved in Windows 7) and try again. Nothing. OpenOffice Writer? Nothing. Word 2003? Nada. Does this thing even work?

Then you read the small print. The Math Input Panel only works with applications that support MathML. As of this writing, the only popular application with MathML support would be Word 2007. There are no other output options. The Math Input Panel cannot generate code in LaTeX, which is the de facto standard in the mathematical community and has been adopted by projects such as Wikipedia, WordPress and jsMath. It cannot generate OLE objects for older versions of Word. It does not even let you paste the damn equation as an image. How can something so ingenious be so useless?

Windows Firewall

On its face, the Windows Firewall has everything you need to say goodbye to third-party firewalls like Comodo. It’s lean, well-integrated with the OS, and the new “Windows Firewall with Advanced Security” console lets you specify detailed rules for inbound and outbound connections to/from specific programs and ports:

Screenshot of the Windows Firewall control panel

Perfect, isn’t it? Unfortunately, it has two fatal shortcomings:

  • Any application can add its own exceptions to it by means of a simple API call. Why? The official rationale is that it is not the firewall’s job to block malicious applications from accessing the network – once you have executed malicious code on your computer, it can pretty much do whatever it wants, including sending data via a trusted process in a way that is invisible to the firewall. There is some truth to this, but a less permissive firewall would still make things that much harder for wrongdoers. More importantly, however, this reasoning misses the use case where you want to prevent legitimate applications from “phoning home”. If I block Adobe Photoshop from using my Internet connection, it probably won’t go so far as to hijack another process, but it will make use of an official Windows API to add an outbound rule for itself.
  • There is no way to get pop-up notifications about outbound connections. In a typical software firewall, when a new application attempts to establish an outbound connection, you get a pop-up window which enables you to allow or block the connection, and add a permanent rule for this application. The Windows Firewall does not have this functionality. The only thing you can enable is a notification about blocked incoming connections, which gives you a chance to unblock an application. What about outbound connections? The best you can do is block all unknown applications, but then you will never know that an application wanted to access the Net. It will just silently fail.

Sticky Notes

Screenshot showing two sticky notes on the desktopThe Sticky Notes feature looks really useful at first. For someone who stares at his screen for most of the day, the Windows desktop seems to be a logical place for “notes to self”. The UI is pretty straightforward and has some nice touches, such as the fact that every note has a little plus button that lets you quickly add another note.

Unfortunately, for some unknown reason Sticky Notes is not a gadget, like the weather thingy you can see on the screenshot above. It’s a separate application. One that cannot be minimized to the system tray. And I don’t know about you, but I don’t like tiny utilities like this taking up space on my taskbar. I need the space so I can comfortably switch between my productivity applications.

Windows Backup

The final “almost perfect” Windows 7 feature I’m going to talk about is Windows Backup. Now this is a seriously exciting utility that promises to replace third-party backup applications like Acronis True Image. On the face of it, it has everything you need. Scheduled and on-demand backups? Check. System drive snapshots? Check. Backups of selected folders? Check. Incremental backups? Check. Restore from bootable CD/DVD? Check. Time needed to back up 500 GB of data to an external USB hard drive? 35 hours. That’s right. Thirty-five freaking hours. (If you suspect there is something wrong with my setup, read these other reports.) Try it once and you’ll never try it again.

It’s as if Microsoft developed a perfectly good backup application and then decided to cripple it on purpose, just to let ISVs make a buck. I don’t want to give my money to Acronis again, especially after reading their official response to a compression bug in TrueImage Home 11 (“just turn off compression”), but it seems I’m going to have to.

Tags:

Why you should use English versions of your OS and other software

Even though I’m writing this blog in English, I know I have a considerable number of readers in non-English-speaking countries, such as my native Poland. This post is for them. If you are American, British, Australian, New Zealand(ish?)Kiwi — sorry, there’s nothing for you here. See you next week.

Now for the rest of you. As you can probably figure out from the title, I’m going to try to convince you to use English versions of your software. Now, I am the webmaster of a site which tells you how to learn English, so you might expect I would tell you how daily exposure to English menu items, system messages, help files, and all the other textual UI elements will program your brain with correct English. (Which, by the way, would all be true.)

But today I’m not going to write about the importance of getting English input every chance you get. Instead, I will give you a very practical reason to install English versions of your operating system and other software rather than versions localized in your native language.

Suppose you have just updated the drivers for your nVidia card. Unfortunately, something has gone wrong and every time you reboot your machine you see the following error message:

Sterownik ekranu przestał działać, ale odzyskał sprawność.

(The error message is in Polish because, in this example, we will assume you are Polish and use the Polish version of Windows.) “Motyla noga”, you curse to yourself while opening your Web browser. If there’s one thing you’ve learned online, it’s that the Internet has the answer to your computer question. Other people must have had the same problem and there must be a forum post somewhere which has the solution.

But what are you going to type into Google? What keywords would be likely to occur in this forum post you want to find? In all likelihood, the poster would have quoted the error message itself.

Except they would have quoted it in English, not Polish. Let’s face it — it is much more probable that the solution to your problem is posted on one of the many English-language tech forums than on one of the few Polish-language ones. A Google Groups search on “nVidia” turns up 17,000,000 group threads in English and only 211,000 in Polish (1/80 of the English figure).

So now you’re stuck with your Polish error message, trying to figure out the exact words the English version might have used. “The screen driver has failed?” “Malfunctioned?” “Stopped working?”

Of course, I have an English-language version of Windows, so if I am having computer issues, I can simply read the English error message off the screen (in our example it’s “The display driver has stopped responding and has successfully recovered”), type that magic phrase into Google together with the name of the malfunctioning device or application and boom! — within minutes I’m reading about the secret registry setting that makes it all okay.

Now that I think about it, having an English-language version of Windows probably accounts for something like 30% of my troubleshooting ability. Moreover, using English-language software is useful not only when troubleshooting — I find it equally helpful when I just want to learn how to do something in Windows, Office, Photoshop or even a Web app like GMail. I can just search on the names I see instead of wondering what is the English name for warstwy dopasowania (adjustment layers). And I can apply the solution more easily because I don’t have to translate all the names back into Polish.

It would perhaps behoove me to give you “the other side” of the argument, but the matter seems pretty clear-cut to me: If you want to get help with your software (and who doesn’t?), it helps to use the same version that most of the potential helpers use. And with this, I leave you.

Tags:

What you should know about Volume Shadow Copy/System Restore in Windows 7 & Vista (FAQ)

What is volume shadow copy?

Volume Shadow Copy is a service that creates and maintains snapshots (“shadow copies”) of disk volumes in Windows 7 and Vista. It is the back-end of the System Restore feature, which enables you to restore your system files to a previous state in case of a system failure (e.g. after a failed driver or software installation).

Does volume shadow copy protect only my system files?

No. Volume Shadow Copy maintains snapshots of entire volumes. By default, it is turned on for your system volume (C:) and protects all the data on that volume, including all the system files, program files, user settings, documents, etc.

How is this different from what’s in Windows XP?

In Windows XP, System Restore does not use the Volume Shadow Copy service. Instead, it uses a much simpler mechanism: the moment a program attempts to overwrite a system file, Windows XP makes a copy of it and saves it in a separate folder. In Windows XP, System Restore does not affect your documents – it only protects files with certain extensions (such as DLL or EXE), the registry, and a few other things (details). It specifically excludes all files in the user profile and the My Documents folder (regardless of file extension).

When are the shadow copies created?

Volume shadow copies (restore points) are created before the installation of device drivers, system components (e.g. DirectX), Windows updates, and some applications.

In addition, Windows automatically creates restore points at hard-to-predict intervals. The first thing to understand here is that the System Restore task on Vista and 7 will only execute if your computer is idle for at least 10 minutes and is running on AC power. Since the definition of “idle” is “0% CPU usage and 0% disk input for 90% of the last 15 minutes, plus no keyboard/mouse activity” (source), it could take days for your machine to be idle, especially if you have a lot of programs running in the background.

As you see, the frequency with which automatic restore points are created is hard to estimate, but if you use your machine every day on AC power and nothing prevents it from entering an idle state, you can expect automatic restore points to be created every 1-2 days on Windows Vista and every 7-8 days on Windows 7. Of course, the actual frequency will be higher if you count in the restore points created manually by you and those created before software installations.

Here’s a more precise description: By default, the System Restore task is scheduled to run every time you start your computer and every day at midnight, as long as your computer is idle and on AC power. The task will wait for the right conditions for up to 23 hours. These rules are specified in Scheduled Tasks and can be changed by the user. If the task is executed successfully, Windows will create a restore point, but only if enough time has passed since the last restore point (automatic or not) was created. On Windows Vista the minimum interval is 24 hours; on Windows 7 it is 7 days. As far as I know, this interval cannot be changed.

What cool things can I do with Volume Shadow Copy?

  • If your system malfunctions after installing a new video card driver or firewall software, you can launch System Restore and roll back to a working system state from before the installation. If you can’t get your system to boot, you can also do this from the Windows Setup DVD. This process is reversible, i.e. your current state will be automatically saved as a restore point, to which you can later go back. (Note: System Restore will not roll back your documents and settings, just the system files.)
  • previous_versionsIf you accidentally delete 10 pages of your dissertation, you can right-click the document, choose Restore previous versions, and access a previous version of it. You can open it (in read-only mode) or copy it to a new location.
  • If you accidentally delete a file or folder, you can right-click the containing folder, choose Restore previous versions, and open the folder as it appeared at the time a shadow copy was made (see screenshot below). All the files and folders that you deleted will be there!

previous_folder

Note: While the Volume Shadow Copy service and System Restore are included in all versions of Windows Vista, the Previous versions user interface is only available in Vista Business, Enterprise and Ultimate. On other Vista versions, the previous versions of your files are still there; you just cannot access them easily. The Previous versions UI is available in all versions of Windows 7. It is not available in any version of Windows 8.

Is Volume Shadow Copy a replacement for versioning?

No. A versioning system lets you access all versions of a document; every time you save a document, a new version is created. Volume Shadow Copy only allows you to go back to the moment when a restore point was made, which could be several days ago. So if you do screw up your dissertation, you might have to roll back to a very old version.

Is Volume Shadow Copy a replacement for backups?

No, for the following reasons:

  • Shadow copies are not true snapshots. When you create a restore point, you’re not making a new copy of the drive in question — you’re just telling Windows: start tracking the changes to this drive; if something changes, back up the original version so I can go back to it. Unchanged data will not be backed up. If the data on your drive gets changed (corrupted) for some low-level reason like a hardware error, VSC will not know that these changes happened and will not back up your data. (see below for a more detailed description of how VSC works)
  • The shadow copies are stored on the same volume as the original data, so when that volume dies, you lose everything.
  • With the default settings, there is no guarantee that shadow copies will be created regularly. In particular, Windows 7 will only create an automatic restore point if the most recent restore point is more than 7 days old. On Windows Vista, the minimum interval is 24 hours, but remember that the System Restore task will only run if your computer is on AC power and idle for at least 10 minutes, so it could take days before the conditions are right, especially if you run a lot of background processes or do not use your computer frequently.
  • There is no guarantee that a suitable shadow copy will be there when you need it. Windows deletes old shadow copies without a warning as soon as it runs out of shadow storage. With a lot of disk activity, it may even run out of space for a single shadow copy. In that case, you will wind up with no shadow copies at all; and again, there will be no message to warn you about it.

How much disk space do Volume Shadow Copies take up?

By default, the maximum amount of storage available for shadow copies is 5% (on Windows 7) or 15% (on Vista), though only some of this space may be actually allocated at a given moment.

You can change the maximum amount of space available for shadow copies in Control Panel | System | System protection | Configure.

How efficient is Volume Shadow Copy?

It’s quite efficient. The 5% of disk space that it gets by default is usually enough to store several snapshots of the disk in question. How is this possible?

The first thing to understand is that volume shadow copies are not true snapshots. When a restore point is created, Volume Shadow Copy does not create a full image of the volume. If it did, it would be impossible to store several shadow copies of a volume using only 5% of that volume’s capacity.

Here’s what really happens when a restore point is created: VSC starts tracking the changes made to all the blocks on the volume. Whenever anyone writes data to a block, VSC makes a copy of that block and saves it on a hidden volume. So blocks are “backed up” only when they are about to get overwritten. The benefit of this approach is that no backup space is wasted on blocks that haven’t changed at all since the last restore point was created.

Notice that VSC operates on the block level, that is below the file system level. It sees the disk as a long series of blocks. (Still, it has some awareness of files, as you can tell it to exclude certain files and folders.)

The second important fact is that shadow copies are incremental. Suppose it’s Wednesday and your system has two shadow copies, created on Monday and Tuesday. Now, when you overwrite a block, a backup copy of the block is saved in the Tuesday shadow copy, but not in the Monday shadow copy. The Monday copy only contains the differences between Monday and Tuesday. More recent changes are only tracked in the Tuesday copy.

In other words, if we were to roll back an entire volume to Monday, we would take the volume as it is now, “undo” the changes made since Tuesday (using the blocks saved in the Tuesday shadow copy), and finally “undo” the changes made between Monday and Tuesday. So the oldest shadow copy is dependent on all the more recent shadow copies.

When I delete a 700 MB file, does VSC add 700 MB of data to the shadow copy?

No. When you delete a file, all that Windows does is remove the corresponding entry (file name, path, properties) from the Master File Table. The blocks (units of disk space) that contained the file’s contents are marked as unused, but they are not actually deleted. So all the data that was in the file is still there in the same blocks, until the blocks get overwritten (e.g. when you copy another file to the same volume).

Therefore, if you delete a 700 MB movie file, Volume Shadow Copy does not have to back up 700 MB of data. Because it operates on the block level, it does not have to back up anything, as the blocks occupied by the file are unchanged! The only thing it has to back up is the blocks occupied by the Master File Table, which has changed.

If you then start copying other files to the same disk, some of the blocks formerly occupied by the 700 MB file will get overwritten. VSC will make backups of these blocks as they get overwritten.

If VSS is constantly backing up blocks of data that get overwritten, what actually happens when a restore point is created if data is automatically being backed up anyway?

Not much — VSS simply starts backing up the data to a new place, while leaving the “old place” there (at least until it runs out of space). Now you have two places to which you can restore your system, each representing a different point in time. When you create a restore point, you’re simply telling VSS: “I want to be able to go back to this point in time”.

Note that it’s a mistake to think that VSS is backing up every change you make! It only backs up enough to enable you to go to a specific point in time. Here’s an example scenario to clear things up:

  1. You create a file (version #1)
  2. You create a restore point
  3. You change the file (resulting in version #2) — VSS backs up version #1
  4. A week later, you change the file again (resulting in version #3) — VSS doesn’t back anything up, because it already has version #1 backed up. As a result, you can no longer go back to version #2. You can only go back to version #1 — the one that existed when the restore point was created.

(Note that actually VSS doesn’t operate on files but on blocks, but the principle is the same.)

What are the security implications of Volume Shadow Copy?

Suppose you decide to protect one of your documents from prying eyes. First, you create an encrypted copy using an encryption application. Then, you “wipe” (or “secure-delete”) the original document, which consists of overwriting it several times and deleting it. (This is necessary, because if you just deleted the document without overwriting it, all the data that was in the file would physically remain on the disk until it got overwritten by other data. See question above for an explanation of how file deletion works.)

Ordinarily, this would render the original, unencrypted document irretrievable. However, if the original file was stored on a volume protected by the Volume Shadow Copy service and it was there when a restore point was created, the original file will be retrievable using Previous versions. All you need to do is right-click the containing folder, click Restore previous versions, open a snapshot, and, lo and behold, you’ll see the original file that you tried so hard to delete!

The reason wiping the file doesn’t help, of course, is that before the file’s blocks get overwritten, VSC will save them to the shadow copy. It doesn’t matter how many times you overwrite the file, the shadow copy will still be there, safely stored on a hidden volume.

Is there a way to securely delete a file on a volume protected by VSC?

No. Shadow copies are read-only, so there is no way to delete a file from all the shadow copies.

A partial solution is to delete all the shadow copies (by choosing Control Panel | System | System protection | Configure | Delete) before you wipe the file. This prevents VSC from making a copy of the file right before you overwrite it. However, it is quite possible that one of the shadow copies you just deleted already contained a copy of the file (for example, because it had recently been modified). Since deleting the shadow copies does not wipe the disk space that was occupied by them, the contents of the shadowed file will still be there on the disk.

So, if you really wanted to be secure, you would also have to wipe the blocks that used to contain the shadow copies. This would be very hard to do, as there is no direct access to that area of the disk.

Some other solutions to consider:

  • You could make sure you never save any sensitive data on a volume that’s protected by VSC. Of course, you would need a separate VSC-free volume for such data.
  • system_protectionYou could disable VSC altogether. (After disabling VSC, you may want to wipe the free space on your drive to overwrite the blocks previously occupied by VSC, which could contain shadow copies of your sensitive data.) However, if you disable VSC, you also lose System Restore functionality. Curiously, Windows offers no option to enable VSC only for system files. If you want to protect your system, you also have to enable Previous versions (see screenshot to the right).
  • The most secure approach is to use an encrypted system volume. That way, no matter what temporary files, shadow copies, etc. Windows creates, it will all be encrypted.

Notice that VSC only VSC only lets you recover files that existed when a restore point was created. So if the sequence of events is as follows:

create file → create restore point → make encrypted copy → overwrite original file

the original file will be recoverable. But if the sequence is:

create restore point → create file → make encrypted copy → overwrite original file

you are safe. If you make sure to encrypt and wipe files as soon as you create them, so that no restore point gets created after they are saved on disk in unencrypted form, there will be no way to recover them with VSC. However, it is not easy to control when Windows creates a restore point; for example, it can do it at any time, just because your computer happens to be idle.

Can I prevent VSC from keeping snapshots of certain files and folders?

Yes, but you have to edit the registry to do that. Here are detailed instructions from MSDN.

What happens when VSC runs out of space?

Most of the time, most of the data on your disk stays unchanged. However, suppose you uninstall a 5 GB game and then install another 5 GB game in its place. This means that 5 GB worth of blocks got overwritten and had to be backed up by VSC.

In such “high-churn” scenarios, VSC can run out of space pretty quickly. What happens then? VSC deletes as many previous shadow copies as necessary, starting from the oldest, until it has enough space for the latest copy. In the rare event that there isn’t enough space even for the one most recent copy, all the shadow copies will be deleted. There are no partial copies.

Thanks to Adi Oltean, who was one of the engineers of Volume Shadow Copy at Microsoft, for answering my questions on the subject.

Tags:

The Hidden Shadow

The flower delivery van had been parked across the street for far too long. Cahey peered outside through the window blinds for the third time. By now he was certain they had him under surveillance. He had been careful not to discuss the subject matter of his current project with anyone, but there were a few souls at the Tribune who knew he was working on a major investigative piece. Apparently that was enough to spike the government’s interest.

Cahey lit a cigarette and reflected on the van’s relatively conspicuous location. Sloppy surveillance work or a deliberate attempt to scare him into silence? There was no way to know. He was, however, sure of one thing: if they came here, they would find nothing. Knowing that digital content was much easier to protect from prying eyes than papers, photographs and recordings, he had disposed of every physical record of his investigation, leaving only a digitized copy on the hard drive of his laptop computer. Two days ago, he had encrypted all this data using an open-source application called TrueCrypt, making sure to overwrite the original files several times before deletion. Now his data was unrecoverable without the password, and there was nothing anybody could do about it, not even the NSA with their army of PhD’s and their supercomputers. The spooks would be in for a surprise.

“Drrrrrt” — the sound of the doorbell pierced the smoke-infused air. Cahey glanced through the window. The van was gone. As he walked towards the door, he contemplated logging out of his Windows account, but decided against it. Bypassing that layer of security would be a trivial exercise, and it wouldn’t do the government much good anyway, given the fact that everything of interest was now encrypted. He opened the door. On his porch stood five serious-looking men in suits. “Stephen Cahey? We have a warrant to search the premises.”

———-

Agent Jack Trallis looked at the machine he had been ordered to process. It was a pretty standard Dell laptop with a dual-core CPU and a 15-inch screen that was covered with fingerprints. “God, do I hate those glossy displays”, he muttered to himself. He was alone in the room; the other agents were in the living room questioning the suspect. Trallis noticed the prominent TrueCrypt icon on the machine’s desktop. “Uh oh. Strong encryption.” He fixed his eyes on the taskbar at the bottom of the screen. There was a row of oversized, unlabeled icons that reminded him of the Hackintosh he had once built for his girlfriend. The guy’s laptop was running Windows 7. There was still a chance.

He located the Documents folder, opened its Properties window, and clicked on the “Previous Versions” tab. Just as he thought, there were five previous versions of the folder – “shadow copies” created regularly by the operating system as part of the System Restore mechanism. As these snapshots were prepared silently in the background and stored on a hidden disk volume, few users were aware of them. Agent Trallis was smiling. The good guys from Redmond were going to make his job easy again.

He selected one of the snapshots and clicked Open. An Explorer window popped up, showing the contents of the Documents folder exactly as it had appeared three days ago. “This is too funny”, he thought. There was a subfolder labeled Project Foxhunt full of scanned documents and audio files. Trallis grabbed his radio. “Sir”, he called out to his commanding officer, “I’ve got something you might want to have a look at.”

For technical information on Volume Shadow Copy, read What you should know about Volume Shadow Copy/System Restore in Windows 7 & Vista

Tags:

An audiophile’s look at the audio stack in Windows Vista and 7

If you are an audiophile who uses a PC as a source in your audio system, you’re probably aware of the fact that Windows Vista introduced a brand-new audio engine to replace the much hated KMixer of Windows XP. In my opinion, there are a few reasons why audiophiles should be happy with this change:

  • The new audio stack automatically upconverts all streams to a 32-bit floating-point sample depth (the same that is used in professional studios) and mixes them with the same precision. Because of the amount of headroom that comes with using 32-bit floats, there is no more clipping when playing two samples at the same time. There is also no loss of resolution when you lower the volume of a stream (see below).
  • The Vista/Win7 audio engine automatically feeds your sound card with the highest-quality output stream that it can handle, which is usually 24 bits per sample. Perhaps you’re wondering why you should care, given that most music uses only 16 bits per sample. Suppose you’re playing a 16-bit song with a digital volume control set to 10%. This corresponds to dividing each sample by 10. Now let’s assume the song contains the following two adjacent samples: 41 and 48. In an ideal world, after the volume control we would get 4.1 and 4.8. However, if the output stream has a 16-bit depth just like the input stream, then both output samples will have to be truncated to 4. There is now no difference between the two samples, which means we have lost some resolution. But if we can have an output stream with 24 bits per sample, for each 16-bit level we get 28 = 256 additional (“fractional”) levels, so we can still preserve the difference between the two attenuated samples. In fact, we can have ≈4.1016 and ≈4.8008, which is within 0.04% of the “ideal” samples of 4.1 and 4.8.
  • Don’t you hate it when you change the volume in your movie player or instant messaging software and instead of changing its own volume, it changes your system volume? Or have you ever used an application with its own poorly implemented volume control (iTunes, I’m pointing at you!)? Well, these abominations should now be behind us. In Vista and Win7, each application gets its own audio stream (or streams) and a separate high-quality volume control, so there should no longer be any reason for application vendors to mess with the system volume or roll their own and botch the job.

So Windows Vista and Windows 7 upconvert all your samples to 32-bit floats and mix them with 32-bit precision into an output stream that, by default, has the highest bit depth that your hardware can handle. The output bit depth is customizable; you can change it in the properties of your audio device. If you change it e.g. to 16 bits, the audio engine will still use 32-bit floats for internal processing — it will just downconvert the resulting stream to 16 bits before sending it to your device.

Now, what about the sample rate? You can set the output sample rate in the audio device properties window, but is there also some internal sample rate that the Windows audio engine uses regardless of your setting? For example, does it upsample your 44.1 kHz songs to 96 or 128 kHz? Unlike the upconverting from 16-bit integers to 32-bit floats (which should be completely lossless), this could potentially introduce some distortion as going from 44.1 kHz to 96 or 128 kHz requires at least some interpolation.

I couldn’t find the answer to this question anywhere, so I wrote to Larry Osterman, who developed the Vista and Win7 audio stacks at Microsoft. His answer was that the sample rate that the engine uses is the one that the user specifies in the Properties window. The default sample rate is chosen by the audio driver (44.1 kHz on most devices). So if your music has a sample rate of 44.1 kHz, you can choose that setting and no sample rate conversion will take place. (Of course, any 48 kHz and higher samples will then be downsampled to 44.1 kHz.)

There is some interesting technical information on the Windows Vista audio stack in this Channel9 video.

Tags:

Should we care about ABX test results?

Policy no. 8 in the Terms of Service of the respected audiophile community Hydrogenaudio states:

8. All members that put forth a statement concerning subjective sound quality, must — to the best of their ability — provide objective support for their claims. Acceptable means of support are double blind listening tests (ABX or ABC/HR) demonstrating that the member can discern a difference perceptually, together with a test sample to allow others to reproduce their findings.

What a breath of fresh air. Other audio forums are full of snake-oil-peddling and Kool-Aid-drinking evangelists who go on and on about how replacing $200 speaker wires with $400 speaker wires “really opened up the soundstage and made the upper-midrange come alive”. The people at Hydrogenaudio know that such claims demand proper scientific evidence. How nice to see that they dismiss subjective nonsense and rely instead on the ultimate authority of ABX tests, which really tell us what makes a difference and what doesn’t.

Except that ABX tests don’t measure what really matters to us. ABX tests tell us whether we can hear a difference between A and B. What we really want to know, however, is whether A is as good as B.

1.

“Wait a second!”, I hear you exclaim. “Surely if I cannot tell A from B, then for all intents and purposes, A is as good as B and vice versa. If you can’t see the difference, why pay more?

Actually, there could be tons of reasons. To take a somewhat contrived example, suppose I magically replaced the body of your car with one that were less resistant to corrosion, leaving all the other features of your vehicle intact. Looking at the car and driving it, you would not notice any difference. Even if I gave you a chance to choose between your original car and the doctored one, they would seem identical to you and you could choose either of them. However, if you were to choose the one I tampered with, five years later your vehicle’s body would be covered in spots of rust.

The obvious lesson here is that “not seeing a difference” does not guarantee that A is as good as B. Choosing one thing over another can have consequences that are hard to detect in a test because they are delayed, subtle, or so odd-ball that no one even thinks to record them during the test.

But how is this relevant to listening tests? Assuming that music affects us through our hearing, how could we be affected by differences that we cannot hear?

In his fascinating book Burning House: Unlocking the Mysteries of the Brain, Jay Ingram describes the case of a 49-year-old woman suffering from a condition called hemispatial neglect (the case was researched by neuropsychologists John Marshall and Peter Halligan). Patients with hemispatial neglect are unable to perceive one (usually the left) side of the objects they see. When asked to copy drawings, they draw only one side; when reading out words, they read them only in half (e.g. they read simile as mile).

burning-houseIn Marshall and Halligan’s experiment, the woman was given two simple drawings showing two houses. In one of the drawings, the left side of the house was covered in flames and smoke; the houses looked the same otherwise. Since the flames were located on the left side, the patient was unable to see them and claimed to see no difference between the drawings. When Marshall and Halligan asked her which of the houses she would rather live in, she replied — rather unsurprisingly — that it was a silly question, given that the houses were identical.

However, when the experimenters persuaded her to make a choice anyway, she picked the flameless house 14 out of 17 times, all the time insisting that both houses look the same.

Marshall and Halligan’s experiment shows (as do other well-known psychological experiments, including those pertaining to subliminal messages) that it is possible for information to be in a part of the brain where it is inaccessible to conscious processes. This information can influence one’s state of mind and even take part in decision-making processes without one realizing it.

If people can be affected by information that they don’t even know is there, then who says they cannot be affected by inaudible differences between an MP3 and a CD? Failing an ABX test tells you that you are unable to consciously tell the difference between two music samples. It does not mean that the information isn’t in your brain somewhere — it just means that your conscious processes cannot access it.

So the fact that you cannot tell the difference between an MP3 and a CD in an ABX test does not mean that an MP3 is as good as a CD. Who knows? Maybe listening to MP3s causes more fatigue in the long run. Maybe it makes you get bored with your music more quickly. Or maybe the opposite is true and MP3s are actually better. We can formulate and test all sorts of plausible hypotheses — the point is, an ABX test which shows no audible difference is not the end of the discussion.

2.

I have shown that the lack of audible differences between A and B in an ABX test does not imply that A is as good as B. Before you read this post as an apology for lossless audio formats, here is a statement that will surely upset hard-core audiophiles:

The fact that you can tell the difference between an MP3 and a CD in an ABX test does not mean that the MP3 is worse than a CD.

First of all, the differences between MP3s encoded at mainstream bitrates (128 kbps and 192 kbps) and original recordings are really subtle and can be detected only under special conditions (quiet environment, good equipment, full listener concentration, direct comparisons of short samples). Because the differences are so tiny, we cannot automatically assume that it is the uncompressed version that sounds better. Subtle compression artifacts such as slightly reduced sharpness of attacks on short, loud sounds may in fact be preferred by some listeners in a direct comparison.

Secondly, even if we found that the uncompressed version is preferred by listeners, that wouldn’t necessarily mean that it is better. People prefer sitting in front of the TV to exercising, but the latter might make them feel much better overall. If it were discovered, for example, that compressed music is less tiring to listen to (this is of course pure speculation), then that fact might outweigh any preference for uncompressed sound in blind tests.

Summary

The relevance of ABX tests to the lives of music lovers is questionable. Neither does the absence of audible differences imply equal quality, nor does the presence of audible differences imply that the compressed version is inferior. Rather than being the argument to end all debate, the results of ABX tests are just one data point and the relative strengths of various audio formats may well be put in a new light by further research.

Tags:

Blind-testing MP3 compression

Among music listeners, the use of lossy audio compression technologies such as MP3 is a controversial topic. On one side, we have the masses who are glad to listen to their favorite tunes on $20 speakers connected to their PC’s onboard audio device and couldn’t care less what bitrate MP3s they get as long as the sound quality is better than FM radio. On another side, we have the quasi-audiophiles (not true audiophiles, of course, as those would never touch anything other than a high-quality CD or LP player properly matched to the amplifier) who stick to lossless formats like FLAC due to MP3’s alleged imperfections.

If I considered myself part of either group, my life would be easy, as I would know exactly what to do. Unfortunately, I fall somewhere in between. I appreciate music played through good equipment and I own what could be described as a budget audiophile system. On the other hand, I am not prepared to follow the lead of the hard-core lossless format advocates, who keep repeating how bad MP3s sound, yet do not offer anything in the way of objective evidence.

So, me being me, I had to come to my own conclusions about MP3 compression. Is it okay for me to listen to MP3s and if so, what bitrate is best? To answer these questions, I spent many hours doing so-called ABX listening tests.

What is an ABX test?

An ABX test works like this: You get four samples of the same musical passage: A, B, X and Y. A is the original (uncompressed) version. B is the compressed version. With X and Y, one is the original version (same as A), the other is the compressed version (same as B), and you don’t know which is which. You can listen to each version (A, B, X or Y) as many times as you like. You can select a short section of the passage and listen to it in each version. Your objective is to decide whether X = A (and Y = B) or X = B (and Y = A). If you can get a sufficient number of right answers (e.g. 7 times out of 7 or 9 times out of 10), you can conclude that there is an audible difference between the compressed sample and the original sample.

What I found

  1. The first thing I found was that telling the difference between a well-encoded 128 kbps MP3 and a WAV file is pretty damn hard. Since 128 kbps is really the lowest of the popular MP3 bitrates and it gets so much bad rap on forums like Head-Fi, I expected that it would fail miserably when confronted with the exquisite work of artists like Pink Floyd or Frank Sinatra. Not so. Amazingly, the Lame encoder set at 128 kbps (ABR, high quality encoding) held its own against pretty much anything I’d throw at it. The warm, deeply human quality of Gianna Nannini’s voice in Meravigliosa Creatura, the measured aggression of Metallica’s Blitzkrieg, the spacious guitar landscapes of Pink Floyd’s Pulse concert — it all sounded exactly the same after compression. There were no changes to the ambiance of the recording, the quality of the vocals, the sound of vowels and consonants, the spatial relationships between the instruments on the soundstage, or the ease with which individual instruments could be picked out.
  2. That said, MP3s at 128 kbps are not truly transparent. With some training, it is possible to distinguish them from original recordings in blind listening tests. My trick was to look for brief, sharp, loud sounds like beats or certain types of guitar sounds — I found that compression takes some of the edge off them. Typically, the difference is so subtle that successful identification is only possible with very short (a few seconds long) samples, a lot of concentration and a lot of going back and forth between the samples. Even then, the choice was rarely obvious for me; more often, making the decision felt like guessing. Which of the identical bass riffs I just heard seemed to carry more energy? A few times I was genuinely surprised that I was able to get such high ABX scores after being so unsure of my answers.
  3. With some effort, it is possible to find passages that make the difference between 128 kbps MP3 and uncompressed audio quite obvious. For me, it was just a matter of finding a sound that was sharp enough and short enough. In David Bowie’s Rock ‘n Roll Suicide, I used a passage where Bowie sings the word “song” in a particular, Dylanesque way (WAV file). Another example is a 1.2-seconds-long sample from Thom Yorke’s Harrowdown Hill (WAV file). The second beat in the sample is accompanied by a static-like click (clipping) that is considerably quieter in the compressed version. More samples that are “difficult” for the MP3 format can be found on the Lame project page (I found the “Castanets” sample especially revealing.).
  4. What about higher bitrates? As I increased the bitrate, the differences that were barely audible at 128 kbps became inaudible and the differences that were obvious became less obvious.
    • At 192 kbps, the Bowie and Yorke samples were still too much of a challenge and I was able to reliably tell the MP3 from the original, though with much less confidence and with more going back and forth between the two versions.
    • At 256 kbps (the highest bitrate I tested), I was not able to identify the MP3 version reliably — my ABX results were 7/10, 6/10 and 6/7, which can be put down to chance.

Caveats

Obviously, the results I got apply to my particular situation. If you have better equipment or better hearing, it is perfectly possible that you will be able to identify 256 kbps MP3s in a blind test. Conversely, if your equipment and/or hearing is worse, 192 kbps or even 128 kbps MP3s may sound transparent to you, even on “difficult” samples.

Test setup

  • Lame MP3 encoder version 3.98.2. I used Joint Stereo, High Quality, and variable bitrate encoding (ABR).
  • Foobar2000 player with ABX plugin. I used ReplayGain to equalize the volume between the MP3 and the original file — otherwise I found it too easy to tell the difference in ABX tests, since MP3 encoding seems to change the volume of the track somewhat.
  • Auzentech X-Meridian 7.1 — a well-respected audiophile-quality sound card with upgraded LM4562 op-amps.
  • RealCable copper jack-RCA interconnect.
  • Denon PMA-350SE amplifier — an entry-level audiophile receiver designed in England.
  • Sennheiser HD 25-1 II, top-of-the-line closed headphones with stock steel cable.

When I write that there was an audible difference in an ABX test, I mean that I got 7/7 or 9/10 correct answers without repeating the test.

Conclusions

If my goal was to use an MP3 bitrate that is indistinguishable from the original in a blind listening test, I would use 256 kbps, since that is the bitrate which I was unable to identify in a reliable way, despite repeated attempts on a variety of samples (including the “difficult” samples posted on the Lame website).

Whether I will actually standardize on 256 kbps, I’m not sure. The fact that a 192 kbps MP3 can be distinguished from the original in a contrived test (good equipment, quiet environment, high listener concentration, specially selected samples) does not mean it is unsuitable for real-world scenarios. Sure, at 192 kbps the music is not always identical to the original, but judging by my experiments, the difference affects less than 1% of my music (in a 100-second sample, more than 99 seconds would probably be transparent). Even if all I did was listen to this tiny proportion of my music, I would be in a position to perceive the difference less than 1% of the time (what percent of the time do I listen to music in a quiet environment? what percent of the time am I really focused on the music as opposed to other things I’m doing?). Besides, there is the rarely-posed question of whether “different” necessarily means “inferior” — it is quite possible that subtle compression artifacts might actually improve the perceived quality of music in some cases.

Tags:

Review of the Dell 2209WA 22″ LCD monitor

So, the Eizo S2231W went back to the store (kudos to Proline.pl for a problem-free returns policy) and my search for a new 22″ LCD continued. After some more Web research, my attention turned to the recently released Dell 2209WA. According to the well-informed posters at hardforum.com, this display is supposed to be a big deal for three reasons:

  • It is a 22″ panel that doesn’t use cheap TN technology. Up to now, the only non-TN 22″ displays available had been the Eizo S2231W, the Lenovo L220x (which has ultra-tiny pixels), and the HP LP2275W.
  • It uses an IPS panel. IPS panels are currently considered better than TN and PVA/MVA panels. They are supposed to have better color fidelity and better viewing angles than other panel types.
  • It only costs $380 (in Poland), half the price of the Eizo S2231W!

Spurred by rave user reviews, I quickly found a store that carries these monitors in my city and made sure that the store would accept a return if I didn’t like the screen for whatever reason. Having spent the equivalent of $750 on the Eizo a few days before, I almost felt like I was getting the screen for free this time!

My expectations were pretty high. Everyone and their mother says that IPS panels are better than S-PVA, so despite the budget price of the Dell 2209WA, I expected the image quality to be close to that of the Eizo S2231W. I was concerned with quality control issues like uneven backlighting or bad pixels. But what was really on my mind was the text clarity: Would the Dell be as horrible as the S2231W? I hoped it would at least be in the same league as my trusted Eizo S1910.

Text quality

As soon as I unpacked the Dell 2209WA, I placed it next to my S1910 to check how the text looked. I could barely believe my eyes. Not only was the Dell in the same league as my old display — it was actually better! All the letters looked sharper even after I dialed down the sharpening control (the default setting of 50 amounts to subtle sharpening; to disable sharpening altogether, you need to set it at 40).

The explanation for the difference in favor of the Dell 2209WA turned out to be quite simple. The subpixels on the S1910 are not rectangular. Their shape looks a bit like this: “>”.  On the other hand, the Dell 2209WA has rectangular subpixels, which are better suited for the subpixel rendering used by ClearType.

Here is a photo which compares the pixel structure of the Eizo S1910 with the Dell 2209WA (sorry for the slightly out-of-focus S1910 photo):

ClearType text on Eizo S1910 and Dell 2209WA

Image quality

Now that the text sharpness concern is out of the way, let’s take a look at other issues:

  • Color reproduction: Side-by-side comparison with the Eizo S1910 shows that the Dell 2209WA displays slightly less saturated reds. The subjective color quality when viewing photographs in full-screen mode can only be described as very good.
  • Black level: Compared with the Eizo monitors that I have owned or tested, the black level is quite poor. Dark photographs and dark scenes in movies and games appear flooded with a dark shade of grey, which makes for a washed out, bland image. Eizo monitors, while incapable of displaying true blacks, still manage a good contrast ratio when showing dark images. The Eizos show a clear difference between e.g. RGB (0,0,0) and RGB (2,2,2), so the picture retains a lot of “punch”. On the Dell, the near-black shades are still distinguishable from each other; however, the perceived contrast is lower. From my point of view, the difference between the Dell 2209WA and an Eizo (S-)PVA screen is so large that I think I would rather watch a movie on a smaller 4:3 screen with a good black level than on a 16:10 screen with a black level of the 2209WA. Perhaps I will get used to the lower quality over time, but it’s certainly a big step backwards. (If you used a TN panel before, you will likely consider it a step forward. It’s all relative.) [Added Nov 2009: I have to confess I watch movies on the Dell most of the time, despite the poor black level. It still bothers me, but it turns out I’m too lazy to move my displays around every time I want to watch a movie. Plus I like the bigger size.]
  • Brightness: The display is definitely too bright. Most hardforum.com users seem to set the brightness at 15 out of 100; I set it at zero. Even at 0 brightness, I find the monitor unusable in an unlit room at night — I have to put an Ikea Grönö lamp with a 40-watt-equivalent bulb right behind it. This is a perfect example of unethical marketing trickery. Pumping up the brightness allows Dell to advertise an impressive contrast ratio, which looks good to ignorant consumers, but really means nothing in terms of picture quality. The real quality of an LCD is measured by its black level. If the black level is good, you don’t need dazzling, eye-burning whites to maintain a high contrast ratio. In fact, a critical user will quickly find two things: (1) that bright whites are very hard on the eyes, (2) that the “turbo-brightness” trick does not work anyway when dark images are being displayed.
  • “Wet screen” / Sparkle effect: Visible if you take a close look. Any more of it, and it would be a problem for me. I still don’t see why manufacturers can no longer make a proper anti-glare coating like that of my Eizo S1910 (made in 2005), which displays perfectly matte solid-color areas that don’t sparkle like they are covered in hair gel.
  • Color shifting: I expected my Eizo S2231W to have the infamous “Rainbow White” effect. It didn’t have it. But the Dell 2209WA seems to be a textbook example of it. The left side of the screen has a greenish tint, whereas the right side is slightly reddish. The effect was present on both the units that I tested. I don’t think it will be too much of a problem for me, but if you are sensitive to hue shifting, be warned. [A week later: I can just barely notice the color shifting, even on grey areas taking up the entire screen. I think it was more noticeable when I first tested it. Did the monitor “burn in” or something? Anyway, it didn’t bother me before, it bothers me even less now.]
  • White glow / silvery shimmer: Along with the poor black level, this is a major problem with the Dell 2209WA. As I’m writing this, I’m smiling at all the people on the Internet who have written that IPS panels have better viewing angles than PVA panels. Sure, the colors don’t change as you change the viewing angle; instead, all the dark areas of the screen start to glow. For example, I normally sit about 70 centimeters from the screen. When playing a game with dark areas (e.g. BioShock or Mass Effect), I can always see bright, shimmering patches in the lower left and lower right corners of the screen. (That’s the best-case scenario after adjusting the monitor tilt.)  The effect goes away only when I sit 1 meter from the monitor, but then I have to be looking at the exact center of the panel. If I move my head 3 centimeters to the left or right, the white glow reappears. When the screen is completely black, the white glow is impossible to eliminate completely, unless I sit 1.5 meters or farther from the screen. Considering the fact that the blacks on the 2209WA are pretty grey anyway, this is a screen that has serious problems displaying dark images.
  • Angle-dependent white glow on the Dell 2209WA

    Angle-dependent white glow on the Dell 2209WA (from a distance of about 70 cm)

  • Backlight uniformity: There is a small bright patch adjacent to the top edge of the screen, about 13 cm from the right edge. I had my unit replaced because of this. The replacement unit has a slightly less visible patch in exactly the same location. I will not be replacing this unit, as it looks like this problem is present in the entire batch. Besides, this minor glitch is completely overpowered by the angle-dependent white glow described above.
  • Bad pixels: No bad pixels on either of the units I’ve tested. Pretty good.

Ergonomics

  • Heat / power issues: Internet wisdom says IPS panels consume more power than PVA panels and the Dell 2209WA seems to confirm this. The back of the monitor can get quite hot in normal operation — in contrast, the back of my other display (Eizo S1910) gets barely warm. On a warm summer day, the Dell can really add to the temperature in my small room. This could be the result of the absurdly bright backlight that Dell used in this display (see above).
  • Noise: This display is quite noisy. There is a pretty annoying high-pitched whine (1) right after you turn it on (for at least a few minutes), and (2) when it is displaying resolutions and refresh rates other than the native 1680 x 1050 @ 60 Hz. For example, there is a loud whine during the BIOS POST sequence, which uses character mode. During normal work in the native resolution and refresh rate, I can only hear a slight buzzing inverter noise at a distance of about 70 cm. This noise is not particularly annoying, but I can hear the buzzing go away when I turn off the monitor. This is in a quiet room at 2 am, when the loudest sound is the sound of my (quiet) hard drive and the near-silent fans in my PC (I’m a bit of a silent-PC enthusiast).  While at other times of day it is a complete non-issue, I sure don’t like the fact that my new monitor is a source of noise in my system. If you use stock cooling in your PC, don’t worry about the noise — you probably won’t hear it, as long as you stick to the native resolution.
  • Startup time: The backlight takes a long time to reach its proper brightness. When I turn on the Dell and my Eizo at the same time, the Dell is initially much darker than the Eizo and takes about 20 minutes to “catch up”.
  • Response time: Subjectively better than my old Eizo, but not by much.
  • 75 Hz refresh rate: The 2209WA supports a 75 Hz refresh rate, which means that it can display 25% more frames per second than a standard 60 Hz LCD. This makes for much more fluid motion in videogames. It really makes a difference in almost every game. However, turning on 75 Hz is tricky. You need to add a custom display mode in the nVidia control panel (if you have an nVidia card) or in Powerstrip (if you have an ATI card). Besides, when I set my monitor to anything over 60 Hz, the usual power supply noise it emits becomes much louder and turns into a high-pitched whine. It is quieter at some refresh rates than at others (in my case 73 Hz was relatively quiet), but it never completely goes away. In the end, I have decided to stick to 60 Hz.
  • Input lag: I did not notice anything troubling.
  • Anti-glare coating: Apart from the slight sparkle mentioned before, the anti-glare coating reflects a bit too much light. When I put it side-by-side with my Eizo, it is obvious that the Eizo is “more black”.

Summary

  • Text/office work: Very good. Text is very crisp, though brightness needs to be set at close to zero for comfortable work. If you like working with the lights out, I would recommend putting a 40 watt lamp right behind the monitor. You might also need to dial down the contrast, sacrificing color quality.
  • Photo viewing/editing: Satisfactory. Colors are well-reproduced and photos look basically the same as on much more expensive Eizo FlexScan LCDs. However, dark images look worse due to the combination of the mediocre black level and the white glow effect (you have to sit at a distance of 1 m or more to eliminate the latter defect).
  • Movie viewing/editing: Poor. Movies have a lot of dark, moody scenes, which look washed out on the 2209WA, even if you sit far away from the screen and position the monitor carefully to avoid the white glow effect. I’m afraid grey isn’t the new black…
  • Gaming: Mixed bag. Shooters and RPGs often use shadows to create atmosphere. On this monitor, these scenes have little depth. Brighter, more cheerful games look fine. On the other hand, the possibility of using a 75 Hz refresh rate (despite its relative noisiness) means that this monitor can display more fluid motion than typical LCDs.

My search for a 22-inch widescreen LCD that would match my 19″ Eizo S1910 has turned out to be a disappointment. Four years after I bought that display, there appears to exist no 22″ that is good at everything: text work, photo editing, movies and games.

That said, I will be keeping the Dell 2209WA. Why? First, I’m running out of options. The only non-TN 22″ that I have not yet tested is the HP LP2275W, but since it is equipped with an S-PVA panel, I expect it to have the same ClearType defect that made me return the Eizo S2231W.

Second, if I have to choose between a monitor that is suitable only for text work and a monitor that is only good for photos, movies and games (Eizo S2231W), I’m going to choose the former. After all, the amount of time I spend reading websites and working with documents far outweighs the combined time I spend working with Photoshop, watching movies or gaming. Last but not least, the Dell’s low price more than makes up for its shortcomings. If you cannot get a monitor that does it all, at least get an inexpensive monitor that satisfies most of your needs.

Tags: