Things I’ve learned, published for the public benefit
Trying To Be Helpful header image

Entries categorized as 'Technology'

Review of the Dell U2415 LCD monitor

It was 7 years ago that I last bought a display for my desktop PC. The display I picked then – the Dell 2209WA – is still kicking, although it has begun to show signs of advanced age. The CCFL backlight has yellowed and dimmed, and the panel now has a weird dark smear in its left half. It is a display that doesn’t believe in climate change, happily guzzling 50 watts of power and giving off enough heat to make the entire backplate hot to the touch.

So far, my display strategy has been to hold off on an upgrade until someting really great comes along. “Maybe this year we’ll finally get affordable OLED displays? Maybe this is the year that Windows gets proper support for Retina screens?”, I would think, every year – until I got tired of waiting and decided to order the Dell U2415, a display that Wirecutter has singled out as the best 24″ monitor. I admit one of my reasons was curiosity – I wanted to see how much display technology has improved in 7 years. With so many commenters gushing over the picture quality on the U2415, maybe I was missing out?

(more…)

Tags:

Technicolor TC7200 router freezes under load – solution

I have a Technicolor TC7200 cable modem/router which was forced on me by my cable provider (UPC). After one of the automatic firmware updates last year, I started having intermittent problems with stability. The router would suddenly “freeze” with the following symptoms:

  • No websites can be opened
  • DNS requests fail
  • Existing connections keep working (all downloads started before the freeze just keep going)
  • Other machines connected to the same router work fine
  • If you do nothing, connectivity will come back after a while (10 minutes?)
  • Right-clicking the network adapter in Windows and choosing Diagnose successfully restores connectivity, even though Windows reports no issues.

The issue would mostly arise under heavy load. Whenever I opened a lot of new connections (for example, many concurrent downloads, streaming video in several tabs, multiple torrents downloading), I could be reasonably sure that a freeze would occur within 1-20 minutes. It would also occur 1-2 times a day regardless of the load.

Solution

The Technicolor TC7200 does not work properly if you change its default IP address. I had changed the router’s IP address from the default (192.168.0.1) to 192.168.1.1 because that was the address of my previous router and I wanted to spare myself some reconfiguration.

Changing the address of the router back to 192.168.0.1 (and moving all the devices on my LAN back to the 192.168.0.xxx subnet) has completely eliminated the issue. For the past month or so, I’ve had zero freezes despite my attempts to trigger one. Just to be sure, I briefly changed the subnet to 192.168.1.xxx, and – you guessed it – the issue came back.

Polish version

Router Technicolor TC7200 instalowany standardowo przez UPC nie obsługuje prawidłowo sieci lokalnych, których podsieć jest inna niż domyślna (192.168.0.xxx). W przypadku ustawienia w panelu administracyjnym adresu routera innego niż 192.168.0.1 (czyli np. 192.168.1.1), urządzenie zacznie od czasu do czasu się “zawieszać”, a dokładniej blokować możliwość nawiązywania nowych połączeń z danego komputera (wygląda to tak, że nie otwierają się nowe strony, nie działa DNS). Charakterystyczne jest to, że dotychczasowe połączenia trwają nadal (czyli np. pobieranie pliku rozpoczęte przed “zwisem” będzie kontynuowane) oraz że inne urządzenia w tym samym LAN-ie działają w tym czasie bez zarzutu. Problem pojawia się zwłaszcza pod obciążeniem – w przypadku otwarcia wielu połączeń, pobierania/wysyłania dużej ilości danych. Po powrocie do domyślnej podsieci (192.168.0.xxx) problem znika.

Więcej szczegółów powyżej w wersji angielskiej.

Tags:

Sublime Text 3 doesn’t play well with Equalizer APO and other applications that monitor file changes

I just spent a day investigating a really baffling bug that occurred when I installed Equalizer APO on a new Windows machine. (Equalizer APO is a free equalizer that plugs into the Windows Audio subsystem, which lets you, among other things, correct the acoustic flaws of your room to dramatically improve the quality of all audio playing on your computer. Maybe I’ll write a post about it some day — it’s awesome.)

Equalizer APO uses a plain text config file (config.txt) that it constantly monitors for changes using a mechanism provided by Windows. Normally, you can hear the sound playing on your PC change within a few seconds of saving the file. Soon after installing Equalizer APO on a fresh Windows installation, I noticed that the changes I was making in config.txt were often being ignored. For example, the first two changes after reboot might get applied, but subsequent changes would be ignored. I tried fiddling around with folder permissions and moving the config.txt file to various locations on the hard drive, which seemed to fix the issue for some time (usually until the next reboot).

The weirdest thing was that whenever I changed config.txt, Equalizer APO wrote the following message to its log:

Error while reading configuration file:
The system cannot find the file specified.

This did not make any sense at all. How could Equalizer APO not find a file that was clearly there? And why was it unable to find the file only some of the time? After spending an hour studying the source code for Equalizer APO, I grew convinced that the only possible reason was a bug in Equalizer APO which somehow blocks access to the config.txt file (after all, weird contention issues are not unheard of in multithreaded apps) combined with an obscure Windows bug which results in CreateFile() reporting a sharing violation as a missing file.

After submitting a lengthy and detailed bug report to the author of Equalizer APO, I accidentally opened config.txt in Notepad instead of Sublime Text 3, my go-to text editor…

Impossible. The problem was gone. I could edit the file as much as I wanted, and every change was applied. Back to Sublime Text 3 — it stopped working. I tried opening the file in Komodo Edit — it worked just like it did in Notepad.

A-ha! Clearly Sublime Text 3 was doing something weird with the file. Could it somehow be hiding the file from Equalizer APO?

It turns out, when you save a file in Sublime Text 3, in its default configuration, it doesn’t simply overwrite it like all other editors. Instead, it does the following:

  1. Write the modified text into a temporary file.
  2. Delete the original file.
  3. Rename the temporary file so that it looks like the original file.

At the exact moment when ST3 deleted the original file, Windows would notify Equalizer APO about the “change” and cause it to re-read the file. If the read operation was quick enough (which would have depended on things like the overall disk load), Equalizer APO would find the file missing.

Why does Sublime Text 3 save your files in such a weird way? It’s supposed to be a safety feature. If ST3 simply overwrote the original file and something really bad happened during the overwrite, you could lose data. Making a temporary file first guarantees that you will always be able to get your data back.

However, this roundabout way of modifying files can cause problems with software that monitors file changes. I’m not just talking about the scenario that gave me the headache which occasioned this post, but other scenarios as well. For example, there are backup and versioning apps which monitor filesystem changes. To such an app, a save operation in ST3 will look like a file got deleted, and then a new file got created, which may ruin the association between the current version of the file and its earlier versions. For real-life reports of problems like that, see the previously linked thread on the ST forum and this StackOverflow question.

According to the above sources, the “atomic save” feature can be disabled in ST3 by editing user preferences, but I could not get it to work (in build 3065). In the end I simply downgraded to ST2.

Tags:

Web Audio API – things I learned the hard way

Firefox recently dropped support for the Mozilla Audio Data API, so I started porting my two audio-enabled Web apps (Plasticity and Online Tone Generator) to the Web Audio API, which is the W3C-blessed standard way to work with audio in the browser.

In the process, I ran into a few problems, which I thought I’d share here for the benefit of other developers making their first steps with the Web Audio API.

AudioBufferSourceNodes and OscillatorNodes are single-use entities

Suppose we want to generate two 440 Hz tones, each with a duration of 1 second, separated with a 1-second pause. This is not the way to do it:

oscillator = context.createOscillator();
oscillator.frequency.value = 440;
oscillator.connect(context.destination);
currentTime = context.currentTime;
oscillator.start(currentTime);
oscillator.stop(currentTime + 1); //stop after 1 second
oscillator.start(currentTime + 2); //resume after 2 seconds
oscillator.stop(currentTime + 3); //stop again after 3 seconds

What’s wrong? We cannot call .start() on an OscillatorNode or AudioBufferSourceNode more than once. The second call in the above code will result in an error. Both OscillatorNodes (which are used to generate simple tones) and AudioBufferSourceNodes (which are used to play back short samples like sound effects) are meant to be thrown away after each use.

Instead, we should create a separate node for every time we want to play a sound. Every time, we must also connect it to the audio graph:

oscillator = context.createOscillator();
oscillator.frequency.value = 440;
oscillator.connect(context.destination);
currentTime = context.currentTime;
oscillator.start(currentTime);
oscillator.stop(currentTime + 1);

oscillator2 = context.createOscillator(); //create 2nd oscillator
oscillator2.frequency.value = 440;
oscillator2.connect(context.destination);
oscillator2.start(currentTime + 2);
oscillator2.stop(currentTime + 3);

ChannelMergerNode inputs don’t map to channels

Warning: This will be fixed in the next Working Draft of the W3C spec; the below text will eventually become obsolete when browsers implement the new spec.

What do you do when you have two mono sources – like OscillatorNodes, which are always mono, or AudioBufferSourceNodes connected to a mono buffer – and you want to mix them into a stereo signal, for example, play one sample in the left channel, and the other in the right? You use a ChannelMergerNode.

A ChannelMergerNode has a number of inputs, but only one output. It takes the input audio signals and mixes them into a single multichannel signal. Sounds pretty simple, but it’s easy to fall into the trap of assuming that inputs correspond to channels in the output signal. For example, take a look at the following code, which tries to play a tone on the right channel only:

oscillatorR = context.createOscillator();
oscillatorR.frequency.value = 440;
mergerNode = context.createChannelMerger(2);
//create mergerNode with 2 inputs
mergerNode.connect(context.destination);

oscillatorR.connect(mergerNode, 0, 1);
//connect output #0 of the oscillator to input #1 of the mergerNode
//we're leaving input #0 of the mergerNode empty
currentTime = context.currentTime;
oscillatorR.start(currentTime);
oscillatorR.stop(currentTime + 2);

The result of running this code is a tone playing in both channels at the same time. Why? Because inputs of a ChannelMergerNode do not map to channels in the output signal. If an input is not connected, ChannelMergerNode will ignore it. In this case, the first input (numbered 0) is not connected. The only connected input is #1, and it has a mono signal. ChannelMerger merges all the channels on all the connected inputs into a single output. Here, it receives only a single mono signal, so it will output a mono signal, which you will hear coming from both speakers, as you always do with mono sounds.

The right way to have a sound playing only in one channel is to create a “dummy” source node and connect it to the ChannelMergerNode:

context = makeAudioContext();
oscillatorR = context.createOscillator();
oscillatorR.frequency.value = 440;
mergerNode = context.createChannelMerger(2); //create mergerNode with 2 inputs
mergerNode.connect(context.destination);

silence = context.createBufferSource();
silence.connect(mergerNode, 0, 0);
//connect dummy source to input #0 of the mergerNode
oscillatorR.connect(mergerNode, 0, 1);
//connect output #0 of the oscillator to input #1 of the mergerNode
currentTime = context.currentTime;
oscillatorR.start(currentTime);
oscillatorR.stop(currentTime + 2);

You create a silence node by creating an AudioBufferSourceNode, just like you would for any sample, and then not initializing the buffer property. The W3C spec guarantees that this produces a single channel of silence. (As of April 2014, this works in Chrome, but does not work in Firefox 28. In Firefox, the input is ignored and the result is the tone playing on both channels.)

Update: I’m happy to say that my feedback to the W3C Audio Working Group has resulted in changes to the spec. Per the latest Editor’s Draft, ChannelMergerNodes accept up to 6 mono inputs (other types of inputs will be downmixed to mono), with empty inputs treated as silence rather than omitted. Now the changes have to be published in the next Working Draft, and then the browsers have to implement them.

Unused nodes get removed automatically

You might think that if you want to have two sounds playing in different channels – one sound in left, another in right – you don’t need to create dummy nodes. After all, the ChannelMergerNode will have two input channels.

In the code below, we want to play a 440 Hz tone in the left channel for 2 seconds, and a 2400 Hz tone in the right channel for 4 seconds. Both tones start at the same time.

oscillatorL = context.createOscillator();
oscillatorL.frequency.value = 440;
oscillatorR = context.createOscillator();
oscillatorR.frequency.value = 2400;
mergerNode = context.createChannelMerger(2); //create mergerNode with 2 inputs
mergerNode.connect(context.destination);

oscillatorL.connect(mergerNode, 0, 0);
//connect output #0 of the oscillator to input #0 of the mergerNode
oscillatorR.connect(mergerNode, 0, 1);
//connect output #0 of the oscillator to input #1 of the mergerNode
currentTime = context.currentTime;
oscillatorL.start(currentTime);
oscillatorL.stop(currentTime + 2); //stop "left" tone after 2 s
oscillatorR.start(currentTime);
oscillatorR.stop(currentTime + 4); //stop "right" tone after 4 s

This code works as expected for the first 2 seconds – each tone is audible only on one channel. But then the left tone stops playing, and the right tone starts playing on both channels. What’s going on?

  1. When oscillatorL stops playing, it gets disconnected from mergerNode and deleted. The browser is allowed to do this because – as you recall – an OscillatorNode or AudioBufferSourceNode can only be used once, so after we call oscillatorL.stop(), oscillatorL becomes unusable.
  2. The ChannelMergerNode notices that it is left with only one channel of input, and starts outputting a mono signal.

As you can see, the most stable solution, if you want to access individual audio channels, is to always have a dummy node (or several, if you’re dealing with 5.1 or 7.1 audio) connected to your ChannelMergerNode. What’s more, it’s probably best to make sure the dummy nodes remain referenceable for as long as you need them. If you assign them to local variables in a function, and that function returns, the browser may remove those nodes from the audio graph:

function playRight()
{
    var oscillatorR = context.createOscillator();
    oscillatorR.frequency.value = 440;
    var mergerNode = context.createChannelMerger(2);
    mergerNode.connect(context.destination);
    var silence = context.createBufferSource();
    silence.connect(mergerNode, 0, 0);
    oscillatorR.connect(mergerNode, 0, 1);
    currentTime = context.currentTime;
    oscillatorR.start(currentTime);
    oscillatorR.stop(currentTime + 2);    
}

playRight();

Consider what happens at the time playRight() finishes. oscillatorR won’t get removed because it’s playing (scheduled to stop in 2 seconds). But the silence node is not doing anything and when the function exits, it won’t be referenceable, so the browser might decide to get rid of it. This would of course switch the output of the ChannelMergerNode into mono mode.

It’s worth noting that the above code currently works in Chrome, but it might not in the future. The W3C spec gives browsers a lot of leeway when it comes to removing AudioNodes:

An implementation may choose any method to avoid unnecessary resource usage and unbounded memory growth of unused/finished nodes. (source)

In Firefox, You cannot modify an AudioBuffer after you’ve assigned it to AudioBufferSourceNode.buffer

The following code attempts to generate and play a 440 Hz tone over a single channel:

SAMPLE_RATE = 44100;
buffer = context.createBuffer(1, 44100*2, SAMPLE_RATE);
//create a mono 44.1 kHz buffer, 2 seconds length
bufferSource = context.createBufferSource();
bufferSource.buffer = buffer;

soundData = buffer.getChannelData(0);
for (var i = 0; i < soundData.length; i++)
  soundData[i] = Math.sin(2*Math.PI*i*440/SAMPLE_RATE);
bufferSource.connect(context.destination);
bufferSource.start(0);

It works in Chrome, but fails in Firefox (28) without any error message. Why? The moment you assign buffer to bufferSource.buffer, Firefox makes the buffer immutable. Further attempts to write to the buffer are ignored.

This behavior is not covered by the W3C spec (at least I couldn’t find any relevant passage), but here’s a Mozilla dev explaining it on StackOverflow.

Tags:

Windows bug: AltGr key stops working after you open a “Save As” or “Open File” window

While working on the next release of my Windows app for typing foreign characters and phonetic symbols, I stumbled on a pretty serious bug that affects the use of multiple keyboard layouts on Windows Vista, Windows 7 and (to a lesser extent) Windows 8.

What the bug looks like to the end user

Suppose your default Windows keyboard layout is US English, but you also want to use other keyboard layouts occasionally – maybe you’re an American learning French or a Pole who lives in the US, but wants to write in Polish every now and then. The Language Bar in Windows allows you to set a separate layout for each window, so let’s say you fire up Microsoft Word and change your layout to Polish.

Quick note: The Polish keyboard layout, like most non-English keyboard layouts, makes use of the right-hand Alt key, also known as “AltGr”, which is short for Alternate Graving. For example, to type the letter ł (pronounced like English w), you press AltGr+L.

After typing for a while, you decide to save your document. You bring up the Save As dialog box, type up a filename, and press OK. When you start typing again, you notice that you can no longer type any AltGr characters.

What’s going on? The act of opening the default Windows “Save As” (or “Open File”) dialog disables the AltGr key. The AltGr characters are not accessible in the dialog (you cannot use them in file names). More seriously, once you close the dialog, AltGr will remain broken in your application, even though the Language Bar will report that you are using the correct layout. At first, I thought it was lying, but it’s not – technically, the layout is still in effect. It’s just that the AltGr key is no longer working; it becomes a regular Alt key.

The same happens if your default Windows layout uses AltGr and you set just one window to a non-AltGr layout such as US English. If you open the “Save As” or “Open File” dialog, the right Alt key will turn into AltGr. This means, for example, that hitting right-Alt+F will no longer bring up the File menu.

Which versions of Windows are affected?

I’ve reproduced this bug on Windows Vista, Windows 7, and Windows 8. Windows XP seems immune.

Note that the default setting in Windows 8 is to have a single keyboard layout for all applications. This bug will affect you only if you specifically go to the language settings and choose the option to enable per-app keyboard layouts.

Tags:

The right way to stress-test an overclocked PC

Consider the following situation: You’ve overclocked your CPU, set the core voltage (Vcore) to some reasonable number, and then it’s time for some stress testing to make sure your rig is stable. You follow all the standard recommendations that you can find on overclocking forums: you run Prime95 for 12-24 hours, do a few hours of FurMark, and a few hours of IntelBurnTest for good measure. All the tests complete without a hitch. Congratulations! Your system is now considered stable.

But then you run Crysis or Battlefield 3 and you get random lockups and reboots. What’s going on?

The problem is that you stress-tested the CPU and GPU separately. That doesn’t guarantee that your system will be stable when both the CPU and the GPU are under load. Interestingly, loading the GPU can make your CPU unstable!

In other words, to be stable under combined CPU+GPU loads, your CPU may need a higher Vcore than it does for isolated CPU loads. That’s why the Vcore you arrived at using Prime95 or IBT may be too low to guarantee CPU stability when the GPU is under stress.

Here’s the story of how I realized it:

I had overclocked an i5 3570K to 4.3 GHz at 1.25 Vcore. The system had successfully completed the following stress tests:

  1. Prime95 Blend (all cores) for 14 hours
  2. Prime95 Small FFT (all cores) for 8.5 hours
  3. FurMark for 3 hours
  4. IntelBurnTest on Standard for 2 hours
  5. IntelBurnTest on High for 1 hour

Most overclockers would agree that these results look rock-solid. However, when I ran IntelBurnTest and FurMark simultaneously, I was shocked to see FurMark fail almost immediately. I repeated the experiment several times and the time to failure was always between 5 and 30 minutes. I started coming up with hypotheses:

Hypothesis #1: The PSU can’t supply enough power for the combined CPU+GPU load. The Corsair HX520W supplies 480 watts on the 12V line; an overclocked 3570K + HD7850 shouldn’t draw more than 300 W, but there’s also the motherboard, and the PSU is kind of old – who knows, maybe the capacitors have aged?

Refutation: After replacing the PSU with the Be Quiet! Straight Power E9 580 W (564 W on the 12 V line), the symptoms were exactly the same. It wasn’t a PSU problem!

Hypothesis #2: The increased heat production is causing the GPU, CPU, or video card VRMs to overheat.

Refutation: (1) GPU and CPU core temperatures were carefully monitored during stress testing and did not exceed 90°C. (2) During isolated CPU/GPU stress testing, CPU/GPU core temps were the same, yet there were no crashes. (3) With an open case and all fans set to maximum, the CPU+GPU test failed just as quickly. It was not a heat issue.

That left me with only one hypothesis.

Hypothesis #3: Loading the GPU is somehow causing the CPU to lose enough power to become unstable. In other words, the CPU Vcore is set high enough for isolated CPU load, but not high enough for combined CPU+GPU load. This was a novel hypothesis; I hadn’t seen it discussed in any of the overclocking resources that I had read.

Confirmation: In the words of Colonel Hans Landa, that’s a bingo! Upping the Vcore from 1.25 V to 1.275 V improved the stability by a large margin. Instead of failing after 5-30 minutes (as verified in multiple tests), IBT+FurMark failed after 108 minutes. (Notice that if the crashes had been due to overheating, the test would have failed more quickly than before, as increasing the CPU Vcore made the CPU hotter.)

Still, the system was not fully stable. In order to make it IBT+Furmark stable for several hours, I would have probably had to increase Vcore to around 1.3 V. Incidentally, this is close to the Vcore that the CPU chooses automatically for a 4.3 GHz clock speed, if you select the so-called “offset mode” instead of forcing a fixed Vcore. The lesson here would be that Intel knows what they’re doing when choosing these automatic settings.

The right way to ensure your overclock is fully stable

Run FurMark and IntelBurnTest at the same time for a few hours – at least as long as you expect to operate with combined CPU+GPU load. For example, if your typical gaming session is 3 hours, you should run IBT+FurMark for at least 3 hours. Of course more is better.

One tricky part to combined CPU+GPU stress testing is that FurMark needs some free CPU cycles in order to stress the GPU properly. If you fully load the CPU with IBT, Furmark will be bottlenecked and the GPU load will be much less than 100% – on my system it was around 75%.

The solution is to find the right settings for IntelBurnTest that will result in stressing the CPU while leaving just enough free CPU cycles to allow FurMark to get GPU load above 95%. In my case, the highest combined load was produced with IBT set to “High” and 3 threads.

Another piece of advice is to stick to “offset mode”, i.e. let the CPU set the Vcore automatically depending on the clock frequency. This may result in rather high voltages and high power draw (for example, a 3570K @ 4.3 GHz and full load has a Vcore of 1.32 V, which is higher than usually reported by overclockers at this clock speed), but at least you’re using a Vcore that has been blessed (and presumably extensively tested) by Intel.

My personal choice was to scale back my overclock to 4.1 GHz and choose offset mode. This translates to a Vcore of only 1.192 volts under load and, as far as I can tell, complete stability (IBT+Furmark 5 hours with no error) – though of course you can never be 100% sure…

FAQ

Isn’t it overkill to stress-test the CPU and GPU at the same time? That depends. If you’re absolutely sure you’re never going to reach full or near-full CPU+GPU load, then I guess you can stick to standard, isolated stress testing. However, bear in mind that there are games which can place a very high load on the CPU and the GPU at the same time.

Are you saying that most overclocked systems out there are not truly stable? It would seem so. Most overclockers determine a “stable” CPU Vcore based on isolated CPU testing with something like Prime95 or IntelBurnTest. My experience shows that you need a much higher Vcore to ensure stability under combined CPU+GPU load. I suspect most OC’d systems would fail the IBT+Furmark test outlined here rather quickly; they might also crash when running certain games. Of course, my findings would be more trustworthy if someone replicated them – please post your experiences in the comments!

Why would loading the GPU make the CPU unstable? I’m no hardware engineer, but here’s a wild guess: Loading the GPU drains some of the power away from the CPU (or causes small voltage dips from time to time). Therefore, when the GPU is loaded, the CPU will need a higher Vcore to remain stable.

What was your exact setup?

  • Motherboard: Asus P8Z77-V Pro
  • CPU: Intel i5 3570K
  • CPU cooling: Noctua NH-U12P SE2, 120mm Enermax T.B. Silence fan
  • Video card: AMD Radeon HD7850 (MSI Twin Frozr III, 2 GB)
  • GPU cooling: Accelero S1 Plus, 120mm Enermax T.B. Silence fan
  • 16 GB RAM (two 8 GB sticks, Kingston HyperX)
  • SSD: Crucial M4 256 GB, HDD: Western Digital Red 2 TB (WD20EFRX)
  • Case: Fractal Design Define R4
  • Case cooling: 1 rear 140mm Fractal Design Silent Series R2 fan
  • PSUs tested: Corsair HX520W / Be Quiet! Straight Power E9 580W

Tags:

In search of a quiet PSU

Something’s amiss with the current crop of power supply units (PSUs) for desktop PCs. It seems all of them whine, squeal, buzz, or emit some other type of electrical noise.

As I researched the purchase of a PSU for my new PC, I was surprised to notice a large number of user complaints about “coil whine”, an umbrella term that encompasses various kinds of noise given off by vibrating transformer coils. Annoyed users described purchasing a PSU only to find that their new component emitted an awful sound – for example, it buzzed when the CPU was under load, or squealed when the system was in standby mode.

I want to make a few things clear up front:

  • We’re not talking about el cheapo PSUs – these people bought high-quality midrange or high-end models from the most reputable vendors like Seasonic or Corsair.
  • We’re not talking about some sort of faint sound that is only audible if you put your ear to the PSU. We’re talking about noise that is clearly audible over quiet fans and hard drives, in a quiet room, from 1-5 meters away.
  • It does not look like these users were simply unlucky to receive faulty units. Many users exchanged their units, but their symptoms did not go away (though sometimes they were alleviated somewhat). In fact, after hours spent reading through forum threads and customer reviews, I can’t recall a single case where the replacement unit was completely quiet.

After reading all those troubling reports, I approached the purchase of a PSU with some trepidation. It appeared there was a significant chance that my new PSU would be noisy.

I crossed my fingers and ordered my first PSU:

Be Quiet! Dark Power Pro 10 650W

Vendor page

The Be Quiet! Dark Power Pro 10 is the top-end model from a relatively new German vendor. Be Quiet!’s slogan is “It’s not absolutely silent until it’s absolutely silent”. Their PSUs are manufactured by FSP, a well-known PSU manufacturer based in Taiwan. The Be Quiet! Dark Power Pro 10 has earned the Editor’s Choice award from Silent PC Review, which is the best online resource dedicated to silent computing. SPCR’s review describes the Dark Power Pro as “extraordinarily quiet”. I was unable to find any user reports about electrical noise, though it has to be noted that Be Quiet’s PSUs are not the most popular choice and are unavailable in many markets.

The Dark Power Pro 10 looks solid and well-made, and the 135-mm fan was totally inaudible in my system (which probably consumes something like 300 watts under full load). However, the unit buzzed under light and medium loads — for example, while booting Windows, scrolling the browser window, or stress-testing only one CPU core. The buzz was (barely) audible from 2 meters away in a closed, well-dampened case (Fractal Define R4). I would compare the sound to that of a fly caught inside the PSU, or the seek noise of a hard drive. There was no buzzing under full load and in idle.

Putting the CPU (i5 3570K) on fixed voltage dulled the sound a bit, as did disabling the C1E power saving mode, but the improvement was small. I tried 20 or 30 other BIOS settings related to CPU power management, but these had no discernible effect on the noise.

Since I do not enjoy listening to an insect-like sound every time I scroll a webpage, I returned the unit and got another one:

Be Quiet! Straight Power E9 580W CM

Vendor page

This is Be Quiet’s midrange model, with very similar characteristics and a few missing gimmicks (such as less flashy packaging). I bought it because this review (in German) says it has absolutely no electrical noise.

Initially, it seemed that the Straight Power had very little buzzing — the loudest buzz I was able to generate was audible from 30-50 cm away (the Dark Power’s buzz was audible from 2 meters away). It wasn’t as silent as my old Corsair HX520W, but I was willing to live with a small amount of buzzing, especially that I keep my PC under a desk.

However, my evaluation changed when I plugged in my Dell monitor (2209WA), the PSU started buzzing quite loudly. It was a constant sound, independent of system load. I tried plugging in other devices (monitors, laptop PSU, etc.) to the same AC circuit, but only the Dell display makes it buzz. Which was a problem because the Dell is my main display!

Now, I don’t know if the buzzing is the “fault” of the PSU or the Dell display, and I don’t really care. All I know is, this symptom is not present in my old PSU, so the Straight Power had to go.

Seasonic X-560

Vendor page

I had mixed feelings about ordering a Seasonic PSU. On the one hand, Seasonic products are a long-time favorite of Silent PC Review; on the other, Seasonic-made PSUs are the ones which appear most often in reports of coil whine (though it could be a side effect of their popularity).

The Seasonic X-560 was, by a large margin, the loudest PSU I have tried so far. It was pretty quiet in idle and under medium load – apart from a bit of a high-pitched squeal that was only audible up close. Under high load (specifically, when stress-testing the GPU in FurMark), there was a clearly audible crackling sound (audible from about 1 meter away). But the Seasonic X-560 would show its true acoustic power only after I launched Mass Effect 3. As soon as the welcome screen loaded, it started giving off a constant, ridiculously loud hiss that sounded like releasing compressed air or perhaps running water. The noise was as surprising as it was loud – it was the loudest sound I had ever heard from a PSU, and it was easily audible from 5 meters away.

I also have to mention that the power connectors are poorly made. Plugging the ATX cable to the motherboard was quite an adventure – I nearly gave up, and at one moment I was afraid that I had damaged the socket! The plug only went in after I figured out a way to slide it in at a specific angle with the right amount of sideways pressure. Unplugging the cable was equally “exciting”. This does not appear to be an isolated case – here’s another user report (in Polish). The CPU power cable was also somewhat difficult to plug in. Poorly fitted cables are something that simply should not happen with a high-end, expensive product like a Seasonic X-Series PSU.

Right now, Seasonic still has a good reputation, but I can’t imagine it will last very long.

Enermax Triathlor 550W

Vendor page

(This is the version without modular cables, as opposed to the Enermax Triathlor 550W FC. The tech specs of both models are identical, though it’s possible that the FC has a different fan profile.)

After trying three “80 Plus Gold” PSUs, I decided to try an “80 Plus Bronze” model. The first thing that I noticed after installing the Enermax Triathlor 550W was how loud the fan was. While the PSU is fitted with the Enermax T.B. Silence 120mm fan, which is a high-quality fan capable of silent operation, the rotation speed is way too high. Judging by the noise level, the idle rotation speed is around 900 rpm. This produces an unpleasant whine and makes the PSU easily the loudest component in my PC.

To make things worse, the PSU ramps up the fan speed quite aggressively – around 40% load (220 W), the noise becomes much louder as the fan exceeds 1000 rpm. Such intense cooling appears unnecessary – when I put my palm next to the exhaust vent, I noticed that the air blowing out of the PSU wasn’t warm in the least bit. It’s as if the fan controller of the Triathlor 550W is trying to keep the PSU at room temperature at all times.

What about electrical noise? As you can imagine, testing for electrical noise in a unit with a loud fan isn’t exactly easy. However, my standard testing procedure revealed a buzz when running Mass Effect 3. The buzz is audible at a distance of 1-2 meters, even with the fan whooshing like crazy. There was also a softer buzz when running FurMark.

In short, even if the Enermax Triathlor didn’t have an outrageously loud fan, it could not be considered quiet due to the buzz.

Be Quiet! Pure Power L8 530W CM

Vendor page

My adventures with the above four PSUs revealed a surprising relationship: the more expensive and power-efficient a PSU was, the more electrical noise it had. An expensive and “reputable” Seasonic X-series was much worse than the less expensive BeQuiet! Dark Power, which in turn was worse than a middle-tier BeQuiet! Straight Power. On the other hand, my experience with the Enermax Triathlor showed that cheapness often means an unacceptably loud fan.

So, for my fifth PSU, I was looking for a relatively cheap unit with a quiet fan profile. I was thinking about Nexus PSUs, but I couldn’t find them in Poland, so I chose another BeQuiet! PSU: the Pure Power L8 530W CM. The Pure Power is quite cheap: about 40% cheaper than the Seasonic X-Series!

The first thing I noticed is that the fan on the Pure Power always starts at full speed, so when you turn on your computer, there is a loud whooshing sound that subsides after 2-3 seconds as the fan slows down to its normal operating speed. How loud is it at normal operating speed? Well, it seems slightly louder than the BeQuiet! Straight Power, BeQuiet! Dark Power or Corsair HX 520W, but the difference is subtle and, with the PSU mounted at the bottom of the case, I daresay immaterial. At a distance of over 1 meters, the noise simply blended in with that of other components (very quiet 120-mm fans and super-quiet WD Red hard drive). The noise quality was quite broadband, there were no traces of whine, as is the case with some fans.

The fan profile is very quiet — I couldn’t get the fan to speed up even after fully loading my i5-3570K CPU (@ 4.1 GHz) and my Radeon HD7850 at the same time, which probably resulted in a power draw exceeding 250 W on the 12V line, which is more than 50% of the maximum output of the PSU on that line. The Be Quiet! stayed quiet no matter what I did.

There was a clicking sound coming from the fan (which I verified by using the SPCR technique of forcibly stopping it with a plastic straw). The sound was inaudible at a distance of over 1 meters; however, there is a risk it could get worse with time. Unfortunately, the fan on the Pure Power is a sleeve-bearing fan which is not ideal for horizontal operation.

Now for the most important part: electrical noise. Unlike its more expensive brothers, the BeQuiet! Pure Power 530W exhibited virtually no electrical noise (whining, buzzing, clicking etc.) under normal usage or artificial load. The Dark Power buzzed when scrolling web pages and during other light tasks, the Straight Power buzzed when I plugged in my Dell monitor — here, there is no noise (unless you put your ear to the PSU). There was likewise no electrical noise when stress testing under Prime95, IntelBurnTest, FurMark or IntelBurnTest+FurMark. I could get the PSU to emit a slight buzz when playing Mass Effect 3 (which is, for some reason, the noisiest application I could find). However, I could not hear the buzz from more than 1 meter away when the PSU was in a closed case.

To sum up, the BeQuiet! Pure Power 530W CM is certainly the quietest PSU I have tested — and the only one which could be described as completely inaudible in typical usage (closed case under a desk). If you believe a PSU should be neither seen or heard, you should definitely check out the BeQuiet! Pure Power 530W CM.

Is the Pure Power as good as my trusty old Corsair HX520W? No, it has a tiny bit of buzzing and the fan is probably a tiny bit louder. But of the PSUs you can buy today, it’s one of the quietest, if not the quietest.

FAQ

How do you know the noise was made by the PSUs, and not your motherboard, video card, etc.? I put my ear to the PSU, the motherboard and the video card. There’s no doubt that the PSUs were the source.

What’s your setup?

  • Motherboard: Asus P8Z77-V Pro
  • CPU: Intel i5 3570K (overclocked to 4.1 GHz)
  • Video card: AMD Radeon HD7850 (MSI R7850 Twin Frozr III, 2 GB)
  • 1 SSD drive and 1 very quiet mechanical hard drive (WD Red 2 TB)
  • Case: Fractal Design Define R4
  • Case fan: 140 mm at 500-600 rpm
  • CPU fan: 120 mm at 600-700 rpm
  • GPU fan: 120 mm at 600-700 rpm

Why do you think current PSUs are prone to electrical noise? I have a hunch that it has something to do with rising efficiency. (This thread quotes others who feel the same way.) A few years ago, when nobody worried about PSU efficiency, you’d have to be very unlucky to get a buzzing or whining PSU. But today, manufacturers are scrambling to squeeze every ounce of efficiency from their designs. They have persuaded consumers (with the aid of reviewers) that it really matters whether their next PSU is an “80 Plus Bronze” or an “80 Plus Gold” (in reality, of course, nobody will notice a difference in their electric bill). As in all situations where everybody is fixated on a single parameter — whether it’s GPAs in schools, display contrast ratios or camera megapixel counts – the one parameter goes up while the overall quality suffers. In this case, silence is sacrificed in order to gain the prestige associated with an “80 Plus Gold” or “80 Plus Platinum” badge.

Tags:

I am now afraid of browser plugins

Yesterday night I got my first malware infection from the Internet. Here’s what happened, step by step:

  1. I was reading a discussion on LinkedIn, trying to get user opinions on a particular ISP in Poland.
  2. Some user had posted a link to a website which maintains a ranking of Polish ISPs. I followed it.
  3. I was transported to a pretty normal-looking website with information on ISPs, user ratings, etc. (I’m not going to post the link here, but I did report it to Google and BadwareBusters.)
  4. Soon after I started reading the page, I noticed that my browser (Firefox) started downloading some PDF file. I attempted to cancel the download, but the file was small, so I didn’t manage to do it in time.
  5. The file automatically opened in Adobe Reader 8.2, my PDF viewer of choice. (more on that later)
  6. A few seconds later, I was greeted with this:

This wasn’t just one of those annoying popups that you can simply close. My PC was completely taken over by “Live Security Platinum”, which, as I later found out, is a fake antivirus which tries to convince you that your computer is infected with a bazillion viruses and get you to pay for the “full version” of the “software”. It adds itself to the Windows Registry, sends unknown data over the network, keeps displaying annoying balloons in the notification area, and when you run Task Manager, it immediately closes it so you can’t kill it. It also reconfigures your proxy settings to keep you from running Google searches to find a solution and prevents you from running certain antivirus applications. Most annoyingly, it permanently removes several crucial Windows services like Windows Update.

Lessons learned / Advice

  1. It’s quite possible to get your PC infected just by browsing the Web – even if you never click on any suspicious links, download suspicious software, or visit shady websites. (Incidentally, I had visited tons of what would be considered shady websites before – be they porn sites, hacking sites, pirate sites – you name it, I’ve been there – and never got infected. Go figure.)
  2. Disable all the plugins that are not absolutely necessary. You may need Flash, but do you really need Quicktime, Silverlight, Java and Adobe Reader? (Remember that plugins can be temporarily re-enabled as needed.) Each plugin is one additional way for malware to gain control over your PC. All it takes is a plugin that hasn’t been updated recently and a malicious (or simply hacked) website that redirects you a cleverly written Flash animation, Java applet, video clip or PDF document.
    • Disable Adobe Reader and replace it with a built-in PDF viewer (both Chrome and Firefox have one). If you can’t do that, at least replace it with a less popular viewer like FoxIt Reader (less popular software is safer because the bad guys always focus on software with the largest user base).
  3. If you have to use a plugin, at least make sure it gets updated regularly. Don’t ignore those Flash updates!
  4. Make sure your browser doesn’t automatically open files in external applications, especially popular and rarely updated applications. With automatic opening of files, a web page can easily make your browser download a file and then immediately open it in an external application. If that external application has a security hole, it could allow the attacker to install malware on your system. Set up your browser to always ask you before opening a file. (In Firefox, you can change the behavior in Options|Applications.)

FAQ

Why the hell were you using Adobe Reader?

Well, I had been using FoxIt Reader, but I switched to Adobe because FoxIt occasionally had difficulty rendering PDF files.

Why were you using such an old version of Adobe Reader?

Because it worked faster than the newer versions. I guess I didn’t think of the fact that old Adobe Reader versions have vulnerabilities that could be exploited by a website I visit.

How did you remove Live Security Platinum from your computer?

Well, it wasn’t easy. When I typed “live security platinum remove” into Google, all I got was a ton of shady-looking keyword-stuffed sites, all of which tried to get me to purchase or at least download some malware removal software. I literally couldn’t find a single reputable-looking resource on the topic. Not a word from McAfee, Kaspersky or Avast. How was I to know that some app called MalwareRemover or TrojanKiller wasn’t going to mess up my system even more?

In the end, I followed the manual removal instructions given here (near the bottom of the page). That way, at least I could verify each step separately.

Once the malware was removed, I was in for a nasty surprise. Live Security Platinum had removed a number of crucial Windows services like Windows Firewall and Windows Update. By “removed”, I don’t mean that it simply disabled them – I mean that they did not even appear in the Services console. How do you reinstall a basic Windows component? I decided that the simplest way out was to use System Restore to bring my system back to the state it was in 4 days earlier. Thank goodness I had a restore point that I could use.

Tags:

Review of the Panasonic TX-P42-ST50-E plasma TV

I bought a new TV for my mom to replace her 21-year-old Sanyo CRT TV that my parents bought when I was a pre-teen. By the way, the Sanyo is still working, though the picture isn’t what it used to be. Initially, I was going to do it the polite way and wait for the old TV to die before I got a new one, but as the old TV has shown no signs of decline in the past two years, I lost hope of that ever happening. I guess I should have known better than to count on the death of a Japanese product from the early 90’s. (BTW, how’s your Game Boy working? Mine’s still going strong.)

image

Why plasma?

When you ask a non-techie shopper what TV they are going to buy, the answer is most likely going to be “LED”. In the minds of most consumers, LED TVs are the state of the art in TV technology. A major reason for this association is that, of all the TV types available today, LED TVs simply appeared the latest. They are also a bit thinner and much more energy-efficient than other types of TVs, which contributes to the modern impression they make on shoppers. For these reasons, an LED TV was my default choice as well. I had to be convinced to choose otherwise.

I was persuaded away from LEDs because I noticed a funny thing during my occasional visits to TV stores in Wroclaw: every time I noticed a TV with picture that impressed me, I would glance at the label and it would read “plasma”.

Plasma displays have two big advantages over LCD and LED displays. One is black level. The basic way LCDs work is by having a huge white backlight that is always on. The white light from the backlight is then blocked with an array of tiny colored filters and liquid crystals. To display black, an LCD has to block the white light with liquid crystals that turn black. (It is important to understand that “LED displays” are actually LCD displays with an LED backlight. In a “classic” LCD display, the backlight is a white CCFL; in an LED display, the backlight is composed of a large number of LEDs that give off a white light.) Since the backlight is very bright, light from the backlight tends to “leak” through the liquid crystals. This means that it is very hard for an LCD to display true black. In most cases, the best it can do is very dark grey. (LCD/LED TVs in stores are always set to show bright, colorful images, so you won’t notice this deficiency.)

On the other hand, a plasma display has no backlight. It is more like a matrix of tiny neon lights – small cells filled with a mixture of gases that turn into glowing plasma when you apply voltage to them. Turn off the voltage and the light is gone. Because of this, it is relatively easy for a plasma display to reproduce true black.

Another advantage of plasma displays is uniformity. LCD and LED displays commonly suffer from “backlight bleed” – light from the backlight “bleeds” around the edges of the panel, which means that when looking at a dark scene in a movie (e.g. a night sky or a dark interrogation room), you can notice patches of light instead of a uniform black area. This is not an issue with plasma displays.

The main advantage of LED displays over plasma displays is power consumption. An LED display will use up 50% less power than a plasma display of the same size. This may seem like a lot in relative terms, but for a 40” display the difference is about 60 watts. If you watch TV for 4 hours every single day, you will save 87 kilowatt-hours over a period of one year. This works out to about $15 per year. You decide whether it’s worth paying $15 per year for superior picture quality.

(More detailed comparison of display technologies from Wikipedia.)

My impressions of the Panasonic TX-P42-ST50-E

I’m not going to write a complete review of the TV. For that, I’ll refer you to the glowing reviews at HDTVtest.co.uk and AVForums. The consensus among the Internet experts is that this is a mid-range plasma TV with better picture quality than last year’s high-end models.

Below, I’ll report my general impressions and include some information that I wasn’t able to find anywhere, despite extensive research. I hope you will find it useful in making your purchasing decision.

Note: The information presented here refers to the European version of the Panasonic ST50. The North American version has different firmware and slightly different features.

  • Black level. This TV produces incredibly deep, inky blacks – much better than I’ve seen on any LCD display. If I may say something slightly controversial, I think there’s little point in improving the black level any further. It’s already better than what you get in a movie theater. (In case you haven’t noticed, movie projectors cannot reproduce 100% black, although they are very close.) Watching dark content such as concert footage or movies with night-time scenes (such as Drive, with night-time car chases and aerial shots of the L.A. skyline) is pure pleasure on this TV. I can’t wait to re-watch Star Wars on it. Space is going to look amazing on this set.
  • Color fidelity. Even without calibration, color accuracy on the Panasonic TX-P42-ST50-E (in the “True Cinema” mode) seems subjectively very good, as checked by inspecting several familiar photographs. This is a very good TV for browsing photos. (You can use the calibration settings posted by David Mackenzie of HDTVtest.co.uk to make the colors even more accurate.)
  • Motion interpolation. Like most modern TVs, the Panasonic TX-P42-ST50-E can make motion more fluid by inserting additional frames between the actual frames in the source. This is a controversial feature. Some people enjoy the extra fluidity; others feel it makes movies feel less “film-like”. I have found I am in the first camp. The standard movie frame rate is quite jerky (just try playing a game at 24 fps – it will be playable, but not fluid by any means). For me, the effect is simply tiring on a large screen, especially when sitting close to the TV. So I have set “Intelligent Frame Creation” to “Mid” and so far I’ve been quite happy.
  • [Added Jun 18 2012:] Burn-in. After about a month of watching a 24-hour news channel for 2-3 hours a day, the channel logo is now permanently visible in light grey when you display a uniform background (especially on white). To be fair, I only noticed the image when I tried the built-in “burn-in remover” feature (which shows a solid white bar moving across the screen). I don’t think this is classic burn-in because no image is visible on a black background (I checked in a very dark room). I tried running the burn-in remover for 5 hours. It has made the channel logo more faint, but it hasn’t removed it. It’s not terribly annoying because the TV is almost never used to display solid colors. The biggest concern is that it will keep getting worse.
  • Sound quality. Customers like slim, sexy TVs. Slim TVs means tiny speakers. Tiny speakers means crappy sound. Panasonic has crammed 8 teeny speakers and a miniature subwoofer into this TV. The 8 speakers sound worse than the single speaker in the old 14” Grundig TV in my kitchen. As an experiment, I plugged in an external amplifier and a very cheap set of Onkyo speakers and the difference was night and day. Don’t get me wrong – you can still watch stuff on this TV; it’s just that the sound seems to be coming from far away (maybe because the speakers are facing down and not forward?) and thus the clarity is not very good. From what I’ve seen, this is pretty much the norm with slim TVs.
  • Reflections. The display is glossy, but I was actually surprised at how well the anti-reflective coating absorbs reflections on this TV. There’s certainly no “mirror effect”, such as you can see on many laptop screens. As long as your display is not facing a window, you should be totally fine.
  • Remote control. From the point of view of someone who’s interested in user interfaces, the remote control is a disappointment. It’s an array of buttons, just like the remote that came with my mom’s 21-year-old Sanyo. Have we really learned nothing about user interfaces in 21 years? Why can’t I have a volume knob to make precise and quick adjustments to volume instead of having to press a button many times? Where’s my pointing device to quickly select options on the screen instead of using the awkward arrow keys? On the plus side, the infrared LED on the remote seems to be pretty powerful, so the remote works pretty much whichever way you point it (at least with fresh batteries). This is a welcome feature, as it can be quite annoying to lift the remote and point it at the TV every time you want to do something, especially if you’re the kind of viewer who often rewinds a movie in order to catch every line of dialogue.
  • Multimedia playback from USB devices. There have been reports that this TV has difficulty playing MKV and AVI files. I have found this to be completely false, at least on my (European) version of the TV. So far, I have tried about 20 different movies in MKV, MP4 and AVI containers, and the TV has played them all back very reliably 90% of them without any problems. I did encounter a problem with one very large (8 GB) MKV movie encoded in 1080p, which worked on my PC but not on the TV. It is possible to seek back and forward by about 10 seconds by pressing OK and then the Left or Right arrow, which is useful if you miss a line. External subtitles in SRT format are supported (the name of the subtitle file has to be the same as the name of the movie file) as well as internal MKV subtitles. Polish characters are supported (there is a menu option to set the character set). The font used to render subtitles is good-looking and very readable, so no problems there, even if your eyesight is not perfect. Subtitles can be turned on and off with the Subtitle button on the remote.
  • DLNA playback. The Panasonic TX-P42-ST50 can access media stored on your PC, eliminating the need to plug in a USB device. It’s simple enough on the TV end: I connected the TV to a router (both using Wi-Fi and an Ethernet cable) and it just worked. However, it took me two hours to get Windows 7 to share my media with the TV – the process is totally unintuitive and involves both reconfiguring your network sharing options and enabling some non-obvious settings in Windows Media Player. Even though I eventually succeeded in this, I was having constant trouble whenever I added a new folder to the shared library: it would take a long time for it to appear on the TV. Finally, I installed a free application called Serviio (by Petr Nejedly) which works perfectly. It allows you to share any folder on your PC with your TV.
  • Subtitles with DLNA playback. One drawback to DLNA playback on the Panasonic TX-P42-ST50 is that external subtitles in SRT files are not supported – only internal MKV subtitles work. (This might be a limitation of the DLNA standard, not of the TV.) To remedy this problem, you can download a free application called MKVmerge (part of MKVToolNix), which can combine a video file (MKV or MP4) and an SRT file into a single MKV file with built-in subtitles. Initially, I was reluctant to take this route, but it turned out to be pretty painless: just drag and drop two files, click “Start”, wait 30-60 seconds and you’re done. Another quirk is that with DLNA playback you cannot turn on subtitles with the Subtitles button on the remote, like you can with USB playback. Instead, you have to press the Option button and choose Subtitles.
  • Internet access. The TV can run third-party apps. Several apps are included with the TV. The selection depends on your country. (In Poland, the best one is iPlex.pl, which allows you to watch older and less popular movies free of charge in exchange for watching commercials for about 5 minutes.) One troubling thing I noticed about the third-party apps is that each one seems to have a different interface. Instead of using the pause button on the remote to pause a video, they’ll force you to press OK, then use the arrow keys to find the “pause” option on their own special menu. Of course the special menu is different in each app. Good luck explaining to a non-techie why something as simple as a pause button cannot work the same way everywhere. The situation is even worse with more “advanced” features like rewinding or finding a show to watch. A couple of words come to mind when contemplating this, the mildest being “idiocy”. As a side-effect of this “explosion of creativity”, many apps simply feel slow, probably because the TV’s CPU is unable to cope with all those custom animations. Panasonic should just enforce common interface guidelines for all apps; otherwise, we’ll have braindead developers each doing their own dance and thinking they look sexy.
  • YouTube app. There is a YouTube application (of course, with its own way of doing things), but I’m not quite sure if it will be of use to anybody. People usually find YouTube videos on various websites, blogs etc. Since no one in their right mind is going to browse the Web on this TV, there is the question of how exactly the YouTube app is supposed to be used. Perhaps it could be useful as a kind of “view later” feature, where you add videos to a playlist on your PC, and then watch them later on a big screen. However, even this limited functionality is not supported well due to the poor user interface of the app and the fact that, amazingly, it does not support the playback of HD videos. You can probably see why I tried it once and then never launched it again.
  • Recording (PVR).The Panasonic TX-P42-ST50-E has a PVR feature, which allows you to record digital TV. You can record on demand or you can tell the TV to record a later show for you (you can even choose the show to record on the TV Guide screen on digital TV, which is pretty convenient). There are, however, several important limitations to this feature:
    • It only works on USB hard drives (USB flash memory is not supported). I was able to successfully use this feature with a USB-powered WD Passport drive.
    • Before you can use it, you have to allow the TV to reformat the drive. This means you have to give up the whole drive for recorded programs.
    • Only digital broadcasts can be recorded. On the plus side, the quality of the recording is great. I was able to record digital terrestrial TV (DVB-T) broadcasts in HD and the quality is indistinguishable from the original. I think the TV simply saves the original MPEG stream.
    • The recorded videos are inaccessible on other devices, even on other units of the same TV. The drive is formatted with the UFS file system, which means it is inaccessible on Windows. The video files themselves are encoded and have not been successfully reverse-engineered so far. So there is no way to view the recorded videos on your PC.
    • You cannot watch one channel and record another at the same time. You can only watch the channel which is being recorded. To watch another channel, you have to stop the recording. This means the PVR feature will be useless if you want to record a football game while your wife watches Downton Abbey.
    • The TV has to be left on standby. If you set it to record a show and then another person turns the TV off completely, you’re out of luck.
  • Noise. First, the good news. Unlike last year’s Panasonic 42” plasmas, this model has no fans (bigger versions do have them). The bad news is that there is a buzzing sound (voltage transformers?) which ranges from inaudible to easily audible (this is when the TV is displaying bright images). At its worst, I can easily hear it in a quiet room at a distance of about 3 meters. The noise is not very high-pitched, so the annoyance factor is not high, in my opinion. If this was a computer display, the noise would of course be totally unacceptable, but, seeing as you rarely use a TV to watch things with no sound, it does not seem a big problem because in most cases whatever you’re watching will mask the sound completely. In any case, I’ve seen DVD drives that made more noise, not to mention the ambient noise you get when watching a movie in a cinema (I’m talking about you, popcorn lovers!). Since the buzzing sound has been reported by other users (usually as not very annoying), I believe it’s a property of the model and not an issue with my specific unit.
  • Heat. The back of the display gets warm in the upper left and right corners. The amount of generated heat seems modest for a display of this size. Certainly this is no heater.
  • Build quality. The set, manufactured in the Czech Republic, feels very solid and well-finished. Nothing feels cheap or flimsy. The build quality certainly inspires confidence in the reliability of the product.
  • Power consumption. Because this is a plasma display, it uses up about twice as much power than an LED display of the same size.
  • Minor annoyances. When watching with headphones on, it’s hard to control the volume because the headphone volume control is buried deep in the menu system.

Conclusions

When I bought the Panasonic TX-P42-ST50-E, I had already read Internet reviews of it, so I was not surprised to see the flawless cinematic picture quality of this TV. I was, however, pleasantly surprised at how well it handles digital content. Having read user comments about the limited video format support in previous Panasonic models, I was fully prepared to shell out $100 for an external media player such as the WD TV Live. This has turned out to be completely unnecessary, which is great because it eliminates the need for another device with yet another remote control.

If you are looking for a modern TV with top-notch picture quality and good support for digital media playback at a reasonable price, you’ll be doing yourself a disservice if you don’t at least check out the Panasonic TX-P42-ST50-E.

My only reservation concerns the buzzing noise that can occasionally be heard when displaying bright images. If you are highly sensitive to noise, it’s probably a good idea to make sure you will be able to return the TV in case you find the buzz too irritating. Personally, I don’t have a problem with it, and I’ve spent significant amounts of money and time to silence my PC, so chances are you won’t either.

(For tons of discussion and nerdy details, check out the official threads at AVSForum and HighDefJunkies.)

Tags:

iTunes 10.2.1 fails to decode MP3 files properly

It is generally assumed that all major MP3 playback software produces the same output. The reason for this thinking is that the MPEG standard defines a decoder in a strict way, allowing only small deviations due to rounding.

A few years ago, I was disabused of that idea when I did an informal test to compare several well-known music players (iTunes 7, Winamp, Foobar2000, Windows Media Player). The test revealed iTunes 7 to be the outlier producing different output from the rest of the pack.

Today, I will present the results of a more rigorous test using the latest version of iTunes (10.2.1).

Test setup

  • Windows 7 Professional SP1 (32-bit) with all the latest updates
  • Auzentech X-Meridian 7.1 sound card
  • Cool Edit Pro 2.0 audio editing software

Tested players

  • Windows Media Player 12.0.7601.17514
  • Winamp 5.56
  • Foobar2000 1.1.5
  • iTunes 10.2.1

MethodolOgy

I played two 10-second MP3 clips in each player, recording the output digitally with Cool Edit Pro 2.0 using the S/PDIF loopback mechanism provided by the sound card driver.

All postprocessing options (crossfade, sound check, etc.) were turned off. Both application and system volume were at 100%.

I used the following MP3 files:

  1. a 10-second clip from Wszystko Ch. by Elektryczne Gitary encoded with LAME 3.97 at 256 kbps ABR with high encoding quality (download file)
  2. a 10-second clip from Time by Pink Floyd encoded with LAME 3.98 at 256 kbps ABR with high encoding quality (download file)

After recording in Cool Edit Pro (as a 16-bit, 44.1 kHz file which matched the source material), I saved each stream as a text file, which looked like this:

-354	-172
1	203
-447	-443
-2490	-3088
-3504	-3676
-3233	-2944
-3206	-3867
-2829	-4348
-2391	-4461
-2196	-4165...

I also opened each of the two MP3 files directly in Cool Edit Pro 2.0 and then saved it as a text file. This file was used as a reference: the output of each player was compared against it. Cool Edit Pro 2.0 uses a Fraunhofer MP3 decoder (Fraunhofer IIS is the institute where MP3 was developed).

I opened the text files in Notepad++ and synchronized them by discarding the initial silence in each file. The goal was to make sure that the first sample in each file corresponded to the start of the clip to enable direct sample-by-sample comparison.

After synchronization, each text file was opened in Cool Edit Pro 2.0 again. Each waveform was subtracted from the reference waveform to reveal the differences.

Results

Each waveform below shows the difference between the reference output stream (Cool Edit Pro 2.0 with Fraunhofer decoder) and the output stream produced by an MP3 player.

Wszystko Ch. – Windows Media Player 12

wszystko-wmp

Wszystko Ch. – Winamp 5.56

wszystko-winamp

Wszystko Ch. – Foobar2000 1.1.5

wszystko-foobar

Wszystko Ch. – iTunes 10.2.1

wszystko-itunes10

As you can see, Windows Media Player, Winamp and Foobar2000 all produced output that matched the reference stream very closely. A review of the text files showed that all three players produced virtually identical bitstreams: the differences between individual samples and the reference stream did not exceed 1, or in rare cases, 2. These differences were not large enough to register on the waveform view, even with magnification.

iTunes 10.2.1, however, added significant distortion that can be seen in the waveform above. In some cases, the samples deviated from the reference values by as much as 5 percent (e.g. 1811 instead of 1719). You can also download the above waveform as a WAV file to hear the “enhancement” added by iTunes. It basically sounds like a very high-pitched sound (> 15000 Hz) of an uneven volume. The ability to hear it will depend on your age: younger listeners will find it more prominent. (Of course, during normal music listening this sound would be very hard to hear.)

The output generated by iTunes 10.2.1 did not depend on the output setting in QuickTime (which iTunes uses to play audio). DirectSound, WaveOut and Windows Audio Sessions all produced the same output.

Time – Windows Media Player 12

time-wmp

TimeWinamp 5.56

time-winamp

Time Foobar2000 1.1.5

time-foobar

TimeiTunes 10.2.1

time-itunes10

Again, Windows Media Player, Winamp and Foobar2000 match the reference stream, while iTunes engages in creative decoding. In this sample, the distortion is smaller: personally, I cannot hear anything when I play the above waveform.

Conclusions

Cool Edit Pro 2.0, Windows Media Player 12, Winamp 5.56 and Foobar2000 1.1.5 all decoded the MP3 clips in virtually the same way. iTunes 10.2.1, on the other hand, produced a distorted output stream. While the distortion is probably inaudible in normal listening situations, it seems to mean that the latest version of iTunes fails to conform to the MP3 standard and is probably best avoided by users who care about audio fidelity.

Notes

In further tests using the same samples, I found that iTunes 9.2.1 matched the reference stream as well as WMP, Winamp and Foobar2000 – it would therefore seem that it decodes MP3 files properly. I also evaluated MediaMonkey 3 and detected very significant distortion (much larger than iTunes 10.2.1), even after disabling as many postprocessing options as I could find (volume leveling, clipping protection, crossfade, smooth stop/seek/pause, remove silence – did I miss anything?).

Check out the thread at Hydrogenaudio for an interesting discussion and independent measurements which confirm my findings.

Added November 2012: I made some quick measurements with the latest version of iTunes (10.7.0.21 for Windows) and it seems that it decodes MP3 files properly.

Added November 2014: I tested iTunes 11.4 for Windows (with DirectSound playback enabled) on one of the files and it is close enough to the reference waveform.

Tags: