Things I’ve learned, published for the public benefit
Hope This Helps header image

How to clean eyeglasses

Picture of Ludwik dishwashing liquidI’ve worn eyeglasses since I was 3 years old. A few years ago, I started getting annoyed with the dust and grease that keep building up on my glasses. Maybe it’s old-age grumpiness kicking in, or maybe it’s because I started to use LCD displays whose immaculate picture quality sensitized me to any blurriness between the LCD matrix and my retina.

Anyway, I started cleaning my glasses regularly. The problem was that I couldn’t figure out a good cleaning technique. First, I tried washing my glasses with running water and then drying them up with towels. That didn’t work so well for the grease and the towels (either cloth or paper) would leave tons of lint on my glasses. So I bought a professional microfiber cloth, the same kind that I use for cleaning photographic lenses, and some isopropyl alcohol (isopropanol), the stuff that they put in those overpriced “lens cleaning kits” you’ll find in the photography section of your electronics store. That was a lot better than my previous technique, but the alcohol would not clean off all the grease, which was impossible to remove completely with the cloth.

Well, I’ve finally figured it out. (Actually, I wish I had. I learned about this technique from my optician.) The answer is dishwashing liquid (AKA dish soap).

  1. Rinse your glasses under running water.
  2. Put a bit of dishwashing liquid on one of the lenses, then use your fingers to gently rub the liquid on both sides of both lenses.
  3. Rinse glasses again to remove the dish soap. You don’t need to use your fingers to get the dish soap off – just use running water. You should be looking at perfectly clean lenses with a few drops of water on them. If there’s any grease or other spots, repeat steps 2 and 3.
  4. Use a microfiber cloth to gently clean off remaining water drops. Use light touches – there might be small pieces of dirt on the cloth and if you rub it too hard, they might scratch the lenses. The microfiber cloth leaves no fluff, so your glasses should be perfectly clean.

It’s really a perfect combination. The dish soap dissolves all the grease, so you don’t get any smudges when you use the microfiber cloth. The microfiber cloth removes the remaining water drops and (non-greasy) stains made by evaporating water, and leaves no lint. The result: pristine-looking glasses in one minute.

What’s more, this technique is fairly convenient to use. Many online how-tos recommend special eyeglass-cleaning sprays or vinegar, which may be expensive or unavailable. On the other hand, most people have dish soap in their kitchen, so the only special accessory you need is a microfiber cloth, which costs $7 (for a top-quality one) and can be re-used for years. And even that isn’t really necessary, as paper towels or tissues work almost as well.

→ 54 Comments

Google Toolbar shows incorrect PageRank for HTTPS pages

If you are a webmaster looking to improve your Google ranking by arranging links from high-PageRank sites, you should be wary of pages whose URL starts with “https” (secure pages). As I have found, the Google Toolbar reports incorrect (usually inflated) PageRank for such pages, so that a secure page which appears to have a PageRank of 8 may in fact have a much lower PR or may not even be indexed at all.

When you visit a secure (https) page, the PageRank you see in the Google Toolbar is not the PageRank of the page — instead, the toolbar shows you the PR of the root page of the domain.

  • https://addons.mozilla.org/en-US/firefox/user/1754094 is the user info page for Jose Bolanos, one of the many developers at the Mozilla add-on community website. In both IE and Firefox, it appears to have a PageRank of 8. Now, if that were true, Jose would truly have a reason to smile. Unfortunately for him, what the toolbar is actually showing is the PageRank of https://addons.mozilla.org, the homepage of the add-ons site. If you try any other page on the site (e.g. this one or this one), you will see that they all have a PR of 8.
  • The Google Adsense login page (https://www.google.com/adsense/login/en_US/) appears to have a PR of 10 (higher than adobe.com) not because it has so many incoming links or because Google has given it an artificial boost — it’s just that the Google Toolbar is reporting the PR of google.com.
  • This obscure page at the Michigan State University and all its subpages appear to have a PR of 8. Of course, by now we know that what we’re really seeing is the PR of the MSU homepage (www.msu.edu) and the actual PR of the page is unknown.

The bug seems to affect the latest versions of the Google Toolbar for IE and Firefox. However, I have seen an earlier version of the toolbar that did not suffer from this problem, so I believe the issue is version-dependent.

→ 7 Comments

Should we care about ABX test results?

Policy no. 8 in the Terms of Service of the respected audiophile community Hydrogenaudio states:

8. All members that put forth a statement concerning subjective sound quality, must — to the best of their ability — provide objective support for their claims. Acceptable means of support are double blind listening tests (ABX or ABC/HR) demonstrating that the member can discern a difference perceptually, together with a test sample to allow others to reproduce their findings.

What a breath of fresh air. Other audio forums are full of snake-oil-peddling and Kool-Aid-drinking evangelists who go on and on about how replacing $200 speaker wires with $400 speaker wires “really opened up the soundstage and made the upper-midrange come alive”. The people at Hydrogenaudio know that such claims demand proper scientific evidence. How nice to see that they dismiss subjective nonsense and rely instead on the ultimate authority of ABX tests, which really tell us what makes a difference and what doesn’t.

Except that ABX tests don’t measure what really matters to us. ABX tests tell us whether we can hear a difference between A and B. What we really want to know, however, is whether A is as good as B.

1.

“Wait a second!”, I hear you exclaim. “Surely if I cannot tell A from B, then for all intents and purposes, A is as good as B and vice versa. If you can’t see the difference, why pay more?

Actually, there could be tons of reasons. To take a somewhat contrived example, suppose I magically replaced the body of your car with one that were less resistant to corrosion, leaving all the other features of your vehicle intact. Looking at the car and driving it, you would not notice any difference. Even if I gave you a chance to choose between your original car and the doctored one, they would seem identical to you and you could choose either of them. However, if you were to choose the one I tampered with, five years later your vehicle’s body would be covered in spots of rust.

The obvious lesson here is that “not seeing a difference” does not guarantee that A is as good as B. Choosing one thing over another can have consequences that are hard to detect in a test because they are delayed, subtle, or so odd-ball that no one even thinks to record them during the test.

But how is this relevant to listening tests? Assuming that music affects us through our hearing, how could we be affected by differences that we cannot hear?

In his fascinating book Burning House: Unlocking the Mysteries of the Brain, Jay Ingram describes the case of a 49-year-old woman suffering from a condition called hemispatial neglect (the case was researched by neuropsychologists John Marshall and Peter Halligan). Patients with hemispatial neglect are unable to perceive one (usually the left) side of the objects they see. When asked to copy drawings, they draw only one side; when reading out words, they read them only in half (e.g. they read simile as mile).

burning-houseIn Marshall and Halligan’s experiment, the woman was given two simple drawings showing two houses. In one of the drawings, the left side of the house was covered in flames and smoke; the houses looked the same otherwise. Since the flames were located on the left side, the patient was unable to see them and claimed to see no difference between the drawings. When Marshall and Halligan asked her which of the houses she would rather live in, she replied — rather unsurprisingly — that it was a silly question, given that the houses were identical.

However, when the experimenters persuaded her to make a choice anyway, she picked the flameless house 14 out of 17 times, all the time insisting that both houses look the same.

Marshall and Halligan’s experiment shows (as do other well-known psychological experiments, including those pertaining to subliminal messages) that it is possible for information to be in a part of the brain where it is inaccessible to conscious processes. This information can influence one’s state of mind and even take part in decision-making processes without one realizing it.

If people can be affected by information that they don’t even know is there, then who says they cannot be affected by inaudible differences between an MP3 and a CD? Failing an ABX test tells you that you are unable to consciously tell the difference between two music samples. It does not mean that the information isn’t in your brain somewhere — it just means that your conscious processes cannot access it.

So the fact that you cannot tell the difference between an MP3 and a CD in an ABX test does not mean that an MP3 is as good as a CD. Who knows? Maybe listening to MP3s causes more fatigue in the long run. Maybe it makes you get bored with your music more quickly. Or maybe the opposite is true and MP3s are actually better. We can formulate and test all sorts of plausible hypotheses — the point is, an ABX test which shows no audible difference is not the end of the discussion.

2.

I have shown that the lack of audible differences between A and B in an ABX test does not imply that A is as good as B. Before you read this post as an apology for lossless audio formats, here is a statement that will surely upset hard-core audiophiles:

The fact that you can tell the difference between an MP3 and a CD in an ABX test does not mean that the MP3 is worse than a CD.

First of all, the differences between MP3s encoded at mainstream bitrates (128 kbps and 192 kbps) and original recordings are really subtle and can be detected only under special conditions (quiet environment, good equipment, full listener concentration, direct comparisons of short samples). Because the differences are so tiny, we cannot automatically assume that it is the uncompressed version that sounds better. Subtle compression artifacts such as slightly reduced sharpness of attacks on short, loud sounds may in fact be preferred by some listeners in a direct comparison.

Secondly, even if we found that the uncompressed version is preferred by listeners, that wouldn’t necessarily mean that it is better. People prefer sitting in front of the TV to exercising, but the latter might make them feel much better overall. If it were discovered, for example, that compressed music is less tiring to listen to (this is of course pure speculation), then that fact might outweigh any preference for uncompressed sound in blind tests.

Summary

The relevance of ABX tests to the lives of music lovers is questionable. Neither does the absence of audible differences imply equal quality, nor does the presence of audible differences imply that the compressed version is inferior. Rather than being the argument to end all debate, the results of ABX tests are just one data point and the relative strengths of various audio formats may well be put in a new light by further research.

→ 21 Comments

Blind-testing MP3 compression

Among music listeners, the use of lossy audio compression technologies such as MP3 is a controversial topic. On one side, we have the masses who are glad to listen to their favorite tunes on $20 speakers connected to their PC’s onboard audio device and couldn’t care less what bitrate MP3s they get as long as the sound quality is better than FM radio. On another side, we have the quasi-audiophiles (not true audiophiles, of course, as those would never touch anything other than a high-quality CD or LP player properly matched to the amplifier) who stick to lossless formats like FLAC due to MP3’s alleged imperfections.

If I considered myself part of either group, my life would be easy, as I would know exactly what to do. Unfortunately, I fall somewhere in between. I appreciate music played through good equipment and I own what could be described as a budget audiophile system. On the other hand, I am not prepared to follow the lead of the hard-core lossless format advocates, who keep repeating how bad MP3s sound, yet do not offer anything in the way of objective evidence.

So, me being me, I had to come to my own conclusions about MP3 compression. Is it okay for me to listen to MP3s and if so, what bitrate is best? To answer these questions, I spent many hours doing so-called ABX listening tests.

What is an ABX test?

An ABX test works like this: You get four samples of the same musical passage: A, B, X and Y. A is the original (uncompressed) version. B is the compressed version. With X and Y, one is the original version (same as A), the other is the compressed version (same as B), and you don’t know which is which. You can listen to each version (A, B, X or Y) as many times as you like. You can select a short section of the passage and listen to it in each version. Your objective is to decide whether X = A (and Y = B) or X = B (and Y = A). If you can get a sufficient number of right answers (e.g. 7 times out of 7 or 9 times out of 10), you can conclude that there is an audible difference between the compressed sample and the original sample.

What I found

  1. The first thing I found was that telling the difference between a well-encoded 128 kbps MP3 and a WAV file is pretty damn hard. Since 128 kbps is really the lowest of the popular MP3 bitrates and it gets so much bad rap on forums like Head-Fi, I expected that it would fail miserably when confronted with the exquisite work of artists like Pink Floyd or Frank Sinatra. Not so. Amazingly, the Lame encoder set at 128 kbps (ABR, high quality encoding) held its own against pretty much anything I’d throw at it. The warm, deeply human quality of Gianna Nannini’s voice in Meravigliosa Creatura, the measured aggression of Metallica’s Blitzkrieg, the spacious guitar landscapes of Pink Floyd’s Pulse concert — it all sounded exactly the same after compression. There were no changes to the ambiance of the recording, the quality of the vocals, the sound of vowels and consonants, the spatial relationships between the instruments on the soundstage, or the ease with which individual instruments could be picked out.
  2. That said, MP3s at 128 kbps are not truly transparent. With some training, it is possible to distinguish them from original recordings in blind listening tests. My trick was to look for brief, sharp, loud sounds like beats or certain types of guitar sounds — I found that compression takes some of the edge off them. Typically, the difference is so subtle that successful identification is only possible with very short (a few seconds long) samples, a lot of concentration and a lot of going back and forth between the samples. Even then, the choice was rarely obvious for me; more often, making the decision felt like guessing. Which of the identical bass riffs I just heard seemed to carry more energy? A few times I was genuinely surprised that I was able to get such high ABX scores after being so unsure of my answers.
  3. With some effort, it is possible to find passages that make the difference between 128 kbps MP3 and uncompressed audio quite obvious. For me, it was just a matter of finding a sound that was sharp enough and short enough. In David Bowie’s Rock ‘n Roll Suicide, I used a passage where Bowie sings the word “song” in a particular, Dylanesque way (WAV file). Another example is a 1.2-seconds-long sample from Thom Yorke’s Harrowdown Hill (WAV file). The second beat in the sample is accompanied by a static-like click (clipping) that is considerably quieter in the compressed version. More samples that are “difficult” for the MP3 format can be found on the Lame project page (I found the “Castanets” sample especially revealing.).
  4. What about higher bitrates? As I increased the bitrate, the differences that were barely audible at 128 kbps became inaudible and the differences that were obvious became less obvious.
    • At 192 kbps, the Bowie and Yorke samples were still too much of a challenge and I was able to reliably tell the MP3 from the original, though with much less confidence and with more going back and forth between the two versions.
    • At 256 kbps (the highest bitrate I tested), I was not able to identify the MP3 version reliably — my ABX results were 7/10, 6/10 and 6/7, which can be put down to chance.

Caveats

Obviously, the results I got apply to my particular situation. If you have better equipment or better hearing, it is perfectly possible that you will be able to identify 256 kbps MP3s in a blind test. Conversely, if your equipment and/or hearing is worse, 192 kbps or even 128 kbps MP3s may sound transparent to you, even on “difficult” samples.

Test setup

  • Lame MP3 encoder version 3.98.2. I used Joint Stereo, High Quality, and variable bitrate encoding (ABR).
  • Foobar2000 player with ABX plugin. I used ReplayGain to equalize the volume between the MP3 and the original file — otherwise I found it too easy to tell the difference in ABX tests, since MP3 encoding seems to change the volume of the track somewhat.
  • Auzentech X-Meridian 7.1 — a well-respected audiophile-quality sound card with upgraded LM4562 op-amps.
  • RealCable copper jack-RCA interconnect.
  • Denon PMA-350SE amplifier — an entry-level audiophile receiver designed in England.
  • Sennheiser HD 25-1 II, top-of-the-line closed headphones with stock steel cable.

When I write that there was an audible difference in an ABX test, I mean that I got 7/7 or 9/10 correct answers without repeating the test.

Conclusions

If my goal was to use an MP3 bitrate that is indistinguishable from the original in a blind listening test, I would use 256 kbps, since that is the bitrate which I was unable to identify in a reliable way, despite repeated attempts on a variety of samples (including the “difficult” samples posted on the Lame website).

Whether I will actually standardize on 256 kbps, I’m not sure. The fact that a 192 kbps MP3 can be distinguished from the original in a contrived test (good equipment, quiet environment, high listener concentration, specially selected samples) does not mean it is unsuitable for real-world scenarios. Sure, at 192 kbps the music is not always identical to the original, but judging by my experiments, the difference affects less than 1% of my music (in a 100-second sample, more than 99 seconds would probably be transparent). Even if all I did was listen to this tiny proportion of my music, I would be in a position to perceive the difference less than 1% of the time (what percent of the time do I listen to music in a quiet environment? what percent of the time am I really focused on the music as opposed to other things I’m doing?). Besides, there is the rarely-posed question of whether “different” necessarily means “inferior” — it is quite possible that subtle compression artifacts might actually improve the perceived quality of music in some cases.

→ 25 Comments

Heartburn remedy: Ground flaxseed

About a month ago, I had a bad case of heartburn (AKA “acid indigestion”). After two days of having my stomach “burp” hydrochloric acid up my digestive tract, my esophagus (the part of the digestive tract that’s directly above the stomach) got inflamed and the pain would persist even when there was no acid.

I had to do something about it, and, of course, I turned to the Internet for advice. Here’s what I tried based on my research:

  • First, I started taking Maalox (mixture of magnesium hydroxide and aluminum hydroxide) chewable tablets. I also considered Manti, but it is 50% more expensive (in Poland at least) and the only difference is that it also contains simethicone, which helps relieve excess gas. Maalox helped neutralize the acid, providing instant relief, but I found I had to take it at least once every 2 hours. That didn’t look like a good remedy to me.
  • Some obscure sites recommended eating Jonagold apples for heartburn and GERD (gastroesophageal reflux disease), which is a kind of chronic heartburn. The idea is that apples contain pectin, which neutralizes stomach acid, and Jonagolds have the most pectin. I could find no reputable sources recommending apples as a remedy, but I thought it was worth a try. What I found was that the apples worked well — they actually provided longer-lasting relief than Maalox.

Even though I had found some ways to temporarily relieve my symptoms, my esophagus kept hurting. My mother suggested that I drink ground flaxseed, which she considers a good remedy for indigestion. Obviously, I scoffed at this advice. After all, I spent many hours researching heartburn on the Web and there was not one place that listed flaxseed as a possible remedy. Just to be sure, I ran a quick Google search, and — sure enough — absolutely no link between heartburn and flaxseed. Was I going to trust the whole of the world’s medical knowledge or my mother’s uneducated guesses?

Later that day, my mom made me some ground flaxseed with hot water, which I drank reluctantly, and — you guessed it — now the Internet shall have a page linking flaxseed and heartburn, because the flaxseed worked like a charm. Not only did it instantly kill the burning pain in my esophagus, it also seemed to neutralize the acid (or somehow shield my digestive tract from it). After days of popping Maalox, eating apples and watching my diet, I finally felt I was getting to the underlying cause of my condition. As an added bonus, I find the taste much better than Maalox. [Update: For a few more weeks, I still had the burping and an acidic taste in my mouth, especially after eating a considerable meal on a relatively empty stomach. But I had no burning pain, and I credit flaxseed with this change. Also, I did not have to take flaxseed every couple of hours, like antacids. In my case, the effects persisted for a really long time.]

Certainly I could be an isolated case, or I guess it could be some weird coincidence that I got better immediately after drinking the flaxseed. I don’t know if flaxseed will relieve your heartburn. But the results I experienced were too dramatic for me to keep this to myself.

The specific product that I used was Len mielony (podwójnie odolejony) made by Herbapol Lublin, a Polish manufacturer of herbal products. The name translates as Ground flaxseed (doubly de-oiled) and it is a sort of “diet” version of regular ground flaxseed with less flaxseed oil and fewer calories. It is a pretty fine powder with a faint smell reminiscent of pumpkin seeds. You prepare it by simply pouring a glass of hot water over a teaspoon of the flaxseed powder.

In Poland, ground flaxseed is readily available in most drugstores. I’m not sure how easily it can be purchased in other countries. You can probably get it in health food stores and the like. If you suffer from heartburn and you can find it, I suggest you give it a shot.

About heartburn causes: I don’t know what caused my episode. The Internet tells me that heartburn is commonly caused by an incompetence of the lower esophageal sphincter (LES), which is a kind of valve that connects the stomach with the esophagus. There is a long list of possible causes of this “incompetence” (eating too much, eating wrong, smoking, hiatus hernia, etc.). About the only thing that I was able to identify with was eating shortly before bedtime, and I have adjusted my habits accordingly. So far this seems to have helped, although I sometimes feel some acidity, which seems to occur when I make long breaks between meals.

→ 55 Comments