(This post is overly subjective, more opinionated than my usual efforts, and contains some cussing. Consider yourself warned.)
I am sick and tired of people who shun technology and progress under the guise of “I’m an old tech veteran, I’ve been working with technology for 30 years, and the new stuff is crap compared to the old stuff.” People who defend this viewpoint are idiots. I’m not talking about audiophiles or other self-delusional “prosumers”; I’m talking about people who have worked a tech trade or had hands-on access to technology for many years and think that their perceptions trump reality. It’s a perverse combination of technology and anti-intellectualism — a form of hipsterism for the over-40 set.
I was prompted to cover this by a recent post on why widescreen monitors are a rip-off (which I will not link to because I truly enjoy the other 99% of this person’s blog, and linking to it would imply that I don’t like him or his site), but the underlying irritation of the entire mindset has been percolating for many years. Viewpoints that drive me crazy include:
Widescreen monitors don’t make any sense
People think that widescreen monitors are stupid on laptops because most people use laptops for text work, and since text is more comfortable to read in columns, wide columns are harder to read. This mindset has had the doubly idiotic result of making people think that websites need to be column-limited. I just love going to a website and having the text squished into a 640-pixel-wide column with 75% of the screen unused. Don’t like how narrow columns look on a widescreen monitor? Use the extra space however you want — put up two web pages side by side, or simply don’t look at the unused space. It’s people like these that also complain that 4:3 video has black bars on either side of it when viewed on a widescreen TV. It’s called pillarboxing, you idiot, and it’s there to prevent your movie from looking like a funhouse mirror.
Widescreen monitors have made modern laptops better. A widescreen laptop monitor allows the keyboard to be wider without the depth of the laptop getting too high (to support the height of a 4:3 monitor). Having a decent keyboard on a laptop used to be impossible without clever wacky engineering tricks; now it is. Widescreen monitors made ultra-small netbooks possible, so if you’re reading this on a netbook but somehow still disagree with me, you’re a hypocrite.
Analog audio is better than digital
There are entire websites (and wikipedia pages) dedicated to this, usually under the guise of “vinyl is better than CD”. Most opinions on this subject were formed when analog audio had several decades of mature mastering and production processes, and digital was brand-new (for example vinyl vs. CD in 1983). Early efforts to put things on CD resulted in some less-than-stellar A/D conversion, which created a type of distortion that most people weren’t used to hearing. People formed opinions then that have perservered more than 25 years later, even though the technology has gotten much better and all of the early mastering problems have long since been corrected.
People who think vinyl sounds better than CD have nostalgia blinders on. They bought an album in their youth, played it endlessly, loved it. Then they buy the same album on CD decades later and condemn the entire format as inferior because it sounds different. Want to know why it sounds different? It has a wider frequency range, lacks rumble, lacks hiss, sounds exactly the same after 10+ playbacks, and was remastered with better technology and mixing conditions under the guidance and approval of the original artist when he wasn’t coked or drunk or stoned out of his mind. People like Pete Townsend, Neil Young and Geddy Lee not only approve of the latest digital technology but are actively utilizing it and going through great pains to remaster their classic albums with it. People are missing the point that it is the mastering and digital compression that causes issues, not the technology itself. Neil Young recently spoke at a conference where he damned digital music, but not because it is digital — rather, because it is delivered differently than the artists intended. Neil Young would like nothing better than for everyone to be able to listen to his music at 24/192. Can’t do that on vinyl, bitches.
Even people who write about the loudness war get it wrong, despite that it’s an easy concept to understand. Massive dynamic compression drowns out subtle details and can add distortion, which is horrible — but it is not exclusive to digital audio, nor caused by it. One author correctly notes that massive dynamic compression butchers mixes, but then subtlety implies that all CDs that “clip” have distorted audio. Digital audio “clips” only if you drive the signal beyond its digital limits. If you took an audio waveform and normalized it such that the highest peak reached exactly the highest value, it is “positioned at maximum volume”, not clipped. Nothing is lost (to be fair, nothing is gained either).
The problem is the mastering and production process, not the technology. Which segues nicely into:
“I will never buy Blu-ray”
The only valid argument against Blu-ray is that it is harder to make a backup copy of the content. It is indeed harder than it is for DVD, or laserdisc, or videotape. That is it. All other arguments are beyond moronic. Even the cheapest possible 1080p HDTV viewing setup has five times the resolution of DVD and lacks signal degradation in the output path. If you view a Blu-ray and can’t tell the difference between it and DVD, you have either a shitty viewing setup, a shitty Blu-ray, or a shitty visual cortex.
Someone recently tried to argue with me that DVDs have the same or better picture than Blu-ray and used Robocop as an example. The comparison was weighted, as they were comparing the $9 Blu-ray that MGM belched out when Blu-ray was only a year old to the Criterion DVD treatment. I own both, so I checked them out and I agree that the DVD has better color tonality throughout the film. However, the Blu-ray thoroughly stomped the DVD in every single other area, most obviously resolution. So much picture detail is added by the increase in resolution that I actually prefer it despite the lack of Criterion oversight.
The real problem, as previously stated, is how the mastering and preproduction process was handled. Even with new 2012 DVD releases, you can still see the “loudness war” video equivalent of digital ringing, which used to be an accident but was later introduced on purpose as part of a misguided “sharpening” step. Listen up: Any sharpening filter added to any signal doesn’t make things sharper; it makes them appear sharper by overlaying a high-frequency permutation signal over the original content, which increases the acutance. Quality is actually lost when you do this, as the high-frequency info obscures actual picture detail.
This is another example of perception vs. reality, which not coincidentally also segues into:
“Computing was better in the old days”
I love retrocomputing as a hobby. I think about it nearly every day; this blog was partially created to talk about vintage computing. But even I wouldn’t say that things were better in the old days. People who say this don’t realize they are really trying to say something else. For example, people who say that “BBSes were better than web forums are today” are actually referring to the sociological fact that, when you communicated with people on a BBS, you were communicating with people who met a minimum level of technical competence — because, if they hadn’t, they would have been too stupid to access a BBS, let alone be proficient with a computer. The overall technological quality level of everyone you met on a BBS in the 1980s was higher than other places, like a laundromat or a bar. What such people fail to consider is that modern web boards, while having a higher quotient of trolls and B1FFs, are open to the entire world. The massive scale of humanity you can encounter on even a tiny niche topic is levels of magnitude higher than it used to be. The sheer scale of information and interaction you can now achieve is staggering, and completely outweighs any minor niggle that you have to deal with 3 or 4 more asshats per day now.
Here’s another example: “Computer games were better back in the old days.” This is wrong. The proper thing to say is that “Some computer game genres were better back in the old days.” I can get behind that. For example, graphics were so terrible (or non-existent!) at the birth of computer gaming that entire industries sprang up focusing on narrative. For such genres (mainly adventure games), several times more effort was put into the story than other genres. As technology and audiences changed over time, such genres morphed and combined until they no longer resembled their origins. That doesn’t mean modern games are terrible; it just means that you need to shop around to get what you’re looking for your entertainment. Don’t play Uncharted 2 expecting a fantastic story with engaging narrative. (Dialog, maybe, not not narrative.) Heck, some genres are genuinely awesome today compared to 30 years ago. For example, Portal and Portal 2 are technically puzzle games, but the storytelling in them — despite never interacting directly with a human — is among the very best I’ve ever encountered.
About the only argument that does work involves the complexity of older computers — they were simpler, and you could study them intensely until you could very nearly understand every single circuit of the board, nuance of the video hardware, and opcode of the CPU. Today, a complete understanding of a computer is no longer possible, which probably explains why Arduino sets and Raspberry Pi are getting so much attention.
I have no conclusion. Stop being an old-fogey anti-intellectual technophobe, you ignorant hipster fuck.