Oldskooler Ramblings

the unlikely child born of the home computer wars

Archive for the ‘Technology’ Category

How to transcode UHD 4K HDR rips for lower bandwidth to an LG C8 OLED via Plex

Posted by Trixter on April 29, 2022

(This post has been edited with new information)

The LG Cx series of OLED TVs have terrible network chipsets in them: They can do more than 100mbps over 5GHz wifi, but can only do 100mbit/s over ethernet (my experiments with a USB-to-ethernet adapter were mixed). So what happens if you want to stream to your TV over ethernet because you can’t use wi-fi?

Included below is an ffmpeg script I’ve used to transcode UHD 4k blu-ray rips down to a bandwidth that can be handled without trouble using my LG C8’s ethernet connection, which I used during a period when I was unable to use the TV over 5GHz wifi (since corrected, thankfully). It leverages a modern nvidia card to do the transcode without any CPU usage, and preserves the HDR10 information. The end result is worse than the source if you pixel-peep, but if you’re sitting 10 feet away from your projector, it’s perfectly fine — and it’s certainly better than a blu-ray rip of the same material. Here’s the script:

REM This creates a Plex Versions proxy that preserves as much quality
REM as possible without exceeding an LG C8's ethernet 100mbit/s capabilities.
REM Call this batch file from the plex directory containing your main movie.
REM
REM The 120M bufsize represents a 3-second window @ 40M vbr that a max of 60M
REM can be sustained.
REM

mkdir "Plex Versions\LG C8"

for %%a in ("*.*") do ffmpeg -find_stream_info -hwaccel auto -i "%%a" -map 0 -c copy -c:v hevc_nvenc -pix_fmt p010le -tune:v hq -preset:v p7 -color_primaries bt2020 -color_trc smpte2084 -colorspace bt2020nc -spatial_aq:v 1 -temporal_aq:v 1 -b_ref_mode middle -profile:v main10 -tier:v high -b:v 40M -maxrate:v 60M -bufsize:v 120M "Plex Versions\LG C8\%%~na.mkv"

Posted in Digital Video, Entertainment, Technology | Leave a Comment »

My IBM PCjr Print Media Archival Project

Posted by Trixter on April 25, 2021

While I’m not the #1 PCjr fan in the world — that honor goes to Mike Brutman — I consider myself in the top five. I’ve owned, used, and programmed for the PCjr for decades. A flawed problem child, the PCjr was an underdog that never fully met IBM’s expectations, but it succeeded at something much greater: With its 16-color graphics, 3-voice sound, and early support from Sierra, it showed the world that PCs could be treated seriously as viable gaming machines. Because of this, I’ve evangelized the PCjr, given extended PCjr history presentations, and even set up comprehensive PCjr exhibits. So you could say I’m a PCjr superfan.

Along these lines, I’m happy to announce the results of a years-long scanning project: A gigantic cache of IBM PCjr resources: Books, magazines, newsletters, catalogs, adverts, and technical and repair information. So what does that mean?

Books

Let’s start with over 20 PCjr-specific books, covering topics from introductions to personal computing, all the way down into technical details about how the PCjr’s enhanced features work. You can pick these up here: http://ftp.oldskool.org/pub/drivers/IBM/PCjr/Books/

Newsletters

There’s also a complete run of the Eugene PCjr Club newsletter (over 135 issues), as well as a complete run of jr Newsletter out of Connecticut (75% of which are new 600 DPI scans). There were at least 32 (!) different PCjr clubs during PCjr’s lifetime, but only a few had long and comprehensive newsletters as these two. The Eugene PCjr Club was the longest-run active PCjr club in the world, starting in 1984 until disbanding in 2002, and from 1985 onward they had their own newsletter.

Reading these is not only a nostalgic trip back in time, but also chock full of surprisingly relevant information to vintage computer hobbyists today. They continued coverage where the magazines left off, reporting on which new hardware add-ons and modifications you could perform on a jr, iincluding a potential 286 upgrade, VGA upgrade, hard drives, and more; they also had many tips on getting software to run on the not-quite-compatible PCjr. You can pick up the entire Eugene PCjr Club and jr Newsletter runs, as well as other PCjr newsletters (check out The Orphan Peanut, prepared completely on a 768K PCjr!), here: http://ftp.oldskool.org/pub/drivers/IBM/PCjr/Newsletters/

Heck, there’s even 21 issues of The Junior Report, a newsletter from “The PCjr Club” which I never knew about during their heyday, which surprised me since they were held in Schaumburg, Illinois — practically in my back yard at that time.

Magazines

I’ve also managed to archive complete runs of most magazines that were dedicated only to the IBM PCjr, such as Peanut, PCjr Magazine, and even PCjr World, a special insert included in PC World magazine for a few issues. (These jr-specific magazines are rare, and I acquired them at considerable expense, so please give a moment of silence to thank them for their sacrifice.)

Additionally, I’ve managed to scan very many magazine excerpts from other magazines that covered PCjr. Some of these excerpts were quite good and comprehensive, from using PCjr as a cheap scientific data acquisition platform, to detailed accounts of what was happening with PCjr during its original time period. You can grab the magazines and excerpts completed thus far here: http://ftp.oldskool.org/pub/drivers/IBM/PCjr/Magazines/
(I’m still working on the complete run of “COMPUTE! For the PC and PCjr” as well as “jr”; if you can lend or donate issues for scanning, please let me know.)

Catalogs

Finally, I’ve archived some catalogs, which can serve as a collector’s checklist of all the PCjr-specific hardware and software it was possible to use with your PCjr. The PC Enterprises catalogs list some esoteric stuff that is nearly impossible to find, and IBM’s The Guide has some gorgeous product shots of PCjr and other hardware. There are also catalogs from Computer Reset, Paul Rau Consulting, and others. Pick up all the catalogs here: http://ftp.oldskool.org/pub/drivers/IBM/PCjr/Catalogs%20and%20Price%20Lists/

What’s in it for me?

All of these are high quality scans and fully text-searchable. Advanced techniques were used to ensure the highest quality possible at reasonable sizes. You will not find any JPEG “mosquito noise” compression artifacts, screened printing moire patterns, or unreadable text in these scans.

While I originally did this for new PCjr owners so that they could get up to speed quickly, there is a lot of nostalgic waxing and trivia for grizzled old collectors too. For example, there’s references to third-party hardware and modifications that I never knew existed until I started this project (a reset button, an SVGA sidecar, 286 upgrade, quad-cartridge-port adapter, EMS sidecar modification, etc.), esoteric program patches to get equally esoteric software working on PCjr, and even trivia like what the “L” port was originally meant to be used for.

Thanks

I’d like to thank Louie Levy for donating most of the Eugene PCjr Club newsletters to me for this project, and L. Teague for many jr Newsletters, PCE catalogs, and other materials.

FAQs:

Q: Can I ______ these files?
A: I don’t care what you do with these files, as long as you’re sharing and enjoying them and don’t utterly destroy my bandwidth. Please leech responsibly, preferably at 512KB/s or less.

If you want to upload these to The Internet Archive, go for it; just let me know what the collection links are so that I can edit this post and link to them. Someone has already done some of these files piecemeal, without acknowledging my efforts, but that goes with the territory; we (archivists) are used to it. If you want fame and fortune, being an archivist is a pretty terrible way to go about it.

Q: What is your scanning process?
A: Funny you should ask.

Q: The PDFs are great, but–
A: Don’t worry, OCD friends: The original physical pages are being stored off-site, and I also made a copy of the raw unprocessed 600 DPI scans if newer and better technology becomes available.

Posted in Technology, Vintage Computing | 2 Comments »

How to reasonably archive color magazines to PDF

Posted by Trixter on July 14, 2020

During a conversation with one of my archival collectives, the topic of archiving color magazines came up. Our goal was to distribute scans of the material as PDF, primarily because of its ubiquity of viewing software, but also because OCR’d text could follow the images, making the magazine searchable without requiring the user to perform OCR. However, most of us haven’t started archiving our magazines, because it’s an extremely daunting task. Color magazines are notoriously annoying and difficult to scan to digital form because:

  • Most were printed using screened printing, whose tiny high-contrast dots hurt compression ratios, and produce moiré patterns when scanning at, or resizing to, lower resolutions
  • The high number of pages in color magazines (300, 400, or even 500 pages per publication) makes using a flatbed scanner a tedious process, as well as resulting in a very large set of data per magazine (if preserving quality is a concern)
  • Some magazines print almost all the way into the binding, leaving only a few millimeters of margin at the gutter, which prevents traditional book scanners, both flatbed and camera-based, from capturing the inner 1 CM of printed material

However, we’re in possession of several magazines that the original publisher hasn’t archived and aren’t available in the wild, so we decided to experiment with various scanners, software, and methods to see what was possible, while staying within the limits of what is practical.

While everyone has their own views on what’s important (size vs. quality, speed vs. accuracy, effort vs. volume, etc.), I came up with a set of rules and processes for myself that I’ll be following, and would like to share them. I held myself to the following goals:

  • PDF file sizes should not exceed 1MB per page on average. In 2020, and for the next 5 years at current broadband capacities and growth, a file size of 500MB for giant magazines, or 100MB for modest ones, is appropriate. This isn’t because of total size — storage is cheap — but rather because of transfer rates. I could easily scan a 500-page magazine to 30 GB of TIFF files (which I’ve done many times), but it’s not practical to share 30GB per magazine with online repositories. And besides, I’m not made of money, and some online repos may balk at an attempted upload of 7 TB (approx. 20 years of a large magazine’s print run).
  • Pages should be scanned at 600 DPI. This preserves the screening which can be dealt with later if necessary. It also ensures that very fine print will not only be legible, but able to be OCR’d. (Even if 300 DPI material is eventually needed for extremely large publications to stay under 1GB, the 300 DPI material can be obtained by resizing the 600 DPI material, instead of re-scanning the entire document.)
  • No matter the amount of processing, text should never dip below 600 DPI. This is less of a preference and more of a way to ensure that very fine print, such as a magazine’s masthead/impressum, is legible.
  • All screened material should be de-screened. If the scanning system has a proper de-screening option (a real one that asks for the LPI of the source material, not just a dumb blur filter), it will be turned on during scanning (and the results checked afterwards). If no such option exists, all 600 DPI (and better) scans will be run through a proper de-screening process. I have had excellent results with the Sattva Descreen plugin and endorse it for this. Descreening screened material not only improves the quality of screened images by removing the screening pattern, but results in smaller files (no matter the compression method) due to what is effectively noise reduction.
  • Mild degradation of images is appropriate as long as the text legibility itself is preserved. (Acrobat and DjVu can both do this, although some repositories aren’t accepting DjVu any more.)

To achieve these goals at the highest legibility but the smallest file size, I follow these practices:

  • Destroy the magazines. If you cut the binding off, you have flat sheets that you can run through an ADF or sheet-fed scanner. You can cut very close to the binder glue, giving the inner printing a change to be scanned. It’s a sacrifice, but I feel preserving information printed on paper is more important than preserving the paper. I bought a guillotine paper cutter for $120 specifically for this purpose.
  • Use a high-quality sheet-fed duplex scanner with a configurable TWAIN driver. Usually people think of the Fujitsu ScanSnap series for this, and that was what I first purchased, but the ScanSnap series’ software is not configurable, and it’s only 9 inches wide which prevents scanning some material. I was lucky enough to acquire a Fujitsu fi-series scanner second-hand. This line of professional office scanners have an extremely configurable TWAIN driver that allows groups of settings to be saved into profiles appropriate for various kinds of material. And while it’s not a photo scanner, it does a more than acceptable job of scanning color magazines (better than the ScanSnap, which always has washed-out colors). Would I use it for scanning photos or artwork? No, but it’s my first choice for scanning entire books or magazines. This can be a case of spending some real money, but you do get what you pay for.
  • Pay for Acrobat. Real, commercial Acrobat supports JPEG2000 compression, which outperforms JPEG in both size and quality. But more importantly, it has a feature that can drastically reduce large PDFs called Adaptive Compression. It works by separating text and line drawings on a page into their own monochrome layer that is compressed losslessly. Then, the image that remains after the text has been lifted is downsampled and recompressed. This results in much smaller files without compromising the legibility of text and the sharpness of line drawings. (This feature may have been inspired by DjVu, whose early claim to fame was doing exactly this.) Finally, commercial Acrobat can perform OCR without requiring additional software.

With those rules and methods set, I performed many tests with a lot of material, and came up with a set of best practices that met my criteria. I compiled those practices into a handy flowchart:

I’ve continued to put this flowchart into practice with a lot of material, including mixed-content manuals (color, grayscale, and B&W material in the same manual), 500-page color screened magazines, 8.5×11″ photocopied material, dot-matrix printouts, and printed books. In all cases, I follow the flowchart until the size is reasonable for the material, and I’ve never been disappointed or felt like I was giving up too much quality for the file size. (What is “reasonable” is different for everyone according to personal preference, goals, and motivation, so it’s up to you to determine what that size eventually is.)

I hope that this information will help you finally tackle your own stacks of magazines that, like me, have been leering at you ominously for years from the various corners of your abode.

Posted in Lifehacks, Technology | Tagged: , , , | 14 Comments »

You cannot violate the laws of physics

Posted by Trixter on May 4, 2018

It’s technology refresh time at casa del Trixter.  I’m dabbling in 4K videography, and upgrading my 9-year-old i7-980X system to an i7-8700K to keep up.  Another activity to support this is  upgrading the drives in my home-built ZFS-based NAS, where I back up my data before it is additionally backed up to cloud storage.  The NAS’ 4x2TB drives were replaced with 2x8TB and 2x3TB (cost reasons) in a RAID-10 config, and it mostly went well until I started to see disconnection errors during periods of heavy activity (ie. a zpool scrub):

Apr 30 19:32:07 FORTKNOX kernel: sd 0:0:2:0: [sdc] Device not ready
Apr 30 19:32:07 FORTKNOX kernel: sd 0:0:2:0: [sdc] 
Apr 30 19:32:07 FORTKNOX kernel: Result: hostbyte=DID_OK driverbyte=DRIVER_SENSE
Apr 30 19:32:07 FORTKNOX kernel: sd 0:0:2:0: [sdc] 
Apr 30 19:32:07 FORTKNOX kernel: Sense Key : Not Ready [current] 
Apr 30 19:32:07 FORTKNOX kernel: sd 0:0:2:0: [sdc] 
Apr 30 19:32:07 FORTKNOX kernel: Add. Sense: Logical unit not ready, cause not reportable
Apr 30 19:32:07 FORTKNOX kernel: sd 0:0:2:0: [sdc] CDB: 
Apr 30 19:32:07 FORTKNOX kernel: Read(16): 88 00 00 00 00 00 08 32 11 70 00 00 01 00 00 00
Apr 30 19:32:07 FORTKNOX kernel: end_request: I/O error, dev sdc, sector 137498992
Apr 30 19:32:07 FORTKNOX kernel: sd 0:0:2:0: [sdc] Device not ready
Apr 30 19:32:07 FORTKNOX kernel: sd 0:0:2:0: [sdc]

At first I thought the drive was bad, so I replaced it.  I then saw exactly the same types of errors on the replacement drive, so to make sure I wasn’t sent a bad replacement, I tested the drive in another system and it passed with flying colors.  So now the troubleshooting began:  Switch SATA ports on the motherboard:  No change.  Switch SATA cables: No change.  Switch SATA power cables: No change.  Switch SATA cables and ports with one of the drives that was working:  No change; that specific drive kept reporting “Device not ready”.  I even moved the drive to a different bay to see if the case was crimping the cables to the drive when I put the lid back on:  No change.

It was really starting to confuse me as to why this drive wouldn’t work installed as the 4th drive in my NAS.  I started to doubt the aging Xeon NAS motherboard, so I bought a SAS controller and a SAS-to-SATA forward breakout cable so that the card could handle all of the traffic.  This seemed to work at first, but eventually the errors came back.  I then started swapping SATA breakout ports, then entire SAS cables, then eventually a replacement SAS controller.  In all instances, the errors eventually came back on just that single drive, a drive that worked perfectly in any other system!

The solution didn’t present itself until I started building my replacement desktop system based on the i7-8700k.  In that system, I opted for a modular power supply to keep the cable mess at a minimum (highly recommended; I’ll never go back to non-modular PSUs).  When I was putting my video editing RAID5 drives into the new desktop, I noticed with irritation that each of the modular SATA power cables only had three headers on them instead of four.  This sucked because I was hoping to use one SATA power breakout cable for all four drives, and now I’d have to use two cables which added to the cable clutter inside the case.  This power supply was Gold rated, high wattage — why only put three SATA power headers on a breakout cable?  In thinking about the problem, I came to the conclusion that the makers of the power supply were likely being conservative, to avoid exceeding the limits of what that rail was designed to provide.

And that’s when I remembered that I was putting four drives on a single rail back on the NAS, and not three like the new power supply was enforcing.  When I moved the misbehaving NAS drive to a SATA power header on another rail, all of the drive disconnection problems went away.  Whoops.

How did this work before?  The power draw of 2x8TB + 2x3TB drives was just high enough to be dodgy, when the previous configuration of older 4x2TB drives was not.  The newer drives draw more power than the older drives did.

Lesson learned, and now I have spare controllers and cables in case there’s a real failure.

Posted in Technology | 7 Comments »

Comedy gold in the Handy specification

Posted by Trixter on August 12, 2015

The Atari Lynx is my favorite handheld console.  Planning and design started in 1986, long before the Nintendo Game Boy, but development took long enough that it was actually released half a year after the Game Boy.  Whether or not that makes it the first or second handheld console is up for discussion, but it was definitely one of the first two.

History shows that the Lynx had an unsuccessful run compared to other handheld consoles of the time, which includes the Game Boy, Game Gear, and TurboExpress.  Lynx’s failure in the market was split fairly evenly between three culprits:

  1. Price: $179 vs. Game Boy’s $100
  2. Battery life: It took 6 AAs to power the unit for only 4 hours
  3. A lack of compelling licensed or original titles

The first hardware revision was also somewhat large, but as a guy with large hands, that never bothered me.

There were some killer arcade ports to the Lynx that really showcased what the system was capable of, such as Road Blasters, Stun Runner, and Klax.  But these were ports of Atari games; Lynx never got the “big” licensees such as Mortal Kombat or Street Fighter (a double shame considering the Lynx supported true hardware multiplayer linkups).

I recently sought out the Lynx hardware specification so that I could reminisce, sadly, about all of the untapped power Lynx had that was never realized.  The Lynx was so powerful that it shamed other handhelds for nearly a decade after it was discontinued:

  • 4096-color palette
  • Unlimited sprites with scaling, clipping, and collision (multiple modes!) all performed in hardware
  • 4-channel stereo sound supporting both algorithmic and sampled waveforms (in other words, it could play both pure-hardware chiptunes and Amiga mods)
  • Multiplayer cable connections up to 8 simultaneous players/units
  • Supported left-handed play via the ability to flip the screen

There were other cool bits for the time it came out too, like a limited 32-bit math-coprocessor (used for some true 3-D polygon games like Hard Drivin’ and Steel Talons).  It wasn’t perfect; the system would slow down if you pushed it too hard (too many sprites, too many sampled sounds playing simultaneously), but it was creative and ambitious.

The Lynx started life as a hardware project by Epyx, initially called “Handy” (because it was to be the first handheld console).  Ex-Amigans RJ Mical and Dave Needle were involved in the design, which is why Handy feels like a tiny Amiga.  The initial specification was written in June of 1987 by Dave Needle.  Reading through it, two things are immediately evident:

  1. The designers of Handy had a passion for creating a fun device with as many progammer assists as possible
  2. Dave had a wry sense of humor

I will now reproduce for you some of the comedy gold hiding in the Handy specification.  I’ve truncated some sequences for brevity, but the words are all Dave’s:

The human input controls consist of a 4 switch (8 position) joy stick, two sets of 2 independent fire buttons, game pause button, 2 flablode buttons. power on, and power off….Flablode is a Jovian word meaning a device or function that we know is required or desired but for which we don’t have an actual definition (noun: flabloden, verb: to flablode).

3. Software Related Hardware Perniciousness
(or why some software people hate some hardware people)

There are certain things that the software ought not to do to the hardware. While these things are not physically destructive on their own, they will cause the hardware to act unpredictably strange, which may cause the user to be physically destructive. While we certainly could have protected the system from many of the following problems, I doubt that we would have found them all. In addition, the act of software doing one of these things means that there is a problem in the software anyway and the intended function would probably not happen. Additionally, this is only a toy. If this unit were a bio-medical device I would have found a safer solution.

3.1 Don’t Do These Things.

If you do any of the things that I tell you not to do, the hardware will act unpredictably and will not be expected to recover without a complete system initialization. So don’t do them.

3.1.5 Palettes and Page Breaks

This one is an actual hardware bug of mine…(much technical info removed)…Pen index numbers C,D,E, and F will be incorrectly unchanged. Sometimes I am such a jerk.

3.2 Please Don’t Do These Things.

There are also things that software people do that merely annoy the hardware people rather than annoy the end user. This includes seemingly harmless activities like sub-decoding bit values from byte descriptions (sprite types, for instance), or assuming address continuity between areas of hardware…Please don’t do them. It would be difficult to list all of the possibilities, so I will list the general considerations and ask you to be sensible about them. In addition, please feel free to submit a request for an exemption-on a specific case. If it is granted, we will change the specification so as to make the special case forever legal. The price will be small.

3.3 Do It This Way

Some of the hardware functions, as designed by us mindless brutes, require special handling. As we discover these requirements, I will list them here.

4.3 CPU Sleep

BIG NOTE: Sleep is broken in Mikey. The CPU will NOT remain asleep unless Suzy is using the bus. There is no point in putting the CPU to sleep unless you expect Suzy to take the bus. We will figure out how to save power some other way.

All hardware collision detection is done with the data in the collision buffer, not the data in the video buffer. This has obvious advantages and will be explained at seminars and in newsletters forever….In addition, the software can either do its own collision detection, or use the contents of the collision buffer for some weird collision detection algorithm. In either event, I will be mortally offended.

In addition, this means that when some of the bytes are not reloaded, the length of the SCB will be smaller by the number of bytes not used. If I have said this in a confusing manner, then I have.

Well, I finally found the bug that required a pad byte of 0 at the end of each scan line of data. But, It is actually 2 bugs. I have fixed one of them, but the other requires an extensive change. Too bad, I am out of time. Therefore: There is a bug in the hardware that requires that the last meaningful bit of the data packet at the end of a scan line does not occur in the last bit of a byte (bit 0). This means that the data packet creation process must check for this case, and if found, must pad this data packet with a byte of all Os. Don’t forget to adjust the offset to include this pad byte. Since this will only happen in 1/8 of the scan lines, it is not enough overhead to force me to try to fix the bug. Sorry.

This bit can be used to pause the sprite engine. The ‘unsafe’ condition of the internal registers is not directly affected by this bit.

I need to think about how to use it.

Receive parity error can not be disabled. If you don’t want it, don’t read it….We have just discovered that the calculation for parity includes the parity bit itself. Most of us don’t like that, but it is too late to change it.

11.7 Unusual Interrupt Condition

Well, we did screw something up after all. Both the transmit and receive interrupts are ‘level’ sensitive, rather than ‘edge’ sensitive. This means that an interrupt will be continuously generated as long as it is enabled and its UART buffer is ready. As a result, the software must disable the interrupt prior to clearing it. Sorry.

11.8 left out

I left something out. I know what it is but by the next time I revise this spec, I may have forgotten.

I have forgotten.

12.1.4 Bugs in Mathland

BIG NOTE: There are several bugs in the multiply and divide hardware. Lets hope that we do not get a chance to fix them.
1. In signed multiply, the hardware thinks that 8000 is a positive number.
2. In divide, the remainder will have 2 possible errors, depending on its actual value. No point in explaining the errors here, just don’t use it. Thank You VTI.

12.4 Qbert Root

As a compromise between the square root and qube root desires of the software people, and the schedule desires of the management, we have decided to incorporate the function of QbertRoot. The required steps are:
1. Start a 16 by 16 multiply.
2. Immediately write to ‘E’ which will try to start a divide.
3. Read the result from “D,C,B,A’.

(editor’s note:  I can’t tell if QbertRoot is an actual function, or trolling.)

16. Known Variances From Anticipated Optimums
This is a list of the known bugs in the hardware.

…It will be OK to write to the twentyfour 16 bit registers of the Suzy SCB PROVIDING that you only do it via the MACRO PROVIDED TO YOU BY RJ MICAL. In addition, you must understand that the contents of this macro may change if future revisions of the hardware so require. In addition, you must understand that future hardware may make the process not work and therefore the macro will be changed to be a warning that you can’t use it anymore.

Don’t cheat.

My personal favorite, saved for last:

7.3 Stereo

The 4 audio channels can be individually enabled into either of the two output drivers. This is not now in the hardware. It may never get into the hardware. After all, I hate music. However, since I acknowledge that I am wrong about some kinds of music (in the right circumstances, with me not present) I have graciously allocated one entire byte for the possible implementation of another useless and annoying sound feature.

(For the record, that feature did go into the hardware, in the second hardware revision of the Lynx.  If you have the later hardware, some audio is in true stereo.)

If all hardware/technical documentation was written this way, I’d be an embedded systems programmer by now.

I’ve scanned and OCR’d the full Handy specification for anyone wanting to learn more about the Lynx.  There’s a compo-winning demo hidden in the hardware, just waiting to be found.

Posted in Gaming, Programming, Technology, Vintage Computing | Tagged: | 1 Comment »

Sony Xperia Z3v impressions and workarounds

Posted by Trixter on January 10, 2015

The Xperia Z3v is a very odd hybrid of a phone that is being marketed as a flagship for the current generation of smartphones; it was released in October 2014 and is a Verizon exclusive (other carriers have the older Z2 or Z3).  There is a nearly criminal lack of coverage in the media for this phone, so I thought I’d rectify that with my thoughts on the phone after two months of use.  First, some background:

We switched the entire family over from Sprint to Verizon (more expensive, but you get what you pay for) and part of the terms of the switch was that we all get new phones.  As I was a long-time Samsung customer (Epic 4G, Galaxy S4) I was planning on going with the S5, but wanted a few things the S5 couldn’t give me, like stereo front-facing speakers.  After reviewing everything Verizon offered that met my requirements, I decided to try the Xperia Z3v under the agreement that I could return it after 14 days if I wanted to switch to another phone.  Because the phone is best-in-class in a few areas, I’ve decided to keep it, accepting that a few aspects of the phone need workarounds.

The Z3v is a combination of the Z2 (larger, thicker body; slower CPU) and the Z3 (camera, screen).  It’s a frankenphone that only Verizon offers.  Let’s start by describing the basic features of the phone that drew me to it:

  • 20 megapixel camera sensor
  • 1080@60p and 4K@30 video recording
  • IP65/68 rating (dustproof, waterproof up to 5 feet for up to 30 minutes)
  • Front-facing stereo speakers
  • Dedicated physical camera shutter button
  • Wireless charging

(It has more features than these obviously, like remote PS4 playing, but these are the only ones that interested me.)  Sounds awesome right?  Well, it mostly is.  Based on my experience, here’s what “mostly awesome” means:

Pros

The camera.  As a point’n’shoot, the Z3v is one of the best I’ve ever had.  The 20mp sensor, coupled with firmware borrowed from the Sonty CyberShot line of cameras means that it shoots great automatic pictures, they look like the stock photos of EyeEm.  In default “auto” mode, which is what you get if you press the camera shutter button to wake up the phone and go straight to the camera app, it uses the 20MP sensor to oversample the scene and provide both free HDR shots and stabilization.  It is smart enough to turn off stabilization if it notices the camera is on a tripod, and tells you it is doing so with a small status indicator.  Actually, it’s smart enough to do all sorts of things that I won’t bother mentioning here — just know that the Z3v is good enough that I don’t carry a dedicated camera any more.  Is it a replacement for a DSLR?  No, of course not.  But it is definitely a replacement for a sub-$300 point’n’shoot.  The shutter button even performs a press-half-to-focus-lock, press-full-to-shoot function.

4k video.  Being able to shoot this is not terribly practical, but it does work, and you do see some additional fine detail that you don’t see in 1080p mode.  4K mode is useful if you can’t decide on the framing of a 1080p shot and you want the ability to crop without losing detail.  It works best outdoors; there’s no point in using it in low light.

It’s waterproof.  Will I be shooting underwater?  No.  Will I be secretly grateful that my accidental fumble of the phone into the toilet won’t completely destroy it?  Absolutely.

Active noise-canceling for free.  This feature isn’t advertised heavily, but if you purchase the custom “Sony Digital Noise Cancelling Headset (MDR-NC31EM)” earbuds for $45 and stick them in the phone, the Z3v will 1. recognize they are in, and 2. enable active noise-cancelling.  This works because the earbuds have little microphones on them that sample the environment, which the phone then generates the inverse waveform for in certain bands and mixes that into the output.  While the earbuds aren’t the most comfortable things to have in for more than an hour, the features does work well — better than noise-isolation earbuds which I’ve used for a decade — and I’m thankful to have them on my commute.  I haven’t noticed any distortion, but I listen to mostly spoken material on my commute.

Wireless charging.  With a cheap Qi charger, this simply works, which is great because the USB/charging port is normally behind a waterproof door you have to keep opening and closing when connecting via that port.

Battery life.  The battery life on this phone is simply amazing given what the phone is capable of.  I can go two days on a single charge, and that includes 3-4 hours of screen-on time per day.  If that’s not good enough for you, there are two classes of power-saving modes with multiple levels of customization, the latter of which shuts down everything except calling and texting and can stretch a single charge up to seven days.  Geezus.

Sounds too good to be true?  Well…

Cons

The 20MP mode is disappointing.  The camera normally shoots everything at 8MP.  If you want the true resolution of the sensor, you can enable 20MP in “manual mode”.  It works, and you have some customization over the shot, but it’s disappointing because the sensor and lens are small enough that there is no appreciable extra detail captured in the 20MP shot.  I’ve done comparisons with the phone on a tripod in a lot of light and there was just no advantage: I scaled the 20MP shot down to 8MP in photoshop and it didn’t look any better; I did a 100% crop of a few locations in both images and the 20MP didn’t have any more detail, mostly just fuzzier larger sections.  So, it’s sort-of useless, and I don’t use it.

The phone is slippery.  The front and back are glass, and the edges are a rougher plastic material.  The edges aren’t enough for me to keep a good grip on the phone at all times.

The native camera apps offer little customization.  If you want to shoot video under odd circumstances, or use the camera on a tripod to take nice stills, the native camera app — even in manual mode —  lacks a few things that make it difficult.  There’s no true manual focus or manual white balance.  You can pick focus “types” and white balance “types” but the focus and exposure are always auto-adjusting.  And the 4K video mode offers no customization whatsoever; it’s 100% auto-adjust.

60p isn’t really 60p.  For some inexplicable reason, the camera shoots video slightly slower than 59.94 or 60fps which are the only framerates considered 60p broadcast standard.  Out of several tests, the resulting videos had variable framerates, all nonstandard, like 59.32 and 58.86.  This leads to slightly jerky video over longer lengths of time, and can cause issues editing in post.  One video I shot and uploaded directly to YouTube without editing shows up as “1080p59”.  (The 30p video modes were all 29.97 like they’re supposed to be, so that’s good at least.)

4k video mode overheats the phone.  Seriously.  The first time you use it, you’ll get a pop-up saying that if the camera overheats, your video will be stopped, saved, and then the camera will shut down to cool off.  Sure enough, it does all that after about 5-7 minutes of 4K video shooting.  This, coupled with the 60p framerate issue noted above, seems very bubblegum-and-shoestring to me.  But, good luck getting those fixed, because:

Frankenphone = orphan.  The Z3v was the result of a partnership between Verizon and Sony; it is a hybrid of the Z2 and Z3.  As a result, neither company will fully support the phone.  I’ve tried to report the firmware bugs noted above to both companies, and both companies tell me to contact the other company.  Sony tells me that Verizon supports the phone exclusively, and Verizon tells me that any firmware bugs in the camera are the responsibility of the manufacturer.  Which really sucks, because:

Playing with the alarm volume can lock up the phone.  If you adjust the volume of an individual alarm a few times, then this hilarious thing happens: When the alarm time comes, it does not make noise but instead locks up the phone.  You have to mash and/or hold down the power button to get out of this state until the phone eventually reboots.  I was late to work one day because of this.  It would be nice to be able to report this bug to someone, but oh well.

The front-facing speakers aren’t as loud or clear as they could be.  My son used to have an HTC One M7 and his audio was louder and clearer than on the Z2v despite the hardware being almost 2 years older.  It’s not bad; just don’t assume it’s a replacement for good headphones.

The Stock youtube app doesn’t allow pre-downloading.  This feature was removed by YouTube at some point, angering hundreds of thousands of commuters, myself included.  I used the stock YouTube app on my Galaxy S4 for a full year for this reason so I could predownload videos in my “Watch Later” list to view on the train, and the Z3v app is fully updated and doesn’t allow caching of videos.

These were initially very big disappointments and I almost returned the phone because of them.  After some research, here’s how I mitigated them:

Workarounds

Slippery: The Verizon store had a $20 cheap flexible plastic case that I put on it just to make it less slippery until I found something else.  I haven’t found anything else, so it’s still on there.  I tried carbon fiber stickers; while they looked nice, all they did was make it more slippery.  Trying to search Amazon or other stores for “Xperia Z3v case” doesn’t work well because you keep getting results for the Z2 or Z3, both of which have different dimensions than the Z3v.

Lack of manual camera options:  I found that OpenCamera works with my phone and supports locking focus, white balance, and exposure.  This allows me to shoot videos in very odd conditions, such as a reflective glass computer CRT that emits colored light.  It doesn’t support the 60p or 4k modes of the phone because those are manufacturer-specific and have no exposed API.

Odd 60p videos:  moveyourmoneyproject.org created this script to “fix” 60p videos so that they can be edited in post-production without causing problems:

A = FFAudioSource("MOV_0001.mp4")
V = FFVideoSource("MOV_0001.mp4")
AudioDub(V, A)

# Force compliant framerate (will adjust audio rate to match)
AssumeFPS("ntsc_double", true)

# Resample adjusted audio rate back to 48KHz
SSRC(48000) #if crashes, use ResampleAudio() instead

Inability to pre-download YouTube videos:  TubeMate now provides that function.  It’s clunky and buggy, but it works well enough to keep my commutes from becoming too boring.

Alarm volume adjustments lock up phone:  Adjust the alarm volume using the Settings->Alarm path instead.  Whatever you set it to, all new alarms will inherit, and you can adjust that all you like without consequences.

Conclusion

I think it’s a great phone if the above Cons don’t affect you and you’re looking to join Verizon and get a new phone before April 2015.  (After April, I believe the new Samsung is coming out, and it remains to be seen how that compares.)

Most people will use the phone on full auto, and it is very, very good at that.  Just don’t expect manual fine-tuning of a few things.

Posted in Digital Video, Technology | 1 Comment »

Cyberpunx

Posted by Trixter on October 5, 2014

October is “National Cyber Security Awareness Month”, whatever the hell that means.  In recognition of this dubious designation, I’ve made an HD remaster of the 1990 documentary Cyberpunk available.  Consisting of interviews with William Gibson, Jaron Lanier, Timothy Leary, Vernon Reid (from Living Color), and Michael Synergy, and briefly featuring a few industrial bands such as Front 242, Manufacture, and Front Line Assembly, the documentary provides a look at what the cyberpunk movement was circa 1990.  Subjects such as cyber-terrorism, cybernetic implants/enhancement, virtual reality/telepresence, and general counterculture rebellion against “The System” are touched upon.  Inevitable comparisons with Akira are made.

Here Be Dragons

While the producer and director did an admirable job making the source material interesting and presentable to the public, there are a lot of flaws with the documentary.  Some are minor and can be overlooked, such as the 1990s trend of inserting faux computer graphic overlays (to try to make the material more similar to the world Gibson painted in Neuromancer).  Many of the problems are with pacing; there are entire sections that focus on a particular subject for too long, sometimes without impact.  One section in particular goes so long that different digital effects start to fade in and out after a few minutes, almost as if the editor was bored and resorted to doing something with the image to keep the viewer’s interest.

There are also some very misrepresented facts and predictions, but it’s not really fair to criticize a documentary for failing to predict the future correctly.  That being said, there are some real howlers in here, from the supposed power hackers wield(ed) against governments, to the silly, amateur computer graphics that obscure hackers’ identities, to the heavily hinted-at concept that Neuromancer itself was responsible for shaping technology and history.  The most egregious is equating hacker with cracker (although, to be fair, that’s happened multiple times before and since).

A special mention must be given to Michael Synergy, who perfectly embodies the huckster who started believing his own bullshit.  Some of his claims in the documentary are so utterly, patently ridiculous, so incredibly pretentious, that it takes a great deal of willpower not to scream at him when he’s talking (especially when he mispronounces the word “genre”).  Were I him, I would have wanted this stage in my life to disappear, and it seems as if that wish has come true: His moniker disappeared with the 1990s.  My personal wild speculation is that once the real, actual revolution of the web occurred and it was able to finally call him out, he quietly exited stage left.  (Last I heard, he worked for Autodesk in the mid-1990s, was going by his birth name again, living in Hawaii, working in IT; if anyone has a real update, I would love to know what actually happened to him.)

Most depressingly, there is a real missed opportunity with how Jaron Lanier’s involvement was portrayed.  In the documentary, he comes across as a stoner who only mentions VR, which is a shame because — then and now — he’s the most relevant and accurate representation of a hacker that the documentary includes.  Of everybody interviewed, Jaron is the only person who is still exploring these concepts and ideas, and more importantly their unintended fallout, which you can read about in his most recent book Who Owns The Future?.  (Even if you don’t buy the book, follow that link and read the Q&A to get a feeling for his concerns.)

Worth watching?

While it may be hard to sit through, the documentary retains glimpses of the innocent, wildly-optimistic, techno-hippie idealism that grew with the rise of personal computing and networking.  For that nostalgia factor alone — the time when the Internet existed but the World-Wide Web did not — it’s worth an hour of your time.  It’s also worth watching to catch which ideas were especially prescient, such as:

  • Whoever holds the most information holds the most power
  • Every device will be interconnected
  • Physical boundaries will not impede meaningful communication
  • People will be individual, mobile, uncensored “broadcast stations” (considering I can post to youtube from my phone, I’d call this a reality)
  • The “matrix” as a concept and/or allegory for God (later realized almost to the letter in The Matrix movie trilogy)

…and so on.  You could make an interesting drinking game out of catching which ideas succeeded (although you’d get more drunk, quickly, by catching all of the stupid and inaccurate comments).

Cyberpunk: The Documentary is now available at archive.org.  Grab the MPEG-TS file if able; it’s 60p, Blu-ray compliant, and won’t take up too much space in your memory implant.

Posted in Digital Video, Entertainment, Technology | Tagged: , | 1 Comment »

Hardware for an OpenIndiana ZFS file server

Posted by Trixter on October 6, 2013

It’s hard to be an Illumos user.  The amount of hardware that works correctly with OpenIndiana (my favorite OS right now) is not very well defined and relies on confirmations from the user community whether something works or not.  There are other ways to build a ZFS NAS, such as FreeNAS, but I’ve been using ZFS since it was in Solaris (x86) 10u3 and have followed the path of the devout:  Solaris 10 x86, to OpenSolaris, to OpenIndiana.

This blog post is not about how to build a ZFS fileserver; there are enough out there.  Rather, this post is about what hardware I chose to build mine.  I wanted to spend less than $1000, build something that had future storage upgrade potential, and had a motherboard + CPU that someone else had already verified as good for OpenIndiana.  I wanted ECC memory to further protect against what ZFS already protects against, and finally I wanted to boot off of a USB flash drive.

Here’s what I came up with.  Please forgive the NewEgg formatting, but at least the links are intact in case you want to go buy something:

Qty. Product Description
1 COOLER MASTER HAF series RC-912-KKN1 Black SECC/ ABS Plastic ATX Mid Tower Computer Case COOLER MASTER HAF series RC-912-KKN1 Black SECC/ ABS Plastic ATX Mid Tower Computer Case
Item #: N82E16811119233
Return Policy: Standard Return Policy
1 SUPERMICRO MBD-X9SCM-F-O LGA 1155 Intel C204 Micro ATX Intel Xeon E3 Server Motherboard SUPERMICRO MBD-X9SCM-F-O LGA 1155 Intel C204 Micro ATX Intel Xeon E3 Server Motherboard
Item #: N82E16813182253
Return Policy: Standard Return Policy
1 Rosewill CAPSTONE-550 550W Continuous @ 50°C, Intel Haswell Ready, 80 PLUS GOLD, ATX12V v2.31 & EPS12V v2.92, SLI/CrossFire Ready, Active PFC Power Supply Rosewill CAPSTONE-550 550W Continuous @ 50°C, Intel Haswell Ready, 80 PLUS GOLD, ATX12V v2.31 & EPS12V v2.92, SLI/CrossFire Ready, Active PFC Power Supply
Item #: N82E16817182068
Return Policy: Standard Return Policy
1 Intel Xeon E3-1230 Sandy Bridge 3.2GHz LGA 1155 80W Quad-Core Server Processor BX80623E31230 Intel Xeon E3-1230 Sandy Bridge 3.2GHz LGA 1155 80W Quad-Core Server Processor BX80623E31230
Item #: N82E16819115083
Return Policy: CPU Replacement Only Return Policy
1 Kingston 8GB 240-Pin DDR3 SDRAM DDR3 1333 ECC Unbuffered Server Memory Intel Model KVR13E9/8I Kingston 8GB 240-Pin DDR3 SDRAM DDR3 1333 ECC Unbuffered Server Memory Intel Model KVR13E9/8I
Item #: N82E16820239116
Return Policy: Memory Standard Return Policy
1 LG Black 14X BD-R 2X BD-RE 16X DVD+R 5X DVD-RAM 12X BD-ROM 4MB Cache SATA BDXL Blu-ray Burner, Bare Drive, 3D Play Back (WH14NS40) - OEM LG Black 14X BD-R 2X BD-RE 16X DVD+R 5X DVD-RAM 12X BD-ROM 4MB Cache SATA BDXL Blu-ray Burner, Bare Drive, 3D Play Back (WH14NS40) – OEM
Item #: N82E16827136250
Return Policy: Standard Return Policy
1 Kingston DataTraveler SE9 64GB USB 2.0 Flash Drive Model DTSE9H/64GB Kingston DataTraveler SE9 64GB USB 2.0 Flash Drive Model DTSE9H/64GB
Capacity: 64GB Color: Silver
Item #: 9SIA12K0X40410

This came out to roughly $730.

Normally I try to be fair and list pros and cons, but I only have good things to say about this arrangement.  Everything (hardware and software) worked correctly on the first try.  The performance is ludicrous; I can easily saturate a gigabit pipe with a 4-drive raidz.  There were no hitches installing OpenIndiana at all, even installing to the flash drive (which is connected to an internal USB header so it’s out of the way).  The drives are sideways so the power and sata cables can be routed behind the motherboard.  In fact, everything is routed behind the motherboard except for the main motherboard power cable.  The drives are on rails; while they aren’t hot-swap, it is very easy to swap them without using any tools.

Astute readers will wonder why I purchase a Sandy Bridge Xeon instead of something newer.  Ivy Bridge or Haswell would have given me more bang for my buck, but I wanted to play it safe with confirmed, tested hardware.  I was also unsure if my motherboard would support Ivy Bridge — it requires the latest BIOS to do so, but you need a Sandy Bridge to apply the latest BIOS!  Horrible catch-22.  So I played it safe with Sandy Bridge.  You also might be wondering why a blu-ray drive is there.  That is a “why not?” addition — if it’s possible to burn blu-ray media directly from the fileserver, that’s an additional win.  Even if I can’t, it was only $50 more than a DVD-ROM drive so hey, why not.

All of my previous builds have been from cast-off second-hand hardware; this is the first time I built it right the first time, and I wish I had done this years ago.  I have a good feeling this hardware will last me a good 6-8 years as-is.

Posted in Technology | Leave a Comment »

You couldn’t be a total idiot

Posted by Trixter on July 9, 2012

One of the things I miss about the first decade of personal computing was that nearly every computer enthusiast you met  — on BBSes, in computer stores, etc. — was pretty good at using them.  Early personal computing meant you couldn’t be a total idiot and still use a computer, unlike today.  So if you met someone who used computers enough and liked doing so, chances are they were not an idiot.

I think it’s amazing how much of a commodity personal computers have become.  Last year, my then 2-yr-old nephew could navigate an ipad without any help, even though he wasn’t talking yet.  My 12-yr-old son has a typical smartphone, which means he can send sound, images, and data to anyone in the entire world no matter where he is — that’s stuff I used to watch on Star Trek, and now it’s reality!  That’s both pretty damn scary and pretty awesome.  But I think what’s missing is the element of discovery, of natural intellectual curiosity, trying to figure out what the machine can do, why and how it does what it does, and how to push it farther.  That’s what I miss about the early days, and is probably why I have 27 old computers stuffed into my crawlspace, with 1 or 2 in regular circulation.

I feel like that intellectual-curiosity-for-tech has been lost from the general public in the last 10-15 years.  Maybe I’m wrong and it never existed at all, and I was just lucky enough to always be surrounded by people who were interested in computers in my youth.

A month ago, the newly-unearthed M.U.L.E. for the PC (more on that in a later post) got hours of use as my 12-yr-old and his friend played several games.  Because we didn’t have the manual at the time, there was much experimentation and probing on what keys to press and how the game mechanics worked.  A few months ago, they did the same thing playing 2-player simultaneous Zyll, where they poked and prodded every square inch of the game to try to see what made it tick (and when I surprised them with a dot-matrix-printed Zyll FAQ after they’d played for a few hours, they just about lost their shit).

My point is that they were really into it, and I can’t help but wonder why they aren’t into much neater tech they own that has vastly more power and flexibility.  I never see them as enthusiastic around their xbox or iphone as they were playing these old games and trying to figure out how to drive them.  They might be enthusiastic about the game they’re playing or the people they’re playing with, but never the machine itself.

Why is that?  Does a device lose all interest once it has been through commoditization?

Posted in Sociology, Technology, Vintage Computing | 6 Comments »

I grow tired of the technologically ignorant

Posted by Trixter on February 29, 2012

(This post is overly subjective, more opinionated than my usual efforts, and contains some cussing.  Consider yourself warned.)

I am sick and tired of people who shun technology and progress under the guise of “I’m an old tech veteran, I’ve been working with technology for 30 years, and the new stuff is crap compared to the old stuff.”  People who defend this viewpoint are idiots.  I’m not talking about audiophiles or other self-delusional “prosumers”; I’m talking about people who have worked a tech trade or had hands-on access to technology for many years and think that their perceptions trump reality.  It’s a perverse combination of technology and anti-intellectualism — a form of hipsterism for the over-40 set.

I was prompted to cover this by a recent post on why widescreen monitors are a rip-off (which I will not link to because I truly enjoy the other 99% of this person’s blog, and linking to it would imply that I don’t like him or his site), but the underlying irritation of the entire mindset has been percolating for many years.  Viewpoints that drive me crazy include:

Widescreen monitors don’t make any sense

People think that widescreen monitors are stupid on laptops because most people use laptops for text work, and since text is more comfortable to read in columns, wide columns are harder to read.  This mindset has had the doubly idiotic result of making people think that websites need to be column-limited.  I just love going to a website and having the text squished into a 640-pixel-wide column with 75% of the screen unused.  Don’t like how narrow columns look on a widescreen monitor?  Use the extra space however you want — put up two web pages side by side, or simply don’t look at the unused space.  It’s people like these that also complain that 4:3 video has black bars on either side of it when viewed on a widescreen TV.  It’s called pillarboxing, you idiot, and it’s there to prevent your movie from looking like a funhouse mirror.

Widescreen monitors have made modern laptops better.  A widescreen laptop monitor allows the keyboard to be wider without the depth of the laptop getting too high (to support the height of a 4:3 monitor).  Having a decent keyboard on a laptop used to be impossible without clever wacky engineering tricks; now it is.  Widescreen monitors made ultra-small netbooks possible, so if you’re reading this on a netbook but somehow still disagree with me, you’re a hypocrite.

Analog audio is better than digital

There are entire websites (and wikipedia pages) dedicated to this, usually under the guise of “vinyl is better than CD”.  Most opinions on this subject were formed when analog audio had several decades of mature mastering and production processes, and digital was brand-new (for example vinyl vs. CD in 1983).  Early efforts to put things on CD resulted in some less-than-stellar A/D conversion, which created a type of distortion that most people weren’t used to hearing.  People formed opinions then that have perservered more than 25 years later, even though the technology has gotten much better and all of the early mastering problems have long since been corrected.

People who think vinyl sounds better than CD have nostalgia blinders on.  They bought an album in their youth, played it endlessly, loved it.  Then they buy the same album on CD decades later and condemn the entire format as inferior because it sounds different.  Want to know why it sounds different?  It has a wider frequency range, lacks rumble, lacks hiss, sounds exactly the same after 10+ playbacks, and was remastered with better technology and mixing conditions under the guidance and approval of the original artist when he wasn’t coked or drunk or stoned out of his mind.  People like Pete Townsend, Neil Young and Geddy Lee not only approve of the latest digital technology but are actively utilizing it and going through great pains to remaster their classic albums with it.  People are missing the point that it is the mastering and digital compression that causes issues, not the technology itself.  Neil Young recently spoke at a conference where he damned digital music, but not because it is digital — rather, because it is delivered differently than the artists intended.  Neil Young would like nothing better than for everyone to be able to listen to his music at 24/192.  Can’t do that on vinyl, bitches.

Even people who write about the loudness war get it wrong, despite that it’s an easy concept to understand.  Massive dynamic compression drowns out subtle details and can add distortion, which is horrible — but it is not exclusive to digital audio, nor caused by it.  One author correctly notes that massive dynamic compression butchers mixes, but then subtlety implies that all CDs that “clip” have distorted audio.  Digital audio “clips” only if you drive the signal beyond its digital limits.  If you took an audio waveform and normalized it such that the highest peak reached exactly the highest value, it is “positioned at maximum volume”, not clipped.  Nothing is lost (to be fair, nothing is gained either).

The problem is the mastering and production process, not the technology.  Which segues nicely into:

“I will never buy Blu-ray”

The only valid argument against Blu-ray is that it is harder to make a backup copy of the content.  It is indeed harder than it is for DVD, or laserdisc, or videotape.  That is it.  All other arguments are beyond moronic.  Even the cheapest possible 1080p HDTV viewing setup has five times the resolution of DVD and lacks signal degradation in the output path.  If you view a Blu-ray and can’t tell the difference between it and DVD, you have either a shitty viewing setup, a shitty Blu-ray, or a shitty visual cortex.

Someone recently tried to argue with me that DVDs have the same or better picture than Blu-ray and used Robocop as an example.  The comparison was weighted, as they were comparing the $9 Blu-ray that MGM belched out when Blu-ray was only a year old to the Criterion DVD treatment.  I own both, so I checked them out and I agree that the DVD has better color tonality throughout the film.  However, the Blu-ray thoroughly stomped the DVD in every single other area, most obviously resolution.  So much picture detail is added by the increase in resolution that I actually prefer it despite the lack of Criterion oversight.

The real problem, as previously stated, is how the mastering and preproduction process was handled.  Even with new 2012 DVD releases, you can still see the “loudness war” video equivalent of digital ringing, which used to be an accident but was later introduced on purpose as part of a misguided “sharpening” step.  Listen up:  Any sharpening filter added to any signal doesn’t make things sharper; it makes them appear sharper by overlaying a high-frequency permutation signal over the original content, which increases the acutance.  Quality is actually lost when you do this, as the high-frequency info obscures actual picture detail.

This is another example of perception vs. reality, which not coincidentally also segues into:

“Computing was better in the old days”

I love retrocomputing as a hobby.  I think about it nearly every day; this blog was partially created to talk about vintage computing.  But even I wouldn’t say that things were better in the old days.  People who say this don’t realize they are really trying to say something else.  For example, people who say that “BBSes were better than web forums are today” are actually referring to the sociological fact that, when you communicated with people on a BBS, you were communicating with people who met a minimum level of technical competence — because, if they hadn’t, they would have been too stupid to access a BBS, let alone be proficient with a computer.  The overall technological quality level of everyone you met on a BBS in the 1980s was higher than other places, like a laundromat or a bar.  What such people fail to consider is that modern web boards, while having a higher quotient of trolls and B1FFs, are open to the entire world.  The massive scale of humanity you can encounter on even a tiny niche topic is levels of magnitude higher than it used to be.  The sheer scale of information and interaction you can now achieve is staggering, and completely outweighs any minor niggle that you have to deal with 3 or 4 more asshats per day now.

Here’s another example:  “Computer games were better back in the old days.”  This is wrong.  The proper thing to say is that “Some computer game genres were better back in the old days.”  I can get behind that.  For example, graphics were so terrible (or non-existent!) at the birth of computer gaming that entire industries sprang up focusing on narrative.  For such genres (mainly adventure games), several times more effort was put into the story than other genres.  As technology and audiences changed over time, such genres morphed and combined until they no longer resembled their origins.  That doesn’t mean modern games are terrible; it just means that you need to shop around to get what you’re looking for your entertainment.  Don’t play Uncharted 2 expecting a fantastic story with engaging narrative.  (Dialog, maybe, not not narrative.)  Heck, some genres are genuinely awesome today compared to 30 years ago.  For example, Portal and Portal 2 are technically puzzle games, but the storytelling in them — despite never interacting directly with a human — is among the very best I’ve ever encountered.

About the only argument that does work involves the complexity of older computers — they were simpler, and you could study them intensely until you could very nearly understand every single circuit of the board, nuance of the video hardware, and opcode of the CPU.  Today, a complete understanding of a computer is no longer possible, which probably explains why Arduino sets and Raspberry Pi are getting so much attention.

Conclusion

I have no conclusion.  Stop being an old-fogey anti-intellectual technophobe, you ignorant hipster fuck.

Posted in Digital Video, Entertainment, Sociology, Technology, Vintage Computing | 10 Comments »