While I get my video editing gear up to HDR standards and prep for Vintage Computer Festival Midwest 2025, I’ve found time to appear on some podcasts the past few months:
More Fun Talking Retro Episode 7: The Oldskool PC Jim Leonard – Piracy Is Not The Theme:
Atari Podcast Episode 55 – The Past, Present, and Future of MobyGames:
I’ve enjoyed all of these, and I think I’m getting slightly better at it each time. If you think I can contribute to your podcast on anything from vintage computing history, PC retrogaming, programming old systems, the early demoscene, 1980s gaming history, or who knows what all else, drop me a line at mobygamer@gmail.com.
(Edit 2025: I have been informed that “letterboxd” is a better outlet for this sort of thing, so I joined letterboxd and will put future reviews there.
I use every October as an excuse to binge horror movies. I’ve tried reviewing them at length on this blog in the past, but that turned out to be a colossal waste of time (I spent at least 15 minutes writing each review, about 4 hours total, and each post got single-digit views). But I still feel sharing the experience is useful in some way, so I’ll write the full list up with a small snippet of whether or not I recommend it and why.
I decided to try something different this year: Sort all of the unwatched horror movies I have on my local media server from WORST to BEST, according to audience rating. The idea was that, as I move from movie to movie, they would keep getting better and better :-D I have to say, the experiment worked! They started out utterly ridiculous, and slowly got better and better, with the occasional surprise (very bad, or very good) along the way. A few times, I couldn’t take it any more and watched a much better movie instead of following the rules I had set up for myself, but for the most part I stuck to the experiment and was pleased by it.
Here’s the list, in watched order:
The Phantom Eye (1999): Roger Corman cheese. Recommended.
The Prophecy (1995): Angels fighting angels. Christopher Walken as an angel! Recommended.
Dracula (1931): A classic. First half much better than second half. Recommended.
The Prophecy II (1998): A sequel, worth seeing for more Walken. Barely recommended.
The Prophecy 3: The Ascent (2000): Weak and not enough Walken. Skip.
The Prophecy: Uprising (2005): Low budget side-quel. Hallmark of low budgets is way too much dialog and not enough action or locations. Skip.
The Prophecy: Forsaken (2005): Sequel to low budget side-quel. Forgettable. Skip.
The Clonus Horror (1979): First half better than second half but worth sticking it out. Recommended.
The Hand (1981): This was WAY better than the ratings suggested. Michael Caine’s severed hand starts murdering people who anger him. Highly recommended.
Devil Doll (1964): Twilight zone did this way better (twice). Skip.
New Terminal Hotel (2010): The return of Stephen Geoffreys to mainstream acting. He’s good, everyone else is not. Skip.
Halloween III: Season of the Witch (1982): Can’t believe I hadn’t seen this before 2024! This is totally not a Halloween movie, and it’s super bananas at times, but I was never bored. Commentary track with Tom Atkins is just as entertaining as the film. Highly recommended.
Uncle Sam (1996): Desert Storm zombie comes back as Uncle Sam to wreak patriotism on a small town. Recommended (barely).
How to Make a Monster (2001): Computer program comes to life and kills people. Obligatory Julie Strain appearance. Computer-speak is difficult to cringe through, but there were some cute and clever moments. Recommended if you like “OMG computers world wide web first person shooter” cheese from the early 2000s.
Contamination (1980): Italian rip-off of Alien that was so terrible I was yelling at the screen. Avoid at all costs.
He Knows You’re Alone (1980): Serial killer hunts down brides-to-be. Tom Hanks’ first movie where he has no impact on the story at all. Forgettable.
Frankenstein (1931): A classic. Worth seeing.
The Initiation (1984): Daphne Zuniga’s first film. One of two horror films I watched this month set in a mall. Skip.
The Video Dead (1987): An offbeat zombie horror film. One of two horror films I watched this month where zombies invade our world through a television. It’s not really worth seeing, despite two funny uses of an iron.
Frankenstein Unbound (1990): Roger Corman’s reimagining of Frankenstein. Fantastic cast, surprisingly good props for the budget (a Corman tradition), but ultimately not worth seeing. It’s just not enjoyable to watch.
Hellbound (1994): Chuck Norris swears and kicks demon butt. Movie, however, does not kick butt. Skip.
The Puppet Masters (1994): The “28 weeks later” of Invasion of the Body Snatchers. Recommended.
Prom Night (1980): Rumor has it that Jamie Lee Curtis agreed to star, at a low salary, if she were permitted to dance. And disco dance she does. Not enough to save the film though. Skip.
Strange Invaders (1983): A nearly perfect homage to 1950’s communist-scare alien movies. If you haven’t seen any 1950’s sci-fi then you won’t be in on the joke and probably won’t like it. Recommended (barely).
The Boogey Man (1980): Ineptly handled horror slasher. Skip.
The Final Destination (2009): Not as good as the first Final Destination film, but still very enjoyable and has fun with the kills. Recommended.
The Wolf Man (1941): Another classic, and worth seeing — but there’s a lot of 1940s-isms that don’t resonate very well today, like discrimination and sexism. Lon Chaney Jr. is better in makeup than not. Tough to recommend.
Christmas Evil (1980): A fascinating film that is more drama, exploitation, and thriller than horror. A statement against apathy and commercialism. Recommended, but only barely.
Evilspeak (1981): Clint Howard conjures up the demon form of Richard Moll using an Apple II computer! It’s a terrible film, but come on, conjuring demons with an Apple II. Reluctantly recommended.
Mutant (1984): Inept zombie movie with inept shambling zombies that are created by inept chemicals from an inept evil company. Did I mention it was inept? Skip (unless you like seeing Wings Hauser ad-lib half his lines)
Friday the 13th: The New Blood (1988): Jason against a girl with telekinesis. Does not live up to the promise of Jason-against-telekinesis, unfortunately. Skip.
Mimic (1997): Guillermo del Toro’s worst film is still worth watching. Director’s Cut is supposed to be better than the theatrical cut I watched. Recommended.
ABCs of Death 2 (2014): I loved this! 26 different short vingettes from 26 different directors. Recommended.
Ben (1972): A lonely boy befriends a smart leader rat of a smart rat swarm. Sequel to Willard. Super slow with no real payoffs. Hated them both. Skip.
Chopping Mall (1986): Robots kill teens hiding in a mall after dark. Second of two horror films I watched this month set in a mall. Unlike the other one, this one is very silly, but it knows it and has a ton of fun with the premise. Recommended.
Demons 2 (1986): Italian film that tries to rip off so many movies simultaneously I lost count of how many. Second of two horror films I watched this month where zombies invade our world through a television. Dreadful film that even the good makeup/prosthetics can’t rescue. Skip.
The Children (1980): Deadly children! Low budget but manages to stay interesting throughout. Recommended.
A Cold Night’s Death (1973): Two-person movie where Robert Culp and Eli Wallach try to figure out what killed everybody at a remote arctic research station. Originally a made-for-TV movie, rises above that and was one of the hidden gems of this month’s horror movie experience. Highly recommended.
Mom and Dad (2017): Mass hysteria causes parents to murder their children. Nicolas Cage and Selma Blair do their best with the material, but the movie loses confidence in its own premise. Can’t really recommend it.
Return of the Living Dead III (1993): One of my favorite cult classics — I have seen this many times before but try to watch it once a decade anyway. Has a few unique and cynical twists on the living dead zombie premise. If you liked the first RotLD from 1985, you’ll like this one. (Skip the second RotLD.)
The Night Flier (1997): Miguel Ferrer investigates a serial killer who travels by plane. Takes a while to get going but ultimately worth it. Recommended.
Popcorn (1991): Horror comedy mystery. Resisting urge to call it cheesy. “Black Comedy For Dummies”. Barely recommended.
Dead Hooker in a Trunk (2009): Student project made for $2500. Great student project, disappointing movie. Skip.
The Eye (2008): I see dead people, but not as good as Sixth Sense. What could have been a good premise is wasted on Jessica Alba’s limited acting ability. Skip.
Poltergeist (1982): The last time I saw this was in August 1982, when my brother and I snuck into the theater to watch it after having watched Tron. Came across the 4k version and decided to revisit it. Fantastic movie where many aspects of it still hold up. Highly recommended.
House of Wax (2005): Starts slow but methodically gets better and better as the movie goes along. Paris Hilton, surprisingly, can actually act. Some very unexpected practical effects. Recommended.
Sixteen Tongues (1999): Cinema verité shot on analog video DV cams. True cyberpunk, which is dystopian and depressing. Extremely low budget (the “future” is littered with 16-bit ISA cards and classic Macs), takes place in a single location. Script is so good that it exceeds the acting capability of the actors, sadly. Not Rated (which means practically X-rated). I’m glad I watched it, as I’m a fan of performance art, but I can’t really recommend it as it will disgust most people.
The Keep (1983): Michael Mann’s only non-crime-thriller movie. Was clearly a much longer film with better narrative pacing before it was cut in half by producers. Cool practical effects. Recommended.
Hell Night (1981): College hazing takes a wrong turn after locking people in a house that is already inhabited by a psycho. Takes forever to get going and almost never gets there. Skip.
The Sentinel (1977): A woman has psychic flashbacks after moving into a strange apartment complex. Despite many recognizable stars, has a weak payoff. Used people with actual physical deformities as scary people, which is not really acceptable today. Skip.
The Thing (2011): This is the prequel to The Thing (1982), which itself is a remake of The Thing From Another World (1951). All of them are recommended and each bring something to the table, but Carpenter’s The Thing (1982) is clearly the best and should be watched BEFORE watching this prequel, as the prequel will permanently spoil the 1982 film. Recommended.
51 movies — a new record for me. There are still days left in October, but I think I’ll stop there.
With a dummy battery, the screen flipped out, no active cooling, and a normal ambient room temperature, a 4k60 4:2:0 10-bit recording overheated the camera sometime after 3 hours.
The same setup WITH Ulanzi active cooling, a 4k60 4:2:0 10-bit recording ran indefinitely with no heat warning.
TL;DR
I spent a week performing a lot of G9 II overheating tests, and am presenting the results here. The quick summary: There are some things you can do to prevent the G9 II from overheating, but they have their own pros and cons. Ultimately, if you want to record tons of video without any overheating, you should not buy a G9 II, and should instead spend the extra cash on a GH7.
Full test parameters and results are below. Before continuing, a disclaimer: A few links in this blog post are affiliate links to the exact products I actually purchased and tested with. Clicking on these links don’t cost you any extra money, but may earn me a small commission.
Background
I shoot primarily long-form video: Product photography, talking heads, and events. I am invested in the micro four thirds (M43) lens system, and bought a Panasonic Lumix G9 Mark II as soon as I was able to afford one, because I really wanted the first Panasonic M43 camera with phase-detect autofocus. It works great, and the autofocus is finally fantastic — however, it overheats during longer video shoots, shutting down video recording functions for 13 minutes to recover.
What’s worse, I bought it at the worst possible time: 35 days before they announced the Panasonic Lumix GH7, which doesn’t overheat. 5 days past the return policy, it was too late to return my G9 II for a refund. If I try to sell it on ebay, I can expect a $600 loss. So I guess I’m stuck with it.
Trying to make the best of a bad situation, I decided to see what the overheating limits are, so I let the video recording run as long as it could in different resolutions, framerates, colorspaces, and codecs. The results are below, and I hope they are helpful to someone.
Test Setup
Almost all my shots are locked down on a tripod, so that’s what I used for these tests. I have been shooting 4k exclusively since 2016, so I didn’t bother with smaller resolutions. I didn’t test any ALL-I modes because I haven’t used them since Panasonic added h.265 to their cameras (and ProRes to the G9 II). I also didn’t test true DCI 4K (4096×2160) modes, as I felt UHD 4K (3840×2160) was the more common use case.
Here are the test parameters I adhered to:
Indoors at room temperature (roughly 70F/21C)
On tripod
Screen was flipped OUT. (Very important!! This exposes the back of the camera to release more heat!)
Used “dummy” battery connected to mains/wall power
Heat management was set to “high”
Recorded to 2 x 128G v90 SD cards mirrored (128G total storage). Cards were reformatted between each test.
1/60 shutter (to match room lighting frequency) for all 60p/30p/24p modes
Waited until back of camera was cool to the touch before starting a new test
All recording formats were h.265 (LongGOP). Colorspace was 4:2:0, as I didn’t notice any significant heat differences between 4:2:0 and 4:2:2 during initial testing.
Times reported are the lengths of the resulting footage files, truncated to the nearest minute
All recording modes were the full readout of the sensor, with the exception of the 4.4k modes which are a 1:1 pixel crop
All tests were performed three times, and the median value chosen as the result
128GB Mirrored v90 SD Card Results
Resolution
Framerate
Behavior
Length
Notes
HD (1080p)
240p
OK
1h27m
240p not a typo; this high-framerate mode is used to shoot slow-motion footage
UHD 4K
24p
OK
1h54m
UHD 4K
30p
OK
1h54m
UHD 4K
60p
OK
1h27m
Flashed heat warning after 50 minutes, but did not stop recording
UHD 4K
120p
overheated
23m
4.4K (4:3)
60p
overheated
41m
This is a 1:1 pixel readout/crop
5.7K (17:9)
24p
OK
1h27m
5.7K (17:9)
30p
overheated
45m
5.7K (17:9)
60p
overheated
33m
5.8K (4:3)
24p
OK
1h27m
Full sensor readout / “open gate”
5.8K (4:3)
30p
OK
1h27m
Full sensor readout / “open gate”
I was very surprised the 5.7k 17:9 (5728×3024) 30p mode overheated much faster than the 5.8k 4:3 (5760×4320) 30p open gate mode, as the full-res open gate mode processes more information per frame. I have neither explanation nor theories for this behavior.
Do larger/slower SD cards help with overheating?
To see if shooting long-form events were possible, or if slower cards caused more (or less) issues, I used larger+slower SD cards to see how far I could push the camera. I saw no significant differences in overheating performance doing this. Modes that overheated still overheated at roughly the same length. Modes that previously worked continued to work.
For example, recording to UHD 4K 60p filled a 512G V30 SD card with 5+ hours of video. As in the previous test, this also flashed a heat warning around the 50 minute mark, but it kept going and filled up the card.
Does an ULANZI Camera Cooling Fan help with the overheating?
Yes. Mini-review:
The ULANZI Camera Cooling Fan is a $40 battery-powered add-on fan with two speeds and a temperature display that attaches to most mirrorless cameras designed with a flip-out screen. I bought one and connected it to continuous power (only fair, since my camera was on dummy battery continuous power as well), set it to high speed, and re-tested the modes that previously shut off due to overheating. Results:
Resolution
Framerate
Behavior with fan
Old limit
New limit with fan
UHD 4K
120p
overheated
23m
23m
4.4K (4:3)
60p
OK
41m
58m
5.7K (17:9)
30p
OK
45m
1h27m
5.7K (17:9)
60p
OK
33m
58m
So, the Ulanzi fan definitely helped extend the recording times in most modes that had overheat shutoff issues.
That said, there are some caveats to using it:
Mine did not stay connected to the G9 II very well; it kept sliding down, then flying off at great velocity because of the spring-loaded nature of how it attaches to the back of the camera. I added four small adhesive rubber “bumps” to all four corners to permanently solve this issue.
The fan is not silent, and raises the noise floor picked up by the camera’s microphones to -30dB. (That said, nobody uses the built-in camera audio for anything serious.)
Does writing ProRes to an SSD help with overheating?
Not really. The only video mode that overheated that also had a ProRes equivalent was 5.7K (17:9) 30p, so “prores as a way to generate less heat” is only useful in one recording scenario. Not only is this a very narrow limited scenario, it still overheated. Writing to an SSD led to a max recording time of 64m (prores) vs. 51m (h.265).
Does using an external recording monitor help with overheating?
Yes. I used an Atomos Ninja (newer version, not the Ninja V) as the monitor + recorder for the G9 II, as I had purchased it earlier in the year for an unrelated project. In all recording scenarios, I was surprised to see the G9 II heat up, even though it wasn’t recording! But despite that, it never fully overheated and stopped, even when displaying the flashing heat warning.
The highest RAW resolution and framerate I could achieve with my Ninja (the Ninja Ultra can go higher) was 5.7K 30p. Where the native camera overheats and shuts off at 45m using that resolution and framerate, using the Ninja resulted in no overheating and filled a 2TB SSD with over 3.5 hours of ProRes RAW footage.
Recommendations and Conclusions
While I’m stuck with my G9 II, I was happy to see that some demanding modes such as UHD 4K 60p and 5.8K 30p can record nearly indefinitely at normal room temperature. For other demanding modes that overheat, at least I now know what they are and can plan accordingly.
Please keep in mind that these numbers were the result of optimal indoor shooting conditions. If you’re shooting outdoors with direct sunlight hitting the camera, or you shoot with the flip-out screen in the closed position against the back of the camera, you can probably expect 1/3rd (or less!) of these runtimes, and should buy a GH7 instead.
Vintage Computer Festival Midwest is an annual celebration of all things vintage computing. It is organized and paid for by Chicago Classic Computing, a 501c3 non-profit entity. At VCFMW, you can expect hundreds of tables with a mixture of exhibits and vendors, a large inclusive community to nerd out with, and a series of presentations from community members. Admission to every VCFMW has always been free. I believe in this mission, and I am proud to be a part of the organization that puts on the show every year.
For the past decade, my primary role has been organizing and executing our presentation track (ie. “talks”). This includes, but is not limited to:
Choosing talks from submitted proposals, based on diversity and fitness of subject matter
Organizing the talks schedule, based on each presenter’s availability and requirements
Setting up and tearing down the talks room (speakers, screen, projector, mics, AV, etc.)
Recording video and audio of the talks, then editing and uploading them after the show
I perform nearly all of these activities myself. It is exhausting, and I don’t get to see the actual show very much because I’m running the presentations, but I believe in the cause and I’m happy to do it. The presenters work hard on their projects and presentations, so I work hard producing a quality final result that the community can learn from.
Interest in our show exploded
The interest in VCFMW has skyrocketed in the past two years, to the point where we were forced to move to a venue 4x as large as our previous one to ensure that every exhibitor who wanted a table could get one. Even with the new space, table sign-ups effectively “sold out” in a matter of hours.
This success also translated to presentation submissions, where we received nearly triple the number of talk proposals we’ve gotten in previous years. Unfortunately, we were not equipped (mostly financially) to provide additional space, equipment, and personnel for running two simultaneous presentation tracks to accommodate everyone who submitted a proposal. With only one track, roughly half of the proposals had to be declined.
Why a transparency report?
It’s come to my attention that some people are unhappy that their talks were not selected. It has even been suggested that I “play favorites”, ie. choosing presentations based on personal relationships or selfish motivations, rather than what is best for the show. While I am under no obligation to explain our criteria for selecting talks, I thought it would be beneficial to provide insight into our selection process. It’s my hope that this will give closure to those who may have felt wronged, and possibly help inform any other vintage computer shows that struggle with selecting presentations.
How we select proposals
Proposals are considered based on multiple criteria. Some of these are obvious:
Is it on-topic for a vintage computer show? This is hopefully self-explanatory: The closer your talk is to the genre of vintage computing, the more likely we will select it.
Is it “on-brand” for our specific midwest show? Most vintage computer shows are on the east coast or west coast, and tend to present east/west coast history. Midwest computing history is not as well represented, so we lean towards midwest-centric history topics.
Is it new or “deep” research? The primary purpose of presentations is to inform, teach, and educate. We are more likely to select a proposal that has new revelations about a subject, or extremely deep dives that are not already well-represented online.
How wide is the audience? If your proposal targets a tiny niche group of people, we are less likely to select it than a proposal that appeals to a wider audience. (We have unfortunately learned this through experience: Talks that attract an audience of less than 5 people take up a valuable slot that could have been allocated to a talk that informed hundreds or thousands of people once online.)
Is it multi-generational? To serve our mission of inclusion and education, we more closely consider proposals that appeal to multiple generations. One core tenet of our show is that of outreach: Without younger generations becoming interested in vintage computing, the field will die.
Has it been given before? We have precious few slots for talks, and tend to pass up talks that have already been given previously in the past 12 months and/or already online.
There are also non-obvious, behind-the-scenes reasons talks may be selected or rejected:
Individual presenter availability. If we get two fantastic proposals where both presenters are only available for the exact same time slot, then we obviously can’t select both proposals, as we can only support one presentation track at this time.
Equipment or venue restrictions. If a proposal relies on exotic video hookups we cannot accommodate, or 3-phase 220V power, or something that the venue restricts, we cannot technically meet those requirements and cannot select the proposal.
Organizer privilege. Executing VCF Midwest takes months of hard work every year, and we ask for nothing in return, other than people attend and have a good time. On rare occasions, given two differing proposals with equal merit, we sometimes reserve the privilege of preferring one proposal over another as compensation for putting on the show.
Concerns about favoritism
“Organizer privilege” is not necessarily the same thing as favoritism. Favoritism implies unfair preferential treatment, but in accordance with our selection criteria, we try to be as fair and accommodating as possible given our available resources. Despite that, some people have expressed concern that we are “playing favorites” based on individual desires and personal relationships. To that, I will only point out the obvious: Given that our community is “small” (relatively speaking), it is a statistical inevitability that some submitted proposals will be from people I know personally. I stand by what I’ve previously written: Proposals are primarily selected due to their merit and their relevancy to the show.
A very specific example was brought to my attention that is worth going over in detail: This year, we received three proposals for the same topic family: Modern fantasy/”new retro” 6502-themed personal computers. Had I selected all three to not “play favorites”, over 1/4th of our entire presentation schedule would have been taken up by 6502-based “new retro” personal computers. Had I rejected all three (also not to play favorites), it might have led to accusations of our show somehow being biased against 6502-based projects. So, I decided to select only one of them. Of the three, one talk had already been given in June and the video of it was already online, so it was removed from consideration. The remaining two were considered based on the criteria listed above, and a single one was selected that satisfied the most criteria.
That’s all that can be said on the matter. If people still have a problem with our selection process, they are free to take their proposals to other festivals (although that is no guarantee another festival will have as open and considered a selection process as we do).
Improvement plan for the future
Despite having a lot of experience putting on our show, we are not professional event organizers. We make mistakes every year, and we try to learn from them. As mentioned earlier, the growth of the show caught us off guard, and we could have handled talk submissions a little more gracefully.
For 2025, we plan on making the following changes to the talk proposal submission and selection process:
All talk submissions will go into a webform (rather than emailed), which will be open for a specific time to be clearly communicated. Proposals will only be accepted during that time. Using a webform will also ensure we have the full information we need (title, abstract, speaker bio, etc.) rather than just an email saying “Hey, me and some friends want to have some stage time to discuss XXX”, etc.
We will not “pre-consider” any submissions. Submissions will not even be looked at until the submission window closes.
A roundtable discussion amongst multiple CCC members will be held for pre-selection and initial voting, to address concerns that any single person is “playing favorites”.
What would it take to add a second presentation track?
VCFMW is free to attend, and although we are asking for a donation for tables and power this year, that amount does not actually cover the full cost of tables and power. We have also experienced a 4x growth in one year due to a surge of interest in the show, which had to be dealt with by getting a larger and more expensive venue (the alternative would have been to limit the number of people who could attend our previous venue).
As such, our 501c3 non-profit cannot always fund the show. We rely on donations and the show auction to generate enough money to put on the show for next year, but sometimes it isn’t enough. Every year, the organizers contribute personal funds as necessary to ensure a nice show for everyone, such as buying equipment that we need. (For example, every piece of equipment you see in the talks room was purchased for use at the show, as that is cheaper than renting in the long run.) We do this because we believe in the community, we want to advocate and promote vintage computing history and outreach, and want everyone to have a good time at our show.
If you are unhappy with various aspects of VCFMW, please try to keep the above points in mind. Better yet, why not donate to help cover the costs of putting on the show? With enough funding, we might be able to support simultaneous presentation tracks in future shows, ensuring that more talk proposals have a chance to be selected and presented.
“How do I write a DOS program?” is a question I get several times a month. I have usually given targeted advice for each person’s situation, but my time has become quite limited so I can’t do that any more. Luckily, I stumbled across Stephan Sokolow’s Blog, which has two great entries with most of the information I usually impart:
On November 1st 2023, I released the first part of a 4-part series on the IBM PCjr on YouTube. The opening shot is a slow dolly push into the frame, showing an IBM PCjr’s startup screen. It is some of the best product footage I’ve ever shot, perfectly calibrated and exposed, showing the CRT’s 16 colors as accurately as I’ve ever seen in 4k footage. You can check out the opening shot (and hopefully the entire video!) here:
Someone mentioned to me that at first they thought it was rendered CGI, which is one of the best complements I’ve ever received. :-)
While I’m not a professional photographer, I’ve been studying videography for about a decade, and gotten some tips from experts that include an Emmy-award-winning cinematographer, so I feel somewhat qualified to advise on the process of shooting vintage computer product photography/video. I’d love to see better vintage computer videography on YouTube, so here’s my process, and I hope it helps someone.
Misconceptions
There’s a lot of bad advice out there, so let’s start by tackling some misconceptions:
Color grading is a stylistic step, not a correction step. Color grading is the process of adjusting and enhancing the colors of a video to achieve a desired visual aesthetic or mood, but in order for grading to work consistently, the input footage needs to be perfectly exposed and neutral. The procedures in this blog post will help you get your footage to be as close to neutral as possible out of the camera, which you can then color grade if you want to.
Gear Doesn’t Matter — but whatever you use, it must have manual controls for aperture, ISO, and shutter speed if you want to control the process. If any of these are outside of your control, then you probably need to revisit your choice of camera or cell phone. (Most modern iphones and android phones either come with a “pro camera” app, or you can buy one, or try an Android open-source solution such as Open Camera.
Achieving neutral footage via best practices
There is a fairly consistent workflow you can use to achieve neutral and balanced video footage every time. As mentioned above, it requires manual control over several camera settings, so if you’re using a full-automatic camera, or a cell phone without a dedicated “pro camera” app, not all of these steps will apply to your gear, but following as many of them as you can will certainly help.
Also, it is best to follow these basic photograph steps in order. ISO, aperture, and shutter speed are all points on the exposure triangle, but two of them affect video much more than photography, so it’s important to set them in the correct order to prevent unintended artifacts in video.
Let’s go! Here’s the basic process I follow after I set up my scene:
1. Adjust lighting in the real world to get the look you want. Vintage computing enthusiasts understand “garbage in, garbage out”, and the same applies to videography. You can’t fix everything in post, so make sure you have enough light. Not sure if you have enough light? Add more light! The more light you add, the more photons hit the camera sensor, which translates to less noise in the final video. (As a bonus, additional light sometimes fills in harsh shadows.)
2. Set camera to full manual. We need individual control over ISO, aperture, and shutter speed. Preferably, also manual white balance.
3. Set ISO to the base ISO of your camera. This reduces noise in the video. Think of ISO as amplification, or gain, added to the signal registered by the camera sensor: Higher settings produce more noise, just like turning up the volume on a vintage stereo system adds more noise and hiss. Base ISO settings differ from camera to camera, and the base setting is usually NOT the lowest ISO setting of 50 or 100, so you need to check your camera’s manual (or, sadly, the internet) to determine the correct setting. For example, on my Panasonic GH5s, the base ISO is 800 for its LOG picture profile, and ISO 400 for all other picture profiles. (On my Panasonic G9 II, the camera limits the low end of ISO adjustments to its base of 400, and won’t let you go below that. I find that very helpful, but I think that default behavior can be toggled.)
4. Set shutter speed to match the frequency of your lighting. This eliminates subtle “banding” in the picture that results from a mismatch between shutter speed and light flicker. There are really only two settings, 1/50 or 1/60, and you should pick whichever matches the AC frequency of your mains voltage. For Europe, this is 1/50; for North America, this is 1/60. (European users might have some trouble getting exactly 1/50, as that is not a traditionally common shutter speed; pick 1/48 instead as a compromise.)
If you’re using professional flicker-less light rigs, then you don’t need to worry about mains/electrical flicker, and can set the shutter angle to something that looks more natural. Set your shutter angle to 180 degrees if your camera has a “shutter angle” setting. If it doesn’t, set it to double your shooting framerate; for example, if shooting video at 24 frames per second, set shutter to 1/48.
5. Adjust exposure using the aperture. With our ISO and shutter speeds set, the only way we can adjust exposure is by adjusting how much light entering the lens hits the camera sensor. If the picture is too bright, stop the aperture down (ie. go from f/4, to f/5.6, to f/8, etc.) until nothing is clipping in the whites. If too dark, stop the aperture up (ie. go from f/8, to f/5.6, to f/4, etc.) until it nothing is crushed in the shadows. If it’s still too dark after opening the aperture all the way, then add more light (see “Adjust lighting” above).
How to check exposure? If your camera supports it, turn on a “levels”, “histogram”, or “waveform monitor” display in your camera’s settings (or cell phone’s “pro camera” app). If your camera doesn’t have that, enable the “zebras” camera setting with the threshold set to 95% or 100% to ensure nothing is clipping; with this setting on, moving “zebra outlines” will appear on-screen where clipping is occurring. (And if your camera doesn’t have either of these monitoring features, consider getting a better camera.)
6. Set white balance in camera. This is to ensure nothing has a color cast, and that white is truly white. Use your camera’s function to set white balance, which usually involves putting a white balance target in front of your camera so it can calibrate itself. Ideally, use an “18% grey card” (they’re cheap on B&H or Amazon), but you can also get by with a pure white sheet of paper.
7. Focus on your subject. If you’ll be moving the camera or subject a lot, or you know your camera uses a superior focus system like phase detection, it’s ok to use autofocus. If you are not moving the camera or subject significantly, or your camera uses an inferior focus system for video like contrast detection, use manual focus instead.
You don’t have to be scared by manual focus: If your camera supports a feature called “focus peaking”, you can enable it to make in-focus areas easier to see. But you don’t have to trust your eyes using manual focus; your camera should support a “tap to focus” feature when in manual focus mode, where it will focus exactly where you tap on the camera’s flip-out screen. (Just remember that tap-to-focus only sets focus once — it will not track your subject and continually adjust focus.)
8. Put a color chart in front of your subject for a few seconds when you start shooting video. This provides a known good reference to further correct any color issues later during editing. The Calibrite (formerly X-rite) ColorChecker Passport Video is a great choice, as its “video” chart can be used directly by some editors, like DaVinci Resolve, to automatically color-correct any subtle differences from neutral in the footage. If using other video editors, both the MBR Color Corrector III plugin (Premiere, After Effects) as well as Cinema Grade (Premiere, Final Cut) can also use this specific color chart. Simply apply whichever process you have to a frame that contains the color chart, and you will notice the footage will subtly get more accurate; you can then cut out the piece with the color chart in it so it doesn’t show up in the final result.
What about picture profiles?
As long as you’re not using a “special effects” profile, anything called “standard”, “natural”, “rec709”, or similar will be just fine. The actual picture profile doesn’t matter as much, because as demonstrated above, we use the color chart shot to correct the footage to a neutral color balance anyway. But if you’re not sure, take some time to test these procedures with all of your picture profile settings, to see which works best. For example, some picture profiles have different contrast levels than others, even if they don’t change the colors, so you may want to find a low-contrast profile to fill in the shadows if you have trouble lighting your scene.
You don’t have to color-grade
Now that you have color-corrected neutral footage, you may be artistically tempted to color-grade your footage. While it can be fun to make video look like your favorite movie or tv show, stop first and think about what your video is trying to do: If you’re trying to convey a certain narrative, color grading might make your footage more impactful and dramatic… for example, going for earth tones if simulating a 1970’s flashback. But when it comes to historical accuracy, the last thing I want to do is color my footage artistically. For example, CGA has 16 distinctive colors; I want everyone to know exactly what those colors looked like, in ideal conditions. So what I put on YouTube is the neutral, balanced footage out of the camera with very few adjustments, and certainly no color grading. I want people to see what the machines actually look(ed) like.
It is entirely possible I am a moron
I hope this helps you with your own product photography — but if you’ve had better results with a different process, I’d love to read about them; post your experiences in the comments.
Did I forget to mention that I co-hosted four episodes of the Floppy Days podcast 18 months ago? I’ve become somewhat of an expert on the IBM PC over the decades, so when Randy Kindig wanted to cover the 40th anniversary of the IBM PC, he reached out to me to help with tech research and modern-day hobbyist developments. Along the way, he asked if I could co-host those episodes, and having a face for radio, I agreed.
We didn’t want to leave anything out, so we covered everything and broke it up into four parts, which aired from December 27th 2021 to March 31st 2022:
I recorded all of my parts locally, but Randy was happy with the stream quality, so he used the stream as the audio. As for the content, over the series it goes from historical and technical, to modern topics such as how to get started, where to find systems, modern homebrew hardware, networking, etc.
When my crew and I wrote 8088 MPH and Area 5150, we had a secondary goal other than to amaze: It was to spurn emulator authors into improving the emulators. Anyone who grew up with a 4.77 MHz system understands that most emulators just don’t play pre-1985 games very well, as most of them were written for that target speed and are unplayable on anything faster. DOSBox is woefully inaccurate for this era of PC gaming — which is why I think most people are trying to forget it, but that would be a shame, as there are some wonderfully surprising action games for this class of system.
MartyPC rises to that challenge, and because the author is as tenacious as we are, he didn’t stop until it ran Area 5150 perfectly. You can even see it single-stepping through the demo’s bit-banged video mode here:
MartyPC single-stepping through a portion of Area 5150
It took 42 years, but we finally have a decent 4.77 MHz 8088 + CGA emulator. And if you’re into that sort of thing, you can even run it in your web browser.