Bloat is a choice (but not necessarily yours)

May 23rd, 2025

First of all, I’d like to remind that the term optimisation comes from mathematics and means selection one of the available options in a certain set to maximise certain function. Of course the term was misappropriated by computer “scientists” in the same way as physicists did with biological term plasma and the first association people have with that word is “something that modern games lack”.

And the crucial part in the original definition is that there’s a metric present by which the choice is judged. So even software may be optimal, it’s just the goals set during its creation are different from what you expect (and most likely set by the marketing team).

Hence you have the first kind of bloat: creeping featuritis caused by the desire to earn more money for the product by making it more appealing to the wider masses. Often it does not matter if most of those features are unpolished or even non-existent, as long as the product is bought you can keep lying that they’ll be fixed/improved with the next update or in an upgraded version of the product. I’m still old enough to remember the times when this was called Korean cell-phone syndrome (after electronics manufacturers from a certain country notorious for such tactic) instead of being normal product cycle.

Other kinds of bloat come from the trade-offs that everybody has to make. For instance, there’s sediment accumulating from the previous features that were hard to remove or even to refactor. So it’s there and may break things if you try to remove it (as other components rely on it in unpredictable ways). The most obvious example of that is human genome, what humans create on macroscopic level is usually much much more effective (there’s no need to list exceptions).

And finally, it’s rather conscious sacrifice of efficiency for some other goals. For example, car engines can be more effective in, say, kilometres per litre of fuel consumed but they won’t allow you to accelerate or won’t pass ecological standards (which is a problem with diesel engines, even Volkswagen had to admit it eventually).

The same applies to developing programs: you may write a program without superfluous features, it will require minimal storage and RAM, and it will run extremely fast—but you’ll end up writing it somewhere next century. And it’ll be easier to write it anew than make a port for another system. Also let’s not even think about debugging.

That’s the story with the low- to high-level programming languages. They offer more and more composability and convenience for being slower. You can write program in machine codes but you have to calculate all jump and call offsets by yourself. That’s why assembler was created—and also introduced macros which allowed to not repeat the common sequences by sacrificing a bit of efficiency (since sometimes you could’ve omitted or improved an instruction or two that were not needed in that particular place). But it’s not good enough when you have to port your program to a different OS or even a different CPU, thus C was invented, earning a nickname “portable assembly” (which is no longer true thanks to the standardisation committee but that’s another story). With C you can write a program not only that you often may compile and run on another system, but also modular enough that you can have a comprehension of what it does without going insane after looking at one huge unstructured file. Of course this also has its cost, both in compilation time and efficiency losses that even optimisation passes can’t compensate.

In the same time there were high-level programming languages, and they had a radically different approach: “we don’t care about the details, we want this calculation to run on the machine and produce results”. The main advantage of C++ over C is that it allows better encapsulation of different objects and types (so you can e.g. define your own matrix class and operations over it writing more natural mat_a + mat_b instead of mat_c = matrix_add(mat_a, mat_b) and worrying if you passed the right types). Of course this does not come for free either (virtual tables for class methods take space, you know) but it’s often a small price to pay (especially if you’re using a language that has actually learned something from C++ mistakes).

The same applies to the reusable components aka libraries. You save time on writing code (and sometimes on debugging it as well) while sacrificing speed (since those components are not tailored to your specific use case) and space (because they often bring in dependencies that you don’t really need; web development is full of egregious examples of this).

At least here it is your choice and you decide how you’re going to build your project what you can sacrifice for what goals. With other apps forced onto you it’s not the case.


In conclusion I want to give an example from my field. My hobby multimedia project written by me from scratch does not strive to be the fastest (except occasionally when it gets the fastest open-source encoder or decoder—but that’s for the lack of competition) and I work on it time from time mostly to make sure I understood some concept right; when I try to make something faster it’s usually to make it fast enough for my practical needs. And there’s Paul M. whose project I haven’t advertised enough (here’s the link just in case)—he took the existing codebase and added or improved some bits, striving to be better than the competition (and often successfully). You can give arguments why either approach resulted in bloated software, but I’d say that as long as we’re happy with the results it’s not bloated in either case. It’s optimal in the original meaning of that term, you just haven’t considered what metric was used for picking that solution.

na_game_tool: yet another bunch of formats

May 11th, 2025

So, what’s been added since previous report?

  • Road Avenger BIN—as mentioned in an update to the previous post, done;
  • Reaper FMV (from Rollcage game)—decoding works fine but the reconstruction does not because of imperfect IDCT implementation on my side (and the fact they don’t use any kind of clipping to make sure the result will stay in range all while employing different tricks like converting double to int by adding a large constant, storing the result as double and read the second half of it as an integer). Overall, I decided not to count it and threw in a couple of easily REd formats described in the next entry;
  • a pair of other BIN formats from Data East—they’re the same as Road Avenger BIN but not even compressed;
  • Ecco the Dolphin LMS—another raw-tile Sega CD console format except that tile data is arranged more curiously there (128×128 frame is split into 128×32 strips consisting of 8×8 tiles that are arranged in columns). Also it seems to use implicit grey-scale palette (unless there are some external palettes hidden elsewhere), at least watching those documentaries in this way is fine;
  • Interplay M95—it’s been described on The Wiki since 2006 (not by me of course) but nobody bothered to implement a decoder for it until now;
  • Psygnosis SMV (from WipEout game)—I looked at it back in the day but of course it’s been on The Wiki since 2012. Codec 2 was not described completely right there (partition codebooks are stored in the end, after block indices and not before; and it was not immediately clear that block palettes are stored in global frame order and not in partition coding order) but nothing that a bit of debugging can’t fix.

In either case I’ve filled self-imposed quota of a dozen of originally REd formats and started to pick other formats to support (three done, about nine to go). I have noted about three formats worth supporting and if I can’t find more good candidates—well, maybe there are some more low-hanging Sega CD formats…

And as usual, for more supported console formats, both simpler and much more complex, there’s always librempeg (and that’s not talking about other things it offers like the countless audio and video filters).

P.S. Since I don’t want to do a separate rant, I’ll mention some tangentially related things here.

First, a search for more SMV samples on discmaster returned no unknown real SMV samples but some PMV samples with .smv extension including one that could not be decoded (because of zero root chunk length—I’ve fixed that) and a couple that had nothing wrong with them but were not detected as Zork PMV at all for some reason.

Second, I actually used discmaster to check if there are some interesting ARMovie files. Sadly it still lack an option for omitting supported files so I searched for acornReplayAudio files and filtered out the sources containing actual ARMovie files without video track (somehow I don’t want to check all of ~6000 files so if one or two files coming from same CD are audio-only I excluded it from the further search). After all this winnowing I’ve found two samples with Eidos Escape 102 codec. It’s simple and I’ve REd it some time ago so finally I’ve managed to write a decoder for it. At least I don’t have to search for the samples by the magic string like I did back in the day… Sadly there’s no good way to detect old TCA format so I’m not even going to attempt looking for it.

Democracy and open source

May 7th, 2025

Since I have nothing better to do, I decided to finally write a rant about between democracy and open source movement.

I believe democracy is a flawed form of government—because people are flawed. There are two main problems here: first, the fact that people are mostly ignorant and don’t think about the decisions implemented by their representatives; second, people don’t often vote how they want. Yes, there’s a difference here—in the former case people don’t have an idea what their representatives do (nor care about the fine details of the legislation passed) and vote for somebody for such profound reasons as “I voted this way last time” (or even “we’ve always been voting for this party”), “at least it’s not X”, “just for fun”; in the latter case people may know what they want from the candidate and yet keep voting for the wrong person consciously. The reasons in that case may be even sadder: “I want to vote for a winner” (i.e. voting for the candidate who’s most likely to win because that candidate is most likely to win), “that guy controls the main business in our town so we have to vote for him”, “that guy gives out freebies” or “all media claim that’s the best possible candidate ever”. Side note: I’ve heard all the excuses listed here and not just in the recent news about USian elections. Essentially it boils down to two things: people either lacking control (over the information, or even over their own income; the mostly forgotten distributism movement had a lot to say about it) and people not caring about it at all (but still voting for some reason). And it seems to me that open-source movement is a lot like this.

Initially Free Software (free not as in “free beer” but rather like in Stallman’s speeches) was targeting the audience that understood it, namely the programmers and hackers that values the software freedoms and could exercise all of them (yes, including modifying source code and compiling the library/program). Nowadays though there are many users that actually do not care about the software they use as long as it solves their needs (and if not, they’ll either look for an alternative or start pestering the developer to fix it for them, instead of doing it themselves). Like with democracy, people are so used to its presence that they not realise why it matters and don’t care if something happens to it.

The second aspect is the lack of control. I develop software in the old way: I make it useful for me and provide the sources for the curious zero people who may do with it whatever the AGPL license permits. But most developers have to play by the rules of infrastructure providers in order to get their software noticed. In an exaggerated form, if your project is not on GitHub and does not have at least a thousand stars then it does not exist.

And if you got your software popular and included into some kind of distribution or package repository, that means kowtowing to the distribution (or package repository) maintainers. As one of the core libav developers I could observe the interactions between that project and certain Linux distributions. All I can say is that with the recent USian Securing Open Source Software Act and European Cyber Resilience Act they essentially get the same treatment as the developers got from them (but at least it’s more formal).

If you wonder why either is important in the first place, the answer is freedom. Without democracy you have no way to affect what’s forced onto you, without open source you have no chance of getting software for your needs instead of the vendor’s needs (which are usually diametrically opposite to your needs—like having full control over your hardware, wallet and all personal information they can siphon out of you and sell to the highest bidder).

If you wonder what can be done about it, there are two obvious solutions. People may actually start to think about what they’re doing (or choosing) and collect all available information beforehand. And not such utterly improbable solution involves global (thermo)nuclear war—no people mean no problems (or at least survivors will be more occupied with surviving instead of competing who has the latest iPhone model). The chances of the latter are rather good, let’s do nothing and wait.

na_game_tool: another bunch of formats

April 30th, 2025

Short summary for Paul: nothing interesting. I write it as a report of done work and to provide some clues for somebody looking for ancient formats.

After I did EGG format support, I remembered that samples.mplayerhq.hu had a directory dedicated to that and some other game formats (from Smacker to Bink 2), so I decided to look at it and see if I can support some of those. Spoiler alert: there were low-hanging fruits there.

What got supported:

  • ICOM video—this one is used in their Sherlock Holmes Consulting Detective series. Initial frame is unpacked, the rest are coded as sequences of skip and masked updates. I’ve managed to RE it from the sample files only, it’s that simple!
  • Klondike Moon VID—also raw first frame plus simple RLE for the following frames, and I’ve also managed to RE it from the samples;
  • Maelstrom ANM—this one uses RLE for both intra and inter frames;
  • Manic Karts AMF—simple RLE with a special large value skip or run modes that encode skip/run length in the way known as Xiph lacing;
  • The Amazing Spider-Man vs The Kingpin BIN (Sega Saturn game)—again, I’ve REd it from the two known sample files despite this being a console game. I had some vague ideas about how the console works and indeed it proved to be raw data for tiles/palette/scroll planes (even if it was send as a partial update in some cases) that I simply had to assemble correctly. Just reading some console specifications of its video system was enough.

What I looked at but haven’t REd (yet?):

  • Knowledge Adventure Movie—I looked at it some time ago and it’s not a format I’m eager to revisit;
  • Road Avenger BIN—this format seems to compress tile data with some custom LZ77-based compressor with a signature “sarc20“. I’ve managed to locate the bit in the binary specification responsible for the unpacking but no references to what and how it’s used. Nevertheless I hope to finish REing it with my new knowledge Update: done. It’s simply one plane with tiles coded and palette transmitted before the compressed data;
  • Rollcage Ripper FMV—I REd it long time ago, I just need to locate my notes and finish it. Update: it’s in progress, decoding works, reconstruction is far from good though;
  • Rocketz VQM—I failed to find the code responsible for the decoding (and Ghidra decompiler refuses to work on many functions there), and DosBox debugger reports that these files are read in 16kB chunks, so no luck on guessing their internal structure. Maybe one day…

In either case that makes it ten original formats added since version 0.3 and about ninety in total. Two more originally REd formats and another dozen of formats REd and documented by others (those are easier to come by) and my goal for the next release is complete.

One game, two rare-letter formats

April 24th, 2025

As I’m still in a search for formats to implement in na_game_tool for 0.4.0 release, one game brought two formats at one—and both start with a letter I had only one or two formats before.

I’m talking about ID4 or Independence Day. Its Sega Saturn port uses EGG format which has not been REd before. But first I was able to locate PC version of the game with its animations in YQS format, so I started with it.

It turned out to be a very curious format, with YUV frame coded per plane using quadtree and two separate streams for pixel values and for control bits, eight streams in total (there’s IMA ADPCM audio as well and fixed-size vector codebook for luma). Essentially plane is split into 8×8 (luma) or 4×4 (chroma) blocks which may be skipped entirely (if the control bit is not set), filled with single value obtained from the stream (again, if the next control bit is not set) or split into four sub-blocks with the similar approach. 2×2 chroma blocks may be skipped, filled or coded raw; 2×2 luma blocks additionally may use input pixel value as a codebook index and use that 2×2 vector (flipped or not, depending on control bits).

After that I actually looked at EGG. It had similar structure but with some changes: frames are always 20kB and do not contain audio, values are big-endian and bitstream format for 2×2 luma blocks differs. I still had to resort to the binary specification for it though—loaded CPE file in Ghidra as raw SH-2 binary, located animation names and started attempts to disassemble data around them. Eventually I got the function that looked very similar to the one in PC version (except that it decoded just luma instead of doing all three components and converting macroblock to RGB). Thus I got a working decoder for this format as well.

That’s how I got two more formats supported, which makes it almost 42% of my goal. The problem is still mostly to locate the formats that can be supported (most of the ones I haven’t implemented need a lot of work to unpack the binary specification). In either case I’m in no hurry and it was nice to learn more about old formats.

A tale of three formats

April 19th, 2025

Since I have nothing better to do, I keep looking at the odd formats here and there and occasionally do something about them. Here are three formats I took a look at recently and that have nothing in common beside being video codecs.

CGDI

This is a “capture” codec (aka Camcorder Video) if not for the fact that it records rather events than actual image data. I had a suspicion right from the start that it’s organised in the same way as WMF/EMF—opcodes with parameters that invoke GDI subsystem to draw actual image—and finally I’ve decided to check that.

Of course it turned out to be true. There are about 64 opcodes in total, some are for drawing things, some are for window management stuff, and some are for GDI state management (e.g. create or delete brush, pen, font and such).

Since implementing a decoder for it would mean replicating a good deal of Windows graphics subsystem (even if you can borrow code from Wine), I consider it completely impractical and merely improved codec documentation in The Wiki.

TCA

This is an animation format used on Acorn platform. Actually it has three layers of encapsulation: there’s raw TCA format that contains only the header and video frames, then there’s TCA wrapped in ACEF chunk with an optional SOUN chunk following it (no points for guessing what it contains), and finally there’s that format put inside ARMovie (as a single block).

I added its support to NihAV just for completeness sake. Of course not all of different flavours are supported (video is mostly just plain LZW but it has some alternative coding mode and an uncompressed alternative, audio is IMA ADPCM but sometimes it’s not without any reliable way to distinguish which is which). And looks like some animations may have variable frame rate (with DIR1 subchunk likely telling frame durations). All the details are there, in raw ARM binaries and semi-compiled BBC BASIC code, but I’m satisfied that it works at least for a couple of random plane samples I tried and have no desire to try supporting every known sample in existence.

Savage Warriors ANM

This one is a curious format. I’ve managed to locate decoding functions in one of the overlay files, it looked reasonable (LZ77 compression for intra frames and something a lot like FLI delta frame compression for the rest) but the decoder did not work properly. Curiously, demo version contains some of the same animations as the full game but in slightly different format (the initial magic is missing); after comparing them I found out that the release version uses a weird format with a 32-bit value inserted after each kilobyte of data. I ended up implementing my own buffered reader that loads those kilobyte blocks and skips those additional words for the release version.

Another thing is that LZ-compressed format had 17-byte header which the decoder skipped. Of course it made me suspect of being a third-party compression scheme, and after searching around it turned out to be Diet (you may remember it being used as an executable compressor but apparently it had other uses). It somewhat reminded me of MidiVid Lossless as it is yet another codec reusing third-party general compressor (with special preprocessing for executables, which was a dead giveaway).

In either case, both flavours of this ANM format are now supported by na_game_tool (and will be the part of the next release).

NihAV: even further encoder improvements

April 9th, 2025

Since the last time I wrote about it somebody actually decided to test my transcoder on a bunch of fringe formats. The main issue there was wrong audio output (since I didn’t bother to check the actual input format, it often passed e.g. floating-point audio as 16-bit PCM, resulting in complete garbage). Additionally I’ve fixed a stupid bug in Indeo IVF demuxer (so now both known samples are decoded fine) as well as improving support for some other corner cases.

And there was one fun failure only my transcoder could have: when video streams starts with skip frame(s) and ZMBV encoder was selected, it refused to encode. I’ve added a work-around so now it simply presumes that frame to be black and goes forth normally (if you encode into raw video, such initial frames would be skipped instead).

So whatever can be decoded seems to be decoded and re-encoded just fine (but there are probably more cases waiting for their turn to be discovered). And there are some more formats waiting for an open-source decoder (in case of Motion Pixels the wait is likely to be very long).

na_game_tool: new season start

April 6th, 2025

It’s been some time since na_game_tool release and I’ve finally started working on adding new formats for it again (for v0.4 the goal is to have a hundred of supported formats including a dozen of fresh ones, currently it has only about eighty). And since newly added formats are somewhat remarkable, I decided to talk about them.

Spidy Ani from Imperium Galactica. The name sounded vaguely familiar and indeed I’ve encountered it before. I did not bother to support old sprite format since they are just animations for which you need to know the background image and palette. Version 2 contains both palette and (optionally) audio, making its support much more reasonable. As for the compression, it actually keeps the old RLE method and introduces a new one (where low values mean copy length instead of a literal) plus it adds optional LZSS compression. The best part is that it’s not been reverse engineered by somebody else but it has been documented (in open-ig wiki) as well.

Shadowcaster cutscenes. This is a format used by a game published by Origin, it turned out to be RLE optionally further compressed by LZSS (I shan’t blame you if you see some similarities here). The main difference is that the format contains commands for e.g. playing an external XMI track (or stop playing it). And instead of one palette it can store several and switch between them arbitrarily.

Castles 2 (CD version) greyscale video. This one is very special: the game has colour cutscenes in Interplay MVE format (usually with castles) but beside that it apparently has some greyscale animations lifted from public domain films. The files have .m extension and start with “BYON PIZZA” string, definitely not hinting at Byon Garrabrant, who was the lead programmer of that game.

It was extremely hard to read the binary specification for it as the executable stored that code and data in executable overlay. That means that it was in a special part of executable that gets loaded on demand, thus lacking proper references to the data and even functions. Luckily it turned out that I didn’t need it at all.

One of the sequences turned out to be just one black (or white) frame. Seeing that the frame was exactly 1/4th of raw image and filled with 0x40, I decided to calculate a histogram of some other (larger) frame. It turned out that indeed if you output values in 0x00-0x3F range as is and repeat 0x40-0x7F values four times you get exactly full image. Then by looking at the reconstructed image I understood that it codes data in 2×2 blocks (and in inverse mode so 0x00 means white and 0x3F means black). After that figuring out that codes 0x80-0xFF in subsequent frames are used to signal skips was trivial.

That was fun. Hopefully I’ll encounter more interesting formats in the future.

P.S. I’ve also added PI-Video decoding support to NihAV. The codec was used at least in one multimedia encyclopaedia and did some non-conventional things with LZW coding. I’m still not sure that it was a good idea because codec developer may be russian but since I could not find any information about him (or the companies involved beside them being Western European), I’m giving it the benefit of the doubt.

My Rust experience after eight years

March 29th, 2025

Soon it will be eight years since I’ve (re)started NihAV development in Rust. And for this round date I’d like to present my impressions and thoughts on the language—from the perspective of applicability in my experimental multimedia framework.

Read the rest of this entry »

A look on Perfect Clarity Audio

March 28th, 2025

Since Paul asked me to look at it (and what’s equally important, provided the binary to look at), I did exactly that.

It turned out to be a rather simple codec, originally known as Sonic Foundry Lossless. The only really interesting detail is that it has intra- and inter-frames. Intraframes start with 0xFFFFFFFF and four last samples for each channel (for filtering purposes). Each frame stores the number of samples (I suppose), coded data size, Rice code parameter (per channel), filter method/order (ditto), LPC coefficients (same, only if global flags enable it though) and probably CRC.

Data is coded as fixed-parameter Rice codes (low bit is used for the sign)—unless it’s filter method 5 which means all zeroes, then optional LPC phase (LPC has order four and 8-bit coefficients) and then, depending on filter order, fixed prediction as in Shorten.

Finally there may be mid/stereo reconstruction (if global config signals it) and clipping for 24-bit mode (but not for 16-bit apparently).

I don’t know if they’ve added some improvements in the newer versions but this looks rather simple compared to other lossless codecs. There’s the final bit of weirdness: it’s usually stored in WAV which means it’s one of those codecs with variable-size frames in WAV. Not something I’d like to support.

P.S. I’ll document it for The Wiki (as well as DVC from the previous post) a bit later.