In the comments to the previous post a certain Paul B. pointed out that SVQ1 decoder (the one in libavcodec
or mine) decodes certain files with visual artefacts. So I opened the old dreary QuickTime.qts
with Ghidra
to look at its contents once again (last time it was for QDesign Music details but luckily I’ve marked SVQ1 decoder functions as well).
The official binary specification turned out to have slightly different design with just one block decoding function that gets intra or inter codebooks passed to it (so intra block is essentially adding residue to zero block using intra codebooks). And, more curiously, the codec uses 16-bit values for pixels up to the very end of decoding.
As you can guess, the artefacts looking like white blocks are caused by the pixel value going out of 8-bit range. I actually hooked GDB script to mplayer2
that loads QuickTime decoder (and presents some garbage instead of proper decoded frame) to see what happens with the block showing such artefact. It turned out that pixel with the original value 0xCF
got increased to 0x14F
during codebook additions and the reference decoder had output it as 0x4F
. So I changed clamping to discarding top bits and it works much better.
Considering that codebooks are stored as single .dll
resource and block decoding function works (for performance reasons) as a chain of block modifying functions with stackless calling convention I call the results good enough and let those who want more dig there instead of me.
Oh, why I’m so stupid that could not get idea about such simple solution. Instead I wasted countless hours on “fixing” integer overflows.
There’s a saying “any wise man can be baffled by some simple thing”. I’m not very wise but it happened to me countless times.