That is not true; we are currently using it on a linux-based chinese handheld console, and it works great.
(Whoops... Long post. Summary - too hard, no benefit. There are much more effective ways to do this on a console like the Dreamcast, games don't really benefit from things like zRAM anyway, and it's much better suited for Live CDs.)
It needs a proper operating system to work. Linux, for example. KOS doesn't have the required bits and pieces (like MMU support, virtual memory, and swapping) to build something like this.
By itself, MMU support and virtual memory is nearly useless on a system like the Dreamcast. Which is why KOS doesn't use it. You have no writable disk, so you can't page unused memory out. You can't really even do things like mapping a file into a program's address space, because the read-only disc you have (a CD drive, basically) is slow, and has a very high seek latency. So, under memory pressure, the CD drive would start to go nuts, the program would slow to a crawl, and would probably take a long time to recover. Hardly any better than just crashing because it ran out of memory.
Plus, you get extra runtime overhead from TLB misses, and having to manage a virtual address space, and it makes it that much more difficult to deal directly with the hardware (for example, you can't use DMA to transfer a buffer over to VRAM if you don't know where the buffer is in physical address space, and you have no guarantee that buffer is contiguous).
Arguably, it might give you a way around memory fragmentation. In practice, it won't - you're more likely to get some degree of heap fragmentation, which will cause you to run out of physical memory anyway. Unless you very carefully manage memory yourself. Which you can do just as easily without an MMU.
So, even if you implemented MMU support in KOS (which is a lot of work, and most of KOS itself would need to be heavily modified), you wouldn't be able to use it for anything. Except, perhaps, something like zRAM.
The problem with zRAM is that it actually needs more memory. You have to have some free memory to compress unused pages into. Which means you need some way to evict pages from memory. So your application has to be using memory mapped files for nearly all read-only data, and code. Since bits of your code have been evicted from memory, you'll end up with the drive thrashing like mad, and everything will slow down to a crawl. Again.
You really don't need any of this stuff on a console like the Dreamcast. All this memory management stuff can be done more efficiently from the game you're running on it. After all, the game knows a lot more about what data it needs than the operating system ever could.
And you can be far more clever with it.
Say, for example, you need to access a model file. And let's assume you have zRAM, and the model's data is sitting in the compressed swap area. Your game tries to access the model, but it can't know that the model isn't there. The OS notices the invalid memory access, and switches to kernel mode, finds the compressed data, allocates some memory for it (which will probably involve either compressing a different memory page, or evicting some of your read-only code or data), decompresses it, and returns back to your game, which carries on as if nothing's happened. Except a huge chunk of CPU time has been used up, and all the caches have been wiped out. If you do this too often (I'm thinking once or twice per frame, maybe), you can kiss any hopes of getting 60FPS goodbye.
Worse - you try to access a model file, and it's actually on the CD. Now, your whole game stalls for ~300ms, while the CD drive grabs the data off the disc. You'd be lucky to get 3FPS like that.
A better approach is to have the game itself control this stuff. Your game needs the model file. It checks, and finds out that it hasn't been loaded yet (or maybe it's been evicted because you've not used it in a while). It sticks the file in a queue of files to load (and if the CD drive isn't busy, starts it reading), and skips over it. The game runs as normal, but the model won't show up for ~300ms or so. When the file's been loaded, it'll be immediately usable, and the game will be able to use it next frame. Better still - your game might know that, in a second or so, you're going to need that model, so you stick it into the queue before you need it. Then, everything just magically works - no loading times, no pauses, no sudden pop-ins.
You can also apply this to textures and sounds, which reside in separate memory on the Dreamcast, and couldn't be managed by an OS even if you wanted to.
Granted, it's much more difficult to actually do that than it would be to let the OS handle everything for you. It also works a hell of a lot better on consoles.
If you have something like an SD card (a fast
SD card, not one hooked up using a serial port), the whole equation changes completely. Virtual memory suddenly makes sense. The best way to load (most) resources and code is probably to just map them into a virtual address space, and let the operating system manage it. As an added bonus, initialization code or data that's only used at start-up will eventually get kicked out of main memory, and you won't notice.
You'd still really want to explicitly load things like textures or sounds, but the streaming approach I mentioned earlier is actually even better if you have an SD card, because it'll likely be much faster than a CD drive could ever be.
That's the approach you'd use on something like an iPhone. Or anything powerful enough to run a real operating system, with low-latency flash for storage.
However, streaming + mmap eliminates the need for zRAM (at least, for games). The only stuff you'd ever want to compress is dynamic data, that can't be re-loaded off the SD card by the operating system. In most games, most of the data is static, and could be re-loaded off the disc as needed. The little dynamic data you do need is likely to be needed every single frame, so there's no advantage whatsoever in swapping it out, or compressing it. Everything would slow to a crawl, instead of just crashing.
There are exceptions - this only really works for games. Applications (which we're unlikely to use on a Dreamcast) tend to consist almost entirely of dynamic data. zRAM would probably be useful there (especially if you're running an entire operating system like Linux at the same time). Emulators are almost entirely dynamic data (RAM, internal state, buffers), and ROMs are effectively opaque so there's no way to intelligently load parts of them - you need the whole thing in memory, or performance will tank. Neither MMUs nor zRAM would help.
By the way - the approach I described above for the Dreamcast is actually very common on all kinds of consoles. Systems with limited memory and only an optical disc drive (PS1, Dreamcast, PS2, GameCube, PSP, Xbox 360), and systems with limited memory and a slow cartridge (N64, DS). Even systems with a hard drive (Xbox, PS3, Xbox 360) ultimately have to run off an optical disc, and there's a limited amount of hard drive space (which might be optional, full up, or temporary) which you can really only use as a cache. None of these have much in the way of an operating system either, and where they have an MMU, it's purely for security reasons - you still have to manage memory yourself, and you have no virtual memory or paging to help you out. Streaming's the best bet. Careful memory allocation is needed as well.
Smartphones, tablets, and most of these ARM-based handhelds (real OS, limited memory, fast flash-based storage) are usually better off just using virtual memory to map static data and code, let the OS handle that, keep dynamic data as small as possible, and stream textures and sounds manually.
PCs are... weird. They're really the only place I can think of where zRAM would be useful - running an entire operating system off a read-only optical disk. Lots of dynamic data that isn't being used right now, no writable storage, and that memory would be better used for caching things on the optical disc. That's what zRAM was designed for - Linux Live CDs.
Chilly Willy wrote:
The old JDuke, which was the source of a lot of ports, had lots of problems that did eventually get sorted out. I'm not sure what the DC SW port was based on, but if it had troubles, it was probably an older version of JDuke.
From the readme, it appears to be based on the icculus.org Linux port. Which I seem to remember being very buggy, and just barely enough to get DN3D to run (assembly code replaced with C, platform-specific stuff isolated and replaced with SDL, made compatible with shareware, standard, and atomic edition).
Apparently, Bero's port had some kind of memory restriction, which prevented the full game from working. Then again, it was done using SDL (which is hardly efficient at the best of times), and I think it was abandoned as soon as it was released, so I doubt he put any effort at all into making it use less memory.