(Disclaimer: wall of text incoming, and you really don’t need to read any of it for any of the existing discussion. It is all just background.)
For what it is worth, I even think the idea is within reasonable engineering probability. Who knows what the technology of 2075 is based on (surely not silicon transistors!), but physics still says that working with information will create heat. Heat will disrupt most systems eventually, and will outright destroy others. Current technology has the neat twist that as stuff gets hotter, there is more leaked current (on top of the working current), resulting in even more heat getting generated. Technology is carefully designed to avoid the point of thermal runaway, because we don’t want meltware – but that doesn’t mean you have to design things that way.
At the same time, you could build an active cooling system which sucks heat away from the working area and dumps it into some heat sink. The heat sink will radiate some of the heat away, but the whole point here is that meltware creates heat faster than it can be dissipated. The active cooling keeps the working area at a non-destructive temperature for a while, but as the heat sink gets warmer the system can pull less and less of the heat away from the working area. If your technology has a thermal runaway type mechanism, things finish with a flourish. Even if you pull the plug early, the game can just say ‘the lingering heat doesn’t destroy everything instantly, but does enough damage by the time it has cooled down that the module is unusable anyway.
And finally, the test to make meltware could be hardware. The concept is that you take a piece of programmable hardware, and convert your software program into it. That is, rather than running a software program on a generic processor, you make a hardware program using a customizable module—that module now can’t do anything else, but it does its thing very well. In current technologies it is much faster to run a program that is loaded into something like an FPGA (field programmable gate array), than running a ‘normal’ software program on a generic processor—but there are only so many applications where the cost is worth the performance boost. We don’t know the future technology, but the same basic principle is likely to remain (that customized hardware is faster than software running on generic hardware, but more painful and expensive to make). In theory you could manufacture a fully customized piece of hardware, rather than using a programmable one, but that would probably be vastly more expensive. It could be used, however, to offer that “availability 18, really expensive” option that players won’t usually buy, but is sometimes provided to them.
Sorry for going on at such length – I just kind of like it when real world concepts and useful game design can be made to line up to some extent, and this happens to be an area I know enough about to wave my hands about and offer an explanation.