Let's assume we have the requisite tapes/source code listings (which is a non-trivial assumption). Since there's no hardware, we are stranded on ebb tide if for whatever reason we decide to run the code. What layers of computing history are the most notable and vulnerable to oblivion?
Newer calculators, like the TI-Nspire Cx-series and HP Prisms.
Proprietary non-linear editing systems from the 80s and 90s before the switchover to software.
A few legacy systems that haven't quite been emulated yet are BeOS, and RISCOS, certain proprietary Unix implementations like A/UX, Amix, Irix, NeXtOS, Apollo DomainOS, AIX, PA-RISC, Ultrix, etc., and Japanese PCs like the PC-98, PC-88, Sharp X1, Sharp X6800, and FM-Towns
Minicomputer hardware, like VAX Machines, DEC Alpha Systems, and DECstations, are also lacking in good emulation, but the OSes (OpenVMS and Ultrix) are available to hobbyists.
Itanium has not so far been emulated, so IA-64 versions of Windows are probably going to be lost (as no more IA-64 processors are being made) should nothing be done.
For anyone outside IBM, developing one might be difficult: neither the original, low-level CISC instruction set used through the 1990s, nor the PowerPC AS extension to the POWER architecture that replaced it, has been fully described in public documentation.
Videogame and home computer emulation proved the opposite, that with a different organizational structure, preservation was possible, even easy. (It's not hard at all to write an emulator if you're not picky about performance.)
For a system to be unemulatable it has to be unloved. If it's unloved does it matter if it's emulatable?
I don't have numbers on this, but have the impression that more often the software is the main issue, where nothing is known while some hardware and hardware documentation survived in a state that can be reverse-engineered.