To be clear, when speaking about Kaleidescape players, if you stop playback in one zone and then go to resume in another zone, the movie will pick up where you left off.
Regarding the OP's question, the media-over-IP systems that I'm familiar with employ some degree of image compression. Keep in mind that a 4K/60 HDMI connection requires an 18Gbps link, so even on a system that is using a 10Gbps Ethernet connection, some compression must be applied. It might be possible to use a mathematically lossless compression with a 10Gbps underlying link, but I don't know if anybody does that. Certainly the systems that use a 1Gbps Ethernet link must use lossy compression
The other factor to consider is audio. Does every zone in your house support lossless Dolby Atmos and DTS:X? Or does the media-over-IP system you're looking at include decode and down-mix functionality to provide a suitable audio signal to your other zones? If not, you could potentially find your audio "dumbed down" to the level of your lowest-quality zone.
For my own home use, I have a Strato player connected to my media-over-IP system (a Crestron NVX in my case, which uses 1Gbps Ethernet) that feeds all of the displays in my house except the theater. I have a dedicated Strato in the theater. Although I've never seen a noticeable artifact on the NVX-connected displays, I wasn't willing to accept anything other than a direct connection for my prime movie zone. Plus, all of my other zones are stereo, while my theater is of course not.
This is just my thinking on the subject. Others are certainly welcome to disagree.