Geez, I’m out. Good luck.
It is run by a group of enthusiastic Kaleidescape owners and dealers purely as a service to this community.
You can dismiss this box forever by clicking the "X" in the upper right corner of this message.
... but we strongly encourage you to register for a full account. There is no cost to register for a full account.
This box goes away for registered users.
We probably need a dedicated thread to... they would release the fix they acknowledged is needed in a more reasonable timeframe. ...
And hopefully they also stop with adding the fine grain overlayIt used to be both, but the latest encoding quality has improved significantly since I raised these issues.
The upcoming firmware update will address the unintended processing that causes block artifacts and grain smoothing (DNR) in brighter areas of the image.
It will also fix the minor over-cropping issue, which is partly related to store metadata. In practice, all 2.38:1 and 2.39:1 content is currently being treated as if it were 2.40:1.
Its all subjective of course but to me Ultimate would mean the one that has everything I am interested in watching and at the absolute best video and audo quality possible. I dot not think K delivers on that. Given that K is not limited by bandwidth or storage we should be seeing the type of artifacts that we see on streaming.For those who think that this is not the ultimate home movie platform, please tell me what is. I need to know.
If you don’t see the issue, don’t understand it, or simply aren’t bothered by it, that’s fine: for whatever reason (display type, viewing environment: bright vs dark, calibration, accurate picture mode vs vivid garbage, knowledge level, eyesight, etc.). What’s annoying is the people who come here and downplay these issues as if they don’t exist, even though KS confirmed pretty much all the claims I've made so far.
Personally, I didn’t spend thousands of dollars on a TV(and calibration), AVR, and speakers just to use a player that introduces quality loss, no matter how small that loss may be.
It’s exactly the same on TV forums: you constantly see people denying issues simply because they don’t notice or understand them. For example, I’ve lost count of how many times someone with an LG OLED has claimed their TV doesn’t show posterization or banding in HDR10. Every LG OLED has this issue.
View attachment 9481
I was mostly referring to people who come here claiming our device is defective while theirs isn’t. That kind of statement is frankly insulting.You have a right to complain about the anomalies, and we have a right to say they don't bother us, it's just that simple.
On the other hand, it would be nice if people stating they don´t see an issue wouldn´t be laughed at or dealers accused of "downplaying" the issue for business reasons or even rated as untrustworthy.I was mostly referring to people who come here claiming our device is defective while theirs isn’t. That kind of statement is frankly insulting.
This has been discussed ad nauseum on here. It is a limitation of the available chipset.Why is it only 1080p, that’s the thing I find the most shocking and all we get told is to a hardware issue.
Excuse me?On the other hand, it would be nice if people stating they don´t see an issue wouldn´t be laughed at or dealers accused of "downplaying" the issue for business reasons or even rated as untrustworthy.
How can it be a limitation when the device can play 4k, are we honestly to believe that another chip is used at 1080p for the video wall. I don’t buy it and I am sure there is a workaround but no body wants to do this but say by the C an older player. Even cheaper players can do 4k for their video wall. It’s beyond comprehension that you buy the latest players and servers and they can’t do 4k on the video wall.This has been discussed ad nauseum on here. It is a limitation of the available chipset.
There's a difference between video playback and the graphics rendering engine.How can it be a limitation when the device can play 4k, are we honestly to believe that another chip is used at 1080p for the video wall. I don’t buy it and I am sure there is a workaround but no body wants to do this but say by the C an older player. Even cheaper players can do 4k for their video wall. It’s beyond comprehension that you buy the latest players and servers and they can’t do 4k on the video wall.
Given how expensive the equipment is, they decided to cut costs there; I can't fault the system, but this is one big cock up and stupid cost-saving exercise in my opinion. The latest players can't do a 4k videowall. Am I the only one who thinks this is so poorly done? On a projector screen it just looks crapThere's a difference between video playback and the graphics rendering engine.
Call it what you want but it wasn't a cost-savings measure.Given how expensive the equipment is, they decided to cut costs there; I can't fault the system, but this is one big cock up and stupid cost-saving exercise in my opinion. The latest players can't do a 4k videowall. Am I the only one who thinks this is so poorly done? On a projector screen it just looks crap
Sorry to disagree with you on this. If smaller outfits like Zidoo and R_volution, whose players cost considerably less, can do this, then there is no excuse for cost-cutting on this. I would rather they had put a separate chip that could do this and charge me an extra $200.Call it what you want but it wasn't a cost-savings measure.
Your Apple TV is using custom silicon designed by a trillion-dollar company with billion-device economies of scale, exclusively for their own use. As mentioned in the post that I referenced above, when developing a hardware product, you have to choose from the set of components that are available to you. There are a very limited number of chips that can:
- Decode 2160p video at the bit-rates that we require for our quality level
- Process lossless audio, including lossless Dolby Atmos and DTS:X
- Decode and process Dolby Vision content
- Decrypt high-value content within a secure...
Unless you have some inside info that the rest of us are not privy to it is hard to say that definitively.Call it what you want but it wasn't a cost-savings measure.
That wasn´t aimed specifically at you, but yes, that has happened in this thread. I agree, hard to believe.Excuse me?
Some quick googling/a.i. leads me to ask.....
I think it is just as likely that their UI will not run in 4K on the new chip without substantial rework (cost) thus the chip does not support the 4k UI.