Some time ago, my family tried out digital cable. We were thoroughly unimpressed at the 'digital quality' and subsequently dropped the service. But strangely enough, the digital artifacts that annoyed us so greatly are now showing up in the analog signal. We routinely see compression issues and dropped areas in the picture, and it only seems to be getting worse. Is Time Warner digitally encoding the signal before they send it out on the analog line? It sure seems like they are, but I don't see the benefit in doing that. Maybe it's cheaper on their end? Does anyone else with cable see this problem?
Cable Artifacts
Jan 22, 20076 Comments
kip
3:52 AM on Jan 22, 2007Dustin
9:48 PM on Jan 22, 2007d2
4:42 PM on Jan 25, 2007Maybe the feeds they get from the networks have switched to digital?On an unrelated note, I was watching a rerun of the first Star Trek movie when it hit me that they used this sort of artifacting to imply a weak/jammed signal. That seemed pretty cool: usually I'm quick to criticize movies that don't even try to mirror reality (CSI drives me *NUTS* for all it's inconsistencies and implausible stuff). But Star Trek movies have used this sort of digital artifacting (i.e., block-level compression degredation instead of analog static) to convey weak or failing communications since a few years before I was aware of digital artifacts. How cool is that!?