I love seeing this story… it reminds me of 30 years ago when I worked in the telephone industry. Heard about telephone copmanies rolling out service in very very rural areas - running signals over barbed-wire fences because it was too expensive to run dedicated cables. That did degrade the signal, but it worked.
I know it’s a completely different thing entirely, but it just gave me nostalgia remembering hearing about that.
well obviously, all this proves is that copper wires are just as bad as wet mud. Every audiophile knows you need gold oxygen nitrogen purified wires blessed by a voodoo witch doctor.
The article isn’t clear on one thing : was it an analog or digital signal ?
The results are entirely unsurprising if the signal was digital. Also, I’d like to see a similar test in an environment with more electrical interference, I think the unshielded materials would fare less well there.
This is a really old test. There are forum posts of the same concept and some news articles with the test that are so old that they 404 now. An unshielded coat hanger is the most common. And yes, this is done frequently with analog signals. No, you can’t tell the difference.
It’s a little confusing to me how it could be physically possible that there’d be no difference.
I’m not saying I think I could tell which is which, but surely they can’t sound the same ? Or if they do, why do we even use copper, instead of just making cables out of the cheapest random conductive substance available ? Is copper just that substance ?
I’m lightly active in the headphone enthusiast space. Even in the more light-hearted circles there is still an elevated amount of placebo bullshit and stubborn belief in things that verifiably make zero difference.
It’s rather fascinating in a way. I’ve been in and out of various hobbies over the course of my life but there is just something about audio that attracts an atmosphere of wilful ignorance and bad actors that prey on it.
I’ve been in the audio enthusiast community for like 17 years now. When I was fresh, the internet commentators had me thinking there was some audio heaven in the high end compared to the mid range priced gear. Now I know better and the gear community is not so high end price evangelicals like it used to be. I feel like there was a before and after the $30 Monoprice DJ headphones and the wave of headphones since. Then especially IEMs. Once ChiFi really got rolling with IEMs and amplifiers and DACs, $1000+ snake oil salespeople got to deal in a way more competitive market
Same with speakers. Internet changed everything. No more at the whim of specialty audio stores stock and Best Buys. Now you got the whole worlds amount of speaker brands at a click of a finger plus craigslist/offerup. Also again ChiFi amplifiers and DACs. Also improvements in audio codecs whether for wireless or not. Bluetooth audio was awful until it stopped being awful as standards improved
These days I mostly see the placebo audio arguments in streaming service and FLAC/lossless encode fanboys. Headphone and speaker communities these days seem a lot more self aware and steeped in self-deprecating humor over the cost, diminishing returns, placebo, snake oil they live in today compared to 17 years ago. I want my digital audio cables endpoints plated with the highest quality diamonds to preserve the zeros and ones. No lab diamonds. Must be natural providing the warmth only blood diamonds that excel in removing negative ions. I treat my room with the finest pink himalayan salt sound absorbent wall panels to deal with the most problematic materials used by homebuilders. Authentic himalayan salt has been shown to be some of the highest quality material in filtering unwanted noise and echos while leaving clean pure audio bliss
I like lossless compression. But not because I’d be a audio nut. I prefer it from a data retention and archival viewpoint. I could cut and join lossless data as often as i like, without losses accumulating.
Do you often cut and join audio that you did not record yourself?
You sound like the right person to ask then—how much should I spend on a soundbar for a tv? Or at least do you know a place to ask these questions that give realistic answers with less fanboyism and faux-intellectuals?
You can use this to connect your TV to bookshelf speakers through an optical cable. Just need some speaker wire or banana plug cables to go with it
This one has HDMI ARC which most sound bars use for connection along with optical
Then offerup/craigslist/marketplace for used bookshelf speakers. Practically anything will be far better than your TV. Like $50 used polk, klipsch, and sony speakers are real common on the second hand market. They may be old but speakers last a real long time if you’re not blasting them at super high volumes. Go for speakers that have 5.25"-6.5" woofers. You’ll appreciate them for music too
There’s a bunch of brands and you really can’t go wrong compared to TV speakers. Edifer powered speakers don’t require a separate amplifier. Other major brands like ELAC, Kef, wharfdale, paradigm, …
I would never recommend a soundbar unless you’re absolutely stuck to that form factor for spacial reasons. Bookshelf speakers are still superior and don’t take up that much space. But I’m also not familiar with any I just got tower speakers that sounded really good at a friend and been loving them.
Honestly I just want something that sounds better than tv speakers that won’t break the bank. It seems like everything everyone recommends is $400+, which isn’t crazy compared to the price of a tv but I just need the most basic thing possible that’s better than built-in for occasional movie nights with friends and family
I get that but is a 400 dollar soundbar really any good? Even the 1000 ones sound tinny and small to me but maybe I’m just spoiled.
I bought a pair of Edifier powered bookshelf speakers (R1280T model, I think) for my living room setup and they work fine for casual TV and movie watching. Cost about $110 total. No subwoofer necessary, but I would add one if I had movie nights with more than just me and my partner (and didn’t have downstairs neighbors, lol).
I couldn’t agree more. I got interest in higher-end audio equipment when I was younger, so I went to a local audio shop to test out some Grado headphones. They had a display of different headphones all hooked up to the “same” audio source.
60x vs 80x sounded identical. 60x to 125x, the latter had a bit more bass. 125x to 325x, the latter had a lot more bass and the clarity was a bit better. Then I plugged the 60x into the same connection they had the 325x in. Suddenly the 60x sounded damn similar. Not quite as good, but the 60x was 1/3 the cost and the 325x sure as hell didn’t sound 3x better. They just had the EQ set better for it.
Picked up a bose system test cassette once. It sounds amazing at first listen on anything because they overhype the high and low end, much like most bad modern music. And its actually fatiguing over time and stresses people out. Big reason I hate a lot of (popular) modern music is the over hyped non natural eq.
Friends will show me songs and they grind on my ears with that unnautural 3k boost to make everything “radio sounding”, gross. I don’t want modern radio polish (and the sampled kick drums, awful) I want good sound.
Commodores, night shift, 1985, one of the best sounding albums of all time because they knew what they were doing. And funnily enough one of the first digital tape recordings on a Mitsubishi! Also the nightfly.
Yeah and the loudness wars. It never ended eh.
Yeah sadly. Studies have shown modern music causes fatigue and I think some people at least realize that now. Radio rock is always going to be a sausage waveform. Gotta go underground for good stuff usually.
These days I mostly see the placebo audio arguments in streaming service and FLAC/lossless encode fanboys.
The clamour for lossless/high-res streaming is the audiophile community in a nutshell. Literally paying more money so your brain can trick you into thinking it sounds better.
Like many hobbies, it’s mainly a way to rationalize spending ever increasing amounts on new equipment and source content. I was into the whole scene for a while, but once I had discovered what components in the audio chain actually improve sound quality and which don’t, I called it quits.
I’m a person with sensitive hearing and mp3 always sounds muddy to me compared with a flac or wav rip. My coworker poo-pooed this notion, but I proved it to him. Mp3 does alter the sounds, most people won’t notice, but for somebody that does hear the differences its annoying. I would not spend 10k or anything. I paid $15 for an old 5.1 system, and max $80 for a pi2 with a DAC hat. LOL
For me its like if you stood outside a persons house and heard them talking vs their words coming over their TV. There is a noticable signature that let’s you hear its the TV or real people, and that’s what mp3 vs wav is like for me.
I can also hear my neighbours ceiling fan running in the connected town home. That almost inaudible drone of the motor running, drives me nuts
The push for lossless seems more like pushback on low bit rate and reduced dynamic range by avoiding compression altogether. Not really a snob thing as much as trying to avoid a common issue.
The video version is getting the Blu-ray which is significantly better than streaming in specific scenes. For example every scene that I have seen with confetti on any streaming service is an eldritch horror of artifacts, but fine on physical media, because the streaming compression just can’t handle that kind of fast changing detail.
It does depend on the music or video though, the vast majority are fine with compression.
My roommate always corrects me when I make this same point, so I’ll pass it along. Blu-Rays are compressed using H.264/H.265, just less than streaming services.
🤓☝️ many older blu-rays also used VC1
Higher bitrate though init
Significantly, streaming is 8-16Mbps for 4K, whereas 4K discs are >100
People don’t like hearing this, but streaming services tune their codecs to properly calibrated TVs. Very few people have properly calibrated TVs. In particular, people really like to up the brightness and contrast.
A lot of scenes that look like mud are that way because you really aren’t supposed to be able distinguish between those levels of blackness.
That said, streaming services should have seen the 1000 comments like the ones here and adjusted already. You don’t need bluray level of bits to make things look better in those dark scenes, you need to tune your encoder to allow it to throw more bits into the void.
Lmao, I promise streaming services and CDNs employ world-class experts in encoding, both in tuning and development. They have already poured through maximized quality vs cost. Tuning your encoder to allow for more bits in some scenes by definition ups the average bitrate of the file, unless you’re also taking bits away from other scenes. Streaming services have already found a balance of video quality vs storage/bandwith costs that they are willing to accept, which tends to be around 15mbps for 4k. That will unarguably provide a drastically worse experience on a high-enough quality tv than a 40mbps+ bluray. Like, day and night in most scenes and even more in others.
Calibrating your tv, while a great idea, can only do so much vs low-bitrate encodings and the fake HDR services build in solely to trigger the HDR popup on your tv and trick it into upping the brightness rather than to actuality improve the color accuracy/vibrancy.
They don’t really care about the quality, they care that subscribers will keep their subscriptions. They go as low quality as possible to cut costs while retaining subs.
Blu-rays don’t have this same issue because there are no storage or bandwith costs to the provider, and people buying blu-rays are typically more informed, have higher quality equipment, and care more about image quality than your typical streaming subscriber.
I promise streaming services and CDNs employ world-class experts in encoding
They don’t really care about the quality
It’s funny that you are trying to make both these points at the same time.
You don’t hire world class experts if you don’t care about quality.
I have a hobby of doing re-encoding blurays to lower bitrates. And one thing that’s pretty obvious is the world class experts who wrote the encoders in the first place have them overly tuned to omit data from dark areas of a scene to avoid wasting bits in that location. This is true of H265, VP9, and AV1. You have to specifically tune those encoders to push the encoder to spend more of it’s bits on the dark area or you have to up the bitrate to absurd levels.
Where these encoders spend the bitrate in dark scenes is on any areas of light within the scene. That works great if you are looking at something like a tree with a lot of dark patches, but it really messes with a single light person with darkness everywhere. It just so happens that it’s really easy to dump 2mbps on a torch in a hall and leave just 0.1mbps on the rest of the scene.
That will unarguably provide a drastically worse experience on a high-enough quality tv than a 40mbps+ bluray. Like, day and night in most scenes and even more in others.
I can tell you that this is simply false. And it’s the same psuedo-scientific logic that someone trying to sell gold plated cables and FLAC encodings pushes.
Look, beyond just the darkness tuning problem that streaming services have, the other problem they have is a QOS. The way content is encoded for streaming just isn’t ideal. When you say “they have to hit 14mpbs” the fact is that they are forcing themselves to do 14mbps throughout the entire video. The reason they do this is because they want to limit buffering as much as possible. It’s a lot better experience to lower your resolution because you are constantly buffering. But that action makes it really hard to do good video optimizations on the encoder. Ever second of the video they are burning 14mb whether they need those 14mb or not. The way that’d deliver less data would be if they only averaged 14mbps rather than forcing it throughout. Allowing for 40mbps bursts when needed but then pushing everything else out at 1mbps saves on bandwidth. However, the end user doesn’t know that the reason they just started buffering is because a high motion action scene is coming up (and netflix doesn’t want to buffer for more than a few minutes).
The other point I’d make is that streaming companies simply have a pipeline that they shove all video through. And, because it’s so generalized, these sorts of tradeoffs which make stuff look like a blocky mess happen. Sometimes that blocky mess is present in the source material (The streaming services aren’t ripping the blurays themselves, they get it from the content providers who aren’t necessarily sending in raws).
I say all this because you can absolutely get 4k and 1080p looking good at sub-bluray rates. I have a library filled with these re-encodes that look great because of my experience here. A decent amount of HD media can be encoded at 1 or 2mbps and look great. But you have to make tradeoffs that streaming companies won’t make.
For the record, the way I do my encoding is a scene by scene encode using VMAF to adjust the quality rate with some custom software I built to do just that. I target a 95% VMAF which ends up looking just fantastic across media.









