Once again we see some incredible twaddle from audiophiles and once again people fall for it - so as a public service I am going to try and explain things as clearly as I can. I will try and be fair here, honest.
To start with, you need to understand a relatively simple concept, which is useful for anyone, not just those in to music. The difference between digital and analogue. It is really not as hard as it sounds.
In the real world we encounter things like sound waves. These work with our ears to allow us to hear things - they are pressure waves where the air has waves of compressed and less compressed air (sound can travel through other media). These waves trigger nerves in our ears to signal to our brain that we have heard something. We are very good at picking up the frequencies (tones) of sound within a range of frequencies. Some creatures can hear wider ranges, and the range we can hear changes as we get older.
These compression waves in the air have a pattern to them. If you drew the pattern of higher and lower pressure as a height on a graph on paper, you would see something like this image below. This shows one frequency (i.e. one tone) at a time, but sound can be a complicated mix of different tones. Here you see how louder means higher changes in pressure, and higher pitch means faster changes in pressure.
One of the problems with sound waves traveling through the air is that the sound changes. We know this. We know a sound further way sounds quieter. We know that other background sounds can make a sound harder to hear. We know that some rooms change the sound we hear, with an echo being the most obvious of changes. This is why theaters are carefully designed so that people can hear the sound as it was intended.
But some times we want to record sound or make sound, and the simplest approach to this is using analogue signals. These mean converting the sound from pressure waves to an electrical signal. Once upon a time the conversion was not even electrical - it was mechanical movement of a needle on a cylinder. We can use something simple like a different voltage for different sound pressures. We get voltage wave forms just like those shown above. A microphone does this conversion. We can later convert back from electrical signals to a sound using a speaker. The electrical signal is analogous to the sound pressure wave, and is called an analogue signal. There are several ways to do this conversion, some being better than others. For example, AM and FM radios work in different ways.
The problem with these analogue signals is that they too are not perfect. A long cable will reduce the signal (like making it quieter). Electrical circuits to amplify the signal can also add interference (noise). Some cables can pick up background noise (interference) from other electrical sources.
The quality of the cables, connectors and electronics make a difference to the analogue signal. So, if you have a system to make sound (e.g. music), AKA "HiFi system", and it uses analogue signals, then the quality of the HiFi system and its components matter. Better (more expensive) systems produce noticeably better sound. So people made and sold more expensive better systems, and people bought them. No problem here.
But the world has changed. Even the link from a sound system to a speaker can be digital now, and in practice most people are listening to music that is streamed or downloaded and played on a device that makes the sound directly (via headphones, etc).
Digital signals work in a different way. Instead of using an analogy of the sound pressure waves, the waves are measured, and those measurements conveyed as numbers. Later, those numbers are converted back to sound pressure waves.
The reason for this is that numbers are easier to transport and store reliably than analogue signals. You can reliably communicate a number. "42" is still "42" if it is quiet. Ultimately these signals may be carried as 1s and 0s on a wire, and it could simply by one voltage for a 1 and another for a 0. In practice it is way more complex now, but there are systems that just use these simple voltages.
The result is that you have digital cables, carrying one or more digital signal (these 1s and 0s) between equipment. If the cable works then every single 1 and 0 that is meant to go down the cable gets there. It does not matter if the cable is not very good, if the voltage for 0 is a bit off, as it still comes out as a 0.
Such cables can be used for some direct signaling of sound or video, such as an HDMI cable to your TV.
This presents a problem for those that sell high end cables and equipment to audiophiles. How can they sell expensive gold plated connectors and silver loaded conductors to people if any old cable will do?
You get some amusing cases of gold plated connectors for optical cables. Optical cables use light to carry the 1s and 0s rather than electricity as this avoids interference from electrical noise nearby. Even so, gold plated connectors, something that could help on an old analogue electrical cable, have been seen!
Well, with digital electrical cables, they were quite cunning. When you transmit a digital signal you need to do something called clock recovery which means you extract the timing from the signal. If you have a really cheap cable, with distortion on the signal, you may not get the exact timing of the start or end edges of the digital signals. If you had really cheap and nasty electronics that did clock recovery on a per bit basis, that could affect timing and affect the timing of the sound generated. Timing is quite important and could, in theory, affect the sound you then hear (although these variations are way above the frequency anyone can actually hear). So, whilst the argument was incredibly tenuous, it starts with a gram of fact which made it harder to just shoot down in flames. Saying "theoretically, maybe, but outside human hearing" being met with "well, I can hear the difference", and the whole thing becomes subjective.
But, once again, things have moved on. Now we have audio that is played from audio files like MP3s, which are downloaded or streamed.
It is worth pointing out that, in the digital era, there are things that make a difference. The quality of the original audio source (the recording studio), and the digitising equipment (microphones, ADCs, etc) all make a difference. The final quality of the playback systems, DACs and headphones or speakers make a difference. Both of these deal with the analogue and real world of sound ways, so that makes sense. Another big difference is the compromises made in making a digital signal, the measurement of those sound pressure waves. Ultimately you are throwing away some detail as part of the process. The different ways of coding an audio signal, and the data rates and sampling rates, all make a difference. This is why you can get difference quality formats, even just within MP3s there are different bit rates which sound different.
But when we finally get to the article I mentioned, with expensive high quality Ethernet cables, used for Ethernet for streaming audio, things get a tad special.
Here we are talking about a system that is not just about transferring bits, but a system for sending packets data - basically a system for sending a file of information which is the digital record of the music to which you wish to listen.
Ethernet cables do come in different grades and standards, and there are much higher standards for 10Gb/s cabling, which makes it expensive. But if you don't have a 10Gb/s network, then getting such cables does not help matters at all. For audio you do not need 10Gb/s, or 1Gb/s, or even 100Mb/s. A file transfer can keep up with real time at a few hundred kb/s. The only risk you suffer is streaming audio in real time is if the link is bad enough and slow enough (e.g. a really dodgy ADSL line) that the data transfer cannot keep up and the audio has to stop playing to buffer more data. This is a pretty clear audio failure which is not subjective, and not something that "better ears" can hear when others cannot.
Ethernet cables transfer data very fast, and there is no issue with clock recovery as that is all part of the protocol and the switches and Ethernet controller chips. None of that feeds in to the process of playing the sound in any way. Also, ironically, even a non fully working cable, i.e. one on which there are errors, will usually not cause an issue as the data is resent if incorrect as part of the various IP based protocols. So you have a case where you don't just need an "up to spec" working cable, you could even have a "really dodgy" cable, and still get perfect audio. If there are no pauses for buffering, the final data getting to the audio playback is the same, it is the same 1s and 0s, so it sounds the same!
There is no way whatsoever that a better Ethernet cable can ever affect the "quality" of a streamed audio playback, simple as that. What could have an impact is the quality of the sound card, and speakers or headphones, and the person's ears.
There is one other small point you see in that article in the pictures, which is direction arrows. This was one of the very special ideas that audiophiles had, and dates back to speaker cable and analogue signals. The idea was that somehow you could "condition" the cable to work better in one direction, and this was done by the manufacturer of the expensive cable. You should therefor always connect the cables the right way around with arrows from amp to speaker.
Well, this was always bullshit. Cables don't work like that, and neither do electrical signals. In practice the arrows were almost certainly there to avoid mistakenly connecting amp output to amp output and causing serious problems as a result. Having arrows and following them avoids that. But people started to believe they had some electrical signficance.
What makes this extra special is the use on Ethernet cables. The way Ethernet works is that signals go both ways anyway. Even if the bullshit was true, that a cable could be conditioned to work a certain way, the fact that Ethernet sends data both ways defeats that totally. They are just for show as connecting an Ethernet cable back to itself does not cause fires.
There are other factors when buying cables, like: The colour - is it one you like, or does it fit a colour scheme you are using for cables? How robust the cable is - is it likely to break or wear out somehow - can the cat chew through it? Some cables have nicer features like guards for the cable clip to stop it snapping off. These are all important, but if it meets cat5 or whatever standard you need for your network, then electrically, it does the job, end of story.
The one final thing I would mention is confirmation bias. This is a real thing - it means that when you expect a certain result, you inherently have a bias in your assessment of subjective matters towards supporting that outcome, and are dismissive of any counter evidence. What this means is that if you go out and spend £255 on a 12 metre Ethernet cable, and use it on your HiFi, you will be convinced that it sounds better.
So please, don't fall for bullshit.