Comcast says it represents a 10 Gigabit cable internet network they are building (it doesn’t exist) so they are basically changing the meaning of the g from generation to gig to act like 10g is 5 generations better (or twice as fast)…or that they have a 10 gigabit network. Neither is accurate. It’s still just cable internet that people have to use because they have no other option.
Fuck Comcast.
I read online they are abandoning the “confusing” 10g branding but I just saw a commercial for it. They think all of their customers are morons and count on folks having no other choices in a lot of cases.
Apologies to anyone outside the United States, this is just complaining about our poor internet options and deceptive advertising by greedy corporations.
Screen manufacturers just did a similar thing with the jump from 1080p to 4k
The 1080 part of the original number referred to the number of pixels from top to bottom, 4k refers to left to right. 4k is actually only 2160 from top to bottom though (at the same aspect ratio).
So they quadrupled the number when it should have only doubled, and it was entirely a marketing thing.
Don’t even get me started on the bullshit that is calling 1440p 2k
1440p can be seen as 720p * 2.
There are 4x the pixels so…
I don’t disagree with the change either. Having a large number makes it more difficult to compare. After 2160 it’s 4320. 2k, 4k, and 8k are far easier to remember and figure out the differences.
Exactly. The up tick in resolution was slow, 360 to 480 to 720 to 1080. Relatively small improvements. Then we jump to 2160/4k and the resolution goes up by 400% from the previous 1080. 4k is 4x1080 screens put together.
1440 was a thing when 2160 released. 4k is a shit name because it infers you know what generation it’s 4 times the size of
4K doesn’t mean 4x the size. It means there’s (nearly) 4000 pixels from side to side.
Totally agree, but then
Is internally inconsistent!
If 4k is four times the pixel count of 1080, then 2k means 1440 (-ish, it should be 1530) - that’s fine. But then 8k must be 3050, but it is actually 4320!!!
So it can not refer to the number of pixels (quadratic scaling). On the other hand, if we assume linear scaling and 8k is 4320 and 4k is 2160, then 2k is 1080 - but 2k is never used in that context!
Edit: as you can see I’m very passionate about this XD
Eww, who refers to 1440p as 2k? 1080p is 2k. Not that anyone really says 2k to begin with.
That said, that particular instance is irrelevant as long as things are consistent going forward.
2048x1080 is DCI 2K.
The slight difference between the ratios is why home releases of films often have small black bars at the top and bottom, as the DCI flat ratio is slightly different than 16:9.
Why marketers are allowed to label the speed of a network is just beyond me as an engineer. Call it whatever you want. “Our Purple speed”. Don’t care. But underthat it should be labeled with a standard 1gbps/1gbps.
That would shut up xfinity’s bullshit claims pretty quick. “Our new Plaid speed fiber” 200mbps/4mbps
Seriously I called them years ago asking about fiber, they were real hyped, they bragged they could give me 800! 800 what I asked. Megabytes! Megabytes or Megabits? 800 Megabits, okay fine, symmetric right? Well, no one uses upload anyway. That was their literal response.
I talked about this in another thread recently, but my favorites are the ones that are so lopsided that you literally can’t send back ACKs fast enough to keep up with your own download speeds when using TCP.
I’ve had that! Upload so bad that I couldn’t even send out a request! Even the DNS requests failed. “But you have download available”. Yes, mr customer service, but how does it know what to download?
Download-only internet.
Your water line is now connected! There is no way to turn on the taps in the house.
But at least 4k is indeed 4 times bigger than 1080p, at least in terms of pixels, so it’s not all bullshit in a way
I like to call 1440 “4k” as well because it’s 4 times bigger than 720. Stupid fucking naming system
4K refers to the horizontal resolution of the video, not how much larger than FullHD it is.
Also 1440p is sometimes called QHD (Quad HD) because it’s 4x 720p aka HD
The correct naming scheme btw, if you don’t subscribe to bad marketing:
640x480 = SD (NTSC)
768x576 = SD (PAL)
1280x720 = HD
1920x1080 = FullHD/FHD
2048x1080 = DCI 2K
2560x1440 = QuadHD/QHD
3840x2160 = UHD
4096x2160 = DCI 4K
7680x4320= UHD2
K means 1000 by convention though, and 1080 is the closest to 1000
The K refers to horizontal resolution though. The Resolution used for cinema are 2048x1080 aka DCI 4K and 4096x2160 aka DCI 4K. TV manufacturers thought it would be fun to market UHD aka 3840x2160 as 4K, which it isn’t. It‘d be 3.8K if you’d have to label it like that.
deleted by creator
deleted by creator
1440p in a 16:9 aspect ratio has a resolution of 2560x1440 though, not 2160x1440
Type the right numbers in. It’s 4x
And 4K isn‘t even correct in the horizontal direction. “4K” TVs have a horizontal resolution of 3840 pixels. That’s 3.8K. True 4K, as used in movie production (aka DCI 4K) is 4096x2160
Technically should be called 4.096K I suppose
The worst part is that it’s actually less than 4k pixels on the top.
2160p is not that uncommon though. Saying 4K is just an abbreviation and it’s easier to say while still letting everyone know what you’re talking about. I don’t actually like the term 4K though because it’s a bit ambiguous because of how many different flavors of 4K there are.
It’s actually even worse. They tried to pass off 2048x1080 as a big upgrade over 1920x1080 by marketing it as “2K”. It didn’t work, but locked marketing into using the horizontal resolution.
That’s actually being used in the context of movie production. It’s called DCI 2K. Same with DCI 4K, which is 4096x2160 and thus actual 4K