Probably a very basic question but confused the hell out of me - say if I have 100mb internet at home, and scenario one, a router with 100mb port speed and I connect two PCs to it, each has a 100mb NIC card, is it true that ignoring other factors I should be able to get close to, if not 100mb connection on each of the PCs? On the other hand, scenario 2, if I have a (unmanaged) switch and I connect the PCs to the switch I would only ended up getting 50mb each on each of the PCs (i.e., the switch essentially “halved” my internet speed if I connect 2 PCs to it, 1/3 if I connect 3 PCs to it, etc)?
Scenario 2 is more accurate for BOTH cases, but not because you used a switch. Scenario 1 is false.
It’s the ISP provided speed that is the limited resource. You have 100 Mbps of Internet service, which can be consumed by any single connection. If multiple connections want to use Internet, that speed gets split up between all the devices trying to use it simultaneously.
It does not have to be in equal pieces. If one connection only wants 20 Mbps of Internet, another connection can use the other 80 Mbps remaining.
Using a switch does not affect the consumption of Internet from devices unless the switch port speed itself is lower than your maximum ISP speed. This isn’t true in your example.