I sort of covered this in my other reply, but yes, switching losses are also due to electrical resistance in the semiconducting transistor, and yes I’m assuming that semiconductors are replaced with superconductors throughout the system. Electrical resistance is pretty much the only reason any component generates heat, so replacing semiconductors with superconductors to eliminate the resistance will also eliminate heat generation. I’m not sure why you think superconductors can’t be used for transistors though? Resistance isn’t required for semiconductors to work, it’s an unfortunate byproduct of the material physics rather than something we build in, and I’m not aware of any reason a superconductor couldn’t work where a semiconductor does in existing hardware designs.
Then again I’m also not an IC designer or electrical engineer, so there may be specific design circumstances that I’m not aware of where resistance is desired or even required, and in those situations of course you’d still have some waste heat to remove. I’m speaking generally; the majority of applications, GPUs included, will benefit from this technology.
Semiconductors are used for transistors because they give us the ability to electrically control whether they conduct or resist electrical current. I don’t know what mechanism you’d use to do that with superconductors. I agree you don’t ‘have’ to have resistance in order to achieve this functionality, but at this time semiconductors or mechanical relays are the only ways we have to do that. My focus is not in semiconductor / IC design either so I may by way off base, but I don’t know of a mechanism that would allow superconductors to function as transistors (or “electrically controlled electrical connections”), but I really hope I’m wrong!
I sort of covered this in my other reply, but yes, switching losses are also due to electrical resistance in the semiconducting transistor, and yes I’m assuming that semiconductors are replaced with superconductors throughout the system. Electrical resistance is pretty much the only reason any component generates heat, so replacing semiconductors with superconductors to eliminate the resistance will also eliminate heat generation. I’m not sure why you think superconductors can’t be used for transistors though? Resistance isn’t required for semiconductors to work, it’s an unfortunate byproduct of the material physics rather than something we build in, and I’m not aware of any reason a superconductor couldn’t work where a semiconductor does in existing hardware designs.
Then again I’m also not an IC designer or electrical engineer, so there may be specific design circumstances that I’m not aware of where resistance is desired or even required, and in those situations of course you’d still have some waste heat to remove. I’m speaking generally; the majority of applications, GPUs included, will benefit from this technology.
Semiconductors are used for transistors because they give us the ability to electrically control whether they conduct or resist electrical current. I don’t know what mechanism you’d use to do that with superconductors. I agree you don’t ‘have’ to have resistance in order to achieve this functionality, but at this time semiconductors or mechanical relays are the only ways we have to do that. My focus is not in semiconductor / IC design either so I may by way off base, but I don’t know of a mechanism that would allow superconductors to function as transistors (or “electrically controlled electrical connections”), but I really hope I’m wrong!