The fact that this has been replicated is amazing!
If you want to see all the conflicting findings and by-the-minute updates, this post is great: https://forums.spacebattles.com/threads/claims-of-room-temperature-and-ambient-pressure-superconductor.1106083/page-11?post=94266395
I’m very much not an expert, but my read of this is: most replication efforts have mostly failed when it comes to making a working room temp superconductor (meaning conducts electricity with no resistance). However, groups are increasingly seeing some of the other characteristics expected from superconductors, and it appears that the failures might just be caused by using an unrefined technique.
So time will tell, but this is probably a big advance, but not itself a world changer just yet.
It’s at least reassuring it wasn’t just a hoax this time.
wikipedia has a nice little table on all ongoing research as well.
Although it is worth noting that one of these attempts is just in computer simulation.
The Berkeley lab hasn’t confirmed their results against a physical sample.
Heres my analysis of what all has actually happened from a similar post with this article yesterday
From reading through the article and it’s sources, here’s what seems to be the case:
-
a simulation at Berkely National Labs with their supercomputing capabilities has verified that LK-99 theoretically has superconducting properties
-
Argonne National Labs also seems to be involved and doing stuff, but nothing official from them yet, besides maybe helping with simulation stuff
-
a Russian scientist is working on improving the synthesis process and has made some low purity samples that produce the Messnier effect, but higher purity than the original I think. It’s all from Twitter (x) threads and a little hard to follow. Her handle is @iris_IGB
-
China National Lab (Shenyang) first principles analysis suggest gold and silver doping LK-99 will make superconductors as well. [Directly copied from article]
-
Under the guidance of Professor Chang Haixin, postdoctoral Wu Hao and doctoral student Yang Li of the School of Materials Science and Technology of Huazhong University of Science and Technology they have successfully verified and synthesized the LK-99 crystal. It can be magnetically levitated for the first time and this is shown on a bilibili video. They expect to realize the true sense of non-contact superconducting magnetic levitation. [Also Directly copied from article]
Direct source for last 2 points and also more info in general
https://www.nextbigfuture.com/2023/07/tracking-lk-99-superconductor-replication-efforts.html
I would give LK-99 a 95 percent chance of either being a true rook temperature superconductor or directly leading to the discovery of a true rook temp superconductor in the next few years.
A few caveats however: according to the simulation, the conductive pathways only forms when the copper bonds to a specific higher-energy spot in the crystal, so getting higher purities will likely need a fair amount of innovation on the production process. There are some other complications with the synthesis, so even if it is fully and properly confirmed with more papers and such it will still likely be a while before it can start to be used effectively.
yea, even if LK-99 is a room temp super conductor, i dont it expect to be THE room temp super conductor… but it will prove its possible, and provide pathways to improve it (either advancing LK-99, or showing how altering a material to introduce internal strain can cause it in other compounds)
The proof of concept alone will get so much money this way, i’m betting that we’ll have either 3, 7, or 358 promising compounds or materials in a year or 3! Im getting cautiously optimistic here :-)
The fact that it was a Chinese team that made the discovery may be the kickstart an ‘arms race’ needs in order for the US to put significant resources behind the development of RTSCs. It’s hard to pour millions of dollars into a hole if you want see the bottom of it. If there’s a viable path and it looks like China may beat the US to the finish line, the US will throw more resources at the problem.
What argonne ends up saying will have huge weight for me. Hope to hear them chime in.
Even more interesting is Iris Alexander’s claim that she was able to produce the material using relatively simple tools. Making superconductor materials could become a cottage industry.
-
I swear I saw the opposite headline less than 12 hours ago.
I’m very skeptical, we have seen so many claims of room temperature superconductivity that have turned out to be fake… but considering that Berkeley National Laboratory replicated it, this makes me far more hopeful.
LBNL did not replicate, they simulated the material and found it promising. The lattice of the materials need some sort of substitution to happen in an less likely way, someone with knowledge will have to summarize better.
This is how it starts though. Smaller labs do simulations and get promising results which gets the attention of bigger labs with the capacity for actual experimentation.
We’re talking about Lawrence Berkley National Laboratory doing the simulation here. That is not a small research facility.
Seems to be exactly the opposite of what you describe. Actual experiment shows promise, then large lab runs simulation.
There are lots reasons why a replication attempt might fail despite the stuff being a superconductor.
The process for producing the material isn’t reliable, so that doesn’t tell us much. They might just have been unlucky.
This is unexpected.
But welcome, indeed.
A practical superconductor is a huge deal, it would drastically change the way we deal with electrical power distribution and electromechanical applications. So any development is going to be big news. Though we’re not talking about an actual working conductor, it’s just excitement over research advancement, yeah? I’ve seen this kind of “big news” before in other tech sectors and time often proves it unworthy. If it does present a big step toward a practical superconductor that’s great, but I wouldn’t count any eggs yet.
I would say this is likely not a practical super conductor… But it may well be the first ever room temperature super conductor.
The first semi-conductors were not practical either, but we can all see where that led!
but we can all see where that led!
It led to the LED!
I would say this is likely not a practical super conductor… But it may well be the first ever room temperature super conductor.
Yes of course it would be a big deal if they create one to begin with. However if it’s difficult and expensive to produce, that’s not much help. It has to be mass producible and inexpensive to have industrial significance. I mean we already have expensive solutions. Don’t need any more of those.
The first semi-conductors were not practical either, but we can all see where that led!
I don’t know that semiconductors are a good parallel. Growing the crystals dates back to the early 1900s and was never an expensive or technologically difficult process. Doping silicon to create devices like diodes and transistors was something new, but was not exceedingly expensive or a great technological challenge. The migration to chips which require lithographic doping was more of a challenge.
In any case semiconductor devices were practical shortly after development. One of the first consumer products that used them was the “transistor radio” which was inexpensive and came out shortly after invention of the technology.
The benefits of a room temp super conductor mean that it would be produced at scale for price points well above standard. It’s that big a deal in performance improvement.
The paper’s method is fairly messy, low yield, and brand new… But it’s also not that complex of metallurgy afaik. I would not be surprised to see iterations on method that scale well.
Regardless: if it’s true then it proves it’s possible… Which wasn’t guaranteed until now.
Where were you, the day that everything changed? This is likely it, folks. If this pans out, it’ll be jetpacks and mimosas on the Moon, Jetsons’ style. Holy shit. We thought the computer age was something, this is going to be Something Else
Where was I? On the toilet.
Sat in my chair, waiting for Baldur’s Gate 3 to drop
Same
I have been doing some thinking and this is game changing but not so much. We won’t get hoverboards or flying cars to my knowledge. We will get much cheaper maglev trains, but in America we refuse to build public infrastructure that isn’t for cars so that isn’t gonna fix it.
We won’t get faster traditional computers because those need semi-conductors. There are some patents and theories about superconducting transistors so we may get a “cool running” cpu eventually, but it won’t be faster it just won’t heat up.
Quantum computers will get cheaper and maybe more available, but they are still a research topic so we are probably decades away from them having practical use (or ever in terms of practical for everyday use, they will break encryption as we know it though).
We will “instantly” save something like 30% of our power generation that is lost to heat, but again that is going to require a massive infrastructure project to replace all high voltage power lines, so that is never going to happen in America.
Brush less motors will be able to be smaller and/or able to take in more energy so they will be more efficient, but we are still beholden to our energy storage density.
There is a theoretical idea of using superconducting rings to let electrons flow around it indefinitely, as an energy storage medium, but I have no idea how close that could be or how dense that would be compared to Lithium Ion batteries, or Fossil fuels which is the real competition.
We will get smaller and cheaper MRI’s so medical imaging should get cheaper and more available to the “global south”.
Am I missing anything?
More heat efficient processors and more energy efficient processors are one and the same. Which is huge. Energy usage is a large portion of the cost of computational infrastructure, and things like training neural networks. I suspect a thermally more efficient processor would also potentially last much longer too, with less intense thermal cycling.
A lot of data centers are limited by the energy infrastructure where they are constructed.
Superconductors can be used as very fast charging energy storage devices. Think a capacitor but with better energy storage than a battery. We could have electric cars that charge as fast as it takes to fill a gas tank and instantly charging electronic devices.
Looks like as of 2016 a theorized coil superconducting battery has about 2 Mj/m3 while Gasoline is 30 Gj/m3[1]
:(
Eager to see what other novel ways we can use this material though. If it is cheap and easy to make surely new ideas will be flowing fast.
I thought heat is the main thing limiting computer performance? Like, if we had superconducting transistors that take little energy to change state, highly parallel tasks that are power-limited today would get a whole lot faster. Think native 4k path tracing-level graphics in games on our phones. And better/faster/cheaper AI systems, though they are limited more by memory than by compute, so they’d likely still be run in the cloud mostly.
Heat is a big issue, but we are close to the physical limits of transistor size, they are nearly the size of atoms AFAIK. So this will allow us to have more of them closer I guess with no heat limits. There is also a lot of stuff that goes above my head about quantum tunneling when our transistors get that size. But transistors use semiconductors (Sillicon) not conductors, so this isn’t a drop in replacement. Will require a new type of transistor that uses a conductor I suppose.
Maybe, there’s still important questions. Will it scale? Preliminary tests only transfered mA’s before the super conductivity breaks down. So can you layer the material to get higher amps? Will cables have to be made in one continuous part or will the super conductivity work across joined cables
deleted by creator
This material likely isn’t it but it demonstrates that room temp superconductors exist
I feel like I’ve seen enough to say that this is likely real and you are probably correct. I’m struggling to figure out how to prepare for this though. Are there companies or industries that we should be investing in or something?
So a supercomputer simulation and a video from a team in China…
I’m no more skeptical but I’m certainly not sold yet.
And we’re still waiting for a definite announcement that yes, humanity has finally produced room-temperature, ambient-pressure superconductors.
Exciting news for sure, but as usual it’s not quite there yet.
This seems promising, I can’t wait for my guy to eat his shorts.
Hitch your tits and pucker up. We’re entering a new age of industry. Much like the original Industrial Revolution, technology is going to advance at an extremely rapid pace. Fusion, quantum computing supremacy. Just… wow. How far off is general AI with this new room temperature superconductor?
Fusion is no closer than ever before, and AGI is hilariously over hyped. Also no closer than ever before.
And Fusion is pretty close to begin with. Commonwealth Fusion is well within their purpose time table so far. They don’t need any new superconductors for their project.
Fantasy
Stupid question probably - is computing power what is holding back general AI? I’ve not heard that.
What’s holding back AGI is a complete lack of progress toward anything like intelligence. What we have now isn’t intelligent, it’s multi-variable probability.
It’s not that it’s not intelligent, it’s that predictive language models are obviously just one piece of the puzzle, and we’re going to need all the pieces to get to AGI. It’s looking incredibly doable if we figured out how to make something that’s dumb but sounds smarter than most of us already. We just need to connect it to other models that handle other things better.
The biggest hurdle is that we don’t actually know what intelligence really is at all yet, computationally. Most of the history of science has been repeatedly learning “but things were actually more complicated than originally expected,” so making claims that we’re soon to be able to replicate something that we don’t actually properly understand yet may be a bit premature. The desire to replicate human intelligence by a machine has been around since at least the 1200’s brazen heads, and yet for everything we’ve discovered since we’re still just beating our heads against a wall trying to sleuth out what it really is that makes us ‘think.’
You don’t speak predictively. It’s not one of the pieces, it’s a parlor trick.
Oh god yes. This is going to be pretty simplified, but: The sheer compute required to run something like ChatGPT is mindboggling, we’re talking thousands of A100 GPUs ($10k a piece, each one has 80GB of VRAM) networked together, and probably petabytes of SSD storage for the DB. Most neutral networks require a bunch of GPUs working in parallel because they need a lot of very fast memory to hold all the data they sift through, and a lot of parallel compute to sift through that data as quickly as possible. That’s why GPUs are good for this - you can think of a CPU like a human, very versatile but there’s only so much one person can do at a time. Meanwhile GPUs are like bug swarms, a bunch of much simpler brains, but specialized, and they “make it up on volume”. It’s only because of advances in computing power, specifically in the amount of compute cores and VRAM on GPU dies, that the current level of AI became possible. Try downloading GPT4All and compare free models that run on your machine to the performance of ChatGPT - you’ll certainly see the speed difference, and if you ask the free ones for code or logic you’ll see the performance difference too.
This is all to say that superconducting traces and transistors mean no heat is generated by dumping power through them, so you can fit them closer together - even right next to and on top of each other, doesn’t matter, because they don’t need to be cooled. And, because you lose no power to heat, it can all go to compute instead, so it’ll be perfectly efficient. It’ll bring down the cost of everything, but specifically computer components, and thus OpenAI will be able to bring more servers online to improve the capabilities of their models.
There is still heat generated by the act of computation itself, unless you use something like reversible computing but I don’t believe there’s any current way to do that.
And even then, superconducting semiconductors are still going to be some ways off. We could have superconductors for the next decade in power transmission and still have virtually no changes to processesors. I don’t doubt that we will eventually do something close to what you describe, but I’d say it’s easily a long way off still. We’ll probably only be seeing cheaper versions of things that already use superconductors, like MRI machines.
Edit: my first draft was harsher then it needed to be, sorry, long day.
First of all, nobody’s saying this is going to happen overnight. Secondly, traditional computing systems generate heat due to electrical resistance and inefficiencies in semiconducting transistors; the process of computation does not inherently require the generation of heat, nor cause it through some other means than electrical resistance. It’s not magic.
Superconduction and semiconduction are mutually exclusive - it’s in the name. A semiconductor has resistance properties midway between a conductor and an insulator. A superconductor exhibits no electrical resistance at all. A material can be a superconductor in one “direction” and a semiconductor in another, or a semiconductor can be “warped” into being a superconductor, but you can’t have electrons flowing in the same direction with some resistance and no resistance at the same time. There’s either resistance, or there’s not.
Finally, there is absolutely no reason that a transistor has to be made of a semiconducting material. They can be made of superconducting materials, and if they are then there’s no reason they’d generate heat beyond manufacturing defects.
Yes, I’m talking about a perfectly superconducting system and I’m not allowing for inefficiencies where components interface or component imperfections resulting in some small amount of resistance that generates heat; that would be a manufacturing defect and isn’t relevant. And of course this is all theoretical right now anyway; we don’t even know for sure if this is actually a breakthrough yet (even if it’s really beginning to look like it). We need to better understand the material and what applications it’s suited to before we can make concrete predictions on what impacts it will have. But everything I suggest is grounded in the way computer hardware actually works.
I appreciate you revising your reply to be less harsh, I wasn’t aiming to correct you on anything I was just offering some thoughts, I find this stuff interesting and like to chat about it. I’m sorry if I made your day worse, I hope things improve.
I said superconducting semiconductors as just a handy wavy way to refer to logic gates/transistors in general. I’m aware that those terms are mutually exclusive, but thats on me, I should have quoted to indicate it as a loose analogy or something.
The only thing I disagree with is your assessment that computation doesn’t create heat, it does. Albeit an entirely negligble amount, due to the fact that traditional computation involves deleting information, which necessarily causes an increase in entropy, heat is created. It’s called Landauer’s principle. It’s an extremely small proportion compared to resistive loss and the like, but it’s there none the less. You could pretty much deal with it by just absorbing the heat into a housing or something. We can of course, design architectures that don’t delete information but I’m reasonably confident we don’t have anything ready to go.
All I really meant to say is that while we can theoretically create superconducting classical computers, a room temperature superconductor would mostly still be used to replace current superconductors, removing the need for liquid helium or nitrogen cooling. Computing will take a long time to sort out, there’s a fair bit of ground to make up yet.
Okay, you’re kind of reaching with that one 😋 I didn’t mention Landauer’s Principle because it’s so negligible as to be irrelevant (seriously, the heat generated by writing or erasing a bit is about equivalent to the energy levels of a single electron in a hydrogen atom, in the range of ~0.018 eV at room temperature), and superconductors will reduce even that. I kind of wish we had another word, for when “negligible” doesn’t do the insignificance justice.
I do appreciate the clarification on the point of superconducting semiconductors - and the concern for my day haha! It really wasn’t anything to do with you, hence the edit. And, your point here is absolutely correct - LK-99 isn’t some magical material that can be all things to all people. Its other properties may make it unsuitable for use with existing hardware manufacturing techniques or in existing designs, and we may not find superconductors that can fill every role that semiconductors currently occupy.
Edit: lol, looks like its “other properties” include not being a fucking superconductor. Savage.
I think “rounding error” is probably the closest term I can think of. A quick back of the envelope estimation says erasing 1 byte at 1GHz will increase an average silicon wafer 1K° in ~10 years, that’s hilariously lower than I’m used to these things turning out to be, but I’m normally doing relativistic stuff so it’s not really fair to assume they’ll be even remotely similar.
Really appreciate the write up! I didn’t know the computing power required!
Another stupid question (if you don’t mind) - adding superconductors to GPUs doesn’t really se like it would make a huge difference on the heat generation. Sure, some of the heat generated is through trace resistance, but the overwhelming majority is the switching losses of the transistors which will not be effected by superconductor technology. Are we assuming these superconductors will be able to replace semiconductors too? Where are these CPU/GPU efficiencies coming from?
I sort of covered this in my other reply, but yes, switching losses are also due to electrical resistance in the semiconducting transistor, and yes I’m assuming that semiconductors are replaced with superconductors throughout the system. Electrical resistance is pretty much the only reason any component generates heat, so replacing semiconductors with superconductors to eliminate the resistance will also eliminate heat generation. I’m not sure why you think superconductors can’t be used for transistors though? Resistance isn’t required for semiconductors to work, it’s an unfortunate byproduct of the material physics rather than something we build in, and I’m not aware of any reason a superconductor couldn’t work where a semiconductor does in existing hardware designs.
Then again I’m also not an IC designer or electrical engineer, so there may be specific design circumstances that I’m not aware of where resistance is desired or even required, and in those situations of course you’d still have some waste heat to remove. I’m speaking generally; the majority of applications, GPUs included, will benefit from this technology.
Semiconductors are used for transistors because they give us the ability to electrically control whether they conduct or resist electrical current. I don’t know what mechanism you’d use to do that with superconductors. I agree you don’t ‘have’ to have resistance in order to achieve this functionality, but at this time semiconductors or mechanical relays are the only ways we have to do that. My focus is not in semiconductor / IC design either so I may by way off base, but I don’t know of a mechanism that would allow superconductors to function as transistors (or “electrically controlled electrical connections”), but I really hope I’m wrong!
Simply throwing computing power at the existing models won’t get us general AI. It will let us develop bigger and more complex models, but there’s no guarantee that’ll get us closer to the real thing.
It’s wild 😵💫
0_0