
A paper in Nature reports the discovery of a superconductor that operates at room temperatures and near-room pressures. The claim has divided the research community.
In a packed talk on Tuesday afternoon at the American Physical Society’s annual March meeting in Las Vegas, Ranga Dias, a physicist at the University of Rochester, announced that he and his team had achieved a century-old dream of the field: a superconductor that works at room temperature and near-room pressure. Interest was so intense in the presentation that security personnel stopped entry to the overflowing room more than fifteen minutes before the talk. They could be overheard shooing curious onlookers away shortly before Dias began speaking.
The results, published today in Nature, appear to show that a conventional conductor — a solid composed of hydrogen, nitrogen and the rare-earth metal lutetium — was transformed into a flawless material capable of conducting electricity with perfect efficiency.
While the announcement has been greeted with enthusiasm by some scientists, others are far more cautious, pointing to the research group’s controversial history of alleged research malfeasance. (Dias strongly denies the accusations.) Reactions by 10 independent experts contacted by Quanta ranged from unbridled excitement to outright dismissal, with many of the experts expressing some version of cautious optimism.
Previously, superconductivity has been observed only at frigid temperatures or crushing pressures — conditions that make those materials impractical for long-desired applications such as lossless power lines, levitating high-speed trains and affordable medical imaging devices. The newly forged compound conducts current with no resistance at 21 degrees Celsius (69.8 degrees Fahrenheit) and at a pressure of around 1 gigapascal. That’s still a lot of pressure — roughly 10 times the pressure at the deepest point in the Marianas Trench — but it’s more than 100 times less intense than the pressure required in previous experiments with similar materials.
“If it turns out to be correct, it’s possibly the biggest breakthrough in the history of superconductivity,” said James Hamlin, a physicist at the University of Florida who was not involved in the work but has collaborated with members of the group in the past. If it’s true, he said, “it’s an earth-shattering, groundbreaking, very exciting discovery.” But incidents involving the team’s previous work — including but not limited to a near-room-temperature superconductivity claim published in Nature in 2020 and retracted late last year — have cast a shadow across today’s announcement. “It’s hard to not wonder if some of the same problems that have gone unaddressed in previous work also exist in the new work,” Hamlin said.
For more than a century, scientists have known that cooling most metals to temperatures within a few degrees of absolute zero brings about a dramatic metamorphosis. Around this “critical temperature,” which varies from one material to another, electrons pair up and form a type of quantum fluid. Once this happens, electrons no longer bounce into atoms in the material — interactions that generate resistance — which allows them to flow with no energy loss.
The overarching goal of superconductivity research since then has been to raise the critical temperature.
For decades, physicists have made incremental progress, steadily raising the critical temperature by testing different combinations of elements. One promising class of materials, known as hydrides, emerged in recent years. Hydrides are compounds that combine the featherweight hydrogen with heavier atoms like sulfur or metals. The more hydrogen, the better for superconductivity, physicists believe. Researchers sometimes add in a dusting of other atoms, such as carbon or nitrogen, to further tweak its properties. The first superconducting hydride, reported in 2015, hit its transition at around minus 70 degrees Celsius and 155 gigapascals of pressure (approaching half that of Earth’s core). Within three years, the same group and another both whipped up even more hydrogen-rich “superhydride” materials that could superconduct as high as minus 13 degrees Celsius and at 190 gigapascals.
The new study demolishes all past records. For the past few years, Dias’ team has worked on a superhydride based on lutetium. To produce a sample, the team would bathe a thin film of lutetium in a perfume of 99% hydrogen and 1% nitrogen while baking it for a few days at 200 degrees Celsius. A diamond anvil cell would then compress the sample at 2 gigapascals of pressure. The team would then progressively loosen the anvil while testing the sample for superconducting properties. Dias said that out of hundreds of samples produced, they were able to observe superconductivity in dozens of samples even after the pressure was lowered to about 1 gigapascal.
To demonstrate superconductivity, the team hit three textbook benchmarks. At the critical temperature, they showed a drop in resistance and a peak in a property related to how readily a material warms up. The team also managed to directly measure the expulsion of a magnetic field from the samples — an unambiguous signature for superconductivity called the Meissner effect that has never before been convincingly demonstrated in a superhydride. Curiously, the sample also shifted in color from blue to pink to red in sync with its phase changes.
The paper’s plots are exactly what researchers look for when they test for superconductivity. The strong evidence thrills many scientists who have spent decades searching for materials that can bring the phenomenon closer to everyday conditions.
“I am really excited to see the result. And I don’t in any way doubt that what they’re observing is what it is,” said Siddharth Saxena, a physicist at the University of Cambridge who was not involved in the new work. Eva Zurek, a theoretical chemist at the University at Buffalo who often communicates with the Rochester group but who was also not involved in the research, said that a material that superconducts under these conditions “would impact every aspect of our life in ways we cannot imagine.” Hamlin agrees that the demonstration “is a tour de force of every kind of measurement you would want to see on this material, producing exactly the type of data you would hope to see.”
Yet Hamlin and other researchers insist that the group’s past requires that today’s historic claims be met with historic levels of scrutiny.
“There is a lot of evidence for superconductivity here if you take it at face value,” said Jorge Hirsch, a physicist at the University of California, San Diego. “But I do not believe any of what these authors say. I am not sold at all.”
Hirsch said his mistrust stems from a long history of allegations of research malfeasance made against previous and current members of the group, many of which he has pressed. Most recently, in 2020 Dias and his co-authors published a study of a carbonaceous sulfur hydride (CSH) that hit its critical transition at around 14 degrees Celsius (57.2 degrees Fahrenheit) and 267 gigapascals. Almost immediately, a handful of experts spotted unusual patterns in the data used to verify the material’s response to magnetic fields. When Dias and his frequent collaborator, Ashkan Salamat, a physicist at the University of Nevada, Las Vegas released their raw data a year later in the form of a 149-page document, they detailed an unusual and complicated method for eliminating background magnetic interference — one they said was necessary for them to detect the tiny magnetic field rejected by the small sample. This method was inconsistent with how they’d described the procedure in the original paper, which led Nature to issue a retraction last September.
Hirsch and other physicists allege that the misconduct goes beyond a misleading mix-up regarding the magnetic background. In September, Hirsch and Dirk van der Marel, a professor emeritus at the University of Geneva, published a claim that what Dias and Salamat had released as raw CSH data was actually derived from the published data. “[We] proved basically mathematically that the raw data are not measured in the laboratory; they are fabricated,” Hirsch said. Hamlin independently released a preprint last October claiming that the electrical resistivity data also appeared to have been processed in an undisclosed manner — a new allegation atop the issue that led to the 2022 retraction.
This is the same group that already had their publication retracted by Nature the last time they made such a claim. Maybe Nature's reviewers got better and this time it is legit, but I wouldn't count on that. So there's nothing to see here until this gets independently replicated. They totally deserve all the scrutiny and resistance they seem to be getting.
> a handful of experts spotted unusual patterns in the data
> a year later ... they detailed an unusual and complicated method [for processing the data]
regarding which a team claims:
> [We] proved basically mathematically that the raw data are not measured in the laboratory; they are fabricated
So multiple different peers have pointed out multiple different times that this team is publishing "exactly the results you'd want to see to confirm superconductivity", and then a year later publishing fake-looking raw data or overly-complex "data processing" that magically gives them the exact result they needed to get published. These guys don't even seem particularly good at their fraud:
"We just made the most important discovery in the history of superconductivity!"
[Immediately]: Okay but what about all this bullshit here?
"Oh that? (nervous laughter) ha ha, well, see, the thing about that is ... [a year later] ... there, see, here's the raw data; we were just processing the data in a way we never mentioned before. We didn't tell you about it because, um, well, um, we ... forgot?"
[Immediately]: Okay but this raw data is obviously fake "Oh, well, um, psych! We were just kidding about that result! Ha ha ha, take-backsies. Um yeah, we take it back."
[A year later]
"Hey we did the result again! This time it's for real serious realsies. We have the result here. We did the result thing again. It's the exact same result but better this time. Guys?"
Why is anyone taking them seriously?
Lying in a professional context is basically never OK. If someone is shown to be lying, they should never work in that field again, nor any field that bestows upon them any sort of trust. Lying is a big deal; it should absolutely be career-killing. It's not something people just do once and then get over. A few weeks ago people on Hacker News were talking about how to help their children cheat at school, like it's just a normal thing. It absolutely blows my mind that this would be seen as OK.
> A few weeks ago people on Hacker News were talking about how to help their children cheat at school, like it's just a normal thing.
Link?
https://news.ycombinator.com/item?id=34869283
> I was helping my son with his homework. He had to write an essay about why the gender of the protagonist might be female (although this is never mentioned in the short story). Fortunately for him, ChatGPT knew the story and was happy to write an essay with arguments.
It goes on like that. The parent helps their son use ChatGPT to cheat at school, and the responses are all positive, like:
> I am delighted to hear that children are already adopting the new tech
And:
> This gives your son a competitive advantage over the kids who will simply turn up what the AI wrote (and get a 0) or spend too long writing their own essays.
They're celebrating it and talking about how learning to cheat well will give them a "competitive advantage" (zero-sum bullshit) over the kids who cheat badly, and then making a mockery of the children who actually "spend too long" doing the work themselves.
It's insane. It's bizzaro world. It's infuriating. They're teaching their children to grow up to be liars and cheaters. They're knowingly punishing the children who choose not to cheat. They sound like toxic psychopaths, which I got downvoted (obviously) for calling them.
> It's insane.
Sorry, but I don't agree. The form of this argument is indistinguishable to me from those who argued back in the 20th century that using calculators to do arithmetic was cheating.
Writing essays is on the same road as doing arithmetic by hand, calligraphy, blacksmithing, and buggy-whip manufacturing. All of these used to be useful skills but technology has displaced them. That's the reality, and no amount of railing at the wind will change that.
Yeah - like does it work? Is it costly ? do I need to sneak an iphone in ? What if the kids rat on me !
;-)
(sarcasm, of course).
there isn't enough evidence to say they are lying compared to making a mistake. It is remarkably hard for many scientists to ensure that the presented "raw datA" isn't actually the output of a synthetic analysis. also, if the new work repros and is valid, it means that the original work may well have been right.
If it doesn't repro probably this guy's career is over.
> there isn't enough evidence to say they are lying compared to making a mistake
You're right, of course. I would not be jumping to conclusions if I had actual authority over the consideration of their work; I would be much more measured. But here in my armchair, I'm free to express my strong gut feeling that these guys are obvious liars -- but that's just, like, my opinion, man. This is one of those cases where I'd be very happy to be proven wrong.
I tried to go through the "Room-temperature superconductivity — or not?" paper related to the "[We] proved basically mathematically that the raw data are not measured in the laboratory; they are fabricated" comment. I'll admit, the paper is well over my head. Long time no college science classes.
I'm curious about the methods used to determine that the data was fabricated though. Would someone mathier than me help answer this: Was the fabrication of data determined through a mathmatical analysis of the actual data numbers ala Benford's Law? Or was this determined more through a scientific analysis of the experiment that determined that the numbers weren't possible?
Stuff like Benford's law or the central limit theorem are trivial observations that anyone with some basic statistics knowledge could have called out. But that also means anyone who has some education in the field could easily falsify plots to avoid that, so you will not catch real physicists with those methods. The problems in the paper that was eventually retracted were only visible to experts who actually work on this very topic. Here they talk about some of the abnormalities: https://www.nature.com/articles/s41586-021-03595-z
Looking forward to giving the Nature paper a read. I appreciate your time and wisdom!
Sounds a lot like what Jan Hendrik Schon did https://www.youtube.com/watch?v=nfDoml-Db64
This write up of it [0] tries to address that a little but I agree that all the scrutiny is deserved and should continue.
"The previous paper has been resubmitted to Nature with new data that validates the earlier work, Dias says. The new data was collected outside the lab, at the Argonne and Brookhaven National Laboratories in front of an audience of scientists who saw the superconducting transition live. A similar approach has been taken with the new paper. "
[0] https://phys.org/news/2023-03-viable-superconducting-materia...
The good news is that 1 GPa (10 kbar) is a lot easier to achieve than hundreds of GPa, which means that replication will be much easier. The previous "room temperature" claim involved around 2 Mbar or 200 GPa. For comparison, the detonation pressure of a high explosive is around 50 GPa. So, hopefully, this time, we won't have to wait as long.
Only if one can get ahold of the material.
> However, outside access may fall short of the community’s hopes. Dias and Salamat have founded a startup, Unearthly Materials, which, Dias said, has already raised over $20 million in funding from investors including the CEOs of Spotify and OpenAI. They’ve also recently applied for a patent on the lutetium hydride material, which would deter them from mailing out samples. “We have clear, detailed instructions on how to make our samples,” Dias said. “We are not going to distribute this material, considering the proprietary nature of our processes and the intellectual property rights that exist.” He suggested that “certain methodologies and processes” are also off the table.
This is exactly the kind of thing the patent system was intended to prevent. You have to share enough details of your secret sauce so others can replicate it, even if you intend to not license it or charge outrageous licensing fees. That way the knowledge is out there today, can be expanded on by other researchers/inventors, and at a specific date in the future anyone can use it.
It also helps reduce fraudulent claims of new discoveries.
I don't understand, and if you could illuminate it would be very helpful, how are such pressures achieved and maintained even for the shortest time without destroying the apparatus that holds it?
Wikipedia provides a nice overview:
http://en.wikipedia.org/wiki/Diamond_anvil_cell
Contra the diagram, I don't believe the diamonds in a real anvil are faceted.
According to the wikipedia article you link, they are usually faceted with 16 faces (under "Force Generating Device")
These are generally done with diamond anvils which concentrate force on an extremely small area.
I understand that but it sort of dodges the issue that such enormous pressures must exceed the strength of any material, however small that pressure is contained in.
The compressive strength of typical diamond is about 470 GPa, so no this is very well within the strength of the material. The current record for highest compression achieved in a lab is 770 GPa using a special form of diamond.
Oh, I want to believe. But I will do so only when those other groups publish their own results.
> Reactions by 10 independent experts contacted by Quanta ranged from unbridled excitement to outright dismissal, with many of the experts expressing some version of cautious optimism.
This is exactly how an arbitrary group of scientists should react to extraordinary claims like this. Not 100% dismissal because of the history or provenance of the researchers, not 100% acceptance without independent replication.
I think this is working as intended. We often joke that if something is publishing in Nature it's almost certainly wrong.
But realistically, at any time in any field there are only a small number of people who are truly pushing the state of the art (I'm referring to discovery science, not reference science- IE, a focus on adding 1% additional knowledge to our already copious understandings. Any I think tthose people shouldn't have to follow the normal rules about scientific publishing: I want them to push the limits as much as possible. That means occasionally publishing something that contains a mistake (not a falsification), and then being willing to have it retracted (without consequence to future publication).
These sorts of fields tend to self-correct because somethign that's wrong isn't reproducible, and all these scientists are working to reproduce each other's results (note: both the people who thing Dias faked his results were going to take the new protocol home and attempt to repro it in their hands ASAP).
In a sense, it's being willing to accept a higher false positive/false negative rate so you don't filter out some true positives.
Thread:
https://news.ycombinator.com/item?id=32993556 ("Retraction of Room-temperature superconductivity in a carbonaceous sulfur hydride (nature.com)", 46 comments)
> until this gets independently replicated.
hmm, very interesting. I don't think it works like that anymore. The next step has got to be commercial availability. else investors to would loose money or worse, some Chinese company could "steal" the future profits from this valuable novel technology painstakingly developed by the publishing group.
Even if it turns out to be true, this result is still very far from commercial applications. It would be a fantastic thread to continue for academic research groups at universities, but pretty much noone else will care about it.
“We are not going to distribute this material, considering the proprietary nature of our processes and the intellectual property rights that exist.”
The fact that they are going to making independent verification impossible makes me highly highly suspicious. Most scientists I know in their position would be sharing the material widely (even ahead of publication) to have replication from other parties, to bolster this claim. Until there is replication, I trust NOTHING from this group which has outright fabricated data previously.
Extraordinary claims, from those who have previously faked data, require independent third-party verification.
My thoughts exactly, and you can understand their motivation when they’ve “already received 20m in funding from …”
Apparently they're going to let the market validate their findings.
The article ends with a warning that full reproducibility may be difficult due to IP issues.
Considering the academic (?) origin and allegedly shady history of the research this drawing of a commercial veil might encourage scepticism as to how revolutionary and successful it really is.
Definitely sets off my BS detector. They're under no obligation to publicize this so if they want it to be a secret they could keep it a secret. This means they do want publicity, probably so they can raise money. But there's an official way to protect your IP and get publicity: file a patent that details your process.
Yeah, this sounds like they are using Nature as an advertising platform to hype up their work.
I didn't realise it was possible to publish a paper with the premise that details required to reproduce it are omitted in some sort of weird guessing game... It makes sense that the allegations focus on analysing data manipulation. Maybe it's real, maybe it's fake, but the extreme scepticism seems deserved if they aren't going to be completely open and have a history of doing the same thing over and over.