For many years, editions here of This Week’s Hype were mainly devoted to bogus claims that someone had found a way to get a testable prediction out of string theory or other “evidence for string theory”. Recently there have been many fewer such claims, with consensus in the string theory community that there is now no hope to get a prediction from string theory about observable physics at accessible energies. One can watch the recent talks here on Steven Weinberg and his legacy to get a good idea of what this current consensus looks like: you can’t test string theory since string effects occur at much too high an energy scale, and Weinberg showed that such things will just look like the Standard Model sort of QFT at observable energies. In addition, Weinberg is also credited with the anthropic CC argument, taken as evidence for the otherwise unobservable string theory landscape. Taken together, the consensus of leading particle theorists has become that there’s no point to trying to do any better than the Standard Model, with the only answer available to anyone who asks questions about higher energies is “string theory, whatever that is”.
With particle physics abandoned, theorists have focused on quantum gravity as the only legitimate issue to study. For many decades the hope was that a consistent answer to the unknown question of what string theory really is (often called “M-theory”) would be found, and that would provide a final end to the subject of fundamental physics. This final theory would be untestable, but it would self-consistently explain why one could not hope to test it. In recent years though, after decades of no progress towards a consistent M-theory, string theorists have essentially given up on this hope.
This situation has lead to a recent trend in string theory research: instead of looking for positive evidence for string theory, try to find an argument that resistance is hopeless, string theory is the only theory possible. The arguments of this kind I’ve seen make no sense to me, but they are gaining in influence. One place I noticed this is in this recent white paper about the interesting topic of celestial holography, which has little to do with string theory. There the authors write:
A crowning achievement for the celestial holography program would be for it to determine concretely whether string theories are the only consistent theories of (asymptotically flat) quantum gravity.
Today Quanta magazine has more of this sort of thing, with an article whose title shows up on the web as A Correction to Einstein Hints At Evidence for String Theory. The sub-headline tells us that
In a quest to map out a quantum theory of gravity, researchers have used logical rules to calculate how much Einstein’s theory must change. The result matches string theory perfectly.
which sounds pretty impressive. The article starts off with quotes such as:
The hope is that you could prove the inevitability of string theory using these [bootstrap] methods,” said David Simmons-Duffin, a theoretical physicist at the California Institute of Technology. “And I think this is a great first step towards that.
and
Irene Valenzuela, a theoretical physicist at the Institute for Theoretical Physics at the Autonomous University of Madrid, agreed. “One of the questions is if string theory is the unique theory of quantum gravity or not,” she said. “This goes along the lines that string theory is unique.”
The paper at issue is this one which appeared on the arXiv nearly a year ago. It’s not about string theory or about conventional quantum gravity in four space-time dimensions. The topic is graviton scattering in maximally supersymmetric theories in ten flat space-time dimensions, and the argument is that the basic principles of supersymmetry, Lorentz invariance, analyticity and unitarity imply a bound on the coefficient of the lowest order correction term. The only relation to string theory is that a string theory calculation of this correction coefficient satisfies the bound (as expected, since string theory is supposed to satisfy the assumed basic principles). Much is made of the fact that in string theory one can get any value of the coefficient consistent with the bound. This is taken as evidence for the “inevitability” of string theory, but I don’t see this at all. It’s more accurately evidence for the usual problem with string theory: it’s consistent with anything. If the authors of this paper had found that the string theory bound was different than their bound, they could have written a paper arguing that they had finally found a way to falsify string theory (measure the coefficient, if it was found to be in the region allowed by general principles but not by string theory, string theory would be falsified).
The article does get right the motivations behind these claims:
Some physicists hope to see string theory win hearts and minds by default, by being the only microscopic description of gravity that’s logically consistent. If researchers can prove “string universality,” as this is sometimes called — a monopoly of string theories among viable fundamental theories of nature — we’ll have no choice but to believe in hidden dimensions and an inaudible orchestra of strings.
To string theory sympathizers, the new bootstrap calculation opens a route to eventually proving string universality, and it gets the journey off to a rip-roaring start.
and it gives a little space to skeptics:
Other researchers disagree with those implications. Astrid Eichhorn, a theoretical physicist at the University of Southern Denmark and the University of Heidelberg who specializes in a non-stringy approach called asymptotically safe quantum gravity, told me, “I would consider the relevant setting to collect evidence for or against a given quantum theory of gravity to be four-dimensional and non-supersymmetric” universes, since this “best describes our world, at least so far.”
Eichhorn pointed out that there might be unitary, Lorentz-invariant descriptions of gravitons in 4D that don’t make any sense in 10D. “Simply by this choice of setting one might have ruled out alternative quantum gravity approaches” that are viable, she said.
Another critique, though, is that even if string theory saturates the range of allowed α values in the 10-dimensional setting the researchers probed, that doesn’t stop other theories from lying in the permitted range. “I don’t see any practical way we’re going to conclude that string theory is the only answer,” said Andrew Tolley of Imperial College London.
I don’t at all understand why Quanta chose to cover this. All it does is help to spread hype and further the cause of the “resistance is futile” campaign from proponents of a failed research program.
]]>Various items that may be of interest:
Hype
https://twitter.com/i/web/status/1478179487921565699
https://magazine.caltech.edu/post/quantum-gravity
]]>A few days ago I heard news from Paris of the death of Grichka Bogdanoff on Dec. 28, and this morning heard of the death yesterday of his twin brother Igor. There are many news stories online (e.g. here), and Lubos Motl has written about them here.
There’s a chapter in my book Not Even Wrong about “The Bogdanov Affair”, and quite a few blog postings here referred to the twins and their activities related to theoretical physics. The motivations for writing about them were always two-fold. That they had managed to get more or less nonsensical papers published in reputable physics journals in 2001-2 (Annals of Physics and Classical and Quantum Gravity) raised important questions about how one evaluates speculative theoretical physics research. But also, the whole story had many comic aspects (see for instance here). I always supposed that to some extent the brothers were in on the joke and I hope that was true. At one point they invited me to come see them when I was in Paris, but I decided not to take them up on the offer, since it seemed best to keep one’s distance from whatever they were doing. In recent years I hadn’t been following at all their activities.
There’s a darkly comedic aspect to this and other examples of prominent people opposed to COVID vaccinations succumbing themselves to the disease. I’m sorry that this happened to the brothers, putting a final all too avoidable tragedy at the end of their remarkable life stories.
]]>Multiverse mania started seriously among string theorists around 2003, with a defining event Susskind’s February 2003 The Anthropic Landscape of String Theory. At the time I was finishing up writing what became the book “Not Even Wrong”, and my reaction to Susskind’s paper was pretty much “This is great! Susskind’s argument implies that string theory can’t ever be used to predict anything. If people accept that, they’ll have to give up on string theory since it has come to the end of the line.” Over the next year or two it became clear that devotion to multiverse mania wasn’t just localized at Stanford (where Andrei Linde had always been pushing this, even before the string theorists climbed aboard). Other proponents of the string theory landscape were up and down the California coast, including Raphael Bousso at Berkeley and Joe Polchinski at UCSB. One West Coast holdout was David Gross, who that summer at Strings 2003 quoted Churchill’s words to his country during the Nazi bombardment of London: “Never, never, never, never, never give up”. On the East Coast, the center of the resistance was at the IAS in Princeton, where several people told me that Witten was privately strongly making the case that this was not physics.
I ended up adding an additional chapter to the book about this, and covering developments closely here on the blog. For many years I found it impossible to believe that this pseudo-scientific point of view would get any traction among most leaders of the particle theory community. How could some of the smartest scientists in the world decide that this was anything other than an obviously empty idea? After a while though, it became clear that this was getting traction and that there was a very real danger that particle theory would come to an end as a science, with most influential theorists giving up, justifying doing so by claiming they now had a solid argument for why there was no point in trying to go further. String theory was the answer, but the answer is inherently unpredictive and untestable.
It has become clear recently that we’ve now reached that end-point. From the new video of his discussion with Rovelli, it’s clear that David Gross has given up. No more complaints about the multiverse from him, and his vision of the future has string theory solving QCD 80 years from now, nothing about it ever telling us anything about where the Standard Model comes from. Today brought an extremely depressing piece of news in the form of a CERN Courier interview with Witten. Witten has also given up, dropping his complaints about the string theory landscape:
Reluctantly, I think we have to take seriously the anthropic alternative, according to which we live in a universe that has a “landscape”of possibilities, which are realised in different regions of space or maybe in different portions of the quantum mechanical wavefunction, and we inevitably live where we can. I have no idea if this interpretation is correct, but it provides a yardstick against which to measure other proposals. Twenty years ago, I used to find the anthropic interpretation of the universe upsetting, in part because of the difficulty it might present in understanding physics. Over the years I have mellowed. I suppose I reluctantly came to accept that the universe was not created for our convenience in understanding it.
I’ve never really understood the kind of argument he is making here, that the problem with the string theory multiverse is that it’s upsetting, but we just have to get control of our feelings. Feelings have nothing to do with it: the problem is not that the idea is upsetting, but that it’s vacuous.
The rest of the interview is also pretty depressing. At the high energy physics experimental frontier, he points to an idea (“split supersymmetry”) for trying to keep alive on life support a complicated failed idea:
There is also an intermediate possibility that I find fascinating. This is that the electroweak scale is not natural in the customary sense, but additional particles and forces that would help us understand what is going on exist at an energy not too much above LHC energies. A fascinating theory of this type is the “split supersymmetry” that has been proposed by Nima Arkani-Hamed and others.
On string theory, he follows Gross in referring to not “string theory” but “the string theory framework” and describes the situation as
We do not understand today in detail how to unify the forces and obtain the particles and interactions that we see in the real world. But we certainly do have a general idea of how it can work, and this is quite a change from where we were in 1973.
The situation with string theory unification is that it’s a failed idea, not that it’s a successful general idea just missing some details.
Finally, Merry Christmas and best wishes for the New Year. Fundamental physical theory may now be over, replaced with a pseudo-science, but at least that means that things in this subject can’t get any worse.
]]>Yet more math items:
What is the Arithmetical-Algebraic-Geometric interpretation of the Jones polynomial?
or of Chern-Simons theory?
or of TQFT?
You see that gauge theories and gravity appears in various interactions is it’s in nothing else in a sense, and geometric limits of various string theories or quantum field theories and what I claim that it’s in fact it’s something generally about complex systems and mathematics. You do some combinatorial problem, whatever it is you get some counting or something, and then maybe you look on asymptotic growth of the number of solutions. It could be something very simple but your arranged parameters became something more complicated and if you see something more complicated it’s kind of I think it’s unavoidable you see some physics in a very wide sense: some string theory, some membranes, whatever. Okay, thank you.
I can’t really make much sense of this, but he seems to have some sort of vision of fundamental physics being linked with complexity, a point of view I very much don’t share.
Moving to purely physics topics:
First some math news:
While I’ve always had some sympathy for the general idea that there’s much that could be changed and improved about the US K-12 math curriculum, there’s a huge problem with all proposed changes based on the “algebra/pre-calculus/calculus sequence is too hard and not relevant to everyday life” argument. Students leaving high school without algebra and some pre-calculus are put in a position such that they’re unequipped to study calculus, and calculus is fundamental to learning physics. Without being able to learn physics, a huge range of possible fields of study and careers will be closed to them, from much of engineering through even going to medical school. Whatever change one makes to K-12 math education, it shouldn’t leave students entering college with a severely limited choice of fields they are prepared to study.
For some math items:
In a quick follow-up discussion with me in July 2021, Mermin confessed that he now regrets his choice of words. Already by 2004 he had ‘come to hold a milder and more nuanced opinion of the Copenhagen view’. He had accepted that ‘Shut up and calculate’ was ‘not very clever. It’s snide and mindlessly dismissive.’ But he also felt that he had nothing to be ashamed of ‘other than having characterised the Copenhagen interpretation in such foolish terms’.
A couple months ago I recorded a podcast with Lex Fridman, it’s now available here.
A lot of Fridman’s other interviews are well worth watching or listening to, and I thought we had an interesting conversation. I can’t stand listening to or watching myself, so not sure how it turned out. But happy to answer here any questions about what we were discussing.
]]>During recent travels I attended two conferences (in Paris and Berkeley) and met up with quite a few people. At the Paris conference I gave an intentionally provocative talk to the philosophers of physics there, slides are here. The argument I was trying to make is essentially that more attention should be paid to evidence for a deep unity in much of modern mathematics, which at the same time is connected to our best unified theory of physics (the Standard Model and GR). Edward Frenkel has made some similar points, referring to the Langlands program and its connections to physics as a “Grand Unified Theory of Mathematics”. The specific structures underlying this unification seem to me to deserve attention as providing an important way of thinking about what’s at the “foundations” of both math and physics.
Another motivation for this talk was to make an argument against what I see as having become a widespread and standard ideology about the search for a unified theory in physics. Talking to many physicists and mathematicians interested in physics, I noticed that the conventional wisdom, shared by the establishment and contrarians alike, is that the SM and GR are likely low energy emergent theories, that some completely different sort of theory is needed to describe very short distances such as the Planck scale. Physics establishment figures tend to believe that following the path started with string theory, then AdS/CFT, lately quantum error correction or whatever, will someday lead to a dramatically different sort of theory, replacing space, time and maybe quantum mechanics. Contrarians often have their own favorite idea for a radically different starting point. For an example of this, take a look at Figures 2 and 3 of Mike Freedman’s The Universe from a Single Particle (he spoke about this in Berkeley). Figure 2 is the “establishment” picture, with AdS/CFT the fundamental theory, well-decoupled from the emergent SM + GR (since no one has any idea how to relate them). His Figure 3 shows his own proposal, even better decoupled from any connection to the SM + GR.
Given the extreme level of experimental success of the SM + GR, the obvious conjecture is that these are close to a unified theory valid at all distances. That the mathematical framework they are built on is closely connected to unifying structures in mathematics provides yet more evidence that what one is looking for is not something completely different. The odd thing about the present moment is that arguing that our well-established successful theories can provide a solid basis for further unification makes one a contrarian, with the “establishment” position that a revolution sweeping such theories aside is needed.
I hope to find time in the next few weeks to write up what’s outlined in the slides as a more detailed article of some sort. More immediately, I plan to write a blog entry and perhaps some more detailed notes about the “twistor $P^1$” mentioned at the end of the talk, explaining how it shows up in Euclidean twistor theory as well as in recent work on the Langlands program.
]]>I haven’t been posting here for a while, partly due to a lot of traveling, partly due to some personal time-consuming commitments, and largely due to a lack of much in the way of news that seemed worth much attention. For some examples of such news that might be of interest:
This year the process has involved a highly peculiar situation with the budget for US LHC contributions (prospects for large cuts, assumed to get fixed mysteriously in the last minute process). For the details of what is going on, there’s a news story here, and discussion at an HEPAP meeting here. For the first time I’m aware of, the HEPAP meeting videos are on Youtube (see links here), so one can follow the actual discussion between physicists and government officials there.
Unfortunately, it has come to my attention that certain misunderstandings concerning IUT continue to persist in certain parts of the world. Perhaps the most famous misunderstanding concerns an asserted identification of “redundant copies”. This misunderstanding involves well-known, essentially elementary mathematics at the beginning graduate level concerning the general nonsense surrounding “gluings”. For instance, if one “applies” this misunderstanding to the well-known gluing construction of the projective line, then one concludes that the two copies of the affine line that appear in this gluing are “redundant’’, hence may be identified. This identification leads immediately to a contradiction, i.e., to a “proof” that the projective line cannot exist! More details may be found in the Introduction to [EssLgc] and the references given there.
In case anyone thinks it’s plausible that what’s going on here is that Peter Scholze is making errors in elementary mathematics at the beginning graduate level, David Roberts has an explanation of what’s going on here.
On the string theory front, it’s become impossible to figure out how to have any sort of scientific debate about most of the public defenses of “string theory”. For two recent examples:
There’s faith that one way or another we should be able to test these ideas… It might be very indirect—but that’s not something that’s a pressing issue.
First some personally relevant items:
In math and physics news, there’s:
The latest from the BBC:
String theory – a simple way to understand the universe
Not worth more comment than it’s another reminder that this nonsense continues to be heavily promoted in our most prominent and respected mass media. I’m beginning to doubt we’re going to be rid of it in my (or anyone’s) lifetime.
]]>I just noticed that Gordon Kane has recently published a second edition of his 2017 String Theory and the Real World. Columbia doesn’t seem to yet have full online access to the second edition, but one can already compare the two editions in a few places. For instance, on page 1-5 of the 2017 edition one reads
The LHC is now working in a region of energy and intensity where well-motivated theories imply superpartners could be seen by late 2018.
and
There is good reason, based on theory, to think discovery of the superpartners of Standard Model particles should occur at the CERN LHC in the next few years.
The corresponding first chapter of the latest edition has:
The LHC has so far just entered the region of superpartner masses predicted by compactiﬁed theories, which ranges from about 1.5 to ∼5 GeV (we’ll discuss that range later). Those values are the only physics predictions, rather than just speculations. The LHC will run with higher luminosity after an upgrade, beginning in late 2021 if pandemic work stoppages do not delay it. That increases the possibility of discovery, though not very much. A higher energy collider is needed. From what we know now, a collider with twice the LHC energy range would probably sufﬁce, and cover the region of gluino masses to about 5 GeV.
The concluding chapter of the 2017 edition tell us:
The compactiﬁed M-theory implies that three superpartners (and only three) will be observed at the LHC in the current three-year run (assuming the full integrated luminosity is achieved). These are the gluino and the charged and neutral winos.
Presumably he’s talking about the LHC Run II (2015-18) which did meet its luminosity goals, without any hint of the three superpartners. I don’t yet have access to the later parts of the 2021 edition to see what they say about this.
This isn’t the first time Kane has published multiple editions of ever changing “predictions” about supersymmetry. At one point I compared the 2000 and 2013 editions of “Supersymmetry and Beyond”, you can see the results here.
]]>First some items on the mathematics side:
On the physics side:
These are interesting times for particle physics: times of great uncertainty, in which our physics perspective is changing, and in which we are laying the foundations for the future of our field. As a community, we must rise to the challenge.
What worries me is that the much of the rest of the article contains a lot of
The multiverse describes a physical reality that challenges the presumption that there must be a single unified theory in the deep UV. In a sense, it is the ultimate Copernican revolution since not even the patch of the universe we live in is special. It implies a revision of the cosmological principle because the universe is approximately homogeneous and isotropic only within our horizon, but may be globally highly non-homogeneous. The multiverse is not an abstract idea, but it is a generic consequence of a large class of inflationary theories, where unavoidable quantum fluctuations of the inflaton spark a chain process with eternal creation of regions that expand faster than the surrounding space.
The multiverse is actually a familiar instrument of our everyday physics toolkit.
There are also theoretical indications for questioning the concept of symmetry. It is nowbelieved (and to a certain extent proven) that any global symmetry is violated at the level of quantum gravity. This means that any global symmetry that we observe in nature is only an accidental effect of looking at a system without sufficient short-distance resolution. The case of gauge symmetries is more subtle. Gauge symmetries are not real physical symmetries, in the sense
that they don’t correspond to an invariance under a physical transformation, but only to a redundancy of the coordinate parametrisation. We often confuse our students on this point by showing them the Mexican-hat potential and leading them to believe that there is a degeneracy of vacua, when in reality there is only one single vacuum state that breaks EW symmetry, as it is clear from the fact that the physical spectrum doesn’t contain any Goldstone boson corresponding to zero-energy excitations. Gauge symmetries may not be as fundamental as we thought, but only an emergent phenomenon. They could be a mirage of a different reality that takes place at a more fundamental level.
It’s looking depressingly possible that leaders of the field will push through as new “foundations for the future of our field” the argument that “the multiverse did it and symmetry is a mirage.” Instead of moving forward, the field will take a huge step backwards.