Speaking truth to parallelism

Today The Economist put out this paragon of hard-hitting D-Wave journalism. At first I merely got angry — but then I asked myself what Winston Churchill, Carl Sagan, or Chip ‘n Dale’s Rescue Rangers would do. So let’s see if The Economist prints the following.

SIR —

In a remarkably uncritical article about D-Wave’s announcement of the “world’s first practical quantum computer” (Feb. 15), you gush that “[i]n principle, by putting a set of entangled qubits into a suitably tuned magnetic field, the optimal solution to a given NP-complete problem can be found in one shot.” This is simply incorrect. Today it is accepted that quantum computers could not solve NP-complete problems in a reasonable amount of time. Indeed, the view of quantum computers as able to “try all possible solutions in parallel,” and then instantly choose the correct one, is fundamentally mistaken. Since measurement outcomes in quantum mechanics are random, one can only achieve a computational speedup by carefully exploiting the phenomenon known as quantum interference. And while it is known how to use interference to achieve dramatic speedups for a few problems — such as factorising large integers, and thereby breaking certain cryptographic codes — those problems are much more specialised than the NP-complete problems.

Over the past few days, many news outlets have shown a depressing willingness to reprint D-Wave’s marketing hype, without even attempting to learn why most quantum computing researchers are sceptical. I expected better from The Economist.

Scott Aaronson
Institute for Quantum Computing
University of Waterloo

I thought ‘factorising,’ ‘specialised,’ and ‘sceptical’ were a nice touch.

A Final Thought (2/16): As depressing as it is to see a lazy magazine writer, with a single harebrained sentence, cancel out your entire life’s work twenty times over, some good may yet come from this travesty. Where before I was reticent in the war against ignorant misunderstandings of quantum computing, now I have been radicalized — much as 9/11 radicalized Richard Dawkins in his war against religion. We, the quantum complexity theorists, are far from helpless in this fight. We, too, can launch a public relations campaign. We can speak truth to parallelism. So doofuses of the world: next time you excrete the words “NP-complete,” “solve,” and “instantaneous” anywhere near one another, brace yourselves for a Bennett-Bernstein-Brassard-Blitzirani the likes of which the multiverse has never seen.

Update (2/16): If you read the comments, Geordie Rose responds to me, and I respond to his response. Also see the comments on my earlier D-Wave post for more entertaining and ongoing debate.

56 Responses to “Speaking truth to parallelism”

  1. Neratin Says:

    I think D-wave really should insert ‘quadratic speedup’ disclaimer into every single press release they write, just for the sake of journalism.

  2. Paul Beame Says:

    At least Scientific American managed a reasonable article on it that does not mention NP-complete problems, and quotes Seth Lloyd’s skeptical comments at least as much as D-Wave’s PR.

  3. Scott Says:

    Thanks, Paul! But your link doesn’t work. Here it is.

  4. Altman Says:

    Deutsch weighs in on Wired.

  5. Will Hunting Says:

    Shouldn’t the letter be addressed SIR/MADAM or better .. MADAM/SIR ? After all a woman now heads the “most powerful” university in the world. 🙂

  6. Scott Says:

    The article was written by one Stephen Jeffrey, who I assume is male.

  7. Devin Says:

    It wouldn’t surprise me for your letter—or one like it—to get published in the next issue of The Economist. They are generally good about such things, assuming you actually posted/e-mailed it to them. However, you forgot ‘Dr.’ before your name, so might get bumped by someone else. Plus, you’re probably over the word limit.

    Also, I’m surprised you could find the byline on the article: the paper has a policy of not attaching names to work.

    Good on you for standing up for our field.

  8. Scott Says:

    However, you forgot ‘Dr.’ before your name

    I thought that was simply assumed. 😉

    Plus, you’re probably over the word limit.

    Yeah, I figured they can edit out what they don’t want. Still, I probably should’ve left out the stuff about interference.

    Also, I’m surprised you could find the byline on the article: the paper has a policy of not attaching names to work.

    Oops — now that I look at it again, could Stephen Jeffrey be the illustrator rather than the author? That’s confusing!

  9. David Moles Says:

    For a minute I thought you’d written “We, the quantum complacency theorists…”

  10. Robin Blume-Kohout Says:

    Scott, much as I applaud your stirring declaration of War on Doofosity, I have to take issue with your statement that “before I was reticent… [w]e, too, can launch a public relations campaign.” I mean, you are a walking, blogging public relations campaign! 🙂 Okay, that’s a bit over the top. Nonetheless, I think Shtetl-Optimized is already known as the place to go for hard-hitting, up to date quantum complexity debunkification.

    Deutsch’s interview with Wired (see Altman’s link) has some very sensible and thoughtful bits — I particularly liked the discussion around “I have to stress that the term qubit hasn’t got a very precise definition at the moment,” which is (a) true, (b) practically embarrassing, and (c) somewhat philosophically interesting.

    However, I did cringe a bit at the “parallel universes” comments. Yeah, I know Deutsch does believe that literally, and he’s got the right to express his opinion… but Mr. Random Wired Reader has no way of knowing that the beliefs of the Father of Quantum Computing are not ubiquitous in the scientific community.

    The end result is that statements like “Quantum computers get their power from parallel universes” [paraphrase of Deutsch] appear on an equal footing with “Quantum computers can’t efficiently solve NP-complete problems” [paraphrase of Aaronson].

  11. Greg Kuperberg Says:

    Deutschism is a common way to explain quantum computing to lay audiences. My response is: Quantum computers get their power from parallel universes only in the same sense that randomized algorithms do. It is easy to understand that randomized computation is a kind of parallel computation, however a very limited kind that also has non-many-worlds descriptions. That is the right frame of mind for quantum computation.

  12. RubeRad Says:

    Isn’t this wave of publicity a good thing in the end? Assuming that D-Wave is fundamentally wrong (I’m not informed about quantum computing, so I’ll take your word on it), reality will confound their attempts to live up to the hype. In the end, Truth cannot lose.

    I guess the question is whether all of these gushing journalists will have enough integrity to follow-through and publish the aftermath as thoroughly as the buildup.

  13. Greg Kuperberg Says:

    Isn’t this wave of publicity a good thing in the end?

    It’s the Anna Nicole Smith philosophy: There is no such thing as bad publicity. But look what happened to her.

  14. Scott Says:

    In the end, Truth cannot lose.

    Truth can lose, truth does lose, and truth has been on the losing side for most of human history. Ten times as many astrologers as astronomers, etc. etc. To me, the more interesting point is that occasionally truth can win.

    I guess the question is whether all of these gushing journalists will have enough integrity to follow-through and publish the aftermath as thoroughly as the buildup.

    That’s certainly one question. Another question is: if they do follow through, will their new line be that quantum computing research as a whole must’ve been bunk?

  15. H Haller Says:

    I think the best move for us `truth-sayers’ is to get some one big – some one whose, let’s say, essay on the issue could be published as op-ed in NYTimes – and speak the truth as our representative.

  16. Robin Blume-Kohout Says:

    Greg: That’s a good way to think about it. At least, I think so. But it seems clear that Deutsch himself doesn’t fully agree with your analysis, since it implies (or states) that there are satisfactory non-many-worlds ways to think about quantum computing — whereas Deutsch seems to be saying that a quantum computer can be used as a proof of many-worlds.

    It’s this approach that makes me cringe. I’m open to the idea that many-worlds is strictly more useful for interpreting quantum computing than for interpreting classical computing (in contradiction to “Quantum computers get their power from parallel universes only in the same sense that randomized algorithms do”), but I’m not a literal many-worldsist, and I think it’s an idea about which a little knowledge (which is what is conveyed by, e.g. the Wired interview) is a dangerous thing.

    And now I’m going to stop cluttering up the Shtetl with interpretation.

  17. John Says:

    The Economist’s audience is not a technical one. That’s the reason their article is not scientifically correct, and is the same reason why your letter will probably not get published. Maybe it will, but its audience will not understand it–although, like D-Wave, you did your best to try to explain it to a non-technical audience.

    As for D-Wave marketing hype, you also have to understand who the audience is. It is potential investors, business people, potential customers, and other “suits.” It is not the Scott Aaronsons of the world, although if you attended the Mountain View demonstration you would have found that Geordie Rose’s talk was targeted more towards a technical and non-technical audience and was well done and honest.

    If you want to see something really bad, look at this from the CBC. It makes me cringe. It is 100% incorrect. But you have to ask yourself who the audience is, then shrug it off.

  18. Joe Fitzsimons Says:

    Actually, the disturbing thing about the CBC clip for me is that they’re interviewing a guy who clearly doen’t know what he’s talking about.

    How on earth do news shows pick up these ‘experts’.

  19. EntropyFails Says:

    I agree that the CBC clip is one of the worst pieces of “journalism” I’ve ever encountered. The management must have said “just get a programmer to explain it” and left it at that. The “expert” should be ashamed to have gone on television without any understanding of the technology involved.

    I think if the journalists wanted to actually inform their audience, they could find several good metaphors to help a layperson understand things better. But as the current crop of articles/interviews show, they only want to hype up something to make it seem like they “got the scoop on the superawesome machine.” It seems like just another reflection of the sad state of journalism in the mass media world.

    Thankfully, we have the web. Let us take a moment to feel sad for those who choose not to use it to further their understanding.

  20. Scott Says:

    John: I love your form of argument!

    “Sure, my claims about my job and marital status might have been technically ‘lies,’ but you have to understand who my audience was — women I wanted to sleep with!”

    “Granted, my claims about the budget deficit were less than truthful, but put it in context: I wasn’t addressing a bunch of PhD economists, I was addressing the voters!”

  21. John Sidles Says:

    A common strategy of startup companies is that the principles in the company hold stock options that acquire value if and only if the price of the publicly traded shares is above a preset “strike price” for a preset period of time.

    The resulting incentive to keep the share price high sometimes lead to egregiously optimistic claims for the company’s technology. This business strategy is known as “pump and dump”. It is illegal, but very hard to prove — because who is to say that a founder’s enthusiasm is not genuine?

    A similar incentive acts for privately-held companies, whenever a new round of funding is urgently needed.

    The lesson (if there is one) is that peer review may not be perfect, but it does act to rein in the above incentives.

  22. Andy D Says:

    “Today it is accepted that quantum computers could not solve NP-complete problems in a reasonable amount of time.”

    Clearly it’s not accepted by some people, like the article writer, and even if you’re implicitly restricting to the complexity community, there’s no need here to invoke a consensus view (which, I’d say, doesn’t exist–not quite) that BQP != NP. Why not just say “there is no theoretical evidence that quantum computers could…”

  23. Geordie Says:

    Scott:

    You’re a smart guy and I love reading your stuff. But I think there’s some kind of disconnect in the way you’re approaching the whole media issue.

    First of all, we agree on every important technical point related to QC/NP. I am actually one of your biggest fans, and point your articles & blog out to anyone who asks for a competent & clear review of any of this stuff.

    I always attempt to be clear about what we’re building and what it does. As you know if you’ve read my blog or seen any of my presentations (a few of which, including the demo, are available on youtube etc.), I am always very clear about the fact that the machines we’re building have only two objectives: (1) better accuracy in the same time as conventional approaches, (2) same accuracy in less time than conventional approaches. We are always careful to state that we are aiming at a quadratic scaling advantage for a particular type of problem.

    You take issue with the use of the word “solve” in the same sentence as “NP-complete” in these articles. Nowhere have I ever claimed that these machines are exponentially faster for NP-C problems. In fact I make it a point to tell people that I believe that your argument about this being some kind of fundamental law is a very cool idea and quite possibly correct. However you seem to be missing the very basic point that “solve” means something different to the 99.99999999999999% of people who aren’t complexity theorists. For most people this word means “solve the problem to the satisfaction of the user”, which certainly does not mean exactly for these types of problems.

    As an aside, you don’t seem to be attacking for example Ilog or Dash with the same verve, although their software products are designed to “solve” NP-hard problems.

    Does Intel “solve” the hardware verification problem for their chips? Does Southwest “solve” the crew scheduling problem for their planes? Do hedge funds “solve” portfolio optimization problems? Of course they all do. Are any of these solutions exact? Of course not. Does it matter? Not really.

    All communication of science includes metaphor and definition issues. Here it’s the word “solve”, but I can think of some others, like “theory” for example.

    It makes a lot more sense to me for all of us QC people to work together to make this stuff comprehensible to a lay audience instead of somehow thinking we’re on different sides. The press loves controversy and knows exactly how to generate it even if it’s not there. This whole issue is a great example–we agree on every single point but somehow we seem to have been maneuvered into appearing to have different viewpoints.

  24. Scott Says:

    Geordie, I appreciate your taking the time to comment here, as well as the kind things you said. I also appreciate D-Wave’s having submitted its results for review, and I look forward to learning more details of what you’ve done as they become available.

    Four quick responses:

    (1) If you read (for example) the Economist piece, it does state unequivocally that a quantum computer could solve NP-complete problems “in one shot” — which given the surrounding context, any reasonable reader would interpret as “by trying all solutions in parallel.” To me, this is arguably the most profound misconception about QC that it’s possible to have, and it’s one that occurs over and over and over. Now, I was careful not to blame D-Wave for this misconception — my career as a media go-to guy has barely even started, and already I know well what it is to be mangled and misquoted. But I do think it would be wise to try and dispel such misconceptions before they arise.

    (2) Prior to your post, I’d never heard of Ilog or Dash. If popular articles get written about classical optimization software analogous to the ones that are now being written about QC, then rest assured: you will read about it on this blog.

    (3) The issue of exact vs. approximate solutions is a bit of a red herring. For many NP-complete problems, we know that finding an approximate solution is every bit as hard as finding an exact one. For other NP-complete problems, we know how to find a good approximate solution classically. There are almost no examples of NP-complete problems for which we know how to find a better approximate solution in quantum polynomial time than in classical polynomial time. This is not to say that such examples don’t exist — just that as yet the evidence isn’t there.

    (4) If you accept point (3), then the whole issue becomes one of polynomial speedups.  I’m certainly glad that you’re not claiming an exponential speedup. But at least based on the evidence I know about, whether you’re seeing a quadratic speedup — or indeed any asymptotic speedup — is very much open to question. Hence the question I asked you earlier: have you compared the empirical scaling for your adiabatic algorithm to the empirical scaling for the best classical algorithms like simulated annealing? If so, what were the results?

    Again, thanks for your willingness to engage in dialogue about these issues.

    –Scott

  25. Joseph Says:

    Geordie,

    I believe the problem is this: the informed public (those with some minimal understanding of what a complexity class is) knows that classical computers cannot “solve” NP-complete problems in a reasonable amount of time. Thus, it is understood that algorithms running on classical computers do something else when they “solve” such problems, eg. produce approximate solutions. On the other hand, if someone claims that they have built a computer strictly more powerful than the normal kind, and that this machine CAN solve NP-complete problems, the natural assumption is not that it merely produce approximations with a quadratic speedup. The natural assumption is that it solves them exactly, and in a reasonable amount of time. I think it would be disingenuous to pretend otherwise.

    You are not a typical marketroid, that much is obvious, but let’s be honest and agree that, to some extent, there are different sides here. I don’t think you want to lie to people, and you clearly care about people understanding the technology, but you do have a profit motive which the rest of us don’t share. Thus it is only natural that you should be more inclined towards hype, and less inclined towards straightening out potential misunderstandings that, in the end, only give you excellent publicity.

  26. andy Says:

    Dear Dr. Aaronson,
    Can you point the interested person to a primer on the topic of quantum computing so that he or she might be able to pick out such malarky as is found in the press article?
    -Andy Y

  27. Scott Says:

    Hi Andy D.,

    Yes, I was implicitly restricting to complexity theorists. 🙂 Within the relevant scientific communities, I would say there is a consensus that NP is not in BQP, just as there’s a consensus that P!=NP, that the Riemann Hypothesis holds, and that the universe is ~14 billion years old. Obviously any of these things could turn out to be false — the point is that they’re much more secure than countless other propositions that people base life-and-death decisions on.

    In my experience, most people have an epistemological double standard: when it comes to science, they will immediately seize on the tiniest sliver of doubt in a way they would never do in everyday life. (QM? Evolution? Global warming? “Just a theory!”) Because of this strange phenomenon, I think it’s much easier to overstate the uncertainties surrounding a scientific consensus than to understate them — although of course both kinds of error are possible.

  28. Joseph Says:

    Andy Y,

    You don’t need to wait for that question to be answered by the esteemed Dr. Aaronson, I can do it for him: look in the “Quantum Computing Primers” section. It’s right on the main page.

  29. Scott Says:

    Yes — inspired by Andy Y. (and the 10,000 other people who’ve asked me the same thing), I just added that section today. Happy Hadamards!

  30. Andy D Says:

    Is the case for P != NP at all comparable to that for global warming? The latter is a story of scientific success, the former is one of failure–an admirable effort with wisdom and perspective gained, maybe, but failure nonetheless. Global warming has enormous consensus papers filled with evidence (of something observable that is happening *now* on this planet); P != NP (whose truth/falsehood could have arbitrarily little bearing on empirical reality) has a 61-to-9 edge in Gasarch’s (unscientific) poll, with the correspondents admitting they have basically no idea how to prove it.

    “Obviously any of these things could turn out to be false — the point is that they’re much more secure than countless other propositions that people base life-and-death decisions on.”

    Is P != NP really secure at all? And how would I base any kind of decision simply on the belief that P != NP? I do act on the secure belief that there’s no very-simple proof lying around somewhere (else I’d be looking), and on the belief that a truly fast SAT solver won’t be found in the next 10 years (else I’d look for the industries most likely to benefit and invest in them)… but the practical impact (and explanatory power) of the P != NP hypothesis, taken alone, seems almost as nebulous as that of the existence/nonexistence of God.

    So why is it socially or scientifically imperative to develop/assert a consensus that P != NP? Whereas anti-evolution and anti-global warming science tends to involve a serious disconnect with mainstream findings and methodology, a certain amount of progressive algorithmic research seems to be driven by an attempt to eventually show P = NP. (Didn’t Valiant outline a program to show NC = #P at the last FOCS? Slim prospect or no, it wasn’t a joke or a scandal.) Maybe such efforts have tapered off overall, but so too have efforts to separate P and NP.

    In contrast to creationist geoscientists, P = NP believers can not only have their beliefs, but run with them in doing good science. So can complete agnostics. It’s a mistake to try and condense the lessons learned in complexity theory into the message that P != NP.

  31. Andy D Says:

    Amend that, P != NP does have explanatory power, but only for other phenomena with potentially no practical impact (e.g. nonexistence of sparse NP-complete sets).

  32. Scott Says:

    Andy, I was starting to worry that we agreed about everything! So I’m glad we have such a fundamental disagreement here.

    the practical impact (and explanatory power) of the P != NP hypothesis, taken alone, seems almost as nebulous as that of the existence/nonexistence of God.

    One of the basic points in philosophy of science is that no hypothesis is ever “taken alone.” Any hypothesis would be devoid of practical impact and explanatory power, if enough other things conspired to nullify its power (e.g. if we were being controlled by a Cartesian demon).

    So why do you criticize the explanatory power of P!=NP taken alone, but not evolution taken alone or the existence of staplers taken alone? It seems to me like another case of double standards.

    You might say the difference is that P!=NP is a purely mathematical statement, but that doesn’t seem important to me. Since the practical impact we’re talking about lies outside of mathematics, obviously that impact will depend on other hypotheses that also lie outside of mathematics. (For example, that we can build reasonably fast computers.)

    Incidentally, when you invent computational learning theory, you earn the right to speculate that NC = #P. 🙂 And even then, only toward the end of an otherwise reasonable talk. It’s kind of like how, when you discover Hawking radiation, you earn the right to speculate that time has an imaginary coordinate.

  33. jhl Says:

    Looking at the various media writeups on this subject over the past few days, it seems that a recurring problem is combination of a shallowness of knowledge on the part of any given author (with obvious exceptions) combined with an imperative to prime a presumably ignorant audience on a rather esoteric and subtle subject.

    Again and again we get articles that basically parrot a half-remembered paragraph on qc from the “Tomorrow” chapter of an Intro. to Computer Science textbook; despite the fact that what D-Wave is attempting doesn’t really fit into that context.

    Of course, this phenomenon is hardly limited to this field of study. It seems that whatever a person’s specialty is, he or she will be frustrated and annoyed by the inadequate representations of that field written by generalists for a lay audience.

    I can think of no solutions other than to discourage and point out lazily-researched writing while simultaneously working to gently raise the overall level of knowledge and critical thinking within society as a whole. Sadly, the latter is probably more difficult than providing an providing an exact solution to an NP complete problem.

  34. Scott Says:

    I can think of no solutions other than to discourage and point out lazily-researched writing while simultaneously working to gently raise the overall level of knowledge and critical thinking within society as a whole.

    Hence, this blog. 🙂

  35. Andy D Says:

    I acknowledge that a suitably beefed-up quantitative version of P != NP would have practical meaning, but

    i) the competing theory, that any fast algorithms that do exist are subtle and well-hidden, would remain empirically plausible, since for as long as we don’t have powerful algorithmics we can’t effectively explore much of algorithm-space (even compared to the space of counterexamples to the Riemann Hypothesis);

    ii) it would remain conceptually plausible, since we haven’t ‘explained’ the nonexistence of good algs so much as assumed it (or at best reduced it to the assumption that SAT is very hard, and that’s assuming we get a decent theory of average-case-complexity worked out), and as we have evidence that certain kinds of algorithms and proofs (Speed-up Theorem algs, undecidability of theoremhood) are necessarily well-hidden;

    iii) the beefed-up P != NP statement would come from nowhere, having little to do with (and distracting from) the actual current findings and intellectual authority of complexity theory. This objection is the source of what you see as my double standard, since whereas global warming is not and cannot be ‘taken alone’, P != NP is taken essentially alone by TCS, and I think we should neither apologise nor dissemble.

    If I could subdivide complexity theory in its current form, I would call most of it ‘reducibility theory’, and emphasize, a la Papadimitriou, that it is geared towards ‘decreasing the number of questions without increasing the number of answers’. And while I applaud your efforts to deflate hype and misunderstandings surrounding pseudosolutions to P (and/or BQP) vs NP, I wouldn’t overstate the consensus. The common knowledge that very smart people have tried and failed for decades to show P = NP is plenty efficacious in positively shaping research and behavior, without having to have epistemic status comparable to evolution or global warming.

  36. Greg Kuperberg Says:

    We agree on every single point but somehow we seem to have been maneuvered into appearing to have different viewpoints.

    This could be the most disingenuous statement that I have ever seen in quantum computing. I just spent some time reading D-Wave’s press releases and Geordie Rose’s blog, and I can say that both have been bad for my blood pressure. In particular, they sound nothing like Scott Aaronson.

    Did you know, Scott, that noise actually helps adiabatic quantum computing? It’s a reason to hate the gate model. Who else here hates the gate model?

    I confess that I don’t hate the gate model. I have nothing against adiabatic quantum computing either, but I am against the claim that noise improves it. As a corollary to that and other statements, “the world’s first commercially viable quantum computer” does start to sound like quantum computing’s answer to cold fusion, just as Andrew Steane suggested in one of the newspaper articles.

    I would ask whether the frustration with journalists is misdirected. You can only expect so much from journalists. What about D-Wave iitself?

  37. Greg Kuperberg Says:

    The issue of exact vs. approximate solutions is a bit of a red herring.

    That’s right, it’s a red herring (not just a bit of one). However, it’s a red herring that is on the plates of a few QC theorists. Coincidentally, their papers are the ones that define D-wave’s technical plan. Are you surprised that these ideas that you find unconvincing have come to life in a real quantum computer?

  38. John Says:

    Greg, there was an archiv paper on that blog post you mentioned: http://arxiv.org/pdf/cond-mat/0609332
    In the abstract it says “We describe a variant of adiabatic quantum computation, which takes advantage of thermal noise to realize improved performance. We show that this approach, which we call thermally assisted adiabatic quantum computation (TAQC), can provide O(√N) scaling for the unstructured search problem.”

    Is there something wrong with the paper? or are you just going to denounce their claim without providing any evidence to back up your claim?

  39. Ze Says:

    Andy D says: Is the case for P != NP at all comparable to that for global warming? The latter is a story of scientific success, the former is one of failure–an admirable effort with wisdom and perspective gained, maybe, but failure nonetheless.

    Is P != NP really secure at all?
    There are results that strengthen it rather than just people voting. Although we have no idea how to prove it , we do have some theorems such as if FPT=W[1] then the complexity hierarchy collapses.

    I haven’t really gone deeply into this but I heard it from Mike Fellows (It’s been a few years, so if I’m wrong I accept all responsibility for it) so and I trust his opinion on parametrized complexity and computational complexity.

    One of these days I’ll have to find the papers and read them.

  40. Geordie Says:

    Greg:

    Did IBM steal your dolls when you were a child or something? Try breathing deeply and counting to ten.

    The role of environments in AQC is subtle, and yes there is evidence that noise can in fact help AQC. You seem to not be aware of the large body of work on studying Landau-Zener transitions in the presence of environments. Since you seem to have the same problem that journalists are being accused of recently (lazy research, regurgitating half-chewed dogma), why don’t you first do your homework and at least read the relevant technical papers before dismissing the results. If you have trouble finding any of the material, I can help you find the rather exotic journals these are in…like PRL, PRB, Rev Mod Phys, oh and also there’s this great thing called arxiv.org which you should check out.

    And yes I hate the gate model. Comes from (1) actually trying to build one and (2) actually trying to build real gate sequences for real gate model QCs. Ever done either of those things?

  41. Greg Kuperberg Says:

    Is there something wrong with the paper?

    I don’t know that there is anything wrong with that paper as an arcane work in adiabatic quantum computing. The paper says, here is an adiabatic algorithm that does a special-case Grover search in time O(sqrt{N}) with a particular time constant. When you add a little noise, the time constant seems to improve. The paper could be true.

    However, there is a great deal wrong with going from that paper to the words “commercially viable”. Indeed, the comments section of one of the announcements of the Demo has the words “NOT COMPETITIVE” in capital letters. cond-mat/0609332 may be mathematically valid, but you would only expect noise to improve a Grover search that’s not competitive. You would only expect to hate the gate model if you built a system a system that’s not competitive (whether or not it uses the gate model itself). And you would only expect fault-tolerant error correction (either implicit or explicit) to be needless in a system that’s not competitive. If it’s not competitive, how is it commercially viable?

  42. Greg Kuperberg Says:

    And yes I hate the gate model. Comes from (1) actually trying to build one and (2) actually trying to build real gate sequences for real gate model QCs. Ever done either of those things?

    I have attended talks on experimental QC. I fully understand that the genuine version is supremely difficult and therefore easy to hate.

  43. Gil Says:

    I like Scott’s skeptical spirit but I am not sure he asks the right question.

    Here is an analogy:

    Suppose you want to assess the performance of a 19th century company that builds automatic chess playing machines. In their brochures they write:

    ” Our machines find the optimal strategy for chess and hence win against every human player. ”

    A 19th century bright blogger objects based on theoretic reasons:

    Blogger: “The amount of possibilities in chess is so large that it is unfeasible to find an optimum strategy!”

    To this the manager replies:

    Manager: ” I admire your blog and papers, young brilliant blogger, but our brochures are meant for general audience. When we talk about “optimal strategy” it is a non technical term. (Also an advertisement saying: ‘Be all you can be’ does not refer to unfeasible optimization.”)

    The manager’s answer is completely reasonable. The problem is that the blogger did not ask the right question.

    Blogger: “Are you sure your machine really plays chess and not a little person hiding inside?”

    So the point is this: If D-wave really built adiabatic 16 qubit quantum computer that can really solve (by itself) Soduko puzzle or a small instance of an NP-complete problem (or even, as far as I am concerned, just an assignment problem or matrix multiplication,) this is extremely impressive and promising.

  44. Greg Kuperberg Says:

    So the point is this: If D-wave really built adiabatic 16 qubit quantum computer that can really solve (by itself) Soduko puzzle or a small instance of an NP-complete problem (or even, as far as I am concerned, just an assignment problem or matrix multiplication,) this is extremely impressive and promising.

    That may be what they would like you to think, but it does not look like what they do. I do not see evidence that the “Orion” would be worse at Sudoku if you replaced their 16 qubits with 16 bits. They have an apparent acceleration of something — finding the ground state of an Ising model — but I do not see that the acceleration is fundamental or fundamentally quantum. Of course it’s hard to say what they are really doing behind the veil of corporate secrecy, but what has actually been announced resembles an electronic version of stone soup.

  45. Scott Says:

    Gil: I like the idea of a 19th-century blogger — blog-by-telegraph? In my previous post, I did make the point that D-Wave might have done something interesting with superconducting qubits, but that right now it’s impossible to tell — and that the burden of proof lies with them, not with everyone else. In particular, their “demo” obviously doesn’t prove anything.

  46. Greg Kuperberg Says:

    D-Wave might have done something interesting with superconducting qubits

    Sure, they have might have; it’s hard to say that they haven’t. But if they have, why is their hype so awful? We should all agree that it’s a giant leap from “something interesting” to “commercially viable”. You do not need graduate training in quantum computing, or even to know what complex numbers are, to understand the difference.

  47. Greg Kuperberg Says:

    Well, bully for Wikipedia. Its article on D-Wave is duly skeptical. It also links to an astute AP article in which D-Wave itself admits that the world’s first commercially viable quantum computer could just be stone soup. Note that the journalist, Jordan Robertson, said nothing about the NP-in-one-shot nonsense. In my experience, science journalists are at least smarter when you tell them the truth.

  48. Steve Demuth Says:

    The phrase “the first commercially viable quantum computer” tells me everything I need to know about the basic intellectual integrity of D-Wave as a company. Qubit may not yet have a precise meaning, but “commercially viable” does: it means that you’ve built something, put it on the market, and managed to sell enough in an ongoing business to show an ongoing rate of return on investment that is greater than what one could get investing in low-risk, fixed interest securities. Anything short of this may be interesting, may be a breakthrough, may even be marginally profitable, but it isn’t “commercially viable.”

    In other words, you can’t claim a commercially viable product until after the fact, because it’s the buyers who vote on whether you’re viable, not your marketing department.

    Furthermore, D-Wave essentially claims to have solved the class of NP-Complete problems, even though they clearly know that they have not in any meaningful sense done so. (A lawyer might argue that they don’t really make that claim, but that’s how it reads. I know this because a couple of my colleagues “impeached” a short tutorial I gave on Quantum Computing earlier this week by bringing me the press release with the remark “Didn’t you say that QC cannot solve problems in the class NP in P time? These guys think otherwise.”) Rose’s response in this column that their claim is comparable to ILOG’s (disclaimer, and claim to competence: I work for ILOG) about their optimization products is baloney. Here’s the headline claim for ILOG’s CPlex product: “[ILOG’s] mathematical optimization technology enables better decision-making for efficient resource utilization.” That’s about as racy as it gets.

    There’s good reason for this. We’re a publicly held company, and we can be held to account by investors for claims we make that we can’t back up. If D-Wave were a public company, and made the claims in an investment prospectus that they make on their website, they’d all be at risk of jail time.

  49. Greg Kuperberg Says:

    Qubit may not yet have a precise meaning, but “commercially viable” does.

    I would say that the meaning of “qubit” as about as precise as “commercially viable”. What you really mean is that “qubit” has a much more arcane meaning. The rigorous meaning of “commercially viable” is of course important to any good businessman, so that they do not have the excuse that it’s okay to pantomime to the ignorant.

  50. GMHurley Says:

    Steve Demuth advances a contentious interpretation of the phrase “commercially viable.” To a corporate financier, it means more or less what he says it does. To a member of the marketing department, it means something like “we have a fighting chance of selling at least one of these things, and getting cash for it, with no claw-back when it doesn’t do what we said it would do.” The threshold for “a fighting chance” is particularly low if you’re trying to sell it to some department of government.

  51. TRRothwell Says:

    Interesting fact. 3 of 4 of D-waves theoretical physicists left in summer 2006 and go by the names P.Love A.Maassen van der Brink and A. Smirnov, all rather noted for some nice contributions in JJ and computational physics.

    D-wave quickly took there science team link off in may of thier site and never replaced it (ive been following it rather rabidly hoping for updates). The only reason such an exodus would occur is:

    A) There was no interesting physics left and the hybrid superconducting Q.C was theoretically complete, implying only technical barriers remain. (hopefully true. it would be so cool if this is true)
    B) The machine was doomed to failure and the people who actually understand the physics bailed before it tanked. (hopefully not true)
    C) The company is just a VC strategic umbrella to hedge bet large sums of money against the possibility of success versus the all too common practice of the principle stakeholders taking their low strike price options and excercising them with the inside knowledge that B) is true. (very likely the truth)

    Anyways, we all hope that dwave actually makes history and validates their system by submitting to peer review and kicking all of our skeptical butts. However, the excuse of ‘keeping our findings to ourselves’ because they are a for profit company doesn’t really seem rational to me. Do they not protect everything they build with iron clad patents and a team of high priced, visciously protective lawyers? If so, without any real competitors on the horizon, wouldn’t they want to submit to us their results and show us they have made history? As S.Lloyd said, the proof is in the eating of the pudding. The CTO of d-wave is behaving irrationally by posting here and defending his machine. Why not just validate it instead, by making demonstrations of comparitive, observable speed-up (the only point unless you just love science) or even that coherent quantum effects are being harnessed? (which is enough for me!)

    apologies if I have made ignorant assumptions; I am not a true lifelong expert in the field. However if i was an investor, I would steer very clear of investment or even association with d-wave until the machine is scaled properly, and produces the aforementioned speed up with coherent quantum effects cleary proven at a feasibly useful scale for solving some currently intractable problems.

  52. Steve Demuth Says:

    A few words in my defense: I said “qubit MAY not have a precise meaning” as a reference to previous posts, which advance the claim that it does not (see the first post of Robin Blume-Kohout in this discussion). Now, by my dictionary, the meaning of the word “may” in this context is clearly to indicate that there is some room for disagreement on the matter. Q.E.D.

    As for my “contentious” definition of “commercially viable:” I work every day with the product marketing folks of both my own employer, and our actual and potential technology partners, which range from some of the largest and best established tech companies in the world, to some of the smallest and most precariously situated. In every instance I can document, those folks were working extremely hard to make sure that their products were commercially viable in the sense I gave. I can’t say that there are no scoundrels in the world who misuse the notion (The D-Wave announcement argues strongly otherwise), but I don’t think there is anything ambiguous about the phrase “commercially viable” in the technology or marketing community at large.

  53. Richard Says:

    The article was written by one Stephen Jeffrey, who I assume is male.

    No, that’s just the credit for the illustration in the article. The author’s name is not given.

  54. Scott Says:

    Yeah, I figured that out (see above).

  55. Greg Kuperberg Says:

    A few words in my defense: I said “qubit MAY not have a precise meaning” as a reference to previous posts, which advance the claim that it does not (see the first post of Robin Blume-Kohout in this discussion).

    Touche, I have met my match at hairsplitting. 🙂

  56. The Quantum Pontiff » Welcome to the Feast, Tums Provided Says:

    […] Over at Shtetl-Optimized, Scott goes a little ballistic on criticism he’s received over the wording of his and Umesh Vazirani’s letters to the Economist. (As a side note, it is kind of sad to see such a poorly written article in the Economist: I know for a fact that an early Economist article on Shor’s algorithm drew a lot of great people into the field of quantum computing.) Part of Scott’s issue is with “physicists” insisting on rigor when they themselves are “the headmasters of handwaving.” So when Scott says Today it is accepted that quantum computers could not solve NP-complete problems in a reasonable amount of time. […]