Recently standardized by the US National Institute of Standards and Technology (NIST), post-quantum cryptography (PQC) empowers organizations to secure their systems against the ‘harvest now, decrypt later’ threat.
This threat consists in threat actors collecting encrypted files and data today in order to break the encryption algorithms using quantum computers in the future.
Shahram Mossayebi, co-founder of Crypto Quantique, spoke during ISACA Europe 2025 conference about some of the myths surrounding these threats. He provided recommendations for organizations on how to approach their transition to PQC – also known as quantum-safe cryptography.
Speaking exclusively to Infosecurity, Mossayebi explained how to evaluate if your organization should prioritize transitioning to PQC and the benefit of adopting a “crypto agile” approach when preparing for this transition.

Infosecurity Magazine: What are the biggest myths about quantum computing and post-quantum cryptography?
Shahram Mossayebi: The first one that comes to mind is when people believe quantum computers will be able to solve everything, which is just fundamentally not true. They are bound to the computational complexity theory and there are classes of problems that even quantum computers cannot solve because we don't know the algorithms for them.
The second misconception is when people think quantum computers will come and replace any computers that we have – some have the idea that, one day, they're going to write an email on a quantum computer. That is, again, not true.
Quantum computers are good at solving some specific classes of problems. They could give us some speed up – sometimes exponential, sometimes quadratic – against how we can do things with classical computers and that's essentially where we're going to use them once we have fully scalable, fault tolerant quantum computers and the quantum algorithms that can harness that power and solves on these specific problems.
However, even then, quantum computers are going to be in data centers and supercomputing centers rather than lying around in our offices or at home.
When it comes to security, again, people think because there is a quantum algorithm out there that breaks asymmetric cryptography, such as the widely adopted Rivest–Shamir–Adleman (RSA) cryptosystem [Shor’s algorithm], it means that quantum computers will break all cryptography and that there's nothing safe against them, which is not true again.
In reality, as far as we know, future quantum computers will only break a certain class of cryptographic algorithms. We already know other classes of algorithms that thus far we know will stand the power of quantum computers until one day someone comes up with another clever quantum algorithm that potentially could break them too.
Hence, is there an apocalypse? It depends on what sort of algorithm you're using, and how persistent you will be to continue using those algorithms and not moving to more safe and secure ones. But otherwise, if you're just following the advice and guidelines that are now out there, the process and timelines that have been given, such as the transition from now to 2035 moving everything to post-quantum cryptographic (PQC) algorithms, you'll be fine.
IM: Why do you think some people believe in these misconceptions?
SM: First, I’d say that fearing an apocalypse is the nature of all human beings.
Regarding quantum computing in particular, people are excited as there has been over $25bn invested in the field by governments, by big guys in the industry and by venture capitalists.
When that amount of investment goes to something, people who have invested tend to get overexcited about what they invested in – they want to show value.
Additionally, some big contenders in quantum computing development tend to make overhyped claims of so-called quantum breakthroughs, generating clickbait headlines while the real paper is not always analyzed in detail.
"We don't have to wait for quantum computers to come to transition to post quantum cryptography."
Today, we see that with other shiny new technologies as well, especially around AI and large language models (LLMs) – and with quantum computers, the mysterious nature of it adds to the myths and people who don’t necessarily understand the technicalities just go with the narrative promoted by quantum-optimists.
That’s why it's important for other people in the industry to bring people down to Earth, create awareness and educate, rather than just be wishy-washy and overhype the benefits of quantum computing.
IM: How close are we to organizations benefitting from real use cases for which using a quantum computer provides an advantage to classical computers?
SM: We know how to build quantum computers, at least theoretically. We know all the nuances and technical approaches to make a working quantum computer. Today, there are several physical mediums to make them and I think this diversity will remain for a long time.
What is left is more of an engineering challenge to make it efficient. It’s a race of who can solve that engineering challenge faster.
Nevertheless, my gut feeling is there is still a long way to go to have a fully scalable quantum computers with the right quantum algorithms that will show us useful things for our day to day. I would say it could take even longer than 10 years.
As far as I know, there is no real software capable of proving any quantum advantage, except Shor's algorithm, which needs a scalable quantum computing. This is not to say that there won't be any, but that it's not trivial. And as much as people now pour money into quantum computing, pretty much the same thing will happen for quantum algorithms.
However, that that doesn't bother me when it comes to the security and cryptography aspects, because in those fields, we need look at risk. We don't really care whether anyone already have a quantum computer or not. The question is whether quantum computers will be around or not, and that becomes a matter of risk that I need to mitigate.
I cannot just rely on how secure my systems are today, I also need to think about whether my assets need to be kept secure for the next 50 years – and if it may not be the case, then I really need to do something about it now.
If someone harvests my data and keeps it long enough that they have a quantum computer [a strategy called ‘harvest now, decrypt later’], they could potentially break the encryption then, and I have failed in my security approach.
When it comes to PQC, we don't have to wait for quantum computers to come. I think we have enough evidence that the risk is considerable, that someone behind the door might have something, or soon might have something, that we need to do something about it today.
IM: Should moving to PQC be a top priority for all organizations, even when they are struggling with other, more pressing security problems?
SM: It depends on the size and type of the organization. This is all about risk. Each organization needs to assess the risk of quantum computing and ‘harvest now, decrypt later’ strategies to your security posture.
There is a simple formula that cryptographer Michele Mosca [professor at the University of Waterloo, Canada, and co-founder of the Institute for Quantum Computing] came up with a few years back.
The idea is the following: you say X is the number of years that you need to keep your assets secure – whether it's data in transit or restoring something – and Y is the number of years that it will take you to transition all your cryptography to PQC.
If the sum of X and Y is bigger than Z, which is your assumption of when quantum computers will be available, then you can consider yourself at risk and you need to do something about it today. However, if that sum is below Z, then you can wait.

Additionally, even when undertaking the transition to PQC, the process can be iterative. You don't have to start with your authentication token generator, because whatever token is generated is only valid for 30 seconds to five minutes anyway. That's not your top priority.
However, say you are a land registry based in the UK, then you need to make sure that any deed that is digitally signed is kept secure for 50 years, then you probably should prioritize transitioning the encryption of your backup data to PQC algorithms.
"Crypto agility means that every time you move from the current PQC standard to a new one, you just need to update your security policy."
In my talk at ISACA Europe 2025, I mentioned that JP Morgan, which started preparing for transitioning to PQC a couple of years ago, recently said they should move faster and started to transition this year, expecting to finish by 2030.
If they achieve that, they will have completed the PQC transition sooner than what any government recommends and that’s because they did a risk assessment and decided they needed to move faster – all this without going to a panic mode. That’s a good example to follow.
IM: Now that NIST has started standardizing the first PQC algorithms, years before the emergence of scalable quantum computers, do we face the risk of seeing threat actors break these before organizations can implement them?
SM: The quick answer is yes, this risk exists, but this is not necessarily specific to PQC. This has been the case forever, because of the way cryptography works. We build our cryptographic algorithms based on hard problems, but once someone comes up with a way to solve these problems, we need to move to another type of class of hard problems and build new algorithms.
If you look into the history, using RSA as an example, when we started using RSA back in 1980s, the first key size was 768, but over the years, as the type of algorithms changed and the compute power got stronger, we started to increase the key size in RSA.
Today, I think you need to use the 4000 or 3000 key sizes for RSA. You cannot use 768 anymore, or even the 2048 key size. Cryptography has always been a race. What is different now is the pace of this race, which was much slower before.
That’s why, for the first time, when NIST started the PQC competition eight years ago and then announced the first set of selected PQC algorithms back in 2024, they said they're not going to stop there.
That’s why, as we speak, there are two other ongoing competitions for other PQC algorithms, in case the current ones are broken. And once they are done after a few years, they're going to do another one.
By choosing this model, NIST recognized the risk that you mentioned, they realized they cannot just sit down and wait for someone come up with a clever quantum algorithm again, and then redo this whole process that takes years, and then the transition takes forever for organizations to implement these cryptographic algorithms.
So, right now there is a PQC standard using a set of algorithms and in the next five to six years, there will be another standard with another set of algorithms. It will not necessarily mean that the first standard will be deprecated, but it will allow organizations to diversify the PQC algorithms they implement, all based on different hard problems.
This is also why I would recommend organizations to consider new processes and methodologies, such as ‘crypto agility.’ Right now, starting a PQC transition means a huge pain to go through the whole infrastructure, all your code lines. If 10 years down the line you suddenly have to do this again, you're going to kill yourself.
Instead of doing that, some organizations create an abstract layer between the applications and the cryptography libraries. Instead of directly integrating the encryption algorithms into your code, you just send a command to ‘encrypt’ and the abstract layer connects to a security policy that says, “For any ‘encrypt,’ command use this standard with these key parameters.”
This means that, every time you need to move on again from the current standard to a new one, you just need to update your security policy and the updated encryption gets populated across all your infrastructure. This process, known as ‘crypto agility,’ is a key concept organizations should consider when preparing for their PQC transition.
