this post was submitted on 16 Apr 2024
325 points (98.8% liked)
Technology
59429 readers
3058 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I think in general the goal is not to stuff more information into fewer qubits, but to stabilize more qubits so you can hold more information. The problem is in the physics of stabilizing that many qubits for long enough to run a meaningful calculation.
Argh it's been a while. The question is whether an n-qbit system actually can contain arbitrary (k <= 2^n^) amounts of n-bit states for arbitrary values of n and k: Such a system might work up to a certain number, but then lose coherence once you try to exceed what the universe can actually compute. As far as I know we simply don't know because noone has yet built a system that actually pushes boundaries in earnest. The limiting factor is more n than k I think but then I'm not a quantum physicist.
It would still mean ludicrously miniaturised computing, in fact, minimised to a maximum extent, but it would not give the asymptotic speedup cryptologists are having nightmares about.