Quantum Algorithm Time Complexity Analysis, and Future QML/QiML PDF + Discussion 02/01/24.
3 Major Quantum-inspired Machine Learning methods are available for use and will improve based on continual software advancements combined with new GPUs that have greater bandwidth and RAM. Dequantized algorithms that achieve similar time complexities to some quantum algorithms for use in QiML applications will see growth.
For high dimensional data meant for large qubit quantum circuits, several Tensor Networks reduce complexity of large tensors into a network of smaller tensors for efficient processing. This QiML approximating method uses trainable parameters, and is being utilized by leading organizations to work on larger problems in a variety of fields.
Quantum Variational Algorithm (QVA) simulation with platforms such as IBM Qiskit and Xanadu PennyLane continue to be heavily researched which feature quantum state vectors for exact solutions. Several software techniques exist to reduce RAM requirements for higher qubit circuits such as data compression, optimized circuit partitioning, and qubit re-use. Parallel quantum architectures provide more time for processing algorithms vs. the overall machine learning process.
In summary: To experiment with algorithms conceived from quantum mechanics, try Dequantized algorithms. If Variational Quantum Algorithm performance is poor, use Parallel Quantum Algorithms. To approximate high qubit quantum states, use Tensor Networks for efficiency. The video discussion is available on the Startup Channel. hashtag#dequantized, hashtag#tensornetworks, hashtag#qvas