Large Language Models Can Help Mitigate Barren Plateaus in Quantum Neural Networks
📰 ArXiv cs.AI
arXiv:2502.13166v3 Announce Type: replace-cross Abstract: In the era of noisy intermediate-scale quantum (NISQ) computing, Quantum Neural Networks (QNNs) have emerged as a promising approach for various applications, yet their training is often hindered by barren plateaus (BPs), where gradient variance vanishes exponentially as the qubit size increases. Most initialization-based mitigation strategies rely heavily on pre-designed static parameter distributions, thereby lacking adaptability to div
DeepCamp AI