Quantum Transfer Learning: The Impact of Classical Preprocessing
Michael Kölle, Jonas Maurer, Philipp Altmann, Leo Sünkel, Jonas Stein, Julian Hager, Sebastian Zielinski and Claudia Linnhoff-Popien
Abstract: Quantum computing promises performance advantages, especially for data-intensive and complex computations. However, the current limitations of quantum hardware impose significant constraints on input sizes. To mitigate this, hybrid transfer learning solutions have been developed, which combine pre-trained classical models capable of managing large inputs with variational quantum circuits. Despite these advancements, the individual contributions of the classical and quantum components to the model's overall performance remain unclear. We propose a novel hybrid architecture that, instead of using a pre-trained network for data compression, employs an autoencoder to generate a compressed version of the input data. This compressed data is then passed to the quantum component. To evaluate our model's classification performance, we compare it against two state-of-the-art hybrid transfer learning architectures, two purely classical architectures, and one quantum architecture. Additionally, we check our compression method with the autoencoder against a Principal Component Analysis (PCA) and Gaussian Mixture Model (GMM). Finally we train our model with another technique where the autoencoder and variational quantum circuit are trained in parallel. The accuracy of each model is tested on four datasets: Banknote Authentication, Breast Cancer Wisconsin, MNIST digits, and AudioMNIST. Our findings indicate that the classical components play a significant role in classification within hybrid transfer learning models, a contribution that is often incorrectly attributed to the quantum component. The performance of our proposed model is comparable to that of a variational quantum circuit utilizing amplitude embedding, and also to a variational quantum circuit which uses compressed data through a PCA or a GMM, suggesting that our approach is a viable alternative.
Citation:
Michael Kölle, Jonas Maurer, Philipp Altmann, Leo Sünkel, Jonas Stein, Julian Hager, Sebastian Zielinski, Claudia Linnhoff-Popien. “Quantum Transfer Learning: The Impact of Classical Preprocessing”. , pp. 368-403, 2025. DOI: 10.1007/978-3-031-87327-0_18 [Code]
Bibtex: