Welcome to the resource topic for 2025/448
Title:
Ciphertext-Ciphertext Matrix Multiplication: Fast for Large Matrices
Authors: Jai Hyun Park
Abstract:Matrix multiplication of two encrypted matrices (CC-MM) is a key challenge for privacy-preserving machine learning applications. As modern machine learning models focus on scalability, fast CC-MM on large datasets is increasingly in demand.
In this work, we present a CC-MM algorithm for large matrices. The algorithm consists of plaintext matrix multiplications (PP-MM) and ciphertext matrix transpose algorithms (C-MT). We propose a fast C-MT algorithm, which is computationally inexpensive compared to PP-MM. By leveraging high-performance BLAS libraries to optimize PP-MM, we implement large-scale CC-MM with substantial performance improvements. Furthermore, we propose lightweight algorithms, significantly reducing the key size from 1\ 960 MB to 1.57 MB for CC-MM with comparable efficiency.
In a single-thread implementation, the C-MT algorithm takes 0.76 seconds to transpose a 2\ 048\times 2\ 048 encrypted matrix. The CC-MM algorithm requires 85.2 seconds to multiply two 4\ 096\times 4\ 096 encrypted matrices. For large matrices, our algorithm outperforms the state-of-the-art CC-MM method from Jiang-Kim-Lauter-Song [CCS’18] by a factor of over 800.
ePrint: https://eprint.iacr.org/2025/448
See all topics related to this paper.
Feel free to post resources that are related to this paper below.
Example resources include: implementations, explanation materials, talks, slides, links to previous discussions on other websites.
For more information, see the rules for Resource Topics .