[Resource Topic] 2024/1851: Secure Transformer-Based Neural Network Inference for Protein Sequence Classification

Welcome to the resource topic for 2024/1851

Title:
Secure Transformer-Based Neural Network Inference for Protein Sequence Classification

Authors: Jingwei Chen, Linhan Yang, Chen Yang, Shuai Wang, Rui Li, Weijie Miao, Wenyuan Wu, Li Yang, Kang Wu, Lizhong Dai

Abstract:

Protein sequence classification is crucial in many research areas, such as predicting protein structures and discovering new protein functions. Leveraging large language models (LLMs) is greatly promising to enhance our ability to tackle protein sequence classification problems; however, the accompanying privacy issues are becoming increasingly prominent. In this paper, we present a privacy-preserving, non-interactive, efficient, and accurate protocol called encrypted DASHformer to evaluate a transformer-based neural network for protein sequence classification named DASHformer, provided by the iDASH 2024-Track 1 competition. The presented protocol is based on our solution for this competition, which won the first place. It is arguably the first secure transformer inference protocol capable of performing batch classification for multiple protein sequences in a single execution only using leveled homomorphic encryption (i.e., without bootstrapping). To achieve this, we propose a series of new techniques and algorithmic improvements, including data-driven non-polynomial function fitting, tensor packing, and double baby-step-giant-step for computing the product of multiple encrypted matrices. These techniques and improvements enable the protocol to classify 163 encrypted protein sequences in about 165 seconds with 128-bit security, achieving an amortized time of about one second per sequence.

ePrint: https://eprint.iacr.org/2024/1851

See all topics related to this paper.

Feel free to post resources that are related to this paper below.

Example resources include: implementations, explanation materials, talks, slides, links to previous discussions on other websites.

For more information, see the rules for Resource Topics .