[Resource Topic] 2019/1068: Not a Free Lunch but a Cheap Lunch: Experimental Results for Training Many Neural Nets Efficiently

Welcome to the resource topic for 2019/1068

Title:
Not a Free Lunch but a Cheap Lunch: Experimental Results for Training Many Neural Nets Efficiently

Authors: Joey Green, Tilo Burghardt, Elisabeth Oswald

Abstract:

Neural Networks have become a much studied approach in the recent literature on profiled side channel attacks: many articles examine their use and performance in profiled single-target DPA style attacks. In this setting a single neural net is tweaked and tuned based on a training data set. The effort for this is considerable, as there a many hyper-parameters that need to be adjusted. A straightforward, but impractical, extension of such an approach to multi-target DPA style attacks requires deriving and tuning a network architecture for each individual target. Our contribution is to provide the first practical and efficient strategy for training many neural nets in the context of a multi target attack. We show how to configure a network with a set of hyper-parameters for a specific intermediate (SubBytes) that generalises well to capture the leakage of other intermediates as well. This is interesting because although we can’t beat the no free lunch theorem (i.e. we find that different profiling methods excel on different intermediates), we can still get ``good value for money’’ (i.e. good classification results across many intermediates with reasonable profiling effort).

ePrint: https://eprint.iacr.org/2019/1068

See all topics related to this paper.

Feel free to post resources that are related to this paper below.

Example resources include: implementations, explanation materials, talks, slides, links to previous discussions on other websites.

For more information, see the rules for Resource Topics .