Recent developments in Machine Learning and Deep Learning depend heavily on
cloud computing and specialized hardware, such as GPUs and TPUs. This forces
those using those models to trust private data to cloud servers. Such scenario
has prompted a large interest on Homomorphic Cryptography and Secure
Multi-Party Computation protocols that allow the use of cloud computing power
in a privacy-preserving manner.
When comparing the efficiency of such protocols, most works in literature
resort to complexity analysis that gives asymptotic higher-bounding limits of
computational cost when input size tends to infinite. These limits may be very
different from the actual cost or execution time, when performing such
computations over small, or average-sized datasets.
We argue that Monte Carlo methods can render better computational cost and
time estimates, fostering better design and implementation decisions for
complex systems, such as Privacy-Preserving Machine Learning Frameworks.
Go to Source of this post
Author Of this post: <a href="http://arxiv.org/find/cs/1/au:+Souza_S/0/1/0/all/0/1">Stefano M P C Souza</a>, <a href="http://arxiv.org/find/cs/1/au:+Silva_D/0/1/0/all/0/1">Daniel G Silva</a>