Rectified Linear Unit

The Rectified Linear Unit (ReLU) function is widely used in machine learning and defined as follows:

\[\begin{split}ReLU(x) = \left\{ \begin{array}\\ 0 & if\ x\leq0 \\ x & if\ x>0 \end{array} \right.\end{split}\]

In this example we apply the rectified linear unit function to several values designed to illustrate its behavior. As with other Cicada functions, ReLU operates element-wise on arrays of any shape.

[1]:
import logging

import numpy

from cicada.additive import AdditiveProtocolSuite
from cicada.communicator import SocketCommunicator
from cicada.logging import Logger

logging.basicConfig(level=logging.INFO)

def main(communicator):
    log = Logger(logging.getLogger(), communicator)
    protocol = AdditiveProtocolSuite(communicator)

    values = numpy.array([-5, -1, 0, 1, 5]) if communicator.rank == 0 else None
    log.info(f"Player {communicator.rank} values: {values}")

    values_share = protocol.share(src=0, secret=values, shape=(5,))
    relu_share = protocol.relu(values_share)
    relu = protocol.reveal(relu_share)

    log.info(f"Player {communicator.rank} relu: {relu}")

SocketCommunicator.run(world_size=3, fn=main);
INFO:root:Player 0 values: [-5 -1  0  1  5]
INFO:root:Player 1 values: None
INFO:root:Player 2 values: None
INFO:root:Player 0 relu: [0. 0. 0. 1. 5.]
INFO:root:Player 1 relu: [0. 0. 0. 1. 5.]
INFO:root:Player 2 relu: [0. 0. 0. 1. 5.]