Function stacks

Function stacks are layers of functions that are executed simultaneously in one forward, backward, or update pass. Function stacks are created when you either create a test or load a model from disk. Here are some examples of function stacks.

They can be small and simple:

FunctionStack nn = new FunctionStack(
new Linear(2, 2, name: "l1 Linear"),
new Sigmoid(name: "l1 Sigmoid"),
new Linear(2, 2, name: "l2 Linear"));

They can be a little bit bigger:

FunctionStack nn = new FunctionStack(
new Convolution2D(1, 2, 3, name: "conv1", gpuEnable: true),// Do not forget the GPU flag if necessary
new ReLU(),
new MaxPooling(2, 2),
new Convolution2D(2, 2, 2, name: "conv2", gpuEnable: true),
new ReLU(),
new MaxPooling(2, 2),
new Linear(8, 2, name: "fl3"),
new ReLU(),
new Linear(2, 2, name: "fl4")
);

Or, they can be very large:

FunctionStack nn = new FunctionStack(
new Linear(neuronCount * neuronCount, N, name: "l1 Linear"), // L1
new BatchNormalization(N, name: "l1 BatchNorm"),
new LeakyReLU(slope: 0.000001, name: "l1 LeakyReLU"),
new Linear(N, N, name: "l2 Linear"), // L2
new BatchNormalization(N, name: "l2 BatchNorm"),
new LeakyReLU(slope: 0.000001, name: "l2 LeakyReLU"),
new Linear(N, N, name: "l3 Linear"), // L3
new BatchNormalization(N, name: "l3 BatchNorm"),
new LeakyReLU(slope: 0.000001, name: "l3 LeakyReLU"),
new Linear(N, N, name: "l4 Linear"), // L4
new BatchNormalization(N, name: "l4 BatchNorm"),
new LeakyReLU(slope: 0.000001, name: "l4 LeakyReLU"),
new Linear(N, N, name: "l5 Linear"), // L5
new BatchNormalization(N, name: "l5 BatchNorm"),
new LeakyReLU(slope: 0.000001, name: "l5 LeakyReLU"),
new Linear(N, N, name: "l6 Linear"), // L6
new BatchNormalization(N, name: "l6 BatchNorm"),
new LeakyReLU(slope: 0.000001, name: "l6 LeakyReLU"),
new Linear(N, N, name: "l7 Linear"), // L7
new BatchNormalization(N, name: "l7 BatchNorm"),
new LeakyReLU(slope: 0.000001, name: "l7 ReLU"),
new Linear(N, N, name: "l8 Linear"), // L8
new BatchNormalization(N, name: "l8 BatchNorm"),
new LeakyReLU(slope: 0.000001, name: "l8 LeakyReLU"),
new Linear(N, N, name: "l9 Linear"), // L9
new BatchNormalization(N, name: "l9 BatchNorm"),
new PolynomialApproximantSteep(slope: 0.000001, name: "l9 PolynomialApproximantSteep"),
new Linear(N, N, name: "l10 Linear"), // L10
new BatchNormalization(N, name: "l10 BatchNorm"),
new PolynomialApproximantSteep(slope: 0.000001, name: "l10 PolynomialApproximantSteep"),
new Linear(N, N, name: "l11 Linear"), // L11
new BatchNormalization(N, name: "l11 BatchNorm"),
new PolynomialApproximantSteep(slope: 0.000001, name: "l11 PolynomialApproximantSteep"),
new Linear(N, N, name: "l12 Linear"), // L12
new BatchNormalization(N, name: "l12 BatchNorm"),
new PolynomialApproximantSteep(slope: 0.000001, name: "l12 PolynomialApproximantSteep"),
new Linear(N, N, name: "l13 Linear"), // L13
new BatchNormalization(N, name: "l13 BatchNorm"),
new PolynomialApproximantSteep(slope: 0.000001, name: "l13 PolynomialApproximantSteep"),
new Linear(N, N, name: "l14 Linear"), // L14
new BatchNormalization(N, name: "l14 BatchNorm"),
new PolynomialApproximantSteep(slope: 0.000001, name: "l14 PolynomialApproximantSteep"),
new Linear(N, 10, name: "l15 Linear") // L15
);
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.118.253.223