Swiftpack.co -  wvabrinskas/Neuron as Swift Package
Swiftpack.co is a collection of thousands of indexed Swift packages. Search packages.
wvabrinskas/Neuron
A neural network library for Swift
.package(url: "https://github.com/wvabrinskas/Neuron.git", from: "1.3.4")

Neuron

Introduction

Neuron is a swift package I developed to help learn how to make neural networks. It is far from perfect and I am still learning. There is A LOT to learn here and I've just scratched the surface. As of right now this package provides a way to get started in machine learning and neural networks.

Support

Feel free to send me suggestions on how to improve this. I would be delighted to learn more!! You can also feel free to assign issues here as well. Run the unit tests as well to learn how the project works!

The Brain

It is fairly simple to setup the neural network Brain. This will be the only object you interface with.

Initialization

  private lazy var brain: Brain = {
    let bias = 0.01

    let nucleus = Nucleus(learningRate: 0.001,
                          bias: 0.001)
    
    let brain = Brain(nucleus: nucleus,
                      epochs: 10,
                      lossFunction: .crossEntropy,
                      lossThreshold: 0.001, 
                      initializer: .xavierNormal, 
                      gradient: .sgd)
    
    brain.add(.init(nodes: inputs, bias: bias)) //input layer

    for _ in 0..<numOfHiddenLayers {
      brain.add(.init(nodes: hidden, activation: .reLu, bias: bias)) 
    }
    
    brain.add(.init(nodes: outputs, bias: bias)) //output layer
    
    brain.add(modifier: .softmax)

    brain.add(optimizer: .adam())

    brain.logLevel = .high
    
    return brain
  }()

The Brain class is the main interactive class for dealing with the neural network. The brain object also supports different log levels so you can see what's going on in the console.

brain.logLevel = .low

  //show no logs
  case none
  
  //show only success logs
  case low
  
  //show only success and loading logs
  case medium
  
  //show all logs
  case high

Nucleus

  • It first takes in a Nucleus object that defines the learning rate and bias for the network.
  • When defining a bias it is NOT applied to the input layer.
  • The Nucleus object takes in 2 properties learningRate and bias
    • learningRate - how quickly the node will adjust the weights of its inputs to fit the training model
      • Usually between 0 and 1.
    • bias - the offset of adjustment to the weight adjustment calculation.
      • Usually between 0 and 1.

Epochs

  • The number of times to run through the training data.
  • The brain object may not hit the max number of epochs before training is finished if there is validation data passed and it reaches the defined loss threshold.

Loss Function

  • The loss function of the network. This will determine the loss of each epoch as well as the loss of the validation data set.
  case meanSquareError
  case crossEntropy
  • Currently the network only supports Mean Squared Error and Cross Entropy loss functions

Loss Threshold

  • The loss value the network should reach over an average of 5 epochs.

Initializer

  • The initializer function the brain object should use to generate the weight values for each layer.
  ///Generates weights based on a normal gaussian distribution. Mean = 0 sd = 1
  case xavierNormal

  ///Generates weights based on a uniform distribution
  case xavierUniform
  • Currently the network supports Xavier normal distribution and Xavier uniform distribution

Optimizer

  • The brain object can add an optimizer to the network by calling: brain.add(optimizer:)
public enum Optimizer {
  case adam(b1: Float = 0.9,
            b2: Float = 0.999,
            eps: Float = 1e-8)
}
  • Currently only the Adam optimizer is supported. There will be more soon.
  • You can set the various hyperparameters of the optimizer through the initilization function

Gradient Descent Optimizer

  • As part of the initializer of the brain object you can specify which type of gradient descent the brain performs.
  • By default it chooses Stochastic Gradient Descent.
public enum GradientDescent: Equatable {
  case sgd
  case mbgd(size: Int)
}
  • The network supports both Stochastic and Mini-Batch gradient descent.
  • When adding mbgd you can specify the batch size.

Adding Layers

The brain object allows for adding layers in a module way through the add function.

  public func add(_ model: LobeModel)

The LobeModel struct can be created with a simple initializer.

  public init(nodes: Int,
              activation: Activation = .none,
              bias: Float = 0) {
    self.nodes = nodes
    self.activation = activation
    self.bias = bias
  }

Nodes

  • the number of nodes at the layer

Activation

  • the activation function to be used at the layer
  • NOTE: If the layer is of type .input the activation function will be ignored

Bias

  • the bias to be added at that layer
  • NOTE: If the layer is of type .input the bias will be ignored

Modifiers

The network also supports adding an output activation modifier such as softmax

  public func add(modifier mod: OutputModifier) {
    self.outputModifier = mod
  }
  • Calling add(modifier) on the brain object will add the specified output activation to the output layer.
  • Currently the network on supports Softmax
  case softmax

Compiling the network

After adding all the specified layers and modifiers do not forget to call compile() on the brain object. This will connect all the layers together using the proper initializer and get the network ready for training.

Training

You can also train the Brain object by passing an expected value.

public func train(data: [TrainingData],
                    validation: [TrainingData] = [],
                    complete: ((_ complete: Bool) -> ())? = nil)
  • data: An array of TrainingData objects to be used as the training data set.
  • validation: An array of TrainingData objects to be used as the validation data set.
  • complete A block called when the network has finished training.

Training Data

  • This is the object that contains the inputs and expected output for that input
public struct TrainingData {
  public var data: [Float]
  public var correct: [Float]
  
  public init(data dat: [Float], correct cor: [Float]) {
    self.data = dat
    self.correct = cor
  }
}

Data

  • An array of values that should match the number of inputs into the network

Correct

  • A array of values that the network should target and should match the number of outputs of the network

Importing Pretrained Models

Brain can accept a pretrained model in the form of a .smodel file. Basically a renamed JSON format, this is the format the Brain object will export as well.

  • This will create a fully trained Brain object that is ready to go based on the trained model passed in.
 public init?(model: PretrainedModel,
              epochs: Int,
              lossFunction: LossFunction = .crossEntropy,
              lossThreshold: Float = 0.001,
              initializer: Initializers = .xavierNormal) {

PretrainedModel

  • The object takes in a URL to the .smodel file.
public struct PretrainedModel: ModelBuilder {
  public var fileURL: URL
  
  public init(url file: URL) {
    self.fileURL = file
  }
  ...

Exporting Pretrained Models

The Brain object can export its current weights and setup as a .smodel file. This can be used to import later when creating a Brain object that you wish to be fully trained based on the data.

brain.exportModelURL()

Will export a URL that links to the .smodel file.

brain.exportModel()

Will export a ExportModel that describes the network

Retrieving Data

Pass in new data to feed through the network and get a result.

let out = self.brain.feed(input: data)
  • Returns [Float] using the new inputs and the current weights, aka. feed forward.

Data Studying

Using the Brain object you can also get the result of the loss functions of each epoch as a CSV file using the exportLoss function on Brain.

  • exportLoss(_ filename: String? = nil) -> URL?

  • filename: Name of the file to save and export. defaults to loss-{timeIntervalSince1970}

  • Returns the url of the exported file if successful.

  • Example graph:

TODOs

  • GPU Acceleration is still in the works.
  • Convolutional layer support
  • Much more...

Resources

Cross Entropy + Softmax

Videos:

Cost Function vs Loss Function:

Gradient Clipping:

Activation Functions:

Backpropagation:

Validation:

Classification:

Weight Initialization

Optimizers

GitHub

link
Stars: 7
Last commit: 7 weeks ago

Ad: Job Offers

iOS Software Engineer @ Perry Street Software
Perry Street Software is Jack’d and SCRUFF. We are two of the world’s largest gay, bi, trans and queer social dating apps on iOS and Android. Our brands reach more than 20 million members worldwide so members can connect, meet and express themselves on a platform that prioritizes privacy and security. We invest heavily into SwiftUI and using Swift Packages to modularize the codebase.

Dependencies

Release Notes

v1.3.4
20 weeks ago

Added ability to replace weights

Swiftpack is being maintained by Petr Pavlik | @ptrpavlik | @swiftpackco | API