Our nominee for Most UnderDiscussed Shiny New Stuff with this year’s Apple crop definitely has to be BNNS. What, not ringing a bell? Ch-ch-check it out:
The Accelerate framework’s new basic neural network subroutines (BNNS) is a collection of functions that you can use to construct neural networks. It is supported in macOS, iOS, tvOS, and watchOS, and is optimized for all CPUs supported on those platforms.
Of course, there’s a perfectly good reason it hasn’t made more of a splash, which is
BNNS supports implementation and operation of neural networks for inference, using input data previously derived from training. BNNS does not do training, however. Its purpose is to provide very high performance inference on already trained neural networks.
That’s a bit too high of a bar to jump, right? Well, not quite as much as you’d think! See, if you even noticed Google open sourcing TensorFlow at all, you probably figured it was a bit academic for any real world application you were working on, right? Well, here’s the inestimable Aaron Hillegass to explain the synergy here:
Thus, there are two ways to get deep learning into your Mac or iOS application:
Solution 1: Do all the neural net work on a server using TensorFlow. You must be certain that all your users always have a good internet connection and that the data you are sending/receiving is not too voluminous.
Solution 2: Train the neural net using TensorFlow and export the weights. Then, when you write the iOS or Mac application, recreate the the neural net using BNNS and import the weights.
Google would love to talk to you about the first solution, so the rest of this posting will be about the second. I’ve used TensorFlow’s MNIST example: handwritten digit recognition with just an input and an output layer, fully connected. The input layer has 784 nodes (one for each pixel) and the output has 10 nodes (one for each digit). Each output node gets a bias added before it is run through the softmax algorithm. Here is the source, which you will get automatically when you install TensorFlow.
My sample code is posted on GitHub…
So right now you can ship trainees that won’t learn any more — and feel free to fill in a simile of your choice here for a laugh — if you have a static problem space in your app that could be enhanced with a bit of clever; or you could just get up to speed on the technology now and wait for the almost certainly inevitable addition of backpropagation in future releases! Here’s some more light reading and viewing to help you out with that:
WWDC16 715 – Neural Networks and Accelerate