The neural network programming language

Here’s a JavaScript function that returns a positive value if both x and y are positive:

function isNeg(x) {
  return Math.max(0, -x);
}
function bothPos(x, y) {
  return Math.max(0, 1-isNeg(x)-isNeg(y));
}

Surprise - the functions here are also neural networks! Neural networks are usually drawn as labelled, weighted graphs, but you can also view neural networks as a restricted programming language.

The above functions are in a subset of JavaScript that I call NN. They approximate the one function below written in full JavaScript:

function bothPos(x, y) {
  return x > 0 && y > 0;
}

This is perhaps how you might more naturally implement bothPos. So why would we ever use this more restrictive neural network language? Compared to JavaScript, the language NN has two separate purposes. Its first (philosophical) purpose is to model how real, physical neurons work. Its second (much more specific) purpose is to allow for training: you can take any function here, compute its “loss” against a training set, compute the gradient of that loss, then adjust the constant numbers in the function to (hopefully) improve the function.

This language is quite restrictive. Its only base data type is the floating-point number. Its only expressions are (non-recursive) function calls and the odd expression

expr ::=
  "Math.max(0, " ")"

This can’t be done:

function max(x, y) {
  // ???
}
Tagged #neural-networks, #javascript, #programming, #machine-learning.

Similar posts

More by Jim

👋 I'm Jim, a full-stack product engineer. Want to build an amazing product and a profitable business? Read more about me or Get in touch!

This page copyright James Fisher 2019. Content is not associated with my employer. Found an error? Edit this page.