tree: e75388aad763f50061967f04a18fa52734538a5e [path history] [tgz]
  1. efficientNet.dml
  2. Example-EfficientNet.dml
  3. Example-MNIST_2NN_Leaky_ReLu_Softmax.dml
  4. Example-MNIST_Softmax.dml
  5. fm-binclass-dummy-data.dml
  6. fm-regression-dummy-data.dml
  7. mnist_lenet-predict.dml
  8. mnist_lenet-train.dml
  9. mnist_lenet.dml
  10. mnist_lenet_distrib_sgd-train-dummy-data.dml
  11. mnist_lenet_distrib_sgd-train.dml
  12. mnist_lenet_distrib_sgd.dml
  13. mnist_softmax-predict.dml
  14. mnist_softmax-train.dml
  15. mnist_softmax.dml
  16. ncf-dummy-data.dml
  17. ncf-real-data.dml
  18. README.md
scripts/nn/examples/README.md

SystemDS-NN Examples

MNIST Softmax Classifier

  • This example trains a softmax classifier, which is essentially a multi-class logistic regression model, on the MNIST data. The model will be trained on the training images, validated on the validation images, and tested for final performance metrics on the test images.
  • DML Functions: mnist_softmax.dml
  • Training script: mnist_softmax-train.dml
  • Prediction script: mnist_softmax-predict.dml

MNIST “LeNet” Neural Net

  • This example trains a neural network on the MNIST data using a “LeNet” architecture. The model will be trained on the training images, validated on the validation images, and tested for final performance metrics on the test images.
  • DML Functions: mnist_lenet.dml
  • Training script: mnist_lenet-train.dml
  • Prediction script: mnist_lenet-predict.dml

Neural Collaborative Filtering

  • This example trains a neural network on the MovieLens data set using the concept of Neural Collaborative Filtering (NCF) that is aimed at approaching recommendation problems using deep neural networks as opposed to common matrix factorization approaches.
  • As in the original paper, the targets are binary and only indicate whether a user has rated a movie or not. This makes the recommendation problem harder than working with the values of the ratings, but interaction data is in practice easier to collect.
  • MovieLens only provides positive interactions in form of ratings. We therefore randomly sample negative interactions as suggested by the original paper.
  • The implementation works with a fixed layer architecture with two embedding layers at the beginning for users and items, three dense layers with ReLu activations in the middle and a sigmoid activation for the final classification.