@analyst

Deep Lattice Networks and Partial Monotonic Functions

, , , , and . (2017)cite arxiv:1709.06680Comment: 9 pages, NIPS 2017.

Abstract

We propose learning deep models that are monotonic with respect to a user-specified set of inputs by alternating layers of linear embeddings, ensembles of lattices, and calibrators (piecewise linear functions), with appropriate constraints for monotonicity, and jointly training the resulting network. We implement the layers and projections with new computational graph nodes in TensorFlow and use the ADAM optimizer and batched stochastic gradients. Experiments on benchmark and real-world datasets show that six-layer monotonic deep lattice networks achieve state-of-the art performance for classification and regression with monotonicity guarantees.

Description

[1709.06680] Deep Lattice Networks and Partial Monotonic Functions

Links and resources

Tags

community

  • @analyst
  • @cpankow
  • @dblp
@analyst's tags highlighted