We identify, test and disseminate established and emerging techniques in machine learning in order to provide practitioners with the best tools for their applications
What we do
Explore our work
Read our blog
What we are reading:
Tesselation Filtering ReLu Networks
Relu networks define piecewise linear functions on tessellations of the input space. The authors analyse the shape complexity of the decision surface of a Relu neural network. The main contribution is a framework for decomposing a network along a specific layer into a tessellation and a shape complexity part which is represented by a tessellation-filtering neural network, a special subclass of Relu networks which is also identified in the paper.
Modeling Long Sequences with Structured State Spaces
Building on ideas of using state-spaces and linear ODEs for sequence modelling in optimal control, a new layer for sequence processing is introduced and optimized in a line of papers. The theory behind this layer has overlaps with LSTMs and convolutional layers, in a sense containing them as special cases. It is demonstrated that the new layers can handle very long-range dependencies in sequences with over 10.000 tokens, while being much faster than transformers.
Brax - A Differentiable Physics Engine for Large Scale Rigid Body Simulation
Simulation in Robotics Opportunities Challenges Suggestions
Despite great advancements in physics based simulators for robotics over the last few years, a big sim2real gap remains in the performance of systems when tested in-silico and in the real world. A recent paper, summed up in this pill, tries to outline where the key challenges are and suggests how to tackle them.
Variational- and Metric-based Deep Latent Space for Out-of-Distribution Detection