Sebastian FlennerhagPhD candidate, ML
http://flennerhag.github.io
Meta-learning beyond few-shot learning
an informal guide to Leap -
[code] - [paper] The current wave of machine learning is the epitome of division of labor, wherein a given model is able to perform a narrowly defined task to perfection. While these tasks can be quite complex, each task requires a distinct model to be trained to solve that task...
Thu, 09 May 2019 00:00:00 +0100
http://flennerhag.github.io/2019-05-09-transferring-knowledge-across-learning-processes/
http://flennerhag.github.io/2019-05-09-transferring-knowledge-across-learning-processes/Breaking the Activation Function Bottleneck
through adaptive parameterization -
Deep neural networks are very powerful models, in theory able to approximate any function. In practice things are a little different. Oddly, a neural network tends to generalize better the larger it is, often to the point of having more parameters than there are data points in the training set....
Sat, 09 Jun 2018 00:00:00 +0100
http://flennerhag.github.io/2018-06-09-breaking-activation-function-bottleneck/
http://flennerhag.github.io/2018-06-09-breaking-activation-function-bottleneck/A minimalist Deep Learning library
Automatic differentiation in Numpy -
When you peek under the hood of a deep learning library, things will look pretty complex. That’s because they use a lot of abstractions to handle a wide variety of cases. But the core code of a Deep Learning library is actually not that complex at all. In fact, you’d...
Mon, 06 Nov 2017 00:00:00 +0000
http://flennerhag.github.io/2017-11-06-graph-intro/
http://flennerhag.github.io/2017-11-06-graph-intro/Learning by Gradient Descent
Understanding gradient descent, momentum and learning -
Gradient descent is perhaps the simplest learning optimization algorithm that exists. In Deep Learning, it’s the foundational learning algorithm upon which modern learning algorithms are developed. Yet in introductory material to machine learning, it’s hardly mentioned. I quite frequently get asked about gradient descent and despite it’s simplicity, there’s a...
Wed, 26 Jul 2017 00:00:00 +0100
http://flennerhag.github.io/2017-07-26-gradient-descent/
http://flennerhag.github.io/2017-07-26-gradient-descent/Ensemble learning
Explaining how and what ensembles learn -
You may have wondered why there are so many different algorithms for machine learning - after all, would it not be sufficient with one algorithm? A famous result is the no free lunch theorem, which tells us that no one algorithm will be optimal for every problem. Intuitively, the reason...
Tue, 18 Apr 2017 00:00:00 +0100
http://flennerhag.github.io/2017-04-18-introduction-to-ensembles/
http://flennerhag.github.io/2017-04-18-introduction-to-ensembles/How to use LaTex in Markdown
A quick how to guide -
A pain free recipe Perhaps you’ve noticed that the standard Markdown library doesn’t render LaTex. Turn’s out that the workaround is really simple. All you need is the MathJax plugin for kramdown. MathJax is essentially a JavaScript PNG renderer that turn your LaTex Snippets into high qualit PNG images. To...
Sat, 14 Jan 2017 00:00:00 +0000
http://flennerhag.github.io/2017-01-14-latex/
http://flennerhag.github.io/2017-01-14-latex/Wrapping sklearn classes
Bend sklearn to your will -
What’s the fuzz about? The Scikit-learn, or sklearn, library is perhaps the primary reason I use Python. With a common API, adopted by many other libraries, it is possible to build complex machine learning systems that can be integrated in cross validations, grid searches, learning curves and many other diagnostics....
Sun, 08 Jan 2017 00:00:00 +0000
http://flennerhag.github.io/2017-01-08-Recursive-Override/
http://flennerhag.github.io/2017-01-08-Recursive-Override/