By Julian Zilly and others

We examine the influence of input data representations on learning complexity. For learning, we posit that each model implicitly uses a candidate model distribution for unexplained variations in the data, its noise model. If the model distribution is not well aligned to the true distribution, then even relevant variations will... Show more

December 19, 2019

Loading full text...

Similar articles

Loading recommendations...

x1

Quantifying the effect of representations on task complexity

Click on play to start listening