Dec 10, 2018

Deep learning on time-series


  • RNN: "From my personal experience, recurrent nets are good only when we deal with rather short sequences (10–100 time steps) with multiple variables on each time step (can be multivariate time series or word embeddings)."
  • CNN: Simple 1D convolution filters (by replacing the 2D) on ResNet. "I have benchmarked them a lot of times and they’re mostly superior to RNNs."
  • Autoregressive Feedforward Network: models last N steps using dilated convolution.

                   WaveNet

  • GAN: "we can rely on deep learning to embed our data into the new space with autoencoders or we can use GANs (generative adversarial networks) as anomaly detectors with exploiting discriminator network as anomaly detector (check more details and code here)
  •  Hybrid: 




No comments:

Carlo Cipolla's Laws of Stupidity

    "By creating a graph of Cipolla's two factors, we obtain four groups of people. Helpless people contribute to society but are...