Time-Discounting Convolution for Event Sequences with Ambiguous Timestamps
暂无分享,去创建一个
Atsushi Suzuki | Michiharu Kudo | Takayuki Osogami | Akira Koseki | Takayuki Katsuki | Masaki Makino | Masaki Ono
[1] Jürgen Schmidhuber,et al. Long Short-Term Memory , 1997, Neural Computation.
[2] Matthieu Geist,et al. Convolutional and Recurrent Neural Networks for Activity Recognition in Smart Environment , 2015, BIRS-IMLKE.
[3] Geoffrey E. Hinton,et al. Rectified Linear Units Improve Restricted Boltzmann Machines , 2010, ICML.
[4] Timothy J. O'Shea,et al. Unsupervised representation learning of structured radio communication signals , 2016, 2016 First International Workshop on Sensing, Processing and Learning for Intelligent Machines (SPLINE).
[5] Yu Cheng,et al. Boosting Deep Learning Risk Prediction with Generative Adversarial Networks for Electronic Health Records , 2017, 2017 IEEE International Conference on Data Mining (ICDM).
[6] Yan Liu,et al. Recurrent Neural Networks for Multivariate Time Series with Missing Values , 2016, Scientific Reports.
[7] Takayuki Osogami,et al. Nonlinear Dynamic Boltzmann Machines for Time-Series Prediction , 2017, AAAI.
[8] Geoffrey E. Hinton,et al. Learning Multilevel Distributed Representations for High-Dimensional Sequences , 2007, AISTATS.
[9] L. Baum,et al. Statistical Inference for Probabilistic Functions of Finite State Markov Chains , 1966 .
[10] Helmut Ltkepohl,et al. New Introduction to Multiple Time Series Analysis , 2007 .
[11] Takayuki Osogami,et al. Seven neurons memorizing sequences of alphabetical images via spike-timing dependent plasticity , 2015, Scientific Reports.
[12] Jean-Marc Odobez,et al. Unsupervised Interpretable Pattern Discovery in Time Series Using Autoencoders , 2016, S+SSPR.
[13] Takayuki Osogami,et al. Learning dynamic Boltzmann machines with spike-timing dependent plasticity , 2015, ArXiv.
[14] Hiroshi Kajino. A Functional Dynamic Boltzmann Machine , 2017, IJCAI.
[15] Fei Wang,et al. Towards heterogeneous temporal clinical event pattern discovery: a convolutional approach , 2012, KDD.
[16] Geoffrey E. Hinton,et al. Spiking Boltzmann Machines , 1999, NIPS.
[17] Atsushi Suzuki,et al. Risk Prediction of Diabetic Nephropathy via Interpretable Feature Extraction from EHR Using Convolutional Autoencoder , 2018, MIE.
[18] Ping Zhang,et al. Risk Prediction with Electronic Health Records: A Deep Learning Approach , 2016, SDM.
[19] Atsushi Suzuki,et al. Feature Extraction from Electronic Health Records of Diabetic Nephropathy Patients with Convolutioinal Autoencoder , 2018, AAAI Workshops.
[20] Geoffrey E. Hinton,et al. Modeling Human Motion Using Binary Latent Variables , 2006, NIPS.
[21] Walter F. Stewart,et al. Doctor AI: Predicting Clinical Events via Recurrent Neural Networks , 2015, MLHC.
[22] Rico Sennrich,et al. Neural Machine Translation of Rare Words with Subword Units , 2015, ACL.
[23] Geoffrey E. Hinton,et al. Phoneme recognition using time-delay neural networks , 1989, IEEE Trans. Acoust. Speech Signal Process..
[24] Hongyuan Zha,et al. Modeling the Intensity Function of Point Process Via Recurrent Neural Networks , 2017, AAAI.
[25] Jimmy Ba,et al. Adam: A Method for Stochastic Optimization , 2014, ICLR.
[26] Geoffrey E. Hinton,et al. The Recurrent Temporal Restricted Boltzmann Machine , 2008, NIPS.
[27] Vibhav Vineet,et al. Conditional Random Fields as Recurrent Neural Networks , 2015, 2015 IEEE International Conference on Computer Vision (ICCV).
[28] Hiroshi Kajino,et al. Bidirectional Learning for Time-series Models with Hidden Units , 2017, ICML.
[29] Geoffrey E. Hinton,et al. Learning internal representations by error propagation , 1986 .
[30] Yue Zhang,et al. Deep Learning for Event-Driven Stock Prediction , 2015, IJCAI.