Projects
Geometric Deep Learning
Deep Learning Ensuring Laws of Physics

For modeling physical dynamical systems, a model associated with their geometric properties ensures the laws of physics rather than the superficial dynamics. The automatic discrete gradient algorithm makes a discrete gradient method applicable to a neural network and strictly admits the energy conservation and dissipation laws in discrete time. A representation of molecules invariant to the position and angle ensures the dynamics equivariant to the translation and rotation.

- Y. Chen, T. Matsubara, and T. Yaguchi, “KAM Theory Meets Statistical Learning Theory: Hamiltonian Neural Networks with Non-Zero Training Loss,” Proc. of The Thirty-Sixth AAAI Conference on Artificial Intelligence (AAAI2022), Feb., 2022.
- T. Matsubara, Y. Miyatake, and T. Yaguchi, “Symplectic Adjoint Method for Exact Gradient of Neural ODE with Minimal Memory,” Advances in Neural Information Processing Systems 35 (NeurIPS2021), 2021.
- Y. Chen, T. Matsubara, and T. Yaguchi, “Neural Symplectic Form: Learning Hamiltonian Equations on General Coordinate Systems),” Advances in Neural Information Processing Systems 34 (NeurIPS2021), Dec., 2021. (Spotlight)
- T. Matsubara, A. Ishikawa, and T. Yaguchi, “Deep Energy-based Modeling of Discrete-Time Physics,” Advances in Neural Information Processing Systems (NeurIPS), 2020. (Oral)
- K. Shimamura, S. Fukushima, A. Koura, F. Shimojo, M. Misawa, R. Kalia, A. Nakano, P. Vashishta, T. Matsubara, and S. Tanaka, “Guidelines for Creating Artificial Neural Network Empirical Interatomic Potential from First-Principles Molecular Dynamics Data under Specific Conditions and Its Application to α-Ag2Se,” Journal of Chemical Physics, 2019.
Topology-aware Data Generation

A data distribution or an object shape has its own topological structure, while a deep generative model often assumes a map from a simple distribution without regard for the difference in topology. Our proposed ChartPointFlow assigns a conditioned map to each continuous subset of a point cloud, similarly to a chart of a manifold, thereby nicely generating point clouds with different topologies.

- T. Kimura, T. Matsubara, and K. Uehara, “ChartPointFlow for Topology-Aware 3D Point Cloud Generation,” Proc. of ACM International Conference on Multimedia (ACMMM2021), 2021. (Oral)
Bayesian Deep Learning
Deep Generative Model-based Classifier

Compared to ordinary classifiers, a generative classifier extracts even detailed features for reconstruction, and thereby, it is less likely to overfit to a salient subset of features. A prior knowledge about the dependency between factors is implemented as its structure and provides interpretable results.

- T. Matsubara, K. Kusano, T. Tashiro, K. Ukai, and Kuniaki Uehara, “Deep Generative Model of Individual Variability in fMRI Images of Psychiatric Patients,” IEEE Transactions on Biomedical Engineering, 2020.
- T. Matsubara, T. Tashiro, and K. Uehara, “Deep Neural Generative Model of Functional MRI Images for Psychiatric Disorder Diagnosis,” IEEE Transactions on Biomedical Engineering, 2019.
Uncertainty-Aware Anomaly Detection

A typical score for anomaly detection is sensitive to the apparent ambiguity of a given sample, which is indeed unrelated to the anomality. By removing it, one can detect anomalies robustly to the appearance variety.

- T. Matsubara, K. Sato, K. Hama, R. Tachibana, and K. Uehara, “Deep Generative Model using Unregularized Score for Anomaly Detection with Heterogeneous Complexity,” IEEE Transactions on Cybernetics, 2020.