1 code implementation • 25 Jul 2023 • Chao Huang, Diptesh Das, Koji Tsuda
Random forest is effective for prediction tasks but the randomness of tree generation hinders interpretability in feature importance analysis.
no code implementations • 23 Jun 2023 • Takumi Yoshida, Hiroyuki Hanada, Kazuya Nakagawa, Kouichi Taji, Koji Tsuda, Ichiro Takeuchi
Predictive pattern mining is an approach used to construct prediction models when the input is represented by structured data, such as sets, graphs, and sequences.
1 code implementation • 27 Apr 2023 • Ryo Tamura, Koji Tsuda, Shoichi Matsuda
Newly created modules for AI and robotic experiments can be added easily to extend the functionality of the system.
1 code implementation • 9 Mar 2022 • Dai Hai Nguyen, Koji Tsuda
We present a framework for embedding graph structured data into a vector space, taking into account node features and topology of a graph into the optimal transport (OT) problem.
no code implementations • 29 Sep 2021 • Andrejs Tucs, Koji Tsuda, Adnan Sljoka
This chapter describes the application of constrained geometric simulations for prediction of antibody structural dynamics.
no code implementations • 9 Jun 2021 • Diptesh Das, Vo Nguyen Le Duy, Hiroyuki Hanada, Koji Tsuda, Ichiro Takeuchi
Automated high-stake decision-making such as medical diagnosis requires models with high interpretability and reliability.
no code implementations • 7 Jun 2021 • Dai Hai Nguyen, Koji Tsuda
We propose a generative model to generate molecules via multi-step chemical reaction trees.
no code implementations • 30 Apr 2021 • Syun Izawa, Koki Kitai, Shu Tanaka, Ryo Tamura, Koji Tsuda
As QA specializes in optimization of binary problems, a continuous vector has to be encoded to binary, and the solution of QA has to be translated back.
no code implementations • NeurIPS Workshop DL-IG 2020 • Mahito Sugiyama, Koji Tsuda, Hiroyuki Nakahara
We present a lightweight variant of Boltzmann machines via sample space truncation, called a truncated Boltzmann machine (TBM), which has not been investigated before while can be naturally introduced from the log-linear model viewpoint.
1 code implementation • 25 Oct 2019 • Xiaolin Sun, Zhufeng Hou, Masato Sumita, Shinsuke Ishihara, Ryo Tamura, Koji Tsuda
Machine learning applications in materials science are often hampered by shortage of experimental data.
1 code implementation • 6 Dec 2018 • Kei Terayama, Ryo Tamura, Yoshitaro Nose, Hidenori Hiramatsu, Hideo Hosono, Yasushi Okuno, Koji Tsuda
Furthermore, we show that using the US approach, undetected new phase can be rapidly found, and smaller number of initial sampling points are sufficient.
Materials Science Computational Physics
no code implementations • 21 May 2018 • Mahito Sugiyama, Koji Tsuda, Hiroyuki Nakahara
We present transductive Boltzmann machines (TBMs), which firstly achieve transductive learning of the Gibbs distribution.
1 code implementation • 6 Apr 2018 • Naruki Yoshikawa, Kei Terayama, Teruki Honma, Kenta Oono, Koji Tsuda
Automatic design with machine learning and molecular simulations has shown a remarkable ability to generate new and promising drug candidates.
Chemical Physics Biomolecules
1 code implementation • NeurIPS 2018 • Mahito Sugiyama, Hiroyuki Nakahara, Koji Tsuda
We present a novel nonnegative tensor decomposition method, called Legendre decomposition, which factorizes an input tensor into a multiplicative combination of parameters.
no code implementations • ICLR 2018 • Mahito Sugiyama, Koji Tsuda, Hiroyuki Nakahara
We achieve bias-variance decomposition for Boltzmann machines using an information geometric formulation.
2 code implementations • 29 Sep 2017 • Xiufeng Yang, Jinzhe Zhang, Kazuki Yoshizoe, Kei Terayama, Koji Tsuda
Automatic design of organic materials requires black-box optimization in a vast chemical space.
Chemical Physics Computational Engineering, Finance, and Science
no code implementations • ICML 2017 • Shinya Suzumura, Kazuya Nakagawa, Yuta Umezu, Koji Tsuda, Ichiro Takeuchi
Finding statistically significant high-order interactions in predictive modeling is important but challenging task because the possible number of high-order interactions is extremely large (e. g., $> 10^{17}$).
1 code implementation • ICML 2017 • Mahito Sugiyama, Hiroyuki Nakahara, Koji Tsuda
To theoretically prove the correctness of the algorithm, we model tensors as probability distributions in a statistical manifold and realize tensor balancing as projection onto a submanifold.
no code implementations • 15 Feb 2016 • Shinya Suzumura, Kazuya Nakagawa, Mahito Sugiyama, Koji Tsuda, Ichiro Takeuchi
The main obstacle of this problem is in the difficulty of taking into account the selection bias, i. e., the bias arising from the fact that patterns are selected from extremely large number of candidates in databases.
no code implementations • 15 Feb 2016 • Kazuya Nakagawa, Shinya Suzumura, Masayuki Karasuyama, Koji Tsuda, Ichiro Takeuchi
The SPP method allows us to efficiently find a superset of all the predictive patterns in the database that are needed for the optimal predictive model.
no code implementations • 27 Oct 2015 • Kazuki Yoshizoe, Aika Terada, Koji Tsuda
Upcoming many core processors are expected to employ a distributed memory architecture similar to currently available supercomputers, but parallel pattern mining algorithms amenable to the architecture are not comprehensively studied.
no code implementations • 26 Jun 2015 • Kazuya Nakagawa, Shinya Suzumura, Masayuki Karasuyama, Koji Tsuda, Ichiro Takeuchi
An SFS rule has a property that, if a feature satisfies the rule, then the feature is guaranteed to be non-active in the LASSO solution, meaning that it can be safely screened-out prior to the LASSO training process.
no code implementations • NeurIPS 2009 • Yoshinobu Kawahara, Kiyohito Nagano, Koji Tsuda, Jeff A. Bilmes
Several key problems in machine learning, such as feature selection and active learning, can be formulated as submodular set function maximization.