Search Results for author: Toshiyasu Matsushima

Found 8 papers, 0 papers with code

Boosting-Based Sequential Meta-Tree Ensemble Construction for Improved Decision Trees

no code implementations9 Feb 2024 Ryota Maniwa, Naoki Ichijo, Yuta Nakahara, Toshiyasu Matsushima

Thus, it is expected that ensembles of meta-trees are more effective in improving predictive performance than a single meta-tree, and there are no previous studies that construct multiple meta-trees in boosting.

Prediction Algorithms Achieving Bayesian Decision Theoretical Optimality Based on Decision Trees as Data Observation Processes

no code implementations12 Jun 2023 Yuta Nakahara, Shota Saito, Naoki Ichijo, Koki Kazama, Toshiyasu Matsushima

In the field of decision trees, most previous studies have difficulty ensuring the statistical optimality of a prediction of new data and suffer from overfitting because trees are usually used only to represent prediction functions to be constructed from given data.

Batch Updating of a Posterior Tree Distribution over a Meta-Tree

no code implementations17 Mar 2023 Yuta Nakahara, Toshiyasu Matsushima

Previously, we proposed a probabilistic data generation model represented by an unobservable tree and a sequential updating method to calculate a posterior distribution over a set of trees.

Stochastic 2D Signal Generative Model with Wavelet Packets Basis Regarded as a Random Variable and Bayes Optimal Processing

no code implementations26 Jan 2022 Ryohei Oka, Yuta Nakahara, Toshiyasu Matsushima

When the basis is unknown the candidate of basis increases in exponential order with respect to the signal size.

Probability Distribution on Rooted Trees

no code implementations24 Jan 2022 Yuta Nakahara, Shota Saito, Akira Kamatsuka, Toshiyasu Matsushima

The hierarchical and recursive expressive capability of rooted trees is applicable to represent statistical models in various areas, such as data compression, image processing, and machine learning.

Data Compression

Probability Distribution on Full Rooted Trees

no code implementations27 Sep 2021 Yuta Nakahara, Shota Saito, Akira Kamatsuka, Toshiyasu Matsushima

Its parametric representation is suitable for calculating the properties of our distribution using recursive functions, such as the mode, expectation, and posterior distribution.

Data Compression Model Selection

Theoretical Analysis of the Advantage of Deepening Neural Networks

no code implementations24 Sep 2020 Yasushi Esaki, Yuta Nakahara, Toshiyasu Matsushima

We propose two new criteria to understand the advantage of deepening neural networks.

Cannot find the paper you are looking for? You can Submit a new open access paper.