no code implementations • 25 Apr 2024 • Masahiro Kobayashi, Kazuho Watanabe
For the loss function defined by the monotonically increasing function $f$ and inverse divergence, the conditions for the statistical model and function $f$ under which the estimating equation is unbiased are clarified.
no code implementations • 3 Oct 2021 • Masato Kikuchi, Kento Kawakami, Kazuho Watanabe, Mitsuo Yoshida, Kyoji Umemura
A naive estimation approach that uses only $N$-gram frequencies is sensitive to low-frequency (rare) $N$-grams and not applicable to zero-frequency (unobserved) $N$-grams; these are known as the low- and zero-frequency problems, respectively.
no code implementations • 23 Oct 2020 • Masahiro Kobayashi, Kazuho Watanabe
We discuss unbiased estimation equations in a class of objective function using a monotonically increasing function $f$ and Bregman divergence.
no code implementations • 29 Apr 2020 • Daisuke Kaji, Kazuho Watanabe, Masahiro Kobayashi
Clustering algorithms have wide applications and play an important role in data analysis fields including time series data analysis.
no code implementations • 31 Jan 2019 • Masahiro Kobayashi, Kazuho Watanabe
Therefore, it is vulnerable to outliers in data, and can cause large maximum distortion in clusters.
no code implementations • NeurIPS 2014 • Shinichi Nakajima, Issei Sato, Masashi Sugiyama, Kazuho Watanabe, Hiroko Kobayashi
Latent Dirichlet allocation (LDA) is a popular generative model of various objects such as texts and images, where an object is expressed as a mixture of latent topics.
no code implementations • 28 Jan 2014 • Andrew Barron, Teemu Roos, Kazuho Watanabe
The normalized maximized likelihood (NML) provides the minimax regret solution in universal data compression, gambling, and prediction, and it plays an essential role in the minimum description length (MDL) method of statistical modeling and estimation.