no code implementations • EMNLP 2021 • Arman Zharmagambetov, Magzhan Gabidolla, Miguel A. Carreira-Perpinan
Classification problems having thousands or more classes naturally occur in NLP, for example language models or document classification.
no code implementations • CVPR 2023 • Miguel Á. Carreira-Perpiñán, Magzhan Gabidolla, Arman Zharmagambetov
However, unlike for most other models, such as neural networks, optimizing forests or trees is not easy, because they define a non-differentiable function.
no code implementations • CVPR 2022 • Magzhan Gabidolla, Miguel Á. Carreira-Perpiñán
Ensemble methods based on decision trees, such as Random Forests or boosted forests, have long been established as some of the most powerful, off-the-shelf machine learning models, and have been widely used in computer vision and other areas.
no code implementations • 29 Sep 2021 • Yerlan Idelbayev, Arman Zharmagambetov, Magzhan Gabidolla, Miguel A. Carreira-Perpinan
We show that neural nets can be further compressed by replacing layers of it with a special type of decision forest.
no code implementations • 8 Nov 2019 • Arman Zharmagambetov, Suryabhan Singh Hada, Miguel Á. Carreira-Perpiñán, Magzhan Gabidolla
This paper presents a detailed comparison of a recently proposed algorithm for optimizing decision trees, tree alternating optimization (TAO), with other popular, established algorithms.