no code implementations • 3 Feb 2024 • Yiming Sun, Yuhe Gao, Runxue Bao, Gregory F. Cooper, Jessi Espino, Harry Hochheiser, Marian G. Michaels, John M. Aronis, Chenxi Song, Ye Ye
Transfer learning has become a pivotal technique in machine learning and has proven to be effective in various real-world applications.
no code implementations • 18 Jul 2022 • Bryan Andrews, Gregory F. Cooper, Thomas S. Richardson, Peter Spirtes
The m-connecting imset and factorization criterion provide two new statistical tools for learning and inference with ADMG models.
no code implementations • 29 Mar 2020 • Jonathan D. Young, Bryan Andrews, Gregory F. Cooper, Xinghua Lu
We developed a deep learning model, which we call a redundant input neural network (RINN), with a modified architecture and a regularized objective function to find causal relationships between input, hidden, and output variables.
no code implementations • 22 Dec 2017 • Fattaneh Jabbari, Mahdi Pakdaman Naeini, Gregory F. Cooper
In this paper, we introduce a novel framework to derive calibrated probabilities of causal relationships from observational data.
no code implementations • 16 Nov 2015 • Mahdi Pakdaman Naeini, Gregory F. Cooper
The method can be considered as an extension of BBQ, a recently proposed calibration method, as well as the commonly used calibration method based on isotonic regression.
no code implementations • 9 Jul 2014 • Shyam Visweswaran, Gregory F. Cooper
Learning Markov blanket (MB) structures has proven useful in performing feature selection, learning Bayesian networks (BNs), and discovering causal relationships.
no code implementations • 14 Jan 2014 • Mahdi Pakdaman Naeini, Gregory F. Cooper, Milos Hauskrecht
We prove three theorems showing that using a simple histogram binning post-processing method, it is possible to make a classifier be well calibrated while retaining its discrimination capability.
no code implementations • 13 Jan 2014 • Mahdi Pakdaman Naeini, Gregory F. Cooper, Milos Hauskrecht
A set of probabilistic predictions is well calibrated if the events that are predicted to occur with probability p do in fact occur about p fraction of the time.
no code implementations • 27 Mar 2013 • R. Martin Chavez, Gregory F. Cooper
Using standard methods drawn from the theory of computational complexity, workers in the field have shown that the problem of probabilistic inference in belief networks is difficult and almost certainly intractable.
no code implementations • 27 Mar 2013 • R. Martin Chavez, Gregory F. Cooper
KNET is a general-purpose shell for constructing expert systems based on belief networks and decision networks.
no code implementations • 27 Mar 2013 • Gregory F. Cooper
A method for computing probabilistic propositions is presented.
no code implementations • 27 Mar 2013 • Jaap Suermondt, Gregory F. Cooper, David Heckerman
Cutset conditioning and clique-tree propagation are two popular methods for performing exact probabilistic inference in Bayesian belief networks.
no code implementations • 27 Mar 2013 • Eric J. Horvitz, Jaap Suermondt, Gregory F. Cooper
We introduce a graceful approach to probabilistic inference called bounded conditioning.
no code implementations • 27 Mar 2013 • Gregory F. Cooper
This paper demonstrates a method for using belief-network algorithms to solve influence diagram problems.
no code implementations • 27 Mar 2013 • Homer L. Chin, Gregory F. Cooper
This paper examines Bayesian belief network inference using simulation as a method for computing the posterior probabilities of network variables.
no code implementations • 27 Mar 2013 • Jaap Suermondt, Gregory F. Cooper
This paper focuses on probability updates in multiply-connected belief networks.
no code implementations • 27 Mar 2013 • R. Martin Chavez, Gregory F. Cooper
In recent years, researchers in decision analysis and artificial intelligence (AI) have used Bayesian belief networks to build models of expert opinion.
no code implementations • 27 Mar 2013 • Edward H. Herskovits, Gregory F. Cooper
Kutato is a system that takes as input a database of cases and produces a belief network that captures many of the dependence relations represented by those data.
no code implementations • 27 Mar 2013 • Michael Shwe, Gregory F. Cooper
We analyzed the convergence properties of likelihood- weighting algorithms on a two-level, multiply connected, belief-network representation of the QMR knowledge base of internal medicine.