Estimating directed information to infer causal relationships between neural spike trains and approximating discrete probability distributions with causal dependence trees
Quinn, Christopher J.
Loading…
Permalink
https://hdl.handle.net/2142/15989
Description
Title
Estimating directed information to infer causal relationships between neural spike trains and approximating discrete probability distributions with causal dependence trees
Author(s)
Quinn, Christopher J.
Issue Date
2010-05-18T18:53:33Z
Director of Research (if dissertation) or Advisor (if thesis)
Coleman, Todd P.
Kiyavash, Negar
Department of Study
Electrical & Computer Eng
Discipline
Electrical & Computer Engr
Degree Granting Institution
University of Illinois at Urbana-Champaign
Degree Name
M.S.
Degree Level
Thesis
Keyword(s)
information theory
causality
computational neuroscience
Bayesian networks
Abstract
"This work examines an information theoretic quantity known as directed information,
which measures statistically causal influences between processes. It is shown to be a general quantity, applicable to arbitrary probability distributions. It is interpreted in terms of prediction, communication with feedback,
source coding with feed forward, control over noisy channels, and other settings. It is also shown to be consistent with Granger's philosophical definition. The concepts of direct and indirect causation in a network of processes
are formalized. Next, two applications of directed information are investigated.
Neuroscience researchers have been attempting to identify causal relationships between neural spike trains in electrode recordings, but have been doing so with correlation measures and measures based on Granger causality. We
discuss why these methods are not robust, and do not have statistical guarantees. We use a point process GLM model and MDL (as a model order selection tool) for consistent estimation of directed information between neural
spike trains. We have successfully applied this methodology to a network of simulated neurons and electrode array recordings.
This work then develops a procedure, similar to Chow and Liu's, for fi nding the ""best"" approximation (in terms of KL divergence) of a full, joint
distribution over a set of random processes, using a causal dependence tree distribution. Chow and Liu's procedure had been shown to be equivalent to maximizing a sum of mutual informations, and the procedure presented here
is shown to be equivalent to maximizing a sum of directed informations. An algorithm is presented for efficiently finding the optimal causal tree, similar
to that in Chow and Liu's work."
Use this login method if you
don't
have an
@illinois.edu
email address.
(Oops, I do have one)
IDEALS migrated to a new platform on June 23, 2022. If you created
your account prior to this date, you will have to reset your password
using the forgot-password link below.