University of Pittsburgh

Tensor Mixed Graphical Model

Friday, April 17, 2020 - 1:00pm - 1:30pm

We consider the graphical model with continuous variables, discrete variables and tensor variables. In Lee and Hastie (2013) it is proved that pseudo-likelihood with different regressions and L1 regularizers can learn a sparse and meaningful graph structure.  Here we extend this work with tensor variables which can be seen as a high dimensional multi-way array. Different with previous works, flattening the tensor into a long vector and treating each entry as one variable, we take a tensor as a variable to keep its structural information. With low ranks and sparsity constrains on the connectivity matrix(tensors), we can define new type of edges for tensor-to- tensor variables, tensor-to-continuous variables and tensor-to-discrete variables. I the end we will show some preliminary results proving that the model can learn meaningful linear relations between variables

Copyright 2009–2020 | Send feedback about this site