Drawing causal conclusions from observed statistical dependencies is a fundamental problem. Conditional independence-based causal discovery (e.g., PC or FCI) cannot be used in case there are no observed conditional independences. Alternative methods investigate a different set of assumptions, namely restricting the function class, e.g., additive noise models. In this talk, I will present two causal inference methods employing different kind of assumptions than the above.

The first is a method to infer the existence and identify a finite confounder of a set of observed dependent variables. It is based on a kernel method to identify finite mixtures of nonparametric product distributions. In the second part, I will focus on the problem of causal inference in the two-variable case (assuming no confounders). A known assumption is used, namely that if X causes Y, P(X) contains no information about P(Y|X). In contrast, P(Y) may contain information about P(X|Y). This asymmetry has implications for common machine learning tasks such as semi-supervised and unsupervised learning, which are employed by the proposed causal inference method.

Eleni Sgouritsa is in the last year of her PhD at the Max Planck Institute for Intelligent Systems in Tübingen, Germany, under the supervision of Dominik Janzing and Bernhard Schölkopf. She is a member of the causality group which aims at developing novel methods for inferring causal relations from data. Prior to this, she received an MSc and a BSc degree from the Univ. of Houston and the Univ. of Athens, respectively.