Skip to content

Commit

Permalink
updated
Browse files Browse the repository at this point in the history
  • Loading branch information
kimmo1019 authored and kimmo1019 committed Apr 13, 2024
1 parent 9a0deeb commit 71f7ab2
Show file tree
Hide file tree
Showing 3 changed files with 6 additions and 6 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
[![Documentation Status](https://readthedocs.org/projects/causalegm/badge/?version=latest)](https://causalegm.readthedocs.io)


# <a href='https://causalegm.readthedocs.io/'><img src='https://raw.githubusercontent.com/SUwonglab/CausalEGM/main/docs/source/logo.png' align="left" height="60" /></a> CausalEGM: An encoding generative modeling approach to dimension reduction and covariate adjustment in causal inference
# <a href='https://causalegm.readthedocs.io/'><img src='https://raw.githubusercontent.com/SUwonglab/CausalEGM/main/docs/source/logo.png' align="left" height="60" /></a> CausalEGM: An Encoding Generative Modeling Approach in Causal Inference


<a href='https://causalegm.readthedocs.io/'><img align="left" src="https://github.com/SUwonglab/CausalEGM/blob/main/model.jpg" width="350">
Expand Down
7 changes: 3 additions & 4 deletions docs/source/about.rst
Original file line number Diff line number Diff line change
Expand Up @@ -9,20 +9,19 @@ Background and Challengings

Given data in an observational study, a central problem in causal inference is to estimate the effect of one variable (e.g., treatment) on another variable (e.g., outcome) in the presence of a covariate vector that represents all other variables observed in the study.
Under the well-known “unconfoundedness” condition, valid estimates of the desired effect of treatment on outcome can be obtained by alternative approaches, including matching, weighting, stratification, and regression-based methods.
Covariate adjustment plays an important role in these methods. When the covariate is of high dimension, as is often the case in modern applications, covariate adjustment becomes difficult because of the ``curse of dimensionality".
Covariate adjustment plays an important role in these methods. When the covariate is of high dimension, as is often the case in modern applications, covariate adjustment becomes difficult because of the curse of dimensionality.

Core Idea in CausalEGM
~~~~~~~~~~~~~~~~~~~~~~~
In observational study, many methods have been proposed for covariate adjustment under the potential outcome model (`Rubin et al. 1974 <https://www.fsb.muohio.edu/lij14/420_paper_Rubin74.pdf>`_), which often involves the estimation of the expectation of the outcome conditional on the treatment and the covariate.
CausalEGM simultaneously learns to (1) embed the high-dimensional covariates into a low-dimensional latent space where the distribution of the embeddings (latent covariate features) is pre-specified. (2) build generative models for treatment given latent features and for outcome given treatment and latent features.
The key idea of this method is to partition the latent feature vector into different independent components that play different roles in the above two generative models. This partitioning then allows us to identify a minimal latent covariate feature subvector that affects both treatment and outcome.
Once the latent confounding variable $Z_0$ can be learned, the average dose-response function $\mu(x)$ can be estimated by the following formula:
Once the latent confounding variable :math:`Z_0` can be learned, the average dose-response function :math:`\mu(x)` can be estimated by the following formula:

.. math::
\begin{align}
\mu(x)=\int \mathbb{E}(Y|X=x,Z_0=z_0)p_{Z_0}(z_0)dz_0,
\end{align}
where $X$ and $Y$ are the treatment and outcome variables, respectively. We show that the original high-dimensional covariate $V$ can be replaced by a low-dimensional covariate feature.
where :math:`X` and :math:`Y` are the treatment and outcome variables, respectively. We show that the original high-dimensional covariate :math:`V` can be replaced by a low-dimensional latent confounding variable :math:`Z_0`.

See `Liu et al. (2020) <https://arxiv.org/abs/2212.05925>`_ for a detailed exposition of the methods.
3 changes: 2 additions & 1 deletion docs/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,8 @@ CausalEGM - An Encoding Generative Modeling Approach to Dimension Reduction and

.. include:: _key_contributors.rst

Causal inference has been increasingly essential in modern observational studies with rich covariate information. However, it is often challenging to estimate the causal effect with high-dimensional covariates.
Causal inference has been increasingly essential in modern observational studies with rich covariate information. However, it is often challenging to estimate the causal effect with high-dimensional covariates
due to the “curse of dimensionality”.

We develop **CausalEGM**, a deep learning framework for nonlinear dimension reduction and generative modeling of the dependency among covariate features affecting treatment and response.
The key idea is to identify a latent covariate feature set (e.g., latent confounders) that affects both treatment and outcome. By conditioning on these features,
Expand Down

0 comments on commit 71f7ab2

Please sign in to comment.