Skip to content

Commit

Permalink
Updated index
Browse files Browse the repository at this point in the history
  • Loading branch information
dscolby committed Jan 16, 2024
1 parent f36bcbe commit e7eea0c
Showing 1 changed file with 25 additions and 15 deletions.
40 changes: 25 additions & 15 deletions docs/src/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,26 +38,36 @@ prediction accuracy, generalization, ease of implementation, speed, and interpre

### What's New?
* Added support for dataframes
* Permutation of continuous treatments draws from a continuous, instead of discrete uniform distribution
during randomization inference
* Estimators can handle any array whose values are <:Real
* Estimator constructors are now called with model(X, T, Y) instead of model(X, Y, T)
* Improved documentation
* causalELM has a new logo

### Comparison with Other Packages
Other packages, mainly EconML, DoWhy, and CausalML, have similar funcitonality. Beides being
written in Julia rather than Python, the main differences between CausalELM and these
libraries are:

* causalELM uses extreme learning machines instead of tree-based, linear, or deep learners
* causalELM performs cross validation during training
* causalELM performs inference via asymptotic randomization inference rather than
bootstrapping
* causalELM does not require you to instantiate a model and pass it into a separate class
or struct for training
* causalELM creates train/test splits automatically
* causalELM does not have external dependencies: all the functions it uses are in the
Julia standard library
* causalELM is simpler to use but has less flexibility than the other libraries
### What makes causalELM different?
Other packages, mainly EconML, DoWhy, CausalAI, and CausalML, have similar funcitonality.
Beides being written in Julia rather than Python, the main differences between CausalELM and
these libraries are:
* Simplicity is core to casualELM's design philosophy. causalELM only uses one type of
machine learning model, extreme learning machines (with optional L2 regularization) and
does not require you to import any other packages or initialize machine learning models,
pass machine learning structs to causalELM's estimators, convert dataframes or arrays to
a special type, or one hot encode categorical treatments. By trading a little bit of
flexibility for a simple API, all of causalELM's functionality can be used with just
four lines of code.
* As part of this design principle, causalELM's estimators handle all of the work in
finding the best number of neurons during estimation. They create folds or rolling
rolling for time series data and use an extreme learning machine interpolator to find
the best number of neurons.
* causalELM's validate method, which is specific to each estimator, allows you to validate
or test the sentitivity of an estimator to possible violations of identifying assumptions.
* Unlike packages that do not allow you to estimate p-values and standard errors, use
bootstrapping to estimate them, or use incorrect hypothesis tests, all of causalELM's
estimators provide p-values and standard errors generated via approximate randomization
inference.
* causalELM strives to be lightweight while still being powerful and therefore does not
have external dependencies: all the functions it uses are in the Julia standard library.

### Installation
causalELM requires Julia version 1.7 or greater and can be installed from the REPL as shown
Expand Down

0 comments on commit e7eea0c

Please sign in to comment.