Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Edward Roadmap #464

Open
10 of 16 tasks
dustinvtran opened this issue Feb 15, 2017 · 5 comments
Open
10 of 16 tasks

Edward Roadmap #464

dustinvtran opened this issue Feb 15, 2017 · 5 comments

Comments

@dustinvtran
Copy link
Member

dustinvtran commented Feb 15, 2017

Following the 2017 TensorFlow Dev Summit, here is an outline of Edward going forward at least for Spring 2017. Of course, comments are always welcome. I'm happy to change priorities subject to interest. Related issues are added in parentheses.

Here are features I probably don't have time for. However, they are very much on my mind, and are impactful additions if someone wants to take the helm. (Any of the above is also appreciated!)

@PKU-YYang
Copy link

Hi Dustin:

Just wondering if there is any Control variates (#4) tricks already implemented in Edward? I noticed there is a branch called rao-blakcwellization, not sure if it works properly now.

Thanks.

@dustinvtran
Copy link
Member Author

Not at the moment. I was working out RB as you pointed out, which turns out to be quite involved. Control variates are much easier to implement; I haven't looked into it quite yet.

@ip01
Copy link

ip01 commented Mar 5, 2017

Hi Dustin, do you, or perhaps Francisco, or whomever, plan to implement Generalized Reparametrization Gradient in Edward? If yes - could it be usefull as effective black-box technique for large-scale topic models (Dirichlet G-REP'ed via exp-covariance, Multinomial via Gumbel-Softmax)?

@dustinvtran
Copy link
Member Author

dustinvtran commented Mar 5, 2017

I've never experimented with it so I'm not sure. pinging @franrruiz. I do think there's no one-size-fits-all solution—especially for discrete random variables, and whether it be score functions+control variates, g-rep, gumbel softmax, or reparameterization gradients. However, it would be certainly useful to have them all to experiment.

@RoyiAvital
Copy link

Is anyone keep working on this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants