Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Bidirectional lstm example #2093 #2096

Merged
merged 4 commits into from
May 12, 2016
Merged

Bidirectional lstm example #2093 #2096

merged 4 commits into from
May 12, 2016

Conversation

xlvector
Copy link
Contributor

This is an just simple example for bidirectional lstm.

@piiswrong
Copy link
Contributor

@pluskid
We need a LSTM symbol implemented in python under mx.sym.LSTM so that it will be easier to replace LSTM in all examples with cudnn v5 lstm when we support it

@pluskid
Copy link
Contributor

pluskid commented May 10, 2016

@piiswrong yes agreed. The currently used LSTM in ptb and speech are a variant with some simplification. An implementation of a full version needs the broadcast multiplication feature. Need to check what variant of the LSTM cell is cuDNN v5 providing. What is our current status of supporting cuDNN v5? I am reluctant to upgrade having heard that they breaks a lot of APIs.... also on servers I do not have permission to upgrade cuda driver to v8.

@antinucleon
Copy link
Contributor

@pluskid We now support everything but LSTM cell in CuDNN v5. CUDA 7.5 is fine for CuDNN v5.

@@ -28,33 +28,33 @@ export NVCC = nvcc
DEBUG = 0

# the additional link flags you want to add
ADD_LDFLAGS =
ADD_LDFLAGS = -L/disk1/deeplearning/local_install/lib
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This file should not be modified.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks, I will modify it

@xlvector
Copy link
Contributor Author

Glad to hear that mxnet will have a full support of LSTM.

Another think I want to ask is, is it easy to integrate baidu wrap-ctc without using torch module?

@tqchen
Copy link
Member

tqchen commented May 11, 2016

this is ready to be merged after combine the commits. see instructions http://mxnet.readthedocs.io/en/latest/how_to/contribute.html#how-to-combine-multiple-commits-into-one

@antinucleon antinucleon merged commit 6d99054 into apache:master May 12, 2016
@futurely futurely mentioned this pull request May 13, 2016
12 tasks
@xlvector
Copy link
Contributor Author

sorry to response later. I combined the commits but find its already merged.

Thanks all.

@giorgioercixu
Copy link
Contributor

@xlvector I am working on make this example python2/3 compatible. However, I find that the data folder and gen_data.py are missing in the current master branch. Could you take a look at this?

@xlvector
Copy link
Contributor Author

xlvector commented Dec 4, 2016

Thanks. The gen_data.py is very simple, it just generate whitespace seprated numbers in every line.

something like

for i in range(1000):
print ' '.join([x for x in sorted([random.randint(0, 100) for k in range 10)]

You can add this script in your PR.

@giorgioercixu
Copy link
Contributor

ok I will put this gen_data into the lstm_sort.py directly. Thanks.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants