Skip to content

yotofu/dilated-cnn-ner

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

57 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

dilated-cnn-ner

This code implements the models described in the paper "Fast and Accurate Entity Recognition with Iterated Dilated Convolutions" by Emma Strubell, Patrick Verga, David Belanger and Andrew McCallum.

Requirements

This code uses TensorFlow >= v1.0 and Python 2.7.

It will probably train on a CPU, but honestly we haven't tried, and highly recommend training on a GPU.

Setup

  1. Set up environment variables. For example, from the root directory of this project:
export DILATED_CNN_NER_ROOT=`pwd`
export DATA_DIR=/path/to/conll-2003
  1. Get some pretrained word embeddings, e.g. SENNA embeddings or Glove embeddings. The code expects a space-separated file with one word and its embedding per line, e.g.:

    word 0.45 0.67 0.99 ...
    

    Make a directory for the embeddings:

    mkdir -p data/embeddings
    

    and place the file there.

  2. Perform all data preprocessing for a given configuration. For example:

./bin/preprocess.sh conf/conll/dilated-cnn.conf

This calls preprocess.py, which loads the data from text files, maps the tokens, labels and any other features to integers, and writes to TensorFlow tfrecords.

Training

Once the data preprocessing is completed, you can train a tagger:

./bin/train-cnn.sh conf/conll/dilated-cnn.conf

Evaluation

By default, the trainer will write the model which achieved the best dev F1. To evaluate a saved model on the dev set:

./bin/eval-cnn.sh conf/conll/dilated-cnn.conf --load_model path/to/model

To evaluate a saved model on the test set:

./bin/eval-cnn.sh conf/conll/dilated-cnn.conf test --load_model path/to/model

Configs

Configuration files (conf/*) specify all the data, parameters, etc. for an experiment.

About

Dilated CNNs for NER in TensorFlow

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 95.2%
  • Shell 4.8%