Skip to content

Commit

Permalink
no message
Browse files Browse the repository at this point in the history
  • Loading branch information
THUCSTHanxu13 committed Mar 30, 2017
1 parent 30332f5 commit 5f8ef69
Show file tree
Hide file tree
Showing 4 changed files with 12 additions and 7 deletions.
7 changes: 6 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,11 @@
# TensorFlow-TransX

The implementation of TransE [1], TransH [2], TransR [3], TransD [4] for knowledge representation learning (KRL). The overall framework is based on TensorFlow.
The implementation of TransE [1], TransH [2], TransR [3], TransD [4] for knowledge representation learning (KRL). The overall framework is based on TensorFlow. We use C++ to implement some underlying operations such as data preprocessing and negative sampling. For each specific model, it is implemented by TensorFlow with Python interfaces so that there is a convenient platform to run models on GPUs.


# Customizing Your Own Model

If you have a new idea and need to implement its code, you just need to change Python interfaces for your customized model. Read these codes, you will find that to change the class TransXModel will meet your needs.

# Evaluation Results

Expand Down
4 changes: 2 additions & 2 deletions transD.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ def __init__(self):
self.trainTimes = 3000
self.margin = 1.0

class TransEModel(object):
class TransDModel(object):

def calc(self, e, t, r):
return e + tf.reduce_sum(e * t, 1, keep_dims = True) * r
Expand Down Expand Up @@ -88,7 +88,7 @@ def main(_):
with sess.as_default():
initializer = tf.contrib.layers.xavier_initializer(uniform = False)
with tf.variable_scope("model", reuse=None, initializer = initializer):
trainModel = TransEModel(config = config)
trainModel = TransDModel(config = config)

global_step = tf.Variable(0, name="global_step", trainable=False)
optimizer = tf.train.GradientDescentOptimizer(0.001)
Expand Down
4 changes: 2 additions & 2 deletions transH.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ def __init__(self):
self.trainTimes = 3000
self.margin = 1.0

class TransEModel(object):
class TransHModel(object):

def calc(self, e, n):
norm = tf.nn.l2_normalize(n, 1)
Expand Down Expand Up @@ -85,7 +85,7 @@ def main(_):
with sess.as_default():
initializer = tf.contrib.layers.xavier_initializer(uniform = False)
with tf.variable_scope("model", reuse=None, initializer = initializer):
trainModel = TransEModel(config = config)
trainModel = TransHModel(config = config)

global_step = tf.Variable(0, name="global_step", trainable=False)
optimizer = tf.train.GradientDescentOptimizer(0.001)
Expand Down
4 changes: 2 additions & 2 deletions transR.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ def __init__(self):
self.trainTimes = 3000
self.margin = 1.0

class TransEModel(object):
class TransRModel(object):

def __init__(self, config):

Expand Down Expand Up @@ -84,7 +84,7 @@ def main(_):
with sess.as_default():
initializer = tf.contrib.layers.xavier_initializer(uniform = False)
with tf.variable_scope("model", reuse=None, initializer = initializer):
trainModel = TransEModel(config = config)
trainModel = TransRModel(config = config)

global_step = tf.Variable(0, name="global_step", trainable=False)
optimizer = tf.train.AdamOptimizer(0.001)
Expand Down

0 comments on commit 5f8ef69

Please sign in to comment.