Skip to content

Commit

Permalink
Update generated Python Op docs.
Browse files Browse the repository at this point in the history
Change: 141220667
  • Loading branch information
tensorflower-gardener committed Dec 7, 2016
1 parent b2393de commit 93062d9
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion tensorflow/g3doc/api_docs/python/contrib.legacy_seq2seq.md
Original file line number Diff line number Diff line change
Expand Up @@ -529,7 +529,7 @@ Weighted cross-entropy loss for a sequence of logits (per example).
* <b>`weights`</b>: List of 1D batch-sized float-Tensors of the same length as logits.
* <b>`average_across_timesteps`</b>: If set, divide the returned cost by the total
label weight.
* <b>`softmax_loss_function`</b>: Function (inputs-batch, labels-batch) -> loss-batch
* <b>`softmax_loss_function`</b>: Function (labels-batch, inputs-batch) -> loss-batch
to be used instead of the standard softmax (the default if this is None).
* <b>`name`</b>: Optional name for this operation, default: "sequence_loss_by_example".

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ Weighted cross-entropy loss for a sequence of logits (per example).
* <b>`weights`</b>: List of 1D batch-sized float-Tensors of the same length as logits.
* <b>`average_across_timesteps`</b>: If set, divide the returned cost by the total
label weight.
* <b>`softmax_loss_function`</b>: Function (inputs-batch, labels-batch) -> loss-batch
* <b>`softmax_loss_function`</b>: Function (labels-batch, inputs-batch) -> loss-batch
to be used instead of the standard softmax (the default if this is None).
* <b>`name`</b>: Optional name for this operation, default: "sequence_loss_by_example".

Expand Down

0 comments on commit 93062d9

Please sign in to comment.