Skip to content

Conversation

@wangkuiyi
Copy link
Collaborator

@wangkuiyi wangkuiyi commented Aug 26, 2017

Variable definition and type inference belong to Block (used to be known as NetOp) instead of Scope.

@wangkuiyi
Copy link
Collaborator Author

This PR summarizes the discussion a few hours ago. Anyone please feel free to complete it.

LOD_TENSOR = 6;
}
Type type = 1;
optional LODTesnorDesc lod_tensor = 2; // when type==LOD_TENSOR
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LOD_TENSOR just a description of tensor's sequence information. It could be float lod tensor,int lod tensor, etc.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes. We have LODTensor::element_type.


```protobuf
message VarDesc {
enum Type {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, very strange for Variable Type. If it is variable, the type should be

  • Tensor
  • LODTensor
  • StepScopes.
  • Etc.

Not int/float. They are types for tensors.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Our Variable is dynamic-typed, like a Python variable, which can hold any typed value. So VarDesc needs a field type.

A variable can hold a single int, so there should be a enum Type { INT;.

A tensor can hold any typed elements, so TensorDesc needs a field element_type.

I am changing the text in accordance.

}

message LODTensorDesc {
repeated int dims = 1; // [UNK, UNK, 6000] is saved as [-1, -1, 6000]
Copy link
Collaborator

@reyoung reyoung Aug 27, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

saved as [0, 0, 6000] is better. If unk is the short for batch-size.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

0 is a valid dim size. Consider that IfOp needs to split the input for its left and right branches. It could be that the input for the left or right branch happens to have 0 instances.

```c++
class Block : public OperatorBase {
private:
map<string, VarDesc> vars_;
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The shape information is stored in paddle::framework::Tensor. It is no need to store it again in in VarDesc.

```c++
class OperatorBase {
private:
const OpDesc& desc_;
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There are many operators not created by OpDesc in backward, optimization. It is no need to store OpDesc, and not able to store OpDesc.

InferShape(block);
}
private:
virtual InferShape(Block* block) = 0;
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why should InferShape be redesigned again?

@wangkuiyi
Copy link
Collaborator Author

Closing this too old PR.

@wangkuiyi wangkuiyi closed this Apr 9, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants