Skip to content

Conversation

@inadob
Copy link
Contributor

@inadob inadob commented Feb 3, 2020

Zero-padding can't be done directly in case the model is quantized, so we have to set
pad_value = zero_point

* zero padding can't be done directly if the model is quantized
  since pad=0 will have different meaning in uint8 space
@anijain2305
Copy link
Contributor

Excellent find! I missed this somehow.

I think the current test case failure might be because we give random input to a quantized network, and this leads to values being close to each other in the final output. We can try giving a real input to the quantized network, and then check if the label is present in top 5.

@anijain2305
Copy link
Contributor

Let me take a look at this. You might have to wait for sometime as I am little busy, but I will get to it within 2 weeks.

@anijain2305
Copy link
Contributor

@inadob You can close this one. #4816 covers this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants