Skip to content

Conversation

@BruceDai
Copy link
Contributor

@BruceDai BruceDai commented Mar 23, 2023

This PR is based on #37.

@fdwr @huningxin PTAL at the third commit, thanks.

@BruceDai BruceDai changed the title Add more reduction Implement more reduction Mar 23, 2023
const outputShape = input.shape.slice();
for (let i = 0; i < axesLen; ++i) {
const axis = axes[i] >= 0 ? axes[i] : axes[i] + rank;
const axis = axes[i];
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Aah, nice, we can simplify the testing and implementations after Ningxin's removal of the negative axes policy.

* @return {Tensor}
*/
export function reduceSumSquare(input, options = {}) {
return reduceSum(pow(input, new Scalar(2)), options);
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Formulas all look correct, but wasn't there a GitHub issue with ReduceL2 ambiguity?

The formula I recall was ReduceL2 = sqrt(a1^2 + a2^2 + ... + an^2).

(btw, I always have to re-lookup the formulas because I keep thinking "reduceSumSquare" means x.sum().square(), but actually it means x.square().sum() - so square and then reduce sum)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

but wasn't there a GitHub issue with ReduceL2 ambiguity?

Are there other explanations for ReduceL2 besides ReduceL2 = sqrt(a1^2 + a2^2 + ... + an^2) ?

Reference: https://en.wikipedia.org/wiki/Norm_(mathematics)#p-norm

Copy link

@fdwr fdwr Apr 5, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh never mind, it was actually L2 pooling, not reduction: webmachinelearning/webnn#278

let output = new Tensor(input.shape);
const shape = new Array(input.rank).fill(1);
shape[axis] = -1;
shape[axis] = null;
Copy link

@fdwr fdwr Mar 31, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🤔 Ningxin's PR removed the special handling of negative numbers related to axes, but we still have this special case with axes here with reshape. I wonder if this too should be a policy resolved by higher level frameworks first, and they just pass the resolved shape before it reaches the WebNN API (or does keeping Reshape's special null axis handling make the composition of other WebNN ops simpler, like your instanceNormalization implementation which utilizes reshape?) *this comment is not blocking either way

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

to simplify the implementation

Sorry, I don't catch your point, could you please explain how to simplify the implementation?

And I submitted a pr webmachinelearning/webnn#367 to Spec, please also have a review, thanks.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍 Commented on the other PR.

@huningxin huningxin merged commit 1ff3bbb into webmachinelearning:main Apr 10, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants