Skip to content

Commit

Permalink
Fix GL reference in headless WebGL test code. (tensorflow#1832)
Browse files Browse the repository at this point in the history
* Fix GL reference in headless WebGL test code.

* Cleanup & README
  • Loading branch information
nkreeger authored Jul 12, 2019
1 parent c028b3c commit c4c21a9
Show file tree
Hide file tree
Showing 2 changed files with 48 additions and 5 deletions.
44 changes: 44 additions & 0 deletions tfjs-backend-nodegl/demo/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
# MobileNet tfjs-backend-nodegl Demo

*This is a very early demo to show how tfjs-backend-nodegl can be used for headless WebGL acceleration.*

To run this demo, perform the following:

1. Move into `tfjs-backend-nodegl` (parent directory of this demo folder):
```sh
$ cd tfjs-backend-nodegl
```

2. Build package and compile TypeScript:
```sh
$ yarn && yarn tsc
```

3. Move into the demo directory:
```sh
$ cd demo
```

4. Prep and build demo:
```sh
$ yarn
```

5. Run demo:
```sh
$ node run_mobilenet_inference.js dog.jpg
```

Expected output:
```sh
$ node run_mobilenet_inference.js dog.jpg
Platform node has already been set. Overwriting the platform with [object Object].
- gl.VERSION: OpenGL ES 3.0 (ANGLE 2.1.0.9512a0ef062a)
- gl.RENDERER: ANGLE (Intel Inc., Intel(R) Iris(TM) Plus Graphics 640, OpenGL 4.1 core)
- Loading model...
- Mobilenet load: 6450.763924002647ms
- Coldstarting model...
- Mobilenet cold start: 297.92842200398445ms
- Running inference (100x) ...
- Mobilenet inference: (100x) : 35.75772546708584ms
```
9 changes: 4 additions & 5 deletions tfjs-backend-nodegl/demo/run_mobilenet_inference.js
Original file line number Diff line number Diff line change
Expand Up @@ -22,13 +22,12 @@ const jpeg = require('jpeg-js');

const backendNodeGL = require('./../dist/index');

console.log(` - gl.VERSION: ${
backendNodeGL.gl.getParameter(backendNodeGL.gl.VERSION)}`);
console.log(` - gl.RENDERER: ${
backendNodeGL.gl.getParameter(backendNodeGL.gl.RENDERER)}`);
const gl = tf.backend().getGPGPUContext().gl;
console.log(` - gl.VERSION: ${gl.getParameter(gl.VERSION)}`);
console.log(` - gl.RENDERER: ${gl.getParameter(gl.RENDERER)}`);

const NUMBER_OF_CHANNELS = 3;
const PREPROCESS_DIVISOR = tf.scalar(255 / 2);
const PREPROCESS_DIVISOR = 255 / 2;

function readImageAsJpeg(path) {
return jpeg.decode(fs.readFileSync(path), true);
Expand Down

0 comments on commit c4c21a9

Please sign in to comment.