Closed
Bug 1445021
Opened 7 years ago
Closed 7 years ago
Using shaders to apply color in WebGL broken
Categories
(Developer Documentation Graveyard :: API: Miscellaneous, enhancement, P1)
Tracking
(Not tracked)
RESOLVED
FIXED
People
(Reporter: jack.z.k.davies, Unassigned)
Details
(Whiteboard: [specification][type:bug])
What did you do?
================
1. Copied inline code from the tutorial at https://developer.mozilla.org/en-US/docs/Web/API/WebGL_API/Tutorial/Using_shaders_to_apply_color_in_WebGL
What happened?
==============
Uncaught TypeError: Failed to execute 'attachShader' on 'WebGLRenderingContext': parameter 2 is not of type 'WebGLShader'.
What should have happened?
==========================
success
Is there anything else we should know?
======================================
The code on the page and the repo are not inline, if updated to use the code from the repo the example works
FROM TUTORIAL
const vsSource = `
...
smooth out vec4 vColor;
...
`;
FROM REPO
const vsSource = `
...
varying lowp vec4 vColor;
...
`;
Updated•7 years ago
|
Component: Wiki pages → API: Miscellaneous
Product: developer.mozilla.org → Developer Documentation
Updated•7 years ago
|
Priority: -- → P1
Comment 1•7 years ago
|
||
gmanpersona already fixed this. Someone had (on March 12) attempted to update the code to use WebGL2 features, but this is not a WebGL2 example.
Status: UNCONFIRMED → RESOLVED
Closed: 7 years ago
Resolution: --- → FIXED
You need to log in
before you can comment on or make changes to this bug.
Description
•