Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Need a Raw Shader solution for WebGPURenderer #29781

Open
sketchpunk opened this issue Oct 31, 2024 · 7 comments
Open

Need a Raw Shader solution for WebGPURenderer #29781

sketchpunk opened this issue Oct 31, 2024 · 7 comments

Comments

@sketchpunk
Copy link

Description

I need access to certain aspects of 3js like TransformFeedback/Compute with WebGL, To do so I need to run things with WebGPURenderer with forceWebGL set to true. The hitch is that RawShaderMaterial or something like it is no longer supported. This is a requirement because I have almost a decade worth of raw shaders plus, I need a simple way to keep shader code free & portable. TSL isn't a solution when there is a need to work in various webGL/openGL environments, be it shadertoys, pixiJS, babylonjs, godot, etc. I've even grabbed raw glsl code out of the blender project to use in 3js projects.

So far, the furthest I've managed to get TSL to work with raw shaders doesn't really work. Even using uniforms seems broken as even though I gave it a name, it changed its name plus threw it in a UBO of an unknown name instead of a simple uniform. I understand the idea to match how webgpu works, but there should be more control over uniforms in general that is predictable & known without having to output the GLSL code to see what really was created.

I dont know what the final solution should be, just bring back raw shaders for both GLSL & WGSL or provide some nodes that allows just a big text dump containing code, uniforms, attributes, varying. Also be nice to still have easy access to some of the main matrices like model, view & perspective plus cameraPosition was always nice to have around. I do want to use UBOs more but I want it to be optional if possible to better match WebGL / OpenGL shader standards.

As a side note, I even tried to extend nodeMaterial to see if I can hijack the builder to swop out the vertex/fragment shader before they're compiled into a program but that wasn't successful either.

@sunag

const uColor = THREE.uniform( new THREE.Color( 0x00ff00 ) );
uColor.name  = 'uColor';

const fCode  = THREE.code( `
    vec4 test2(){
        return vec4( 0.0, 1.0, 0.0, 1.0 ); // vec4( uColor, 1.0 );
    }
`);

const fMain = THREE.glslFn(`
    vec4 vertMain(){
        return vec4( 1.0, 0.0, 0.0, 1.0 ); // test2();
    }
`, [ fCode, uColor ] );

const mat = new THREE.NodeMaterial();
mat.fragmentNode = fMain();

const geo   = new THREE.PlaneGeometry( 1, 1 );
const mesh  = new THREE.Mesh( geo, mat );
App.debugMaterial( mesh ).then( sh=>console.log( sh.fragmentShader ) );

Outputs

layout( std140 ) uniform fragment_object {
	vec3 f_uColor;
};

        vec4 test2(){
            return vec4( 0.0, 1.0, 0.0, 1.0 ); // vec4( uColor, 1.0 );
        }
    
vec4 vertMain (  ){
            return vec4( 1.0, 0.0, 0.0, 1.0 ); // test2();
        }
layout( location = 0 ) out vec4 fragColor;

void main() {
	fragColor = vertMain(  );
}

Solution

xxx

Alternatives

xxx

Additional context

No response

@sunag
Copy link
Collaborator

sunag commented Nov 1, 2024

TSL isn't a solution when there is a need to work in various webGL/openGL environments, be it shadertoys, pixiJS, babylonjs, godot, etc. I've even grabbed raw glsl code out of the blender project to use in 3js projects.

One of the main advantages of TSL is that we are doing things so that the shaders communicate without hacks, and work on multiple backends , which will allow your codes to be easily added to third-party projects without breaking due to any update or respecting any optimizations and changes made in the library.

TSL is not only about shader language, TSL is also related to rendering manipulation, so we won't see functions like viewportTexture() in GLSL or WGSL being so simple, this single function call allows you to sample what has already been rendered, with mipmaps if you prefer, allowing the user to just write it in one line of code for this. We still have a lot to explore in this regard, the TSL post-processing is a good example of this.

Even using uniforms seems broken as even though I gave it a name, it changed its name plus threw it in a UBO of an unknown name instead of a simple uniform.

The groups are controlled by the system and allow customizations to be made in a simpler way when using TSL using uniform().setGroup( uniformGroup ) where some uniform groups can be manually updated, or just one render call while others will be updated every time the object is rendered.

I need access to certain aspects of 3js like TransformFeedback/Compute with WebGL, To do so I need to run things with WebGPURenderer with forceWebGL set to true

The first official implementation of TransformFeedback for Three.js came with WebGPURenderer.compute() using WebGLBackend fallback. RawShader simplifies a copy and paste, but does not organize at the system level which breaks with future updates, Three.js Transpiler has this role of bringing RawShader to TSL, so that GLSL code can also be supported in the WebGPU backend, for this it would only require a different care from the user to direct the functionalities of their shaders for nodes.

I dont know what the final solution should be, just bring back raw shaders for both GLSL & WGSL or provide some nodes that allows just a big text dump containing code, uniforms, attributes, varying... Also be nice to still have easy access to some of the main matrices like model, view & perspective plus cameraPosition was always nice to have around.

I think for raw shaders, we could have a class like ShaderNodeMaterial where we could do the on the fly renaming appropriate for the uniforms, buffers, etc. that will be managed by the node system.

We actually have a lot of nodes to handle this in TSL, but I think you'd want to use it as a key phrase in the GLSL string, similar to what we do in TSL.

export const cameraNear = /*@__PURE__*/ uniform( 'float' ).label( 'cameraNear' ).setGroup( renderGroup ).onRenderUpdate( ( { camera } ) => camera.near );
export const cameraFar = /*@__PURE__*/ uniform( 'float' ).label( 'cameraFar' ).setGroup( renderGroup ).onRenderUpdate( ( { camera } ) => camera.far );
export const cameraProjectionMatrix = /*@__PURE__*/ uniform( 'mat4' ).label( 'cameraProjectionMatrix' ).setGroup( renderGroup ).onRenderUpdate( ( { camera } ) => camera.projectionMatrix );
export const cameraProjectionMatrixInverse = /*@__PURE__*/ uniform( 'mat4' ).label( 'cameraProjectionMatrixInverse' ).setGroup( renderGroup ).onRenderUpdate( ( { camera } ) => camera.projectionMatrixInverse );
export const cameraViewMatrix = /*@__PURE__*/ uniform( 'mat4' ).label( 'cameraViewMatrix' ).setGroup( renderGroup ).onRenderUpdate( ( { camera } ) => camera.matrixWorldInverse );
export const cameraWorldMatrix = /*@__PURE__*/ uniform( 'mat4' ).label( 'cameraWorldMatrix' ).setGroup( renderGroup ).onRenderUpdate( ( { camera } ) => camera.matrixWorld );
export const cameraNormalMatrix = /*@__PURE__*/ uniform( 'mat3' ).label( 'cameraNormalMatrix' ).setGroup( renderGroup ).onRenderUpdate( ( { camera } ) => camera.normalMatrix );
export const cameraPosition = /*@__PURE__*/ uniform( new Vector3() ).label( 'cameraPosition' ).setGroup( renderGroup ).onRenderUpdate( ( { camera }, self ) => self.value.setFromMatrixPosition( camera.matrixWorld ) );

export const modelDirection = /*@__PURE__*/ nodeImmutable( ModelNode, ModelNode.DIRECTION );
export const modelWorldMatrix = /*@__PURE__*/ nodeImmutable( ModelNode, ModelNode.WORLD_MATRIX );
export const modelPosition = /*@__PURE__*/ nodeImmutable( ModelNode, ModelNode.POSITION );
export const modelScale = /*@__PURE__*/ nodeImmutable( ModelNode, ModelNode.SCALE );
export const modelViewPosition = /*@__PURE__*/ nodeImmutable( ModelNode, ModelNode.VIEW_POSITION );
export const modelNormalMatrix = /*@__PURE__*/ uniform( new Matrix3() ).onObjectUpdate( ( { object }, self ) => self.value.getNormalMatrix( object.matrixWorld ) );
export const modelWorldMatrixInverse = /*@__PURE__*/ uniform( new Matrix4() ).onObjectUpdate( ( { object }, self ) => self.value.copy( object.matrixWorld ).invert() );
export const modelViewMatrix = /*@__PURE__*/ cameraViewMatrix.mul( modelWorldMatrix ).toVar( 'modelViewMatrix' );

@sketchpunk
Copy link
Author

ShaderNodeMaterial sounds like it can be a good compromise if it gives the end user more control over the GLS/WGSL and keep most of it portable. I can understand I'll need to do some extra tweaking to port all my raw shaders over but I dont want the extent to be that its something that can't be ported back to any other webgl/webgpu based system.

The next question is how feasible is it to get ShaderNodeMaterial working? The raw shader thing is a big blocker for me as I need it for things I work on for my employer and all of my visual debugging tools are written with raw shaders.

@Spiri0
Copy link
Contributor

Spiri0 commented Nov 1, 2024

@sketchpunk I make very complex apps with raw wgsl code but maybe I don't understand your concern well enough.
Do you want to use WebGPU only because of the compute shaders but largely use WebGL and therefore use forceWebGL = true?
I'm almost just a user myself and I see this more as system abuse. I think three.webgpu.js is intended for targeted use of WebGPU and not for 10% Webgpu because of the compute shaders and then with forceWebGL = true 90% WebGL.
I don't want to use the forum here as an advertising platform, but I have a repo on Github that uses a lot of raw wgsl.

https://github.com/Spiri0/Threejs-WebGPU-IFFT-Ocean-V2?tab=readme-ov-file

I also have several codePens that show other use cases for raw wgsl. Maybe I can help you switch entirely to WebGPU if you are absolutely dependent on WebGPU functionalities, because I don't think deliberately mixing WebGPU with WebGL is a good solution. I personally see WebGL functionality in three.webgpu.js as a fallback layer and not as a targeted usage option. My idea is that in the future even this fallback from three.webgpu.js will disappear instead of expand it, because WebGL and WebGPU are simply too different to be able to harmonize with each other.

@sketchpunk
Copy link
Author

@Spiri0
The issue in part is that the standard WebGLRenderer does not have any support of TransformFeedback & a way to use it to modify gl buffers on the GPU. The WebGL fallback for the WebGPURenderer DOES use transformfeedback to handle compute shader like functionality similarly to WebGPU's compute.

At the moment I have no real need for WebGPU but I do need more functionally available for WebGL as its more widely available & still runs more stable then WebGPU, I dont know how many times WebGPU examples in threejs has crashed my graphics card by this point.

For both my personal & professional, I utilize compute shaders in WebGL threw transformfeedback but I have to use a modified version of the library to get GL references to buffer objects to have access to modify them with transformfeedback.

In my more public example, I built a mesh autoskinning prototype that runs on a modified 3js library but everything from data textures, shader compiling and transformfeedback execution I had to write from scratch using raw WebGL API with threejs really just rendering the results. Ideally, I shouldn't have to do that since the feature has been around for almost a decade & yet 3js has no support for it... till now in the new backend but in the process lost the ability to execute raw shaders which is a big requirement if you really want to squeeze out performance or do niche things.

https://sketchpunklabs.github.io/autoskinning/

As a side note... In the new backend, How do I go about grabbing the GL references of an attribute buffer? I'd like to slowly transition my stuff to the new renderer but I'd like to do it in incremental steps away from raw webgl and onto the compute functionality.

@vlucendo
Copy link
Contributor

vlucendo commented Nov 1, 2024

I think for raw shaders, we could have a class like ShaderNodeMaterial where we could do the on the fly renaming appropriate for the uniforms, buffers, etc. that will be managed by the node system.

Currently with wgslFn in fragment or vertex nodes we need to pass all the attributes, varyings, uniforms, dependencies, etc, and it seems to generate a function with all those arguments (I believe there's a 255 limit in webgpu?). I wonder if things could be simplified with a class that behaves a bit like the current ShaderMaterial, automatically including commonly used matrices, detecting attributes, taking care of uniforms, etc.

@Spiri0
Copy link
Contributor

Spiri0 commented Nov 1, 2024

@vlucendo I'm far from the limit of bindings, but bundling data efficiently is a topic. That's why I'm thinking about structs. A struct with 12 bundled uniforms only uses one binding instead of the 12 individual uniforms and more efficient.

@sketchpunk I assume you mean this with your last point about the matrices. This is pretty similar to rawShader

const vertexShaderParams = {
    projectionMatrix: cameraProjectionMatrix,
    cameraViewMatrix: cameraViewMatrix,
    modelWorldMatrix: modelWorldMatrix,
    position: attribute("position"),
}
       
const vertexShader = wgslFn(`
    fn main_vertex(
      	projectionMatrix: mat4x4<f32>,
      	cameraViewMatrix: mat4x4<f32>,
      	modelWorldMatrix: mat4x4<f32>,
      	position: vec3<f32>,
    ) -> vec4<f32> {

      	var outPosition = projectionMatrix * cameraViewMatrix * modelWorldMatrix * vec4f(position, 1);
      
      	return outPosition;
    }
`);

const material = new THREE.MeshBasicNodeMaterial();
material.vertexNode = vertexShader(vertexShaderParams);

and with the positionNode:

const vertexShaderParams = {
    position: attribute("position"),
}
       
const vertexShader = wgslFn(`
    fn main_vertex(
      	position: vec3<f32>,
    ) -> vec4<f32> {

      	var outPosition = vec4f(position, 1);
      
      	return outPosition;
    }
`);

const material = new THREE.MeshBasicNodeMaterial();
material.positionNode = vertexShader(vertexShaderParams);  //the positionNode take care of the matrices

The big advantage of the matrixNodes is that you no longer have to worry about the update, the node system does that. Of course you can also assign the matrices directly from the camera and the model and update them manually like in WebGL

@sketchpunk
Copy link
Author

@sketchpunk I assume you mean this with your last point about the matrices. This is pretty similar to rawShader

Yes, RawShaderMaterials was able to update those sort of values automatically, anything else custom I had to handle updates.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants