Jump to content

havenphillip

Members
  • Posts

    550
  • Joined

  • Last visited

Everything posted by havenphillip

  1. Yeah looks pretty straightforward. You use grayscale images. I wonder if you could scroll one of them without moving the others. This would be great for PBR putting the roughness, metalness and ao all in one. I get it.
  2. Yeah I don't know how to do channels on GIMP but I"ve seen videos on it. But I'm pretty sure that's the set-up. In shader it would just look like this: float col1 = texture(texture0,ex_texcoords0).r; float col2 = texture(texture0,ex_texcoords0).g; float col3 = texture(texture0,ex_texcoords0).b; ...where each of the color channels would be a different noise or whatever.
  3. I want to make a PBR shader for these. I've seen some tutorials around. I just haven't gotten to it. I don't know how well Leadwerks would handle it but I do know that the more textures you add to a shader the more it eats at your FPS. Albedo is like a diffuse map where the shadows are all reduced, which is better. You want the lighting and normals to handle the shadows. Not the diffuse. Your specular is probably the roughness, or the inversion of the roughness. PBR stuff looks really cool, though.
  4. If you liked vertex displacement, you're going to love tessellation. Tessellation is very similar, but incorporates the control and evaluation stages of the shader. The Leadwerks box brush creates a box that has two triangles per face. If you want to use some kind of vertex displacement on it you're going to have a bad time. With tessellation it's different because you are able to subdivide the faces with the shader, so you can use the shader to create more structurally complex models. The control stage in the shader does the subdivision, and the evaluation stage can be thought of as a post-control vertex stage - it just resets everything after tessellation has taken place. This "control" over the number of vertices makes the tessellation shader superior because you can take very simple models and subdivide them to exactly the number of vertices you need without going back and forth between Leadwerks and some modeling program. You can set it to exactly the number you need. You can see here this Leadwerks box in the material editor. It has two triangles per face. I subdivided it by 64. You can see how much detail you can with only two triangles: You can see that two triangles yields a lot. There is a limit, though. I'm fairly certain 64 is the maximum number of subdivisions you can do, and just like the vertex displacement the size of the object matters because the more space you have between vertices, the less crammed the details are. They get too spread out and the tessellation can't keep up with the smaller details. So if you need more than 64 will allow, you will have to subdivide your model a little bit more before you port it in to Leadwerks. One thing I want to mention. I added this uncommented line in the vertex stage: vec4 modelvertexposition = entitymatrix_ * vec4(vertex_position - vertex_normal * 0.1,1.0); Notice the above and below picture: Subtracting the vertex_normal from the vertex_position and multiplying it by some small number allows you to reduce this gap. That's just a little trick I picked up along the way. I still don't know how to close those seams. I think Josh tried to explain it to me once but I'm not a technical guy. I couldn't make sense of it. But just swap that line in instead of the standard modelvertexposition line and adjust the number. But don't go too far in because otherwise your plane will go beneath the surface of the mesh boundary and it will cast shadows on itself. You can disable Cast Shadows in the material editor and that will go away but obviously only do that if you don't want to cast shadows at all. Here's the noobalyzed Leadwerks tessellation shader. It's a nice little Leadwerks gem: 41_tessellation.zip So think of your control stage as "controlling" the subdivisions, and the evaluation stage has the heightmap and the height strength and all the other things you would have put in the vertex stage in a vertex displacement shader. You can add a time element to your heightmap in the evaluation stage to create a wind effect using textures: 42_sheet in the wind.zip The sheet in the wind shader is a bit oddball because I haven't yet figured out how to make it two-sided. The problem is that with a two-sided object the height direction goes the opposite direction. So with this shader you get two sheets instead of one consistent sheet. Maybe it's a modeling problem. I don't know. You can set it to two-sided in the material editor but one of the sides won't react to the flashlight. That kind of thing irks me. It's not enough of a solution for me. But potentially this could be used for clothes or any other thing you want to blow in the wind. If and when I figure it out I'll let you know. I put the heightmap normals information in the fragment shader from 37_normal from heightmap in both of these shaders. That's what gives the sheet any normals at all. You can sort of see it in the soft shadows in the sheet above. You'll see what I mean if you grab this shader and run it in your game. Also, one important note: I usually set my heightmaps to uncompressed in the texture editor. That makes a huge difference. Otherwise you get this weird sort of voxelized look. Happy shading! Next: Geometry...
  5. Note: In my last post shaders 36 and 37 weren't working correctly so I replaced them. If you got the old ones you can throw those out and re-download the new ones on the same links. These shaders are based on the Leadwerks soft particle shader, which I was directed to by Marcousik, and which I noobalyzed into a model shader. What the soft particle shader does is create a soft falloff in opacity depending on how close objects are behind it. To clarify, imagine looking down at a water plane, and a terrain beneath it. Depending on how far away the terrain is from the water plane determines how much of the effect you see. If the water plane is near the ground along the shore it is invisible. But on the deep end, where the ground and the water surface are far apart you get more opacity in the effect. The core of what's going on is you're getting the distance from the eye or camera to the water plane, and the distance from the eye to the terrain surface, and then you're finding the difference between the two. This shader makes nice fog effects. Just make a plane and set the shader on it, and adjust the height to your liking: 39_ground fog.zip If you invert the effect using "1-diff" you get a strong opacity near objects which then fades to alpha the more space there is between its surface and the surface behind it. That's how I was able to make this forcefield shader: 40_forcefield.zip Happy shading! Next: Tessellation...
  6. Simplistically speaking, you could think of the vertex stage of a shader as the "position" of the object, and the fragment stage as the "color." All you really need is position and color. Vertex displacement manipulates the position to warp or move the object. There are many ways to do it but the whole trick revolves around the ex_vertexposition. You may notice in every model shader you have these two lines in the vertex stage: ex_vertexposition = entitymatrix_ * vec4(vertex_position,1.0); gl_Position = projectioncameramatrix * ex_vertexposition; I know from things I've seen around the internet that these two lines are establishing the object in model space and camera space. I don't know much about that but roughly the way I see it the first of these lines establishes what the ex_vertexposition is, and the second line outputs the ex_vertexposition. So anything you do with the ex_vertexposition between these two lines is "vertex displacement." I have three examples in this post. The first manipulates the ex_vertexposition using sin(time) to create a blowing wind effect. The second uses a texture to displace the vertices. And the third uses the player position to influence the ex_vertexposition. But since whatever you do with the ex_vertexposition between these two lines qualifies as vertex displacement, you can relegate it more to the realm of art than science. It's kind of just whatever "looks good." In the first vertex displacement example, if you go into the vertex stage of the shader and find the "wind" section there are these long, obnoxious lines. They yield a pretty convincing wind but they're not set in stone. I think a good rule is that it's only wrong if it looks wrong. First, since we're talking about wind, I thought this would be a good place to drop a subsurface scattering shader. It's not true subsurface scattering, but it's a pretty good approximation. Subsurface scattering is that phenomenon that occurs when you put your hand over a flashlight and you can see the light coming through your hand. The light hits the one side of your hand and "scatters" under the surface of your skin, and comes out the other side as a glow. Set the shrubbery in the scene and then in the Scene tab/General tab scale it to about 5x5x5. Then walk around it to view it from all sides. You'll see the effect: 33_subsurface scattering.zip You may notice your shadows are blocky and not picking up the leaves. Use this shader to fix that by putting it in the "Shadow" slot in the material editor: 34_leaf shadows.zip Here's the first vertex displacement shader. The effect occurs in time and I don't know how to make videos yet so I didn't bother making a picture. If you grab this shader you can just see it for yourself. Put the tree in the scene and scale it to something like 7x7x7. By the way, I made the tree and the shrubbery with TreeIt, which is a free program you can download somewhere. It's pretty cool and it makes the process of making trees and stuff much easier once you understand the program: 35_vertex displacement.zip Next is a displace by texture shader. You use a simple heightmap in the vertex stage to warp the shape and it yields pretty cool results. A few points to note: vertex displacement warps the shape but it doesn't change the collision information. So don't expect to lay down a plane, slap a displacement shader on it and have an instant terrain (I wish!). Apparently shaders are handled on the GPU and collisions are handled on the CPU, so they don't mix. But you can still make cool looking stuff and there is a broad usefulness to it anywhere you want to add 3d effects. It's slower than parallax because you need the actual vertex count, but it doesn't "slip" the way parallax does and you can use it on complex shapes: Another thing to note about vertex displacement is that the number of vertices and the space between vertices matters. If you have few vertices and they're far apart you're not going to get much of an effect. See the cube - which has only two triangles per face - compared to the sphere which has a lot of vertices close together: 36_displace by texture.zip Another thing to note about this is that the normals don't change with the displacement. You can have a warped plane, but the light reacts to it as if it were still flat. So in the above shader and here I added a section which creates a normal map from the heightmap. I then just mix the regular normals and the heightmap normals in the normals section of the fragment shader. You can change the resolution line but the range of about 200 - 500 appears to be the sweet spot: 37_normal from heightmap.zip Lastly we have the interactive displacement shader, which is pretty cool if not rudimentary. You know how when you're playing a 3rd person game and you run through a bush the bush moves? This is a noob version of that. Set the shader on the shrubbery and then attach the script to the same shrubbery. Then drag the player into the slot. When you run through it moves a bit. You can adjust the interact power and the radius of the effect. No picture: 38_interactive displacement.zip Happy shading! Next: Depth Buffer...
  7. I think so. I set the mask size to 0.1 and I don't think I'm seeing any repetition. At ground level it looks good. The textures themselves repeat.
  8. I don't deal with noise shaders that much but there are a lot of examples on Shadertoy. It has a lot of uses depending on what you want, but very often I see noise being used with raymarching to create 3D scenes. I prefer using textures because I can get what I want out of them and noise is a lot of math (noise shaders can get very large very quickly), but I figured a good shader series ought to have a noise function somewhere. I got this noise from iq of Shadertoy fame. Just a nice little simplex noise: 30_noise.zip You can make a mask out of your noise and mix two textures: 31_noise mask.zip Here's a simple hash function: 32_glitchy console.zip At this point I've almost got all 52 shaders. I may slow down a bit until I know I can plot the rest of the way. Just want to make sure I've got them right before I post them. Happy shading! Next: Vertex Displacement...
  9. havenphillip

    Parallax

    Parallax is a cheap way to add a lot of 3D details to any scene. The less vertices on a mesh the better, because parallax doesn't actually add any geometric information. And it requires only the bare minimum number of vertices to work. That makes it fast. And the effect is believable. All you really need is a heightmap and a dream... It's a lot of fun trying different textures with it, and if I could I'd use it for everything. However, there are a few limitations to parallax. The first problem you run into is that it has trouble with the curve. It works best on a flat plane or a slightly curved surface. I've tried this on boulders and such and there is always a problem somewhere at the grazing angles (see below at the parallax occlusion shader). Secondly, parallax sort of "sinks in" to the texture, so your depth is slightly off, thus when you use this texture on the ground and then put objects on it, you get a slight sense that things are always floating a bit. I've done what I can to reduce this problem and you can get away with using it on floors so long as you are not trying to get extreme parallax depth out of it. POM is awesome, but for floors you may also consider the pixel depth offset shader or the parallax + splat shader. Noob tips: I think a good general rule of thumb for using parallax is to think in terms of how far your texture details are meant to stick out. In the case of the bricks you're looking at something like an inch from the surface. In that case you can get away with using the base or the classic parallax. But if your details are meant to stick out further (up to about 5 or 6 inches) then you may want to consider parallax occlusion. You can also improve the speed of your parallax shader by scaling down your normal, specular, and height maps in a 2D image manipulation program such as GIMP of Photoshop because the parallax effect is doing some of the work. And if you're not using a lot of depth, or you're not catching the grazing angles, you can reduce your parallax steps in the shaders for speed. So on to the shaders: Classic parallax is the first shader in this noob_shader series where we actually do something within the vertex stage of the shader by adding what's called a "TBN matrix," which allows for the texture to shift slightly according to the view angle (see link in shader for more on this). This classic parallax effect should be seen as an extension to the normals because the parallax itself is subtle. I haven't messed with this one much, but since parallax is so cheap this could be used to replace the base shader in many cases to add just a little extra to the scene detail, and I reused this brick texture to imply that connection to the base shader here: 26_classic parallax.zip Next we have parallax occlusion (or POM). Parallax occlusion takes parallax to a whole new level. Here we use a different version of the TBN matrix - and one I prefer - because it allows me to adjust the degree of the effect at the grazing angle. Increase the value in this line below (in the vertex stage of the shader) and the effect will flatten along the sides relative to the view angle, and this allows you to use parallax on more rounded objects. Smooth, rolling hills, for instance. But you may have to go back and adjust the parallax depth after raising this: TBN_Matrix[2][2] -= 0.25; //lateral reduction POM creates several layers or slices of the texture according to the heightmap and stacks them with a "for" loop, creating a 3D effect down "into" the texture. It then slides these layers around according to the heightmap and the view angle. You may notice that there are still sharp corners and edges on the mesh. This is because the parallax effect doesn't cross those bounds. It's a trick of the eye. Not geometry. Note all the tiny little details you can get in there: 27_parallax occlusion.zip Next is my attempt at UE4's pixel depth offset (PDO) using gl_FragDepth in glsl. You can see in the circled area what this shader does. It is essentially the POM but the heightmap is used to overlap objects in the scene which furthers the impression of a non-flat surface...using textures on a flat surface. I haven't tried this on anything but a flat floor but I suspect it will work similarly to the POM shader, so it may work on rolling hills and such. Previous versions of this shader worked fairly well on stuff like that. The rocks overlapping the wall near the bottom are actually just the flat texture. One word of note: the gl_FragDepth affects the plane slightly so when you set objects on this surface manually push them down into it a little bit. 28_parallax pixel depth offset.zip Here's a parallax + splat shader. I kind of love this thing. Want to put bullet holes on a stop sign or cracks in the sidewalk for cheap? This is the shader to do that. I sometimes refer to it as the "damage" shader because that's essentially the effect this shader yields. And this is probably the most widely useful of these four because the majority of the surface is at the correct depth. I've tried this with success even on very angled surfaces and it works pretty well even there. Note the bool in the shader. This allows you to "puncture" the mesh or not. Use true for bulletholes and such, or false if you don't want something to go all the way through. Also make sure your alpha masks are set to something other than DXT1 and you're not using .jpg image format. 29_parallax + splat.zip Happy shading! Next: Simplex Noise...
  10. Masking is pretty easy and useful for adding some variety to your shaders. Masks are usually just some greyscale image that is turned into a float. In shaders floats have a very broad utility because you can add them just about anywhere. Super useful. And they can be used to blend textures or define areas in your material. Adding a mask in code is the same as with any texture. The greyscale quality allows you to use a single channel of that greyscale as a representative of all the channels. For instance the red channel. In code it looks something like: float mask = texture(texture5,ex_texcoords0).r; In blending two textures you need the two textures you want to blend and a third mask texture to blend them by. Once you've established them in the shader you simply do this in the output line: fragData0 = mix(outcolor,outcolor2,mask); That's it. The mix function is pretty much the whole trick. But there are other things you can do. One variation of texture blending is by using an alpha mask in what I believe is referred to as "texture splatting." If you set your texture with an alpha channel in your image manipulation program and then in the texture editor set the texture compression to something other than DXT1 you can get an alpha blended texture with a smooth falloff (the alpha wont affect the whole object. Just the texture). Make sure when you're doing this you don't use .jpg image types as jpegs don't carry the alpha channel across. I use .png In code that looks like this: float mask = texture(texture5,ex_texcoords0).a; Note I used ".a" at the end to signify the alpha channel. Here's a nice moldy example of that: 21_texture splat.zip Below is your standard two-color blend. It gets a little tricky adding the normals. I think in this one I just did the entire normals section twice but it just comes down to mixing the normals by the mask so I believe you could get away with something like this: vec3 normal = ex_normal; normal = normalize(texture(texture1,ex_texcoords0).xyz * 2.0 - 1.0); vec3 normal2 = normalize(texture(texture6,ex_texcoords0).xyz * 2.0 - 1.0); normal = mix(normal,normal2,mask); normal = ex_tangent * normal.x + ex_binormal * normal.y + ex_normal * normal.z; normal = normalize(normal); 22_two-color blend.zip This three-color mask actually uses an rgb mask rather than a greyscale. The shader looks exhausting but really I'm just doing the same thing over again three times - once for each color. You may notice on the right there that I ran out of spaces so I just use a generic method for the specular. It's kind of cool you can just change where the colors show by changing the mask. Make sure in making masks for this that you use really really blue, really really red, and really really green otherwise your textures will blend into areas you don't want them: 23_three-color blend.zip Here I used a cubemap to get some reflective little puddles in the mix and mixed the specular and normals by the mask without changing the diffuse: 24_bepuddled terrain.zip This one doesn't even use a mask but I think it's cool. It uses "normal.y" as a means of mixing textures. So if you want to add snow to the top of a rock, for instance, you can use this shader to do so. The blend settings are adjustable: 25_slope blend.zip Open your two-color blend shader and find the "float mask =.." line around line 54 and change it to " float mask = 1.0-normal.y; " ("1.0 - something " just inverts the thing), compile (press the "play" button on the Script Editor), and watch what happens. Or change that line to " float mask = gl_FragCoord.w + 0.35; " and check that out. Once you figure out your float you can pretty much mix your textures by anything. Happy shading! Next: Parallax
  11. havenphillip

    Creek

    One of those things I wanted to make - and a big reason I think many people get involved in shaders - is water. If you've been following along and grabbing these shaders you have now almost all the parts you need to make a basic water plane. If you understand these parts then you can manipulate them to customize your water as you see fit. I'm going to spend a little time in explanation here. In making this shader I start with the base shader and add a texture scroll effect on the normals (02_texture scroll). I then duplicate this texture and scroll it in the opposite direction to give the impression of rippling water. I then add a creek direction so they both flow together down river as they criss-cross each other and that gives the impression of flowing water. I add a refraction effect to further the illusion and use the normal.z to adjust the refraction amount. Normal.z sort of intensifies the drama of your normals in terms of depth. Using it with refraction warps the pixels behind the plane to a greater degree. With the refraction effect added you get a nice creek base. Here I've added a noobalyzed Leadwerks refraction shader, cutting some of the fat, so to speak , so that it is a bit more palatable to those wanting to learn shaders. Thank God Leadwerks includes a refraction shader because I think that process of working it out would have been ridiculous: 18_refraction.zip However, the Leadwerks refraction shader multiplies a camerainversenormalmatrix to the normals, which causes problems when you start trying to add things to it. So I came up with a solution that makes refraction work on the top of a box plane, but not necessarily on any of the sides. If you're making a water plane then that works perfectly, and it's a lot simpler than the above shader. I call it 19_creek base: 19_creek base.zip In developing a creek from this, the first thing I did was find the refraction line and multiply it by some color. And then I multiply that by some float to give me control over the brightness of the water color: vec4 refraction = texture(texture10, screencoord); vec3 water_color = vec3(0.8,1.0,1.0); refraction *= vec4(water_color,1.0); refraction *= 0.8; I then add a Phong effect to get a light reflection off the normals. Phong is very similar to Blinn-Phong but is half the size and is eerily similar to a fresnel effect. The difference is that in Blinn-Phong the light moves a bit with the eye. We don't really want that since we're simulating a distant sun lighting the plane. Under the refraction section I add this: //Phong vec3 lightvec = normalize(lightposition - projectionmatrix[3].xyz); float phong = dot(normal,lightvec); phong = pow(phong, 2000); ... and change the output line to this so I can see what's going on: ... fragData0 = mix(vec4(vec3(phong),1.0),refraction,0.5); You can see above I subtracted the projectionmatrix (ex_vertexposition.xyz also works) from the light position, got the dot product of this light direction and the normals, and multiplied the phong by some power. I use a ridiculously high number to sharpen up the effect. Increase the light position.y to about 8.0 or 10.0 to shine better on the top of the plane and you've got this effect (honestly it looks better in motion): Next I want to get a reflection so I add a cubemap (14_cubemap reflection) for that: //Cubemap vec3 eyevec = normalize(ex_vertexposition.xyz - cameraposition); vec3 reflection = normalize(reflect(eyevec,normal)); vec4 cube = texture(texture7,reflection); The cubemap seemed a little dull to me so I use this trick to quickly increase the contrast in the texture, and you may see this around in some of my shaders: float contrast = dot(cube.xyz,vec3(1.0))-0.5; cube.xyz *= contrast; After I have done this I mix the Phong and the Cubemap together so that I can adjust them together. Multiply each by some number to increase their brightnesses and I have a nice - albeit metallic-looking, creek: cube = mix(cube*1.5,vec4(vec3(phong),1)*2,0.5); ...fragData0 = mix(cube,refraction,0.5); To resolve this metallic issue I use a fresnel (09_fresnel) to mix the cubemap reflection and the refraction to produce a smooth gradient falloff of transparency to reflectivity according to the camera view of the water. That sounds more complex than it is: //Fresnel float ndotv = dot(eyevec,normal); ndotv = pow(ndotv, 0.55) ...fragData0 = mix(cube,refraction,ndotv); Note that I didn't establish an eyevec in the fresnel because I can reuse the eyevec established in the cubemap reflection. End result: 20_creek.zip (The above pictures show the Leadwerks cubemap being used in place of the one I provided and it looks a lot better because there is more going on in the sky). Lastly, you can take some of these numbers you've added and make variables of them at the top so they're all together. The water_color, for instance, feels nicer to me when I can put it at the top, open the shader and adjust it right away without having to go find it in the shader. So there's a creek, or a basic water. Not that much at all once you're familiar with each of the parts. And if you can understand these parts and how to fit them together you are a true noob shader...er. Happy shading! Next: Masks and Texture Blending...
  12. It's cube mapping https://adventuresinrendering.wordpress.com/2014/02/06/texture-mapping-and-image-based-lighting/
  13. Another simple shader technique is the skybox (or cubemap) reflection. Cubemaps are useful for creating quick and easy reflective surfaces for objects and work particularly well on curved surfaces or objects with bumpy normals. You can create an impression of a metallic, wet, or polished look quite easily (provided you have a way to generate cubemaps). In code, they are about as easy as fresnel, which gives them something of a broad utiity. Once you have the technique, you can treat it as a "part" to add to other shaders. Here's the cubemap section in the shader: vec3 eyevec = normalize(ex_vertexposition.xyz - cameraposition); vec3 reflection = normalize(reflect(eyevec, normal)); vec4 cube = texture(texture7,reflection); Not much at all. Below is my noob skybox reflection shader. Note the vec3 effect is put in "fragData2" which handles specular and emission. Basic noob specular output can be written like this "fragData2 = vec4(0,0,0,1); Technically, it is only the alpha channel in the output which handles the specular. The ".xyz" of the output handles emission. So anything you put there will become emissive, like a light. Try fragData2 like something like this: "fragData2 = vec4(1,0,0,1); Now your shader glows red and you've learned emission in shaders - you shader artist, you: 14_skybox reflection.zip Add a fresnel to a skybox reflection and you get a metallic chrome effect: 15_metal.zip I used a little contrast trick I learned from the desaturation shader and created a specular reflectance effect using a cubemap and a blinn-phong. It yields a polished look like wet or waxed paint: 16_polished.zip Below I combined the fresnel, blinn-phong and cubemap reflection to revisit the pearl shader. As you can see, for the most part, shaders are just a lot of smaller parts combined to produce some effect. it's a lot of mixing and matching. But once you have those parts you can reuse them to your heart's content. It is my goal that by the end of this series I will have given enough of these parts that those wanting to learn shaders can do just about anything they can think of, and if I succeed getting one guy up that initial step then this endeavor has been a success. Happy shading! 17_pearl revisited.zip Next: Creek...
  14. Here's the post where he shows the script. I copied it down: window = Window:Create("terrain example",0,0,800,600,Window.Titlebar+Window.Center) context = Context:Create(window) world = World:Create() camera = Camera:Create() camera:SetPosition(0,5,-5) camera:SetMultisampleMode(8) skybox = Texture:Load("Materials/Sky/skybox_texture.tex") camera:SetSkybox(skybox) light = DirectionalLight:Create() light:SetRotation(35,35,0) terrain = Terrain:Create(64,true) terrain:SetLayerTexture(0, Texture:Load("Materials/Developer/Bluegrid.tex"), 0) terrain:SetScale(1,100,1) ball = Model:Sphere(32) ballmat = Material:Load("Materials/Developer/Orangegrid.mat") ball:SetMaterial(ballmat) ball:SetScale(4,4,4) shape = Shape:Sphere(0,0,0, 0,0,0, 1,1,1) ball:SetShape(shape) shape:Release() ball:SetCollisionType(Collision.Prop) ball:SetSweptCollisionMode(true) ball:SetPosition(24,6,0) ball:SetMass(1) camera:Point(ball) while window:KeyDown(Key.Escape)==false do if window:Closed() then break end pos = ball:GetPosition(true) camera:SetPosition(pos.x,6,pos.z-10) camera:Point(ball) if terrain~=nil then for x = 0, 64 do local y = Time:Millisecs()/600 for z = 0, 64 do local height = math.sin(y+x) / 40 terrain:SetHeight(x,z,height) end end terrain:UpdateNormals() end Time:Update() world:Update() world:Render() context:Sync(true) end
  15. That's possible. I don't know how to do it. But I guess you could write the waves in the script. Just a sin wave on the surface. If we could do that we'd have wave height. Macklebee did something like that:
  16. If you took a bowl full of water and looked straight down into the bowl, the water would be clear. But if you were to then move the bowl up to your eye and look from the side along the surface of the water, you'd see it becomes highly reflective. That's fresnel. It's a cool effect. And it's easy to code. You add a uniform vec3 cameraposition towards the top and a fresnel color, and then a fresnel effect is little more than this: vec3 eyevec = normalize(cameraposition - ex_vertexposition.xyz); float ndotv = dot(normal,eyevec); After that all you need to do is output it. I put it in the specular output (fragData2). But you can put it in the the diffuse output as well. You just mix these two colors by the fresnel effect (or the 1.0-fresnel effect): fragData0 = mix(outcolor,fresnel color,ndotv); ...and there it is. Effects like this are nice because you can just keep tweaking them to make different things: 09_fresnel.zip So if fresnel is a step up from the base shader, blinn-phong seems to me like a logical second step. Blinn-phong puts a "dot" on an object which simulates a light. I find this useful in particular if you want to z-sort a material to gain semi-transparency. For instance, glass. The problem is once you z-sort it you lose specular information. So you can add a blinn-phong to your object and maintain specular. You can also adjust the size of the dot to give the impression of "shininess" like you can with specular. But one thing you can do quite easily with this is change the color of the light. Later I would like to write a shader that allows for multiple blinn-phong lights. I haven't gotten around to it. So in the fresnel you established the normalized difference between the cameraposition (eye) and the object position(ex_vertexposition). With blinn-phong you also want to establish the normalized difference between the object position and a light source (projectionmatrix). You then add the fresnel to this, get your dot product just like in the fresnel, and then mix this into an output line (fragDatasomething). Blinn-Phong: 10_blinn-phong.zip If you stagger the light falloff in the blinn-phong you can create a toon lighting effect: 11_toon lighting.zip Here you have two colors mixed by a fresnel effect to create a pearl effect: 12_pearl.zip Here's a blinn-phong with alpha: 13_blinn-phong + alpha.zip Next: Skybox Reflections...
  17. Leadwerks 4 is one of the best purchases I've ever made. I've spent thousands of hours on it and most of that time has been spent dinking around with shaders. I'm not great at modeling and animation. I'm ok with scripts. What seems to draw me the most is shaders. I love shaders. I guess in some way here I just want a place to show off my accomplishments, but I also want to feel like I benefit other people, so my philosophy here is that it's cool to be one guy who can write shaders, but it's cooler to be one guy who can help several other guys write shaders. So I had this idea that I would attempt to make one shader a week for a year - fifty-two of them - and then post them in the community. So that's what this blog will be. And who knows maybe I'll keep going. In the process of writing these I've scoured the internet and tried to keep track of any sources for those who might want to investigate some principle further. Many of these sources are tutorials and such. I numbered the shaders roughly with the intent to order them from basic to more complex principles. I commented sections in the shaders to help make them a bit easier to read and understand, and I tried to keep the language consistent from one shader to the next so that it can easily be seen how different parts are used in various effects. Apart from a few post-effect shaders, I start with a "noobalyzed" version of the Leadwerks "diffuse + normal + specular" shader as a base and start every shader from there - with all that I still consider myself a noob as far as coders go - hence "noob shaders." So if you're also a shader noob then follow this blog, grab the shaders, read the information from the links, tweak the numbers, dink around, and eventually it will start to click together. At the least you'll have a nice little stash of shaders. Ok so let me start with a few preliminaries. I want to bunch them up together and get them out of the way. First off, the base shader which I consider the "bare bones" shader. It is a "noobalyzed" version of the Leadwerks "diffuse + normal + specular" shader. 00_base.zip Here's a few quick effects I picked up from various places. It's not very useful, perhaps, but it's a good start on manipulating variables to create effects. 01_quick effects.zip And here are a few post-effect shaders. The contrast, desaturation, and gamma correct are all found in the quick effects shader. You can see that all these shaders look extremely similar. The effects are the only difference. In some sense all that matters is what you set in your output line (out vec4 fragData0). Simplistically speaking, that's the way shaders are. They're all pretty much the same. It's the stuff you add that makes them interesting. Edit: forgot to add shader 2. Texture scroll: 02_texture scroll.zip Possibly the worst contrast ever. But hey maybe you want that: 03_contrast.zip Desaturate dulls the colors toward grey: 04_desaturation.zip Saturation intensifies the colors: 05_saturation.zip Gamma correct is the way the light "is supposed to look": 06_gamma correct.zip Color bleed is the idea that colors ought to bleach out a bit in direct light: 07_color bleed.zip Sobel because why not: 08_sobel filter.zip Next: Fresnel and Blinn-Phong...
  18. Randomly generated? That's crazy. It totally feels natural.
  19. Ok I got it working. No script needed. Just slap it on an object. Works with posteffects now and no crazy watermode acrobatics. It's a repurpose of the Leadwerks soft particle shader. I set the material settings to blend mode:Alpha, uncheck cast shadows, check two-sided, check depth test, uncheck depth mask, check z-sort, check pick mode. forcefield.zip
  20. Yeah I'm just stockpiling right now. I have a bunch of shaders I'll put out in a while. I don't think it will matter that much since I think everyone's moved on from LE4. I get the benefit of having put in the effort, though.
  21. That's ok, man. I got the soft particle thing and the forcefield working using this method. The problem I was having was because I had other shaders in my scene using texture4 for something other than depth. They work perfectly now. I don't know why Josh wouldn't just set up a "depth mode" as a permanent thing so that people could use depth buffers for cool stuff. Leadwerks' aversion to shaders is self-defeating, imo.
  22. I made this forcefield shader using the watermode depth buffer which works great. I replaced the Leadwerks water shader with this dummy shader: //fragment #version 400 //Uniforms uniform sampler2D texture4;// depth void main(void) { float depth = textureProj(texture4,vec3(1)).r; } //vertex #version 400 #define MAX_INSTANCES 256 void main() { } ... and I used this script to turn on watermode and set the waterplane height: function Script:Start() world = World:GetCurrent() world:SetWaterMode(true) world:SetWaterQuality(1) end function Script:UpdatePhysics() world:SetWaterHeight(3.0) end ... everything works great and the shader works with post effects, which is something I couldn't get accomplished before, AND I'm getting excellent FPS. The problem is I lose the effect as soon as the water plane goes off camera. So I'm thinking if there is a way to create a "depth mode" which keeps the (invisible) water plane on camera at all times that would open up a nice little corner for using the depth buffer for things like a forcefield, water foam, water fog, ground fog, and soft particle falloff effects such as seen here: http://blog.wolfire.com/2010/04/Soft-Particles
  23. That's ok. I"m not a shader expert either. I appreciate you taking a stab at it. It would be nice to solve it just because it irritates me to find this apparent limitation. I'm hoping someone will read this and be able to tell me at least why it happens. It seems if I set the player (camera) in the scene then create the depth buffer object I can get away with putting the camera variable in the Start() function. But when I turn the player's head the depth buffer wobbles around in odd ways. If I remove the player and put him back in the scene, the depth buffer stops moving around but I then have to grab the camera in the UpdateWorld() function. I don't know if that has to do with the problem and might offer a clue, or if it's just a whole other problem.
×
×
  • Create New...