Jump to content

Render-To-Texture with Vulkan in Leadwerks 5

Josh

180 views

In Leadwerks 4, render-to-texture was accomplished with the SetRenderTarget command, which allowed a camera to draw directly to a specified texture, while hiding the underlying framebuffer object (FBO). In the new engine we have a bit more explicit handling of this behavior. This is largely in part due to the use of Vulkan's bindless design, which greatly improves the context-binding design of OpenGL. The Leadwerks "Buffer" class was never documented or officially supported because the underlying OpenGL functionality made the system pretty messy, but the design of Vulkan simplifies this aspect of graphics.

We have seen that the Framebuffer classes replaces the LE4 context. I've added a TextureBuffer class which can be created similarly:

shared_ptr<TextureBuffer> CreateTextureBuffer(const int width, const int height, const int colorcomponents = 1, const bool depthcomponent = true, const int samples = 0);

Once a TextureBuffer is created, you can set a camera to target it for rendering:

camera->SetRenderTarget(texbuffer);

You can also apply its color component(s) to a material:

material->SetTexture(texbuffer->GetColorBuffer(0), TEXTURE_BASE);

You could also retrieve the depth buffer and apply that to a material, rendering the scene from the top down and using the depth in a rain or snow shader, for example.

This functionality will later be used to render the GUI system to a texture for use in VR or with in-game menus painted onto 3D surfaces.

Like everything with Vulkan, this involved a very long process of figuring out everything we need to use, discarding the things we don't, and packaging it up in a structure that is actually usable by the end user. However, once all that is done we have a very powerful system that is optimized for exactly the way modern GPUs work. Here is a small sample of some of my code, just to give you an idea of how complicated this stuff is:

		for (auto pair : visset->cameravislists)
		{
			auto cam = pair.first;
			clear[1].color = { cam->clearcolor.r, cam->clearcolor.g, cam->clearcolor.b, cam->clearcolor.a };

			auto light = dynamic_pointer_cast<RenderLight>(cam);
			if (light == nullptr and cam->rendertarget == nullptr) continue;
			renderpass[0] = device->shadowpass;
			renderpass[1] = device->renderpass[CLEAR_COLOR | CLEAR_DEPTH];							
			int faces = 1;
			if (light)
			{
				if (light->description.type == LIGHT_POINT) faces = 6;
			}
			if (MULTIPASS_CUBEMAP) faces = 1;
			for (int face = 0; face < faces; ++face)
			{
				renderPassBeginInfo.clearValueCount = 2;				
				if (light)
				{
					renderPassBeginInfo.renderPass = device->shadowpass->pass;
					if (light->description.type == LIGHT_POINT and MULTIPASS_CUBEMAP == true)
					{
						renderPassBeginInfo.renderPass = device->cubeshadowpass->pass;
					}
					renderPassBeginInfo.framebuffer = light->shadowbuffer[face]->framebuffer;
					renderPassBeginInfo.renderArea.extent.width = light->shadowbuffer[face]->size.width;
					renderPassBeginInfo.renderArea.extent.height = light->shadowbuffer[face]->size.height;
				}
				else
				{
					renderpass[0] = device->renderpass[CLEAR_COLOR | CLEAR_DEPTH];
					int cc = cam->rendertarget->CountColorTextures();
					renderPassBeginInfo.renderPass = device->rendertotexturepass[cc][int(cam->rendertarget->depthtexture != nullptr)]->pass;
					renderPassBeginInfo.framebuffer = cam->rendertarget->framebuffer;
					renderPassBeginInfo.renderArea.extent.width = cam->rendertarget->size.width;
					renderPassBeginInfo.renderArea.extent.height = cam->rendertarget->size.height;
				}
				vkCmdBeginRenderPass(commandbuffers[currentFrame]->commandbuffer, &renderPassBeginInfo, VK_SUBPASS_CONTENTS_INLINE);
				RecordDraw(currentFrame, cam, pair.second, renderpass[0], face);
				commandbuffers[currentFrame]->EndRenderPass();
				if (light) commandbuffers[currentFrame]->BindResource(light->shadowbuffer[face]);

				//Copy output to render texture
				if (cam->rendertarget)
				{
					for (int n = 0; n < cam->rendertarget->colortarget.size(); ++n)
					{
						if (cam->rendertarget->colortarget[n] != nullptr)
						{
							commandbuffers[currentFrame]->TransitionImageLayout(pair.first->rendertarget->colortexture[n], VK_IMAGE_LAYOUT_COLOR_ATTACHMENT_OPTIMAL, VK_IMAGE_LAYOUT_TRANSFER_SRC_OPTIMAL, VK_PIPELINE_STAGE_COLOR_ATTACHMENT_OUTPUT_BIT, VK_PIPELINE_STAGE_TRANSFER_BIT, -1);
							commandbuffers[currentFrame]->TransitionImageLayout(pair.first->rendertarget->colortarget[n], VK_IMAGE_LAYOUT_SHADER_READ_ONLY_OPTIMAL, VK_IMAGE_LAYOUT_TRANSFER_DST_OPTIMAL, VK_PIPELINE_STAGE_COLOR_ATTACHMENT_OUTPUT_BIT, VK_PIPELINE_STAGE_TRANSFER_BIT, -1);

							VkImageCopy regions = {};
							regions.dstOffset = {0u,0u,0u};
							regions.extent = { uint32_t(cam->rendertarget->colortarget[n]->size.x), uint32_t(cam->rendertarget->colortarget[n]->size.y), 1u};
							regions.srcOffset = regions.dstOffset;
							regions.dstSubresource.aspectMask = VK_IMAGE_ASPECT_COLOR_BIT;
							regions.dstSubresource.baseArrayLayer = 0;
							regions.dstSubresource.layerCount = 1;
							regions.dstSubresource.mipLevel = 0;
							regions.srcSubresource = regions.dstSubresource;
							vkCmdCopyImage(commandbuffers[currentFrame]->commandbuffer, cam->rendertarget->colortexture[n]->vkimage, VK_IMAGE_LAYOUT_TRANSFER_SRC_OPTIMAL, cam->rendertarget->colortarget[n]->vkimage, VK_IMAGE_LAYOUT_TRANSFER_DST_OPTIMAL, 1, &regions);
							commandbuffers[currentFrame]->TransitionImageLayout(pair.first->rendertarget->colortarget[n], VK_IMAGE_LAYOUT_TRANSFER_DST_OPTIMAL, VK_IMAGE_LAYOUT_SHADER_READ_ONLY_OPTIMAL, VK_PIPELINE_STAGE_COLOR_ATTACHMENT_OUTPUT_BIT, VK_PIPELINE_STAGE_TRANSFER_BIT, -1);
						}
					}
				}
			}
		}

Below is a simple Lua program that sets up a scene with two cameras, and renders one camera to a texture buffer which is displayed on the middle box itself.

--Get the primary display
local displaylist = ListDisplays()
local display = displaylist[1];
if display == nil then DebugError("Primary display not found.") end
local displayscale = display:GetScale()

--Create a window
local window = CreateWindow(display, "Render to Texture", 0, 0, math.min(1280 * displayscale.x, display.size.x), math.min(720 * displayscale.y, display.size.y), WINDOW_TITLEBAR)

--Create a rendering framebuffer
local framebuffer = CreateFramebuffer(window);

--Create a world
local world = CreateWorld()

--Create second camera
local texcam = CreateCamera(world)
texcam:SetClearColor(1,0,1,1)

--Create a camera
local camera = CreateCamera(world)
camera:Move(0,0,-2)
camera:SetClearColor(0,0,1,1)

--Create a texture buffer
local texbuffer = CreateTextureBuffer(512,512,1,true)
texcam:SetRenderTarget(texbuffer)

--Create scene
local box = CreateBox(world)

local cone = CreateCone(world)
cone:SetPosition(2,0,0)
cone:SetColor(1,0,0,1)

local sphere = CreateSphere(world)
sphere:SetPosition(-2,0,0)
sphere:SetColor(0,1,0,1)

--Create render-to-texture material
local material = CreateMaterial()
local tex = texbuffer:GetColorBuffer()
material:SetTexture(tex, TEXTURE_BASE)
box:SetMaterial(material)

--Create a light
local light = CreateLight(world,LIGHT_DIRECTIONAL)
light:SetRotation(35,-55,0)

--Main loop
while window:Closed() == false do

	texcam:SetPosition(0,0,0)
	texcam:Turn(0,1,0)
	texcam:Move(0,0,-2)

	world:Update()
	world:Render(framebuffer)

end

Here is the result. Look how simple it is to control this powerful system!

Untitled.thumb.jpg.b9e8deec618971d0f3482f031e52c228.jpg

  • Like 3


0 Comments


Recommended Comments

There are no comments to display.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Add a comment...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Blog Entries

    • By Josh in Josh's Dev Blog 4
      I have been working on 2D rendering off and on since October. Why am I putting so much effort into something that was fairly simple in Leadwerks 4? I have been designing a system in anticipation of some features I want to see in the GUI, namely VR support and in-game 3D user interfaces. These are both accomplished with 2D drawing performed on a texture. Our system of sprite layers, cameras, and sprites was necessary in order to provide enough control to accomplish this.
      I now have 2D drawing to a texture working, this time as an official supported feature. In Leadwerks 4, some draw-to-texture features were supported, but it was through undocumented commands due to the complex design of shared resources between OpenGL contexts. Vulkan does not have this problem because everything, including contexts (or rather, the VK equivalent) is bound to an abstract VkInstance object.

      Here is the Lua code that makes this program:
      --Get the primary display local displaylist = ListDisplays() local display = displaylist[1]; if display == nil then DebugError("Primary display not found.") end local displayscale = display:GetScale() --Create a window local window = CreateWindow(display, "2D Drawing to Texture", 0, 0, math.min(1280 * displayscale.x, display.size.x), math.min(720 * displayscale.y, display.size.y), WINDOW_TITLEBAR) --Create a rendering framebuffer local framebuffer = CreateFramebuffer(window); --Create a world local world = CreateWorld() --Create second camera local texcam = CreateCamera(world) --Create a camera local camera = CreateCamera(world) camera:Turn(45,-45,0) camera:Move(0,0,-2) camera:SetClearColor(0,0,1,1) --Create a texture buffer local texbuffer = CreateTextureBuffer(512,512,1,true) texcam:SetRenderTarget(texbuffer) --Create scene local box = CreateBox(world) --Create render-to-texture material local material = CreateMaterial() local tex = texbuffer:GetColorBuffer() material:SetTexture(tex, TEXTURE_BASE) box:SetMaterial(material) --Create a light local light = CreateLight(world,LIGHT_DIRECTIONAL) light:SetRotation(55,-55,0) light:SetColor(2,2,2,1) --Create a sprite layer. This can be shared across different cameras for control over which cameras display the 2D elements local layer = CreateSpriteLayer(world) texcam:AddSpriteLayer(layer) texcam:SetPosition(0,1000,0)--put the camera really far away --Load a sprite to display local sprite = LoadSprite(layer, "Materials/Sprites/23.svg", 0, 0.5) sprite:MidHandle(true,true) sprite:SetPosition(texbuffer.size.x * 0.5, texbuffer.size.y * 0.5) --Load font local font = LoadFont("Fonts/arial.ttf", 0) --Text shadow local textshadow = CreateText(layer, font, "Hello!", 36 * displayscale.y, TEXT_LEFT, 1) textshadow:SetColor(0,0,0,1) textshadow:SetPosition(50,30) textshadow:SetRotation(90) --Create text text = textshadow:Instantiate(layer) text:SetColor(1,1,1,1) text:SetPosition(52,32) text:SetRotation(90) --Main loop while window:Closed() == false do sprite:SetRotation(CurrentTime() / 30) world:Update() world:Render(framebuffer) end I have also added a GetTexCoords() command to the PickInfo structure. This will calculate the tangent and bitangent for the picked triangle and then calculate the UV coordinate at the picked position. It is necessary to calculate the non-normalized tangent and bitangent to get the texture coordinate, because the values that are stored in the vertex array are normalized and do not include the length of the vectors.
      local pick = camera:Pick(framebuffer, mousepos.x, mousepos.y, 0, true, 0) if pick ~= nil then local texcoords = pick:GetTexCoords() Print(texcoords) end Maybe I will make this into a Mesh method like GetPolygonTexCoord(), which would work just as well but could potentially be useful for other things. I have not decided yet.
      Now that we have 2D drawing to a texture, and the ability to calculate texture coordinates at a position on a mesh, the next step will be to set up a GUI displayed on a 3D surface, and to send input events to the GUI based on the user interactions in 3D space. The texture could be applied to a computer panel, like many of the interfaces in the newer DOOM games, or it could be used as a panel floating in the air that can be interacted with VR controllers.
    • By Josh in Josh's Dev Blog 0
      Putting all the pieces together, I was able to create a GUI with a sprite layer, attach it to a camera with a texture buffer render target, and render the GUI onto a texture applied to a 3D surface. Then I used the picked UV coords to convert to mouse coordinates and send user events to the GUI. Here is the result:

      This can be used for GUIs rendered onto surfaces in your game, or for a user interface that can be interacted with in VR. This example will be included in the next beta update.
    • By Josh in Josh's Dev Blog 4
      I started to implement quads for tessellation, and at that point the shader system reached the point of being unmanageable. Rendering an object to a shadow map and to a color buffer are two different processes that require two different shaders. Turbo introduces an early Z-pass which can use another shader, and if variance shadow maps are not in use this can be a different shader from the shadow shader. Rendering with tessellation requires another set of shaders, with one different set for each primitive type (isolines, triangles, and quads). And then each one of these needs a masked and opaque option, if alpha discard is enabled.
      All in all, there are currently 48 different shaders a material could use based on what is currently being drawn. This is unmanageable.
      To handle this I am introducing the concept of a "shader family". This is a JSON file that lists all possible permutations of a shader. Instead of setting lots of different shaders in a material, you just set the shader family one:
      shaderFamily: "PBR.json" Or in code:
      material->SetShaderFamily(LoadShaderFamily("PBR.json")); The shader family file is a big JSON structure that contains all the different shader modules for each different rendering configuration: Here are the partial contents of my PBR.json file:
      { "turboShaderFamily" : { "OPAQUE": { "default": { "base": { "vertex": "Shaders/PBR.vert.spv", "fragment": "Shaders/PBR.frag.spv" }, "depthPass": { "vertex": "Shaders/Depthpass.vert.spv" }, "shadow": { "vertex": "Shaders/Shadow.vert.spv" } }, "isolines": { "base": { "vertex": "Shaders/PBR_Tess.vert.spv", "tessellationControl": "Shaders/Isolines.tesc.spv", "tessellationEvaluation": "Shaders/Isolines.tese.spv", "fragment": "Shaders/PBR_Tess.frag.spv" }, "shadow": { "vertex": "Shaders/DepthPass_Tess.vert.spv", "tessellationControl": "Shaders/DepthPass_Isolines.tesc.spv", "tessellationEvaluation": "Shaders/DepthPass_Isolines.tese.spv" }, "depthPass": { "vertex": "Shaders/DepthPass_Tess.vert.spv", "tessellationControl": "DepthPass_Isolines.tesc.spv", "tessellationEvaluation": "DepthPass_Isolines.tese.spv" } }, "triangles": { "base": { "vertex": "Shaders/PBR_Tess.vert.spv", "tessellationControl": "Shaders/Triangles.tesc.spv", "tessellationEvaluation": "Shaders/Triangles.tese.spv", "fragment": "Shaders/PBR_Tess.frag.spv" }, "shadow": { "vertex": "Shaders/DepthPass_Tess.vert.spv", "tessellationControl": "Shaders/DepthPass_Triangles.tesc.spv", "tessellationEvaluation": "Shaders/DepthPass_Triangles.tese.spv" }, "depthPass": { "vertex": "Shaders/DepthPass_Tess.vert.spv", "tessellationControl": "DepthPass_Triangles.tesc.spv", "tessellationEvaluation": "DepthPass_Triangles.tese.spv" } }, "quads": { "base": { "vertex": "Shaders/PBR_Tess.vert.spv", "tessellationControl": "Shaders/Quads.tesc.spv", "tessellationEvaluation": "Shaders/Quads.tese.spv", "fragment": "Shaders/PBR_Tess.frag.spv" }, "shadow": { "vertex": "Shaders/DepthPass_Tess.vert.spv", "tessellationControl": "Shaders/DepthPass_Quads.tesc.spv", "tessellationEvaluation": "Shaders/DepthPass_Quads.tese.spv" }, "depthPass": { "vertex": "Shaders/DepthPass_Tess.vert.spv", "tessellationControl": "DepthPass_Quads.tesc.spv", "tessellationEvaluation": "DepthPass_Quads.tese.spv" } } } } } A shader family file can indicate a root to inherit values from. The Blinn-Phong shader family pulls settings from the PBR file and just switches some of the fragment shader values.
      { "turboShaderFamily" : { "root": "PBR.json", "OPAQUE": { "default": { "base": { "fragment": "Shaders/Blinn-Phong.frag.spv" } }, "isolines": { "base": { "fragment": "Shaders/Blinn-Phong_Tess.frag.spv" } }, "triangles": { "base": { "fragment": "Shaders/Blinn-Phong_Tess.frag.spv" } }, "quads": { "base": { "fragment": "Shaders/Blinn-Phong_Tess.frag.spv" } } } } } If you want to implement a custom shader, this is more work because you have to define all your changes for each possible shader variation. But once that is done, you have a new shader that will work with all of these different settings, which in the end is easier. I considered making a more complex inheritance / cascading schema but I think eliminating ambiguity is the most important goal in this and that should override any concern about the verbosity of these files. After all, I only plan on providing a couple of these files and you aren't going to need any more unless you are doing a lot of custom shaders. And if you are, this is the best solution for you anyways.
      Consequently, the baseShader, depthShader, etc. values in the material file definition are going away. Leadwerks .mat files will always use the Blinn-Phong shader family, and there is no way to change this without creating a material file in the new JSON material format.
      The shader class is no longer derived from the Asset class because it doesn't correspond to a single file. Instead, it is just a dumb container. A ShaderModule class derived from the Asset class has been added, and this does correspond with a single .spv file. But you, the user, won't really need to deal with any of this.
      The result of this is that one material will work with tessellation enabled or disabled, quad, triangle, or line meshes, and animated meshes. I also added an optional parameter in the CreatePlane(), CreateBox(), and CreateQuadSphere() commands that will create these primitives out of quads instead of triangles. The main reason for supporting quad meshes is that the tessellation is cleaner when quads are used. (Note that Vulkan still displays quads in wireframe mode as if they are triangles. I think the renderer probably converts them to normal triangles after the tessellation stage.)


      I also was able to implement PN Quads, which is a quad version of the Bezier curve that PN Triangles add to tessellation.



      Basically all the complexity is being packed into the shader family file so that these decisions only have to be made once instead of thousands of times for each different material.
×
×
  • Create New...