Jump to content

GUI Resolution Independence 2

Josh

570 views

Here are some screenshots showing more complex interface items scaled at different resolutions. First, here is the interface at 100% scaling:

100.thumb.png.a9ddde52315658a32464b302710f8769.png

And here is the same interface at the same screen resolution, with the DPI scaling turned up to 150%:

150.thumb.png.9524e424955ef5d5b701604e7ac786a9.png

The code to control this is sort of complex, and I don't care. GUI resolution independence is a complicated thing, so the goal should be to create a system that does what it is supposed to do reliably, not to make complicated things simpler at the expense of functionality.

function widget:Draw(x,y,width,height)
	local scale = self.gui:GetScale()

	self.primitives[1].size = iVec2(self.size.x, self.size.y - self.tabsize.y * scale)
	self.primitives[2].size = iVec2(self.size.x, self.size.y - self.tabsize.y * scale)

	--Tabs
	local n
	local tabpos = 0
	for n = 1, #self.items do
		local tw = self:TabWidth(n) * scale
		if n * 3 > #self.primitives - 2 then
			self:AddRect(iVec2(tabpos,0), iVec2(tw, self.tabsize.y * scale), self.bordercolor, false, self.itemcornerradius * scale)
			self:AddRect(iVec2(tabpos+1,1), iVec2(tw, self.tabsize.y * scale) - iVec2(2 * scale,-1 * scale), self.backgroundcolor, false, self.itemcornerradius * scale)
			self:AddTextRect(self.items[n].text, iVec2(tabpos,0), iVec2(tw, self.tabsize.y*scale), self.textcolor, TEXT_CENTER + TEXT_MIDDLE)
		end
		if self:SelectedItem() == n then
			self.primitives[2 + (n - 1) * 3 + 1].position = iVec2(tabpos, 0)
			self.primitives[2 + (n - 1) * 3 + 1].size = iVec2(tw, self.tabsize.y * scale) + iVec2(0,2)
			self.primitives[2 + (n - 1) * 3 + 2].position = iVec2(tabpos + 1, 1)
			self.primitives[2 + (n - 1) * 3 + 2].color = self.selectedtabcolor
			self.primitives[2 + (n - 1) * 3 + 2].size = iVec2(tw, self.tabsize.y * scale) - iVec2(2,-1)
			self.primitives[2 + (n - 1) * 3 + 3].color = self.hoveredtextcolor
			self.primitives[2 + (n - 1) * 3 + 1].position = iVec2(tabpos,0)
			self.primitives[2 + (n - 1) * 3 + 2].position = iVec2(tabpos + 1, 1)
			self.primitives[2 + (n - 1) * 3 + 3].position = iVec2(tabpos,0)
		else
			self.primitives[2 + (n - 1) * 3 + 1].size = iVec2(tw, self.tabsize.y * scale)
			self.primitives[2 + (n - 1) * 3 + 2].color = self.tabcolor
			self.primitives[2 + (n - 1) * 3 + 2].size = iVec2(tw, self.tabsize.y * scale) - iVec2(2,2)
			if n == self.hovereditem then
				self.primitives[2 + (n - 1) * 3 + 3].color = self.hoveredtextcolor
			else
				self.primitives[2 + (n - 1) * 3 + 3].color = self.textcolor
			end
			self.primitives[2 + (n - 1) * 3 + 1].position = iVec2(tabpos,2)
			self.primitives[2 + (n - 1) * 3 + 2].position = iVec2(tabpos + 1, 3)
			self.primitives[2 + (n - 1) * 3 + 3].position = iVec2(tabpos,2)
		end
		self.primitives[2 + (n - 1) * 3 + 3].text = self.items[n].text
		tabpos = tabpos + tw - 2
	end

end

 

  • Like 1


4 Comments


Recommended Comments

Quote

so the goal should be to create a system that does what it is supposed to do reliably, not to make complicated things simpler at the expense of functionality

taco.jpg

Things don't have to be complicated to be functional.  How many functions you create and how many arguments they have are in your hands.  Though I'm sure someone will be happy to create a wrapper.

Edit: you could also have a visual GUI designer but I know that's pushing it.

Share this comment


Link to comment

The code above is the widget script, which you don't have to touch. Usage to create a tabbed panel would be like this:

local tabber = CreateWidget(parent,"",0,0,400,300)
tabber:SetScript("Scripts/GUI/tabber.lua")
tabber:AddItem("Item 1", true)
tabber:AddItem("Item 2")
tabber:AddItem("Item 3")

 

  • Like 2

Share this comment


Link to comment

Yeah, it seems the only new/harder part is creating a Sprite layer for the window I/O passing a window pointer to a GUI object. Everything so far seems comprehendible. 

  • Like 1

Share this comment


Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Add a comment...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Blog Entries

    • By Josh in Josh's Dev Blog 4
      I have been working on 2D rendering off and on since October. Why am I putting so much effort into something that was fairly simple in Leadwerks 4? I have been designing a system in anticipation of some features I want to see in the GUI, namely VR support and in-game 3D user interfaces. These are both accomplished with 2D drawing performed on a texture. Our system of sprite layers, cameras, and sprites was necessary in order to provide enough control to accomplish this.
      I now have 2D drawing to a texture working, this time as an official supported feature. In Leadwerks 4, some draw-to-texture features were supported, but it was through undocumented commands due to the complex design of shared resources between OpenGL contexts. Vulkan does not have this problem because everything, including contexts (or rather, the VK equivalent) is bound to an abstract VkInstance object.

      Here is the Lua code that makes this program:
      --Get the primary display local displaylist = ListDisplays() local display = displaylist[1]; if display == nil then DebugError("Primary display not found.") end local displayscale = display:GetScale() --Create a window local window = CreateWindow(display, "2D Drawing to Texture", 0, 0, math.min(1280 * displayscale.x, display.size.x), math.min(720 * displayscale.y, display.size.y), WINDOW_TITLEBAR) --Create a rendering framebuffer local framebuffer = CreateFramebuffer(window); --Create a world local world = CreateWorld() --Create second camera local texcam = CreateCamera(world) --Create a camera local camera = CreateCamera(world) camera:Turn(45,-45,0) camera:Move(0,0,-2) camera:SetClearColor(0,0,1,1) --Create a texture buffer local texbuffer = CreateTextureBuffer(512,512,1,true) texcam:SetRenderTarget(texbuffer) --Create scene local box = CreateBox(world) --Create render-to-texture material local material = CreateMaterial() local tex = texbuffer:GetColorBuffer() material:SetTexture(tex, TEXTURE_BASE) box:SetMaterial(material) --Create a light local light = CreateLight(world,LIGHT_DIRECTIONAL) light:SetRotation(55,-55,0) light:SetColor(2,2,2,1) --Create a sprite layer. This can be shared across different cameras for control over which cameras display the 2D elements local layer = CreateSpriteLayer(world) texcam:AddSpriteLayer(layer) texcam:SetPosition(0,1000,0)--put the camera really far away --Load a sprite to display local sprite = LoadSprite(layer, "Materials/Sprites/23.svg", 0, 0.5) sprite:MidHandle(true,true) sprite:SetPosition(texbuffer.size.x * 0.5, texbuffer.size.y * 0.5) --Load font local font = LoadFont("Fonts/arial.ttf", 0) --Text shadow local textshadow = CreateText(layer, font, "Hello!", 36 * displayscale.y, TEXT_LEFT, 1) textshadow:SetColor(0,0,0,1) textshadow:SetPosition(50,30) textshadow:SetRotation(90) --Create text text = textshadow:Instantiate(layer) text:SetColor(1,1,1,1) text:SetPosition(52,32) text:SetRotation(90) --Main loop while window:Closed() == false do sprite:SetRotation(CurrentTime() / 30) world:Update() world:Render(framebuffer) end I have also added a GetTexCoords() command to the PickInfo structure. This will calculate the tangent and bitangent for the picked triangle and then calculate the UV coordinate at the picked position. It is necessary to calculate the non-normalized tangent and bitangent to get the texture coordinate, because the values that are stored in the vertex array are normalized and do not include the length of the vectors.
      local pick = camera:Pick(framebuffer, mousepos.x, mousepos.y, 0, true, 0) if pick ~= nil then local texcoords = pick:GetTexCoords() Print(texcoords) end Maybe I will make this into a Mesh method like GetPolygonTexCoord(), which would work just as well but could potentially be useful for other things. I have not decided yet.
      Now that we have 2D drawing to a texture, and the ability to calculate texture coordinates at a position on a mesh, the next step will be to set up a GUI displayed on a 3D surface, and to send input events to the GUI based on the user interactions in 3D space. The texture could be applied to a computer panel, like many of the interfaces in the newer DOOM games, or it could be used as a panel floating in the air that can be interacted with VR controllers.
    • By Josh in Josh's Dev Blog 0
      Putting all the pieces together, I was able to create a GUI with a sprite layer, attach it to a camera with a texture buffer render target, and render the GUI onto a texture applied to a 3D surface. Then I used the picked UV coords to convert to mouse coordinates and send user events to the GUI. Here is the result:

      This can be used for GUIs rendered onto surfaces in your game, or for a user interface that can be interacted with in VR. This example will be included in the next beta update.
    • By Josh in Josh's Dev Blog 4
      I started to implement quads for tessellation, and at that point the shader system reached the point of being unmanageable. Rendering an object to a shadow map and to a color buffer are two different processes that require two different shaders. Turbo introduces an early Z-pass which can use another shader, and if variance shadow maps are not in use this can be a different shader from the shadow shader. Rendering with tessellation requires another set of shaders, with one different set for each primitive type (isolines, triangles, and quads). And then each one of these needs a masked and opaque option, if alpha discard is enabled.
      All in all, there are currently 48 different shaders a material could use based on what is currently being drawn. This is unmanageable.
      To handle this I am introducing the concept of a "shader family". This is a JSON file that lists all possible permutations of a shader. Instead of setting lots of different shaders in a material, you just set the shader family one:
      shaderFamily: "PBR.json" Or in code:
      material->SetShaderFamily(LoadShaderFamily("PBR.json")); The shader family file is a big JSON structure that contains all the different shader modules for each different rendering configuration: Here are the partial contents of my PBR.json file:
      { "turboShaderFamily" : { "OPAQUE": { "default": { "base": { "vertex": "Shaders/PBR.vert.spv", "fragment": "Shaders/PBR.frag.spv" }, "depthPass": { "vertex": "Shaders/Depthpass.vert.spv" }, "shadow": { "vertex": "Shaders/Shadow.vert.spv" } }, "isolines": { "base": { "vertex": "Shaders/PBR_Tess.vert.spv", "tessellationControl": "Shaders/Isolines.tesc.spv", "tessellationEvaluation": "Shaders/Isolines.tese.spv", "fragment": "Shaders/PBR_Tess.frag.spv" }, "shadow": { "vertex": "Shaders/DepthPass_Tess.vert.spv", "tessellationControl": "Shaders/DepthPass_Isolines.tesc.spv", "tessellationEvaluation": "Shaders/DepthPass_Isolines.tese.spv" }, "depthPass": { "vertex": "Shaders/DepthPass_Tess.vert.spv", "tessellationControl": "DepthPass_Isolines.tesc.spv", "tessellationEvaluation": "DepthPass_Isolines.tese.spv" } }, "triangles": { "base": { "vertex": "Shaders/PBR_Tess.vert.spv", "tessellationControl": "Shaders/Triangles.tesc.spv", "tessellationEvaluation": "Shaders/Triangles.tese.spv", "fragment": "Shaders/PBR_Tess.frag.spv" }, "shadow": { "vertex": "Shaders/DepthPass_Tess.vert.spv", "tessellationControl": "Shaders/DepthPass_Triangles.tesc.spv", "tessellationEvaluation": "Shaders/DepthPass_Triangles.tese.spv" }, "depthPass": { "vertex": "Shaders/DepthPass_Tess.vert.spv", "tessellationControl": "DepthPass_Triangles.tesc.spv", "tessellationEvaluation": "DepthPass_Triangles.tese.spv" } }, "quads": { "base": { "vertex": "Shaders/PBR_Tess.vert.spv", "tessellationControl": "Shaders/Quads.tesc.spv", "tessellationEvaluation": "Shaders/Quads.tese.spv", "fragment": "Shaders/PBR_Tess.frag.spv" }, "shadow": { "vertex": "Shaders/DepthPass_Tess.vert.spv", "tessellationControl": "Shaders/DepthPass_Quads.tesc.spv", "tessellationEvaluation": "Shaders/DepthPass_Quads.tese.spv" }, "depthPass": { "vertex": "Shaders/DepthPass_Tess.vert.spv", "tessellationControl": "DepthPass_Quads.tesc.spv", "tessellationEvaluation": "DepthPass_Quads.tese.spv" } } } } } A shader family file can indicate a root to inherit values from. The Blinn-Phong shader family pulls settings from the PBR file and just switches some of the fragment shader values.
      { "turboShaderFamily" : { "root": "PBR.json", "OPAQUE": { "default": { "base": { "fragment": "Shaders/Blinn-Phong.frag.spv" } }, "isolines": { "base": { "fragment": "Shaders/Blinn-Phong_Tess.frag.spv" } }, "triangles": { "base": { "fragment": "Shaders/Blinn-Phong_Tess.frag.spv" } }, "quads": { "base": { "fragment": "Shaders/Blinn-Phong_Tess.frag.spv" } } } } } If you want to implement a custom shader, this is more work because you have to define all your changes for each possible shader variation. But once that is done, you have a new shader that will work with all of these different settings, which in the end is easier. I considered making a more complex inheritance / cascading schema but I think eliminating ambiguity is the most important goal in this and that should override any concern about the verbosity of these files. After all, I only plan on providing a couple of these files and you aren't going to need any more unless you are doing a lot of custom shaders. And if you are, this is the best solution for you anyways.
      Consequently, the baseShader, depthShader, etc. values in the material file definition are going away. Leadwerks .mat files will always use the Blinn-Phong shader family, and there is no way to change this without creating a material file in the new JSON material format.
      The shader class is no longer derived from the Asset class because it doesn't correspond to a single file. Instead, it is just a dumb container. A ShaderModule class derived from the Asset class has been added, and this does correspond with a single .spv file. But you, the user, won't really need to deal with any of this.
      The result of this is that one material will work with tessellation enabled or disabled, quad, triangle, or line meshes, and animated meshes. I also added an optional parameter in the CreatePlane(), CreateBox(), and CreateQuadSphere() commands that will create these primitives out of quads instead of triangles. The main reason for supporting quad meshes is that the tessellation is cleaner when quads are used. (Note that Vulkan still displays quads in wireframe mode as if they are triangles. I think the renderer probably converts them to normal triangles after the tessellation stage.)


      I also was able to implement PN Quads, which is a quad version of the Bezier curve that PN Triangles add to tessellation.



      Basically all the complexity is being packed into the shader family file so that these decisions only have to be made once instead of thousands of times for each different material.
×
×
  • Create New...