Jump to content

GUI Resolution Independence

Josh

204 views

DPI scaling and the 2D drawing and GUI system were an issue I was a bit concerned about, but I think I have it worked out. This all goes back to the multi-monitor support that I designed back in September. Part of that system allows you to retrieve the DPI scale for each display. This gives you another piece of information in addition to the raw screen resolution. The display scale gives you a percentage value the user expects to see vector graphics at, with 100% being what you would expect with a regular HD monitor. If we scale our GUI elements and font sizes by the display scale we can adjust for screens with any pixel density.

This shot shows 1920x1080 fullscreen with DPI scaling set to 100%:

100.thumb.png.738d54b1e3c9c2465da12614a37c06ff.png

Here we see the same resolution, with scaling set to 125%:

125.thumb.png.bdb32955946b1145f75798d4b94b099f.png

And this is with scaling set to 150%:

150.thumb.png.d03d555ab32cbd474000d580087b5a53.png

The effect of this is that if the player is using a 4K, 8K, or any other type of monitor, your game can display finely detailed text at the correct size the user expects to see. It also means that user interfaces can be rendered at any resolution for VR.

Rather than trying to automatically scale GUI elements I am giving you full control over the raw pixels. That means you have to decide how your widgets will be scaled yourself, and program it into the game interface, but there is nothing hidden from the developer. Here is my code I am working with now to create a simple game menu. Also notice there is no CreatePanel(), CreateButton(), etc. anymore, there is just one widget you create and set the script for. I might add an option for C++ actors as well, but since these are operating on the main logic thread there's not really a downside to running the code in Lua.

		local window = ActiveWindow()
		if window == nullptr then return end	
		local framebuffer = window:GetFramebuffer()
		if framebuffer == nil then return end
		self.gui = CreateGUI(self.guispritelayer)
		
		--Main background panel
		self.mainpanel = CreateWidget(self.gui,"",0,0,framebuffer.size.x,framebuffer.size.y)
		self.mainpanel:SetScript("Scripts/GUI/Panel.lua", true)

		local scale = window.display.scale.y
		local w = 120
		local h = 24
		local sep = 36
		local x = framebuffer.size.x / 6
		local y = framebuffer.size.y / 2 - sep * 3
		
		self.resumebutton = CreateWidget(self.mainpanel,"RESUME GAME",x,y,w,h)
		self.resumebutton:SetScript("Scripts/GUI/Hyperlink.lua", true)
		self.resumebutton:SetFontSize(14 * window.display.scale.y)
		y=y+sep*2

		self.label2 = CreateWidget(self.mainpanel,"OPTIONS",x,y,w,h)
		self.label2:SetScript("Scripts/GUI/Hyperlink.lua", true)
		self.label2:SetFontSize(14 * window.display.scale.y)
		y=y+sep*2

		self.quitbutton = CreateWidget(self.mainpanel,"QUIT", x,y, w,h)
		self.quitbutton:SetScript("Scripts/GUI/Hyperlink.lua", true)
		self.quitbutton:SetFontSize(14 * window.display.scale.y)

		w = 400 * scale
		h = 550 * scale
		self.optionspanel = CreateWidget(self.mainpanel,"QUIT", (framebuffer.size.x- w) * 0.5, (framebuffer.size.y - h) * 0.5, w, h)
		self.optionspanel:SetScript("Scripts/GUI/Panel.lua", true)
		self.optionspanel.color = Vec4(0.2,0.2,0.2,1)
		self.optionspanel.border = true
		self.optionspanel.radius = 8 * scale
		self.optionspanel.hidden = true

 

  • Like 4


2 Comments


Recommended Comments

I see your really focused on the GUI now. I'm taking that as you want to get started with the new editor; which I will not stop you. 🙂

Share this comment


Link to comment
8 hours ago, reepblue said:

I see your really focused on the GUI now. I'm taking that as you want to get started with the new editor; which I will not stop you. 🙂

I'm trying to make the engine feature-complete ASAP because there are companies that want to use it in VR projects right now.

The beta testers really help make this possible!

  • Like 2

Share this comment


Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Add a comment...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Blog Entries

    • By Josh in Josh's Dev Blog 6
      For finer control over what 2D elements appear on what camera, I have implemented a system of "Sprite Layers". Here's how it works:
      A sprite layer is created in a world. Sprites are created in a layer. Layers are attached to a camera (in the same world). The reason the sprite layer is linked to the world is because the render tweening operates on a per-world basis, and it works with the sprite system just like the entity system. In fact, the rendering thread uses the same RenderNode class for both.
      I have basic GUI functionality working now. A GUI can be created directly on a window and use the OS drawing commands, or it can be created on a sprite layer and rendered with 3D graphics. The first method is how I plan to make the new editor user interface, while the second is quite flexible. The most common usage will be to create a sprite layer, attach it to the main camera, and add a GUI to appear in-game. However, you can just as easily attach a sprite layer to a camera that has a texture render target, and make the GUI appear in-game on a panel in 3D. Because of these different usages, you must manually insert events like mouse movements into the GUI in order for it to process them:
      while true do local event = GetEvent() if event.id == EVENT_NONE then break end if event.id == EVENT_MOUSE_DOWN or event.id == EVENT_MOUSE_MOVE or event.id == EVENT_MOUSE_UP or event.id == EVENT_KEY_DOWN or event.id == EVENT_KEY_UP then gui:ProcessEvent(event) end end You could also input your own events from the mouse position to create interactive surfaces, like in games like DOOM and Soma. Or you can render the GUI to a texture and interact with it by feeding in input from VR controllers.

      Because the new 2D drawing system uses persistent objects instead of drawing commands the code to display elements has changed quite a lot. Here is my current button script. I implemented a system of abstract GUI "rectangles" the script can create and modify. If the GUI is attached to a sprite layer these get translated into sprites, and if it is attached directly to a window they get translated into system drawing commands. Note that the AddTextRect doesn't even allow you to access the widget text directly because the widget text is stored in a wstring, which supports Unicode characters but is not supported by Lua.
      --Default values widget.pushed=false widget.hovered=false widget.textindent=4 widget.checkboxsize=14 widget.checkboxindent=5 widget.radius=3 widget.textcolor = Vec4(1,1,1,1) widget.bordercolor = Vec4(0,0,0,0) widget.hoverbordercolor = Vec4(51/255,151/255,1) widget.backgroundcolor = Vec4(0.2,0.2,0.2,1) function widget:MouseEnter(x,y) self.hovered = true self:Redraw() end function widget:MouseLeave(x,y) self.hovered = false self:Redraw() end function widget:MouseDown(button,x,y) if button == MOUSE_LEFT then self.pushed=true self:Redraw() end end function widget:MouseUp(button,x,y) if button == MOUSE_LEFT then self.pushed = false if self.hovered then EmitEvent(EVENT_WIDGET_ACTION,self) end self:Redraw() end end function widget:OK() EmitEvent(EVENT_WIDGET_ACTION,self) end function widget:KeyDown(keycode) if keycode == KEY_ENTER then EmitEvent(EVENT_WIDGET_ACTION,self) self:Redraw() end end function widget:Start() --Background self:AddRect(self.position, self.size, self.backgroundcolor, false, self.radius) --Border if self.hovered == true then self:AddRect(self.position, self.size, self.hoverbordercolor, true, self.radius) else self:AddRect(self.position, self.size, self.bordercolor, true, self.radius) end --Text if self.pushed == true then self:AddTextRect(self.position + iVec2(1,1), self.size, self.textcolor, TEXT_CENTER + TEXT_MIDDLE) else self:AddTextRect(self.position, self.size, self.textcolor, TEXT_CENTER + TEXT_MIDDLE) end end function widget:Draw() --Update position and size self.primitives[1].position = self.position self.primitives[1].size = self.size self.primitives[2].position = self.position self.primitives[2].size = self.size self.primitives[3].size = self.size --Update the border color based on the current hover state if self.hovered == true then self.primitives[2].color = self.hoverbordercolor else self.primitives[2].color = self.bordercolor end --Offset the text when button is pressed if self.pushed == true then self.primitives[3].position = self.position + iVec2(1,1) else self.primitives[3].position = self.position end end This is arguably harder to use than the Leadwerks 4 system, but it gives you advanced capabilities and better performance that the previous design did not allow.
    • By reepblue in reepblue's Blog 1
      Premake is multiplication project maker.Unlike CMake, it simply generates a project file for the given IDE giving you a clean result. You only need the one light weight executable and a lua script for this to work.  I've spent today setting it up with Leadwerks. I haven't tested Linux yet, but it should work.
      My premake5.lua file:
      g_LeadwerksHeaderPath = "./Engine/Include" g_LeadwerksLibPath = "./Engine/Libs" function GlobalSettings() -- Include Directories includedirs { "%{prj.name}", "%{g_LeadwerksHeaderPath}", "%{g_LeadwerksHeaderPath}/Libraries/SDL2-2.0.10/include", "%{g_LeadwerksHeaderPath}/Libraries/NewtonDynamics/sdk/dgCore", "%{g_LeadwerksHeaderPath}/Libraries/NewtonDynamics/sdk/dgNewton", "%{g_LeadwerksHeaderPath}/Libraries/libvorbis/include", "%{g_LeadwerksHeaderPath}/Libraries/libogg/include", "%{g_LeadwerksHeaderPath}/Libraries/openssl/include", "%{g_LeadwerksHeaderPath}/Libraries/VHACD/src/VHACD_Lib/inc", "%{g_LeadwerksHeaderPath}/Libraries/glslang", "%{g_LeadwerksHeaderPath}/Libraries/freetype-2.4.7/include", "%{g_LeadwerksHeaderPath}/Libraries/OpenAL/include", "%{g_LeadwerksHeaderPath}/Libraries/NewtonDynamics/sdk/dMath", "%{g_LeadwerksHeaderPath}/Libraries/NewtonDynamics/sdk/dgTimeTracker", "%{g_LeadwerksHeaderPath}/Libraries/NewtonDynamics/sdk/dContainers", "%{g_LeadwerksHeaderPath}/Libraries/NewtonDynamics/sdk/dCustomJoints", "%{g_LeadwerksHeaderPath}/Libraries/RecastNavigation/RecastDemo/Include", "%{g_LeadwerksHeaderPath}/Libraries/RecastNavigation/DetourCrowd/Include", "%{g_LeadwerksHeaderPath}/Libraries/RecastNavigation/DetourTileCache/Include", "%{g_LeadwerksHeaderPath}/Libraries/RecastNavigation/DebugUtils/Include", "%{g_LeadwerksHeaderPath}/Libraries/RecastNavigation/Recast/Include", "%{g_LeadwerksHeaderPath}/Libraries/RecastNavigation/Detour/Include", "%{g_LeadwerksHeaderPath}/Libraries/tolua++-1.0.93/include", "%{g_LeadwerksHeaderPath}/Libraries/lua-5.1.4", "%{g_LeadwerksHeaderPath}/Libraries/glew-1.6.0/include/GL", "%{g_LeadwerksHeaderPath}/Libraries/glew-1.6.0/include", "%{g_LeadwerksHeaderPath}/Libraries/enet-1.3.1/include", "%{g_LeadwerksHeaderPath}/Libraries/zlib-1.2.5", "%{g_LeadwerksHeaderPath}/Libraries/freetype-2.4.3/include" } -- Global Defines: defines { "__STEAM__", "_CUSTOM_JOINTS_STATIC_LIB", "FT2_BUILD_LIBRARY", "LEADWERKS_3_1", "DG_DISABLE_ASSERT", "OPENGL", "_NEWTON_STATIC_LIB", "_STATICLIB" } -- Windows Exclusive: filter "system:windows" systemversion "latest" pchsource "%{prj.name}/stdafx.cpp" links { "libcryptoMT.lib", "libsslMT.lib", "Rpcrt4.lib", "crypt32.lib", "libcurl.lib", "msimg32.lib", "lua51.lib", "steam_api.lib", "ws2_32.lib", "Glu32.lib", "libovrd.lib", "OpenGL32.lib", "winmm.lib", "Psapi.lib", "OpenAL32.lib", "SDL2.lib", "Leadwerks.lib" } libdirs { "%{g_LeadwerksLibPath}/Windows/x86", "%{g_LeadwerksLibPath}/Windows/x86/%{cfg.buildcfg}" } defines { "PSAPI_VERSION=1", "PTW32_STATIC_LIB", "PTW32_BUILD", "_NEWTON_USE_LIB", "_LIB", "DG_USE_NORMAL_PRIORITY_THREAD", "GLEW_STATIC", "WINDOWS", "WIN32", "OS_WINDOWS", "PLATFORM_WINDOWS", "_WIN_32_VER" } buildoptions { "/D \"SLB_LIBRARY\"", } flags { "NoMinimalRebuild" } linkoptions { "/NODEFAULTLIB:MSVCRT.lib", "/NODEFAULTLIB:MSVCRTD.lib" } -- Linux Exclusive: filter "system:linux" systemversion "latest" linkoptions { "-ldl", "-lopenal", "-lGL", "-lGLU", "-lX11", "-lXext", "-lXrender", "-lXft", "-lpthread", "-lcurl", --"-lSDL2", "%{g_LeadwerksLibPath}/Linux/libluajit.a", "%{gameDir}/libopenvr_api.so" } defines { "ZLIB", "PLATFORM_LINUX", "unix", "_POSIX_VER", "_POSIX_VER_64", "DG_THREAD_EMULATION", "DG_USE_THREAD_EMULATION", "GL_GLEXT_PROTOTYPES", "LUA_USE_LINUX", "_GLIBCXX_USE_CXX11_ABI", "_CUSTOM_JOINTS_STATIC_LIB" } linkoptions { "%{g_LeadwerksLibPath}/Linux/%{cfg.buildcfg}/Leadwerks.a" } -- Debug Build: filter "configurations:Debug" runtime "Debug" symbols "on" targetsuffix ".debug" defines { "DEBUG", "_DEBUG" } if os.target() == "windows" then links { "newton_d.lib", "dContainers_d.lib", "dCustomJoints_d.lib" } end -- Release Build: filter "configurations:Release" runtime "Release" optimize "on" if os.target() == "windows" then buildoptions { "/MP" } links { "newton.lib", "dContainers.lib", "dCustomJoints.lib" } end end function GenerateLuaApp() workspace "PremakeTest" architecture "x86" --architecture "x86_64" startproject "LuaApp" configurations { "Debug", "Release" } -- Test application project "LuaApp" kind "ConsoleApp" language "C++" location "%{prj.name}" staticruntime "on" -- Project Directory: projectDir = "%{prj.name}/" -- Game Directory: gameDir = _WORKING_DIR .. "/../Projects/%{prj.name}" -- OBJ Directory objdir (projectDir .. "%{cfg.buildcfg}_%{prj.name}") targetdir (gameDir) files { "%{prj.name}/**.h", "%{prj.name}/**.cpp" } pchheader "stdafx.h" -- Custom Defines defines { "__TEST_ME_", } GlobalSettings() end newaction { trigger = "luaapp", description = "Builds the stock lua app", execute = GenerateLuaApp() } if _ACTION == "luaapp" then GenerateLuaApp() end Then I just have batch file that builds the project file.
      "devtools/premake5.exe" vs2017 luaapp pause My impressions are more positive on this than CMake as CMake creates a lot of extra files to do real time updating if the CMakeList file is updated. Premake simply just builds the project file and that's it. It really reminds me of VPC stuff I had to deal with in my modding days. Really interested to how codelite projects generate on Linux.
    • By Josh in Josh's Dev Blog 4
      I have been working on 2D rendering off and on since October. Why am I putting so much effort into something that was fairly simple in Leadwerks 4? I have been designing a system in anticipation of some features I want to see in the GUI, namely VR support and in-game 3D user interfaces. These are both accomplished with 2D drawing performed on a texture. Our system of sprite layers, cameras, and sprites was necessary in order to provide enough control to accomplish this.
      I now have 2D drawing to a texture working, this time as an official supported feature. In Leadwerks 4, some draw-to-texture features were supported, but it was through undocumented commands due to the complex design of shared resources between OpenGL contexts. Vulkan does not have this problem because everything, including contexts (or rather, the VK equivalent) is bound to an abstract VkInstance object.

      Here is the Lua code that makes this program:
      --Get the primary display local displaylist = ListDisplays() local display = displaylist[1]; if display == nil then DebugError("Primary display not found.") end local displayscale = display:GetScale() --Create a window local window = CreateWindow(display, "2D Drawing to Texture", 0, 0, math.min(1280 * displayscale.x, display.size.x), math.min(720 * displayscale.y, display.size.y), WINDOW_TITLEBAR) --Create a rendering framebuffer local framebuffer = CreateFramebuffer(window); --Create a world local world = CreateWorld() --Create second camera local texcam = CreateCamera(world) --Create a camera local camera = CreateCamera(world) camera:Turn(45,-45,0) camera:Move(0,0,-2) camera:SetClearColor(0,0,1,1) --Create a texture buffer local texbuffer = CreateTextureBuffer(512,512,1,true) texcam:SetRenderTarget(texbuffer) --Create scene local box = CreateBox(world) --Create render-to-texture material local material = CreateMaterial() local tex = texbuffer:GetColorBuffer() material:SetTexture(tex, TEXTURE_BASE) box:SetMaterial(material) --Create a light local light = CreateLight(world,LIGHT_DIRECTIONAL) light:SetRotation(55,-55,0) light:SetColor(2,2,2,1) --Create a sprite layer. This can be shared across different cameras for control over which cameras display the 2D elements local layer = CreateSpriteLayer(world) texcam:AddSpriteLayer(layer) texcam:SetPosition(0,1000,0)--put the camera really far away --Load a sprite to display local sprite = LoadSprite(layer, "Materials/Sprites/23.svg", 0, 0.5) sprite:MidHandle(true,true) sprite:SetPosition(texbuffer.size.x * 0.5, texbuffer.size.y * 0.5) --Load font local font = LoadFont("Fonts/arial.ttf", 0) --Text shadow local textshadow = CreateText(layer, font, "Hello!", 36 * displayscale.y, TEXT_LEFT, 1) textshadow:SetColor(0,0,0,1) textshadow:SetPosition(50,30) textshadow:SetRotation(90) --Create text text = textshadow:Instantiate(layer) text:SetColor(1,1,1,1) text:SetPosition(52,32) text:SetRotation(90) --Main loop while window:Closed() == false do sprite:SetRotation(CurrentTime() / 30) world:Update() world:Render(framebuffer) end I have also added a GetTexCoords() command to the PickInfo structure. This will calculate the tangent and bitangent for the picked triangle and then calculate the UV coordinate at the picked position. It is necessary to calculate the non-normalized tangent and bitangent to get the texture coordinate, because the values that are stored in the vertex array are normalized and do not include the length of the vectors.
      local pick = camera:Pick(framebuffer, mousepos.x, mousepos.y, 0, true, 0) if pick ~= nil then local texcoords = pick:GetTexCoords() Print(texcoords) end Maybe I will make this into a Mesh method like GetPolygonTexCoord(), which would work just as well but could potentially be useful for other things. I have not decided yet.
      Now that we have 2D drawing to a texture, and the ability to calculate texture coordinates at a position on a mesh, the next step will be to set up a GUI displayed on a 3D surface, and to send input events to the GUI based on the user interactions in 3D space. The texture could be applied to a computer panel, like many of the interfaces in the newer DOOM games, or it could be used as a panel floating in the air that can be interacted with VR controllers.
×
×
  • Create New...