Jump to content

Josh

Staff
  • Content Count

    16,056
  • Joined

  • Last visited

Everything posted by Josh

  1. Josh

    2D Drawing to Texture

    Here is what I came up with: void Camera::SetRealTime(const bool realtimemode) void Camera::Refresh() Refresh will cause a non-realtime camera to render once before it is disabled automatically, until the next refresh.
  2. Technically, 4,294,967,295 vertices is the max. But there will be a limit lower than that depending on your hardware. Very roughly I would aim for no more than 10,000-20,000 polys for any particular model.
  3. LE5 beta updated. Added 3D GUI example, missing newton double float lib.

  4. Josh

    3D GUI

    Putting all the pieces together, I was able to create a GUI with a sprite layer, attach it to a camera with a texture buffer render target, and render the GUI onto a texture applied to a 3D surface. Then I used the picked UV coords to convert to mouse coordinates and send user events to the GUI. Here is the result: This can be used for GUIs rendered onto surfaces in your game, or for a user interface that can be interacted with in VR. This example will be included in the next beta update.
  5. GUI is working in 3D space. Never did this before.

    1. gamecreator

      gamecreator

      How much of these new features are a challenge you enjoy versus a frustrating necessity?

    2. Josh

      Josh

      The features are pretty fun actually. I planned all this out to have this specific functionality work. Vulkan has been some of the most frustrating difficult stuff I have ever worked on (not as bad as Android / iOS stuff though) but the hard part of that is through I think.

  6. Josh

    Voxel Code

    If you keep developing this I would be interested in licensing the code to add a Minecraft-type game template to the new engine.
  7. A dynamic cast will return NULL if the conversion is not valid. (A static cast can potentially return an invalid object.) So you might want to also check if the result of the cast is NULL before continuing.
  8. Cast the Actor object to your custom actor object and the function will be available. See C++ dynamic casting.
  9. Both of those code examples do the exact same thing.
  10. Do you want a Lua or C++ function to be called?
  11. Josh

    Voxel Code

    https://www.leadwerks.com/learn?page=API-Reference_Object_Thread
  12. I don’t mean to pick on you in particular but this is the sort of thing I mean. It sounds like the things you read online are making you unhappy and affecting your life in a negative way.
  13. A low-carb diet is healthy. What I am saying is they start with something valid, but the tendency is for these thing to grow into an all-encompassing belief system.
  14. A friend recently told me they deleted all their social and forum accounts for no particular reason. They just didn't want to be online anymore. It got me thinking. The common view of the Internet is that it is a digital mirror of the real world, and the sum of all human knowledge. This is not true. In the late 1990's Internet message boards were a novel thing. You could discuss various interests online and it was interesting and organic. Now most interaction has been channeled into a few big platforms, and I don't think that is the case today, and it has not been for many years. Outside of some technical topics, I think the Internet has become: Corporate and government(s) astroturfing, probably a great deal of which is generated by AI MLM marketers and cult leaders. (The former is monetized and the latter is not.) Mentally ill compulsives posting variations of the same joke on YouTube over and over. When I think about the intelligent wise people I know in my life, none of them ever write anything online, and I never find their good advice online, about anything. Instead, I see various cults for health, fitness, money, politics, business, every aspect of life you can imagine, and all of it is very bad advice that has nothing to do with reality. For example, for something as simple as food, there is a cult for eating meat. There is also a cult for not eating meat. Humans throughout history have always had some vulnerability to this type of thinking, but it seems like these things are arising faster than ever before in history, and it is totally drowning out normal organic conversations. One sign is if they have their own made-up terms or acronyms for anything, it is probably all bullshit. The only sources of information I have ever found useful in my life are: Other people I have conversations with in real life. My own intuition. Some books. I am saying we are going back to a time before the Internet, because the only people spouting their ideas are the mentally ill. In reality, we have probably been there for 5-10 years. Outside of technical information or booking a hotel, I think it is time to turn off the Internet as much as you can. Now is a good time to read some books, pray or meditate, and pay attention to the people around you. Avoid people who seem like they are getting sucked into some kind of ideological framework or health fad of any kind, or at least call them out on it. I have seen this in several people I know in my life. Having a head full of nonsense is worse than ignorance. By reading the content that is on the Internet, you are not becoming more informed. You are becoming more neurotic.
  15. Josh

    2D Drawing to Texture

    That is actually something I will have to figure out for the editor, too.
  16. Josh

    2D Drawing to Texture

    Hide the camera and it won’t render. There isn’t strict synchronization of the rendering and main threads so I am not sure how you would make sure it just rendered one single frame. Something to think about.
  17. New LE5 beta uploaded with render-to-texture.

  18. The new car looks a million times better, by the way.
  19. Maybe you need to move the vehicle and all tires?
  20. I have been working on 2D rendering off and on since October. Why am I putting so much effort into something that was fairly simple in Leadwerks 4? I have been designing a system in anticipation of some features I want to see in the GUI, namely VR support and in-game 3D user interfaces. These are both accomplished with 2D drawing performed on a texture. Our system of sprite layers, cameras, and sprites was necessary in order to provide enough control to accomplish this. I now have 2D drawing to a texture working, this time as an official supported feature. In Leadwerks 4, some draw-to-texture features were supported, but it was through undocumented commands due to the complex design of shared resources between OpenGL contexts. Vulkan does not have this problem because everything, including contexts (or rather, the VK equivalent) is bound to an abstract VkInstance object. Here is the Lua code that makes this program: --Get the primary display local displaylist = ListDisplays() local display = displaylist[1]; if display == nil then DebugError("Primary display not found.") end local displayscale = display:GetScale() --Create a window local window = CreateWindow(display, "2D Drawing to Texture", 0, 0, math.min(1280 * displayscale.x, display.size.x), math.min(720 * displayscale.y, display.size.y), WINDOW_TITLEBAR) --Create a rendering framebuffer local framebuffer = CreateFramebuffer(window); --Create a world local world = CreateWorld() --Create second camera local texcam = CreateCamera(world) --Create a camera local camera = CreateCamera(world) camera:Turn(45,-45,0) camera:Move(0,0,-2) camera:SetClearColor(0,0,1,1) --Create a texture buffer local texbuffer = CreateTextureBuffer(512,512,1,true) texcam:SetRenderTarget(texbuffer) --Create scene local box = CreateBox(world) --Create render-to-texture material local material = CreateMaterial() local tex = texbuffer:GetColorBuffer() material:SetTexture(tex, TEXTURE_BASE) box:SetMaterial(material) --Create a light local light = CreateLight(world,LIGHT_DIRECTIONAL) light:SetRotation(55,-55,0) light:SetColor(2,2,2,1) --Create a sprite layer. This can be shared across different cameras for control over which cameras display the 2D elements local layer = CreateSpriteLayer(world) texcam:AddSpriteLayer(layer) texcam:SetPosition(0,1000,0)--put the camera really far away --Load a sprite to display local sprite = LoadSprite(layer, "Materials/Sprites/23.svg", 0, 0.5) sprite:MidHandle(true,true) sprite:SetPosition(texbuffer.size.x * 0.5, texbuffer.size.y * 0.5) --Load font local font = LoadFont("Fonts/arial.ttf", 0) --Text shadow local textshadow = CreateText(layer, font, "Hello!", 36 * displayscale.y, TEXT_LEFT, 1) textshadow:SetColor(0,0,0,1) textshadow:SetPosition(50,30) textshadow:SetRotation(90) --Create text text = textshadow:Instantiate(layer) text:SetColor(1,1,1,1) text:SetPosition(52,32) text:SetRotation(90) --Main loop while window:Closed() == false do sprite:SetRotation(CurrentTime() / 30) world:Update() world:Render(framebuffer) end I have also added a GetTexCoords() command to the PickInfo structure. This will calculate the tangent and bitangent for the picked triangle and then calculate the UV coordinate at the picked position. It is necessary to calculate the non-normalized tangent and bitangent to get the texture coordinate, because the values that are stored in the vertex array are normalized and do not include the length of the vectors. local pick = camera:Pick(framebuffer, mousepos.x, mousepos.y, 0, true, 0) if pick ~= nil then local texcoords = pick:GetTexCoords() Print(texcoords) end Maybe I will make this into a Mesh method like GetPolygonTexCoord(), which would work just as well but could potentially be useful for other things. I have not decided yet. Now that we have 2D drawing to a texture, and the ability to calculate texture coordinates at a position on a mesh, the next step will be to set up a GUI displayed on a 3D surface, and to send input events to the GUI based on the user interactions in 3D space. The texture could be applied to a computer panel, like many of the interfaces in the newer DOOM games, or it could be used as a panel floating in the air that can be interacted with VR controllers.
  21. Josh

    Voxel Code

    You could perform your mesh algorithm on another thread. You can’t create the Leadwerks mesh on another thread but you can get all the daya for the mesh ready.
  22. If it’s not in the documentation it’s potentially subject to change in the future. There are also some engine functions that are not documented that the user should not call.
  23. Josh

    Shader Families

    I will take a look!
  24. FYI, the particle system has a built-in collision detection system you can use for rain if you want to.
  25. In Leadwerks 4, render-to-texture was accomplished with the SetRenderTarget command, which allowed a camera to draw directly to a specified texture, while hiding the underlying framebuffer object (FBO). In the new engine we have a bit more explicit handling of this behavior. This is largely in part due to the use of Vulkan's bindless design, which greatly improves the context-binding design of OpenGL. The Leadwerks "Buffer" class was never documented or officially supported because the underlying OpenGL functionality made the system pretty messy, but the design of Vulkan simplifies this aspect of graphics. We have seen that the Framebuffer classes replaces the LE4 context. I've added a TextureBuffer class which can be created similarly: shared_ptr<TextureBuffer> CreateTextureBuffer(const int width, const int height, const int colorcomponents = 1, const bool depthcomponent = true, const int samples = 0); Once a TextureBuffer is created, you can set a camera to target it for rendering: camera->SetRenderTarget(texbuffer); You can also apply its color component(s) to a material: material->SetTexture(texbuffer->GetColorBuffer(0), TEXTURE_BASE); You could also retrieve the depth buffer and apply that to a material, rendering the scene from the top down and using the depth in a rain or snow shader, for example. This functionality will later be used to render the GUI system to a texture for use in VR or with in-game menus painted onto 3D surfaces. Like everything with Vulkan, this involved a very long process of figuring out everything we need to use, discarding the things we don't, and packaging it up in a structure that is actually usable by the end user. However, once all that is done we have a very powerful system that is optimized for exactly the way modern GPUs work. Here is a small sample of some of my code, just to give you an idea of how complicated this stuff is: for (auto pair : visset->cameravislists) { auto cam = pair.first; clear[1].color = { cam->clearcolor.r, cam->clearcolor.g, cam->clearcolor.b, cam->clearcolor.a }; auto light = dynamic_pointer_cast<RenderLight>(cam); if (light == nullptr and cam->rendertarget == nullptr) continue; renderpass[0] = device->shadowpass; renderpass[1] = device->renderpass[CLEAR_COLOR | CLEAR_DEPTH]; int faces = 1; if (light) { if (light->description.type == LIGHT_POINT) faces = 6; } if (MULTIPASS_CUBEMAP) faces = 1; for (int face = 0; face < faces; ++face) { renderPassBeginInfo.clearValueCount = 2; if (light) { renderPassBeginInfo.renderPass = device->shadowpass->pass; if (light->description.type == LIGHT_POINT and MULTIPASS_CUBEMAP == true) { renderPassBeginInfo.renderPass = device->cubeshadowpass->pass; } renderPassBeginInfo.framebuffer = light->shadowbuffer[face]->framebuffer; renderPassBeginInfo.renderArea.extent.width = light->shadowbuffer[face]->size.width; renderPassBeginInfo.renderArea.extent.height = light->shadowbuffer[face]->size.height; } else { renderpass[0] = device->renderpass[CLEAR_COLOR | CLEAR_DEPTH]; int cc = cam->rendertarget->CountColorTextures(); renderPassBeginInfo.renderPass = device->rendertotexturepass[cc][int(cam->rendertarget->depthtexture != nullptr)]->pass; renderPassBeginInfo.framebuffer = cam->rendertarget->framebuffer; renderPassBeginInfo.renderArea.extent.width = cam->rendertarget->size.width; renderPassBeginInfo.renderArea.extent.height = cam->rendertarget->size.height; } vkCmdBeginRenderPass(commandbuffers[currentFrame]->commandbuffer, &renderPassBeginInfo, VK_SUBPASS_CONTENTS_INLINE); RecordDraw(currentFrame, cam, pair.second, renderpass[0], face); commandbuffers[currentFrame]->EndRenderPass(); if (light) commandbuffers[currentFrame]->BindResource(light->shadowbuffer[face]); //Copy output to render texture if (cam->rendertarget) { for (int n = 0; n < cam->rendertarget->colortarget.size(); ++n) { if (cam->rendertarget->colortarget[n] != nullptr) { commandbuffers[currentFrame]->TransitionImageLayout(pair.first->rendertarget->colortexture[n], VK_IMAGE_LAYOUT_COLOR_ATTACHMENT_OPTIMAL, VK_IMAGE_LAYOUT_TRANSFER_SRC_OPTIMAL, VK_PIPELINE_STAGE_COLOR_ATTACHMENT_OUTPUT_BIT, VK_PIPELINE_STAGE_TRANSFER_BIT, -1); commandbuffers[currentFrame]->TransitionImageLayout(pair.first->rendertarget->colortarget[n], VK_IMAGE_LAYOUT_SHADER_READ_ONLY_OPTIMAL, VK_IMAGE_LAYOUT_TRANSFER_DST_OPTIMAL, VK_PIPELINE_STAGE_COLOR_ATTACHMENT_OUTPUT_BIT, VK_PIPELINE_STAGE_TRANSFER_BIT, -1); VkImageCopy regions = {}; regions.dstOffset = {0u,0u,0u}; regions.extent = { uint32_t(cam->rendertarget->colortarget[n]->size.x), uint32_t(cam->rendertarget->colortarget[n]->size.y), 1u}; regions.srcOffset = regions.dstOffset; regions.dstSubresource.aspectMask = VK_IMAGE_ASPECT_COLOR_BIT; regions.dstSubresource.baseArrayLayer = 0; regions.dstSubresource.layerCount = 1; regions.dstSubresource.mipLevel = 0; regions.srcSubresource = regions.dstSubresource; vkCmdCopyImage(commandbuffers[currentFrame]->commandbuffer, cam->rendertarget->colortexture[n]->vkimage, VK_IMAGE_LAYOUT_TRANSFER_SRC_OPTIMAL, cam->rendertarget->colortarget[n]->vkimage, VK_IMAGE_LAYOUT_TRANSFER_DST_OPTIMAL, 1, &regions); commandbuffers[currentFrame]->TransitionImageLayout(pair.first->rendertarget->colortarget[n], VK_IMAGE_LAYOUT_TRANSFER_DST_OPTIMAL, VK_IMAGE_LAYOUT_SHADER_READ_ONLY_OPTIMAL, VK_PIPELINE_STAGE_COLOR_ATTACHMENT_OUTPUT_BIT, VK_PIPELINE_STAGE_TRANSFER_BIT, -1); } } } } } Below is a simple Lua program that sets up a scene with two cameras, and renders one camera to a texture buffer which is displayed on the middle box itself. --Get the primary display local displaylist = ListDisplays() local display = displaylist[1]; if display == nil then DebugError("Primary display not found.") end local displayscale = display:GetScale() --Create a window local window = CreateWindow(display, "Render to Texture", 0, 0, math.min(1280 * displayscale.x, display.size.x), math.min(720 * displayscale.y, display.size.y), WINDOW_TITLEBAR) --Create a rendering framebuffer local framebuffer = CreateFramebuffer(window); --Create a world local world = CreateWorld() --Create second camera local texcam = CreateCamera(world) texcam:SetClearColor(1,0,1,1) --Create a camera local camera = CreateCamera(world) camera:Move(0,0,-2) camera:SetClearColor(0,0,1,1) --Create a texture buffer local texbuffer = CreateTextureBuffer(512,512,1,true) texcam:SetRenderTarget(texbuffer) --Create scene local box = CreateBox(world) local cone = CreateCone(world) cone:SetPosition(2,0,0) cone:SetColor(1,0,0,1) local sphere = CreateSphere(world) sphere:SetPosition(-2,0,0) sphere:SetColor(0,1,0,1) --Create render-to-texture material local material = CreateMaterial() local tex = texbuffer:GetColorBuffer() material:SetTexture(tex, TEXTURE_BASE) box:SetMaterial(material) --Create a light local light = CreateLight(world,LIGHT_DIRECTIONAL) light:SetRotation(35,-55,0) --Main loop while window:Closed() == false do texcam:SetPosition(0,0,0) texcam:Turn(0,1,0) texcam:Move(0,0,-2) world:Update() world:Render(framebuffer) end Here is the result. Look how simple it is to control this powerful system!
×
×
  • Create New...