Jump to content

Josh

Staff
  • Posts

    23,094
  • Joined

  • Last visited

Blog Entries posted by Josh

  1. Josh

    Articles
    In Leadwerks, required files were always a slightly awkward issue. The engine requires a BFN texture and a folder of shaders, in order to display anything. One of my goals is to make the Ultra Engine editor flexible enough to work with any game. It should be able to load the folder of an existing game, even if it doesn't use Ultra Engine, and display all the models and scenes with some accuracy. Of course the Quake game directory isn't going to include a bunch of Ultra Engine shaders, so what to do?
    One solution could be to load shaders and other files from the editor directory, but this introduces other issues. My solution is to build shaders, shader families, and the default BRDF texture into the engine itself. This is done with a utility that reads a list of files to includes, then loads each one and turns it into an array in C++ code that gets compiled into the engine: The code looks like this:
    if (rpath == RealPath("Shaders/Sky.json")) { static const std::array<uint64_t, 62> data = {0x61687322090a0d7bULL,0x6c696d6146726564ULL,0xd7b090a0d3a2279ULL,0x746174732209090aULL,0x9090a0d3a226369ULL,0x66220909090a0d7bULL,0xa0d3a2274616f6cULL,0x9090a0d7b090909ULL,0x555141504f220909ULL,0x909090a0d3a2245ULL,0x90909090a0d7b09ULL,0x6c75616665642209ULL,0x909090a0d3a2274ULL,0x909090a0d7b0909ULL,0x6573616222090909ULL,0x90909090a0d3a22ULL,0x909090a0d7b0909ULL,0x7265762209090909ULL,0x5322203a22786574ULL,0x532f737265646168ULL,0x762e796b532f796bULL,0x227670732e747265ULL,0x9090909090a0d2cULL,0x6d67617266220909ULL,0x5322203a22746e65ULL,0x532f737265646168ULL,0x662e796b532f796bULL,0x227670732e676172ULL,0x909090909090a0dULL,0x9090909090a0d7dULL,0x7d090909090a0d7dULL,0xd2c7d0909090a0dULL,0x756f64220909090aULL,0x90a0d3a22656c62ULL,0x909090a0d7b0909ULL,0x45555141504f2209ULL,0x90909090a0d3a22ULL,0x9090909090a0d7bULL,0x746c756166656422ULL,0x90909090a0d3a22ULL,0x90909090a0d7b09ULL,0x2265736162220909ULL,0x9090909090a0d3aULL,0x90909090a0d7b09ULL,0x7472657622090909ULL,0x685322203a227865ULL,0x6b532f7372656461ULL,0x34365f796b532f79ULL,0x732e747265762e66ULL,0x9090a0d2c227670ULL,0x7266220909090909ULL,0x3a22746e656d6761ULL,0x7265646168532220ULL,0x6b532f796b532f73ULL,0x72662e6634365f79ULL,0xd227670732e6761ULL,0x7d0909090909090aULL,0x7d09090909090a0dULL,0xd7d090909090a0dULL,0x90a0d7d0909090aULL,0xa0d7d090a0d7d09ULL,0xcdcdcdcdcdcdcd7dULL }; auto buffer = CreateBuffer(489); buffer->Poke(0,(const char*)data.data(),489); return CreateBufferStream(buffer); } An unsigned 64-bit integer is used for the data type, as this results in the smallest generated code file size.
    Files are searched for in the following order:
    A file on the hard drive in the specified path. A file from a loaded package with the specified relative path. A file built into the engine. Therefore, if your game includes a modified version of a shader, the shader module will still be loaded from the file in your game directory. However, if you don't include any shaders at all, the engine will just fall back on its own set of shaders compiled into the core engine.
    This gives Ultra Engine quite a lot more flexibility in loading scenes and models, and allows creation of 3D applications that can work without any required files at all, while still allowing for user control over the game shaders.
    The screenshot here shows the Ultra Engine editor loading a Leadwerks project folder and displaying 3D graphics using the Ultra Engine renderer, even though the Leadwerks project does not contain any of the shaders and other files Ultra Engine needs to run:

  2. Josh

    Articles
    At last I have been able to work the plugin system into the new editor and realize my dreams.
    The editor automatically detects supported file formats and generates thumbnails for them. (Thumbnails are currently compatible with the Leadwerks system, so Leadwerks can read these thumbnail files and vice-versa.) If no support for a file format is found, the program just defaults to the whatever icon or thumbnail Windows shows.
    The options dialog includes a tab where you can examine each plugin in detail. I plan to allow disabling of individual plugins, like how it works in 3ds Max.
     
    It's completely possible that this editor could be used to mod existing games with the right set of plugins. I want to try doing this with Source games and see how easily I can load levels up. In the image below, the new editor is browsing an unmodified Leadwerks project, using a plugin to provide support for loading Leadwerks TEX files.


    I wrote about some of these ideas a while ago:
     
  3. Josh

    Articles
    The new editor is being designed to be flexible enough to work with any game, so it can be used for modding as well as game development with our new 3D engine. Each project has configurable settings that can be used to handle what the editor actually does when you run the game. In the case of a game like Quake, this will involve running a few executables to first compile the map you are working on into a BSP structure, then perform lightmaps and pre-calculate visibility.

    You can also set up your own custom workflow to automatically convert textures and models, using either the import / export capabilities of the editor plugins, or an external executable. In Leadwerks, this was all hard-coded with the FBX to MDL converter and a few other converters, but in the new editor it's totally configurable.

  4. Josh

    Articles
    Many games store 3D models, textures, and other game files in some type of compressed package format. These can be anything from a simple ZIP file to a custom multi-file archive system. This has the benefit of making the install size of the game smaller, and can prevent users from accessing the raw files. Often times undocumented proprietary file formats are used to optimize loading time, although with DDS and glTF this is not such a problem anymore.
    Leadwerks uses built-in support for encrypted ZIP files. In our new engine I wanted the ability to load game files from a variety of package formats, so our tools would have compatibility with many different games. To support this I implemented a new plugin type for package files. Packages can be used like this:
    auto pak = LoadPackage("data.zip"); auto dir = pak->LoadDir(""); for (auto file : dir) { if (pak->FileType(file) == 1) { auto stream = pak->ReadFile(); } } I created a package plugin for loading Valve package files and added support for browsing packages in the new editor, alongside with regular old folders. Ever wanted to see what the insides of some of your favorite games look like? Now you can:

    This can work not just for textures, but for materials, 3D models, and even scene or map files.
    I envision this system being flexible enough to support a wide variety of games, so that the new editor can be used not just as a game development tool but as a tool for modding games, even for games that don't have any official tools. All it takes is the right set of plugins to pull all those weird specialized file formats into our editor and export again in a game-ready format.
  5. Josh

    Articles
    I've been wracking my brain trying to decide what I want to show at the upcoming conference, and decided I should get the new editor in a semi-workable state. I started laying out the interface two days ago. To my surprise, the whole process went very fast and I discovered some cool design features along the way.
    With the freedom and control I have with the new user interface system, I was able to make the side panel extend all the way to the top and bottom of the window client area. This gives you a little more vertical space to work with.
    The object bar on the left also extends higher and goes all the way down the side, so there is room for more buttons now.
    The toolbar only spans the width of the viewport area, and has only the most common buttons you will need.
    Right now, I am showing all files in the project, not just game files. If it's a model or texture file the editor will generate a rendered thumbnail, but for other files it just retrieves the thumbnail image from Windows for that file type.

    All in all I am very pleased with the design and pleasantly surprised how quickly I am able to implement editor features.
  6. Josh

    Articles
    Ultra App Kit 1.2 is now available on our site and on Steam. This is a bug fix update that resolves numerous small issues reported in the bug reports forum.
    To download the latest version, see My Purchases.
  7. Josh

    Articles
    One of my goals in Ultra Engine is to avoid "black box" file formats and leave all game assets in common file formats that can be easily read in a variety of programs. For this reason, I put a lot of effort into the Pixmap class and the DDS load and save capabilities.
    In Ultra Engine animated textures can be stored in a volume texture. To play the animation, the W component of the UVW texcoord is scrolled. The fragment shader will sample the volume texture between the nearest two slices on the Z axis of the texture, resulting in a smooth transition between frames using linear interpolation. There's no need to constantly swap a lot of textures in a material, as all animation frames are packed away in a single DDS file.
    The code below shows how multiple animation frames can be loaded and saved into a 3D texture:
    int framecount = 128; std::vector<shared_ptr<Pixmap> > pixmaps(framecount); for (int n = 0; n < framecount; ++n) { pixmaps[n] = LoadPixmap("https://raw.githubusercontent.com/Leadwerks/Documentation/master/Assets/Materials/Animations/water1_" + String(n) + ".png"); } SaveTexture("water1.dds", TEXTURE_3D, pixmaps, framecount); Here is the animation playing in the engine:

    My new video project1.mp4 The resulting DDS file is 32 MB for a 256x256x128 RGBA texture:
    water1.zip
    You can open this DDS file in Visual Studio and view it. Note that the properties indicate this is the first slice of 128, verifying that our texture does contain the animation data:

    Adding Mipmaps
    The DDS format supports mipmaps in volume textures. A volume mipmap is just a lower-resolution image of the original, with all dimensions half the size of the previous frame, with a minimum dimension of 1. They are stored in the DDS file in descending order. The code below is a little complicated, but it will reliably compute mipmaps for any volume texture. Note the code is creating another STL vector called "mipchain" where all slices of all mipmaps are stored in order:
    auto plg = LoadPlugin("Plugins/FITextureLoader.dll"); int framecount = 128; std::vector<shared_ptr<Pixmap> > pixmaps(framecount); for (int n = 0; n < framecount; ++n) { pixmaps[n] = LoadPixmap("https://raw.githubusercontent.com/Leadwerks/Documentation/master/Assets/Materials/Animations/water1_" + String(n) + ".png"); } //Build mipmaps iVec3 size = iVec3(pixmaps[0]->size.x, pixmaps[0]->size.y, pixmaps.size()); auto mipchain = pixmaps; while (true) { auto osize = size; size.x = Max(1, size.x / 2); size.y = Max(1, size.y / 2); size.z = Max(1, size.z / 2); for (int n = 0; n < size.z; ++n) { auto a = pixmaps[n * 2 + 0]; auto b = pixmaps[n * 2 + 1]; auto mipmap = CreatePixmap(osize.x, osize.x, pixmaps[0]->format); for (int x = 0; x < pixmaps[0]->size.x; ++x) { for (int y = 0; y < pixmaps[0]->size.y; ++y) { int rgba0 = a->ReadPixel(x, y); int rgba1 = b->ReadPixel(x, y); int rgba = RGBA((Red(rgba0)+Red(rgba1))/2, (Green(rgba0) + Green(rgba1)) / 2, (Blue(rgba0) + Blue(rgba1)) / 2, (Alpha(rgba0) + Alpha(rgba1)) / 2); mipmap->WritePixel(x, y, rgba); } } mipmap = mipmap->Resize(size.x, size.y); pixmaps[n] = mipmap; mipchain.push_back(mipmap); } if (size == iVec3(1, 1, 1)) break; } SaveTexture("water1.dds", TEXTURE_3D, mipchain, framecount); The resulting DDS file is a little bigger (36.5 MB) because it includes the mipmaps.
    water1_mipmaps.zip
    We can open this DDS file in Visual Studio and verify that the mipmaps are present and look correct:

    Texture Compression
    Volume textures can be stored in compressed texture formats. This is particularly useful for volume textures, since they are so big. Compressing all the mipmaps in a texture before saving can be easily done by replacing the last line of code in the previous example with the code below. We're going to use BC5 compression because this is a normal map.
    //Compress all images for (int n = 0; n < mipchain.size(); ++n) { mipchain[n] = mipchain[n]->Convert(TEXTURE_BC5); } SaveTexture("water1.dds", TEXTURE_3D, mipchain, framecount); The resulting DDS file is just 9.14 MB, about 25% the size of our uncompressed DDS file.
    water1_bc5.zip
    When we open this file in Visual Studio, we can verify the texture format is BC5 and the blue channel has been removed. (Only the red and green channels are required for normal maps, as the Z component can be reconstructed in the fragment shader): Other types of textures may use a different compression format.

    This method can be used to make animated water, fire, lava, explosions and other effects packed away into a single DDS file that can be easily read in a variety of programs.
  8. Josh

    Articles
    When it comes to complex projects I like to focus on whatever area of technology causes me the most uncertainty or worry. Start with the big problems, solve those, and as you continue development the work gets easier and easier. I decided at this stage I really wanted to see how well Vulkan graphics work on Mac computers.
    Vulkan doesn't run natively on Mac, but gets run through a translation library called MoltenVK. How well does MoltenVK actually work? There was only one way to find out...
    Preparing the Development Machine
    The first step was to set up a suitable development machine. The only Mac I currently own is a 2012 MacBook Pro. I had several other options to choose from:
    Use a remote service like MacInCloud to access a new Mac remotely running macOS Big Sur. Buy a new Mac Mini with an M1 chip ($699). Buy a refurbished Mac Mini ($299-499). What are my requirements?
    Compile universal binaries for use on Intel and ARM machines. Metal graphics. I found that the oldest version of Xcode that supports universal binaries is version 12.2. The requirements for running Xcode 12.2 are macOS Catalina...which happens to be the last version of OSX my 2012 MBP supports! I tried upgrading the OS with the Mac App Store but ran into trouble because the hard drive was not formatted with the new-ish APFS drive format. I tried running disk utility in a safe boot but the option to convert the file system to APFS was disabled in the program menu, no matter what I did. Finally, I created a bootable install USB drive from the Catalina installer and did a clean install of the OS from that.
    I was able to download Xcode 12.2 directly from Apple instead of the App Store and it installed without a hitch. I also installed the Vulkan SDK for Mac and the demos worked fine. The limitations on this Mac appear to be about the same as an Intel integrated chip, so it is manageable (128 shader image units accessible at any time). Maybe this is improved in newer hardware. Performance with this machine in macOS Catalina is actually great. I did replace the mechanical hard drive with an SSD years ago, so that certainly helps.
    Adding Support for Universal Binaries
    Mac computers are currently in another big CPU architecture shift from Intel x64 to arm64. They previously did this in 2006 when they moved from PowerPC to Intel processors, and just like now, they previously used a "universal binary" for static and shared libraries and executables.
    Compiling my own code for universal binaries worked with just one change. The stat64 structure seems to be removed for the ARM builds, but changing this to "stat" worked without any problems. The FreeImage texture loader plugin, on the other hand, required a full day's work before it would compile. There is a general pattern that when I am working with just my own code, everything works nicely and neatly, but when I am interfacing with other APIs productivity drops ten times. This is why I am trying to work out all this cross-platform stuff now, so that I can get it all resolved and then my productivity will skyrocket.
    macOS File Access
    Since Mojave, macOS has been moving in a direction of requiring the developer to explicitly request access the parts of the file system, or for the user to explicitly allow access. On one hand, it makes sense to not allow every app access to all your user files. On the other hand, this really cripples the functionality of Mac computers. ARM CPUs do no in and of themselves carry any restrictions I am aware of, but it does look like the Mac is planned to become another walled garden ecosystem like iOS.
    I had to change the structure of user projects so that the project folders are included in the build. All files and subfolders in the blue folders are packaged into your Xcode application automatically, and the result is a single app file (which is really a folder) ready to publish to the Mac App Store.

    However, this means the Leadwerks-style "publish" feature is not really appropriate for the new editor. Maybe there will be an optional extension that allows you to strip unused files from a project?
    There are still some unanswered questions about how this will work with an editor that involves creating and modifying large numbers of files, but the behavior I have detailed above is the best for games and self-contained applications.
    MacOS is now about as locked down as iOS. You cannot run code written on another machine that is not signed with a certificate, which means Apple can probably turn anyone's program off at any time, or refuse to grant permission to distribute a program. So you might want to think twice before you buy into the Mac ecosystem.

    MoltenVK
    Integration with the MoltenVK library actually went pretty smoothly. However, importing the library into an Xcode project will produce an error like "different teamID for imported library" unless you add yet another setting to your entitlements ilst, "com.apple.security.cs.disable-library-validation" and set it to YES.
    I was disappointed to see the maximum accessible textures per shader are 16 on my Nvidia GEForce 750, but a fallback will need to be written for this no matter what because Intel integrated graphics chips have the same issue.
    Finally, after years of wondering whether it worked and months of work, I saw the beautiful sight of a plain blue background rendered with Metal:

    It looks simple, but now that I have the most basic Vulkan rendering working on Mac things will get much easier from here on out.
  9. Josh
    We now accept popular cryptocurrencies in our store and Marketplace through Coinbase Commerce. That's right, you can now buy software like Ultra App Kit using Bitcoin, Ethereum, Litecoin, Dai, or Bitcoin Cash (if you can figure it out!). Right now it's a novelty, but it's worth trying. Maybe by 2022 it will be your only option?

    Sadly, Dogecoin is not one of the currently supported coins. Soon?
  10. Josh

    Articles
    There are three new features in the upcoming Ultra Engine (Leadwerks 5) that will make game input better than ever before.
    High Precision Mouse Input
    Raw mouse input measures the actual movement of the mouse device, and has nothing to do with a cursor on the screen. Windows provides an interface to capture the raw mouse input so your game can use mouse movement with greater precision than the screen pixels. The code to implement this is rather complicated, but in the end it just boils down to one simple command expose to the end user:
    Vec2 Window::MouseMovement(const int dpi = 1000); What's really hilarious about this feature is it actually makes mouse control code a lot simpler. For first-person controls, you just take the mouse movement value, and that is your camera rotation value. There's no need to calculate the center of the window or move the mouse pointer back to the middle. (You can if you want to, but it has no effect on the raw mouse input this command measures.)
    The MousePosition() command is still available, but will return the integer coordinates the Windows mouse cursor system uses.
    High Frequency Mouse Look
    To enable ultra high-precision mouse look controls, I have added a new command:
    void Camera::SetFreeLookMode(const bool mode, const float speed = 0.1f, const int smoothing = 0) When this setting is enabled, the mouse movement will be queried in the rendering thread and applied to the camera rotation. That means real-time mouse looking at 1000 FPS is supported, even as the game thread is running at a slower frequency of 60 Hz. This was not possible in Leadwerks 4, as mouse looking would become erratic if it wasn't measured over a slower interval due to the limited precision of integer coordinates.
    XBox Controller Input
    I'm happy to say we will also have native built-in support for XBox controllers (both 360 and One versions). Here are the commands:
    bool GamePadConnected(const int controller = 0) bool GamePadButtonDown(const GamePadButton button, const int controller = 0) bool GamePadButtonHit(const GamePadButton button, const int controller = 0) Vec2 GamePadAxisPosition(const GamePadAxis axis, const int controller = 0) void GamePadRumble(const float left, const float right, const int controller = 0) To specify a button you use a button code:
    GAMEPADBUTTON_DPADUP GAMEPADBUTTON_DPADDOWN GAMEPADBUTTON_DPADLEFT GAMEPADBUTTON_DPADRIGHT GAMEPADBUTTON_START GAMEPADBUTTON_BACK GAMEPADBUTTON_LTHUMB GAMEPADBUTTON_RTHUMB GAMEPADBUTTON_LSHOULDER GAMEPADBUTTON_RSHOULDER GAMEPADBUTTON_A GAMEPADBUTTON_B GAMEPADBUTTON_X GAMEPADBUTTON_Y GAMEPADBUTTON_RTRIGGER GAMEPADBUTTON_LTRIGGER And axes are specified with these codes:
    GAMEPADAXIS_RTRIGGER GAMEPADAXIS_LTRIGGER GAMEPADAXIS_RSTICK GAMEPADAXIS_LSTICK These features will give your games new player options and a refined sense of movement and control.
  11. Josh

    Articles
    I have procrastinated testing of our new 3D engine on AMD hardware for a while. I knew it was not working as-is, but I was not too concerned. One of the promises of Vulkan is better support across-the-board and fewer driver bugs, due to the more explicit nature of the API. So when I finally tried out the engine on an R9 200 series card, what would actually happen? Would the promise of Vulkan be realized, or would developers continue to be plagued by problems on different graphics cards? Read on to find out how Vulkan runs on AMD graphics cards.
    To test how Vulkan on AMD graphics cards, the first thing I had to do was run the new engine on a machine with an AMD graphics card. I removed the Nvidia card from my PC tower and inserted an AMD R9 200 series graphics card into the PCI-E slot of the motherboard. Then I turned on the computer's power by pressing the power button and powering up my computer with an AMD graphics card. What would happen? Would the AMD graphics card run Vulkan successfully?
    The first error I encountered while running the new 3D engine with Vulkan on an AMD graphics card was that the shadowmap texture format had been explicitly declared as depth-24 / stencil 8, and it should have checked the supported formats to find a depth format the AMD graphics card supported for Vulkan. That was easily fixed.
    The second issue was that my push constants structure was too big. There is a minimum limit of 128 bytes for the push constants structure in Vulkan 1.1 and 1.2. I never encountered this limit before, but on the AMD graphics card it was 128. I was able to eliminate an unneeded vec4 value to bring the structure size down to 128 bytes from 144 bytes.
    With these issues fixed, the new engine with Vulkan ran perfectly on my AMD graphics card. The engine also works without a hitch on Intel graphics, with the exception that the number of shader image units seems to be a hardware-limited feature on those chipsets. We've also seen very reliable performance on Intel chips, although the number of shader image units is severely restricted on that hardware. Overall it appears the promise of fewer driver bugs under Vulkan is holding true, although there is very wide variability in the hardware capabilities that requires rendering fallbacks.
  12. Josh

    Articles
    I recently fired up an install of Ubuntu 20.04 to see what the state of development on Linux is now, and it looks like things have improved dramatically since I first started working with Linux in 2013. Visual Studio Code looks and works great on Linux, and I was able to load the Ultra Engine source code and start compiling, although there is still much work to do. The debugger is fantastic, especially after using the Code::Blocks debugger on Linux, which was absolutely sadistic. (They're both using GDB under the hood but the interface on Code::Blocks was basically unusable.) Regardless of whatever their motivation was, Microsoft (or rather, the developers of the Atom editor VS Code was based on) has made a huge contribution to usability on the Linux desktop. 

    Settings up C++ compilation in Visual Studio Code is rather painful. You have to edit a lot of configuration files by hand to specify the exact command line GCC should use, but it is possible:
    { "tasks": [ { "type": "cppbuild", "label": "C/C++: g++ build active file", "command": "/usr/bin/g++", "args": [ "-g", "-D_ULTRA_APPKIT", "./Libraries/Plugin SDK/GMFSDK.cpp", "./Libraries/Plugin SDK/MemReader.cpp", "./Libraries/Plugin SDK/MemWriter.cpp", "./Libraries/Plugin SDK/TextureInfo.cpp", "./Libraries/Plugin SDK/Utilities.cpp", "./Libraries/Plugin SDK/half/half.cpp", "./Libraries/freeprocess/freeprocess.c", "./Libraries/s3tc-dxt-decompressionr/s3tc.cpp", "./Libraries/stb_dxt/stb_dxt.cpp", "./Classes/Object.cpp", "./Classes/Math/Math_.cpp", "./Classes/Math/Vec2.cpp", "./Classes/Math/Vec3.cpp", "./Classes/Math/Vec4.cpp", "./Classes/Math/iVec2.cpp", "./Classes/Math/iVec3.cpp", "./Classes/Math/iVec4.cpp", "./Classes/String.cpp", "./Classes/WString.cpp", "./Classes/Display.cpp", "./Classes/IDSystem.cpp", "./Classes/JSON.cpp", "./Functions.cpp", "./Classes/GUI/Event.cpp", "./Classes/GUI/EventQueue.cpp", "./Classes/Language.cpp", "./Classes/FileSystem/Stream.cpp", "./Classes/FileSystem/BufferStream.cpp", "./Classes/FileSystem/FileSystemWatcher.cpp", "./Classes/GameEngine.cpp", "./Classes/Clock.cpp", "./Classes/Buffer.cpp", "./Classes/BufferPool.cpp", "./Classes/GUI/Interface.cpp", "./Classes/GUI/Widget.cpp", "./Classes/GUI/Panel.cpp", "./Classes/GUI/Slider.cpp", "./Classes/GUI/Label.cpp", "./Classes/GUI/Button.cpp", "./Classes/GUI/TextField.cpp", "./Classes/GUI/TreeView.cpp", "./Classes/GUI/TextArea.cpp", "./Classes/GUI/Tabber.cpp", "./Classes/GUI/ListBox.cpp", "./Classes/GUI/ProgressBar.cpp", "./Classes/GUI/ComboBox.cpp", "./Classes/GUI/Menu.cpp", "./Classes/Window/LinuxWindow.cpp", "./Classes/Timer.cpp", "./Classes/Process.cpp", "./Classes/FileSystem/StreamBuffer.cpp", "./Classes/Multithreading/Thread.cpp", "./Classes/Multithreading/Mutex.cpp", "./Classes/Multithreading/ThreadManager.cpp", "./Classes/Loaders/Loader.cpp", "./Classes/Loaders/DDSTextureLoader.cpp", "./Classes/Assets/Asset.cpp", "./Classes/Plugin.cpp", "./Classes/Assets/Font.cpp", "./Classes/FileSystem/Package.cpp", "./Classes/Graphics/Pixmap.cpp", "./Classes/Graphics/Icon.cpp", "./AppKit.cpp" ], "options": { "cwd": "${workspaceFolder}" }, "problemMatcher": [ "$gcc" ], "group": { "kind": "build", "isDefault": true }, "detail": "Task generated by Debugger." } ], "version": "2.0.0" } Our old friend Steam of course looks great on Linux and runs perfectly.

    But the most amazing thing about my install of Ubuntu 20.04 is that it's running on the Windows 10 desktop:

    Hyper-V makes it easy to create a virtual machine and run Linux and Windows at the same time. Here are some tips I have found while working with it:
    First, make sure you are allocating enough hard drive space and memory, because you are unlikely to be able to change these later on. I tried expanding the virtual hard drive, and after that Ubuntu would not load in the virtual machine, so just consider these settings to be locked. I gave my new VM 256 GB hard drive space and used the dynamic memory setting to allocate memory as-needed.
    Second, by default Ubuntu is only going to give you a 1024x768 window. You can change this but it requires a little work. Open the terminal in Ubuntu and type this command:
    sudo nautilus /etc/default/grub This will open the file browser and select the GRUB file, which stores system settings. Open the file with a text editor. Since the file was opened from a nautilus window with super-user permissions, the text editor will also be launched with permissions.
    Find this line of text:
    GRUB_CMDLINE_LINUX_DEFAULT="quiet splash" Change it to this. You can set the screen resolution to whatever value you prefer:
    GRUB_CMDLINE_LINUX_DEFAULT="quiet splash video=hyperv_fb:1920x1080" Close the text editor and the nautilus window. In the terminal, run this command:
    sudo update-grub Restart the virtual machine and your new screen resolution will work. However, the DPI scaling in Ubuntu seems to be not very reliable. My laptop uses a 1920x1080 screen, and at 17" this really needs to be scaled up to 125%. I could not get "fractional scaling" to work. The solution is to set your Windows desktop resolution to whatever value feels comfortable at 100% scaling. In my case, this was 1600x900. Then set Ubuntu's screen resolution to the same. When the window is maximized you will have a screen that is scaled the way you want.

    Checkpoints are a wonderful feature that allow you to easily walk back any changes you make to the virtual machine. You can see the steps I went through on to install all the software I wanted to work on Linux:

    If I ever mess anything up, I can easily revert back to whatever point I want. You can probably transfer the virtual machine between computers very easily, too, although I have not tried this yet. Changing the VM disk size, however, requires that you delete all checkpoints. So like I said, don't do that. My disk size is set to 256 GB, but in reality the whole VM is only taking up about 25 GB on my real hard drive.
    The "type clipboard text" feature in Hyper-V does not work at all with Linux, so don't even try.
    Pro tip: If you ever get stuck in the virtual machine, you can press Ctrl + Alt + Left arrow to bring the focus back to the Windows desktop.
    Hyper-V isn't going to allow you to play games with Vulkan or OpenGL, but for other software development it seems to work very well. It's a bit slow, but the convenience of running Linux on the Windows desktop and being able to switch back and forth instantly more than makes up for it.
    Super Bonus Tip
    I found that regardless of how much disk space you allocate for the virtual machine, the Windows default install of Ubuntu 20.04 will still only use 12 GB on the main drive partition. If you try to resize this in the built-in tools, an error will occur, something like "Failed to set partition size on device. Unable to satisfy all constraints on the partition."
    To resize the partition you must install gparted:
    sudo apt install gparted When you run the application it will look like this. Right-click on the ext4 partition and select the Resize/Move menu item:

    Set the new partition size and press the Resize button. 
    Extreme Double Bonus Tip
    If at any time you are running gparted you see an error like "Not all of the space available to /dev/sdb appears to be used, you can fix the GPT to use all of the space" you should press the Fix button. If you don't do this, the partition resize won't work and will display some errors.
    After the partition resize, the VM files on Windows are still only using about 21 GB, so it looks like Hyper-V doesn't allocate more disk space than it has to.
    Crazy Extra Tip
    Both VS Code and Brave browser (and probably Chrome) have an option to use a much better looking "custom titlebar" instead of the ugly GTK+ titlebar buttons.
  13. Josh
    Note: This article contains some referral links for affiliate systems that I added after writing it. My purpose for including them is so that I can learn how these systems work by participating in them, because I am interested in possibly implementing one of our own in the future. The article was written because these are all things I am using and recommend and I am very bored with the same old things.
    Except for VR, the last decade of technology has been pretty yawn-inducing. I think Silicon Valley effectively killed the entire tech industry by showering a few companies with unlimited free money, but the scam is running out and you can't hold back innovation forever. In this blog I will talk about three technologies you might not have heard of and explain what each one can do for you.
    Proton Mail
    Proton Mail is like the new Gmail, except better because it uses end-to-end encryption so your email is never visible on the company's servers. You can read your email by logging into the website, using the smartphone app, or by installing the bridge application to decrypt mail on your PC and integrating with Outlook (requires the paid plan, but that is only $4.99 a month). There's also a free VPN available so you can have secure encrypted connections on the coffee shop wifi.
    A quick search of our mailing list shows that a lot of you have already switched to Proton Mail, making Gmail the new AOL.

    It is possible to use Proton Mail's secure system with your own domain, which gives you the ability to separate your email from your website. This is pretty nice for a couple of reasons. First, if your website is down temporarily for maintenance, email keeps flowing without a hitch. Second, if some type of theoretical security breach were to hit your site, email would be unaffected by it, or vice-versa. I think it is also possible to set up directly with your DNS so you can have email on your own domain with no web hosting at all.
    Why you need it:
    End-to-end encryption makes your email secure. No data harvesting to show you annoying ads. It just looks cool. Brave Browser
    On the surface, Brave browser looks nearly identical to Chrome, but underneath the hood it is a different beast. This is the browser that is really pushing new technology with features like built-in bit torrent, crypto currency, tor, and support for the peer-to-peer IPFS Internet protocol. Not all these technologies are going to pan out, but the possibilities of doing away with the DNS system or revolutionizing online commerce are tantalizing, and some of them are going to work out. With 20 million users Brave now has the size to benefit from network effects so it will be interesting to see what kinds of things are possible with this browser.

    I feel I should warn you. Once you start using Brave it is difficult to go back, especially once you understand what that would involve.

    Why you need it
    Built-in ad blocker. No data harvesting. Innovative new features. Works all with Chrome extensions. BackBlaze B2
    BackBlaze B2 is a cloud storage system similar to Amazon S3. These types of systems are critical for us because it allows a separate of website files and content files. Our website is only a few gigabytes and lives on our server, while all the user-generated files including images, attachments, avatars, and any other uploaded content is all stored on a backend file system. This makes backups to the website very small and as our content grows our core site stays the same size. Invision Power Board does not yet support BackBlaze B2 but the moment it does I plan to switch and cut 75% off our file storage costs.

    Why you need it:
    Like Amazon S3, at 25% the cost. Bonus Tip: KeyCDN
    KeyCDN is a content distribution network like Cloudflare, and is another Swiss company. You can set this up to deliver images or other files faster to the user, and the basic plan lets you get started for free. I plan to integrate this into our site to serve up images and other files faster in the future.

    Why you need it:
    Try a CDN for free. Very inexpensive paid plan. In Summary
    When we continue to rely on old familiar technologies and services we block ourselves from getting involved in new things. I had no idea what IPFS was until I started using Brave, and now I want to add support for it with our website. What are some of the interesting technologies, products, services, or websites you have seen popping up? I feel like a new era has begun.
  14. Josh
    Diving into the innermost workings of the Linux operating system is a pretty interesting challenge. Pretty much nobody nowadays implements anything with raw X11 anymore. They use QT, SDL, or for drawing Cairo, with Pango for text rendering. The thing is, all of these libraries use X11 as the backend. I need full control and understanding over what my code is doing, so I've opted to cut out the middleman and go directly to the innermost core of Linux.
     
    Today I improved our text rendering by implementing the XFT extension for X11. This library uses FreeType to make True-type Fonts renderable in a Linux window. Although the documentation looks intimidating, usage is actually very simple if you already have a renderable X11 window working.
     
    First, you create an XFT drawable:

    XftDraw* xftdraw = XftDrawCreate(display,drawable,DefaultVisual(display,0),DefaultColormap(display,0));
     
    Now, load your font:

    XftFont* xfont = XftFontOpen(Window::display, DefaultScreen(Window::display), XFT_FAMILY, XftTypeString, "ubuntu", XFT_SIZE, XftTypeDouble, 10.0, NULL);
     
    Drawing is fairly straightforwardish:

    XRenderColor xrcolor; XftColor xftcolor; xrcolor.red = 65535; xrcolor.green = 65535; xrcolor.blue = 65535; xrcolor.alpha = 65535; XftColorAllocValue(display,DefaultVisual(display,0),DefaultColormap(display,0),&xrcolor,&xftcolor); XftDrawString8(xftdraw, &xftcolor, xfont, x, y + GetFontHeight(), (unsigned char*)text.c_str(), text.size()); XftColorFree(display,DefaultVisual(display,0),DefaultColormap(display,0),&xftcolor);
     
    And if you need to set clipping it's easy:

    XftDrawSetClipRectangles(xftdraw,0,0,&xrect,1);
     
    Here is the result, rendered with 11 point Arial font:

     
    As you can see below, sub-pixel antialiasing is used. Character spacing also seems correct:

     
    By using Xft directly we can avoid the extra dependency on Pango, and the resulting text looks great. Next I will be looking at the XRender extension for alpha blending. This would allow us to forgo use of the Cairo graphics library, if it works.
  15. Josh

    Articles
    I want to streamline some of this website. We went through a lot of changes since the release on Steam in 2014 and learned what works and what does not.
    The "Marketplace" is just called the default "Downloads" name now, and that literally is what it is for. It's a place to keep a permanent copy of your files. Paid files are still supported, and any purchased items are still available to download, but I do not have any aspirations of building this up unless it just happens spontaneously. Instead I will focus on DLCs and on just making the file format loading features as easy as possible, so you can just buy something on cgtrader.com and not have any conversion step at all. I have hidden all paid items that did not have any sales, but I will keep them on the server for now.
    Most of your games have been moved back to the downloads area. I was the one who wanted everyone to move to the game launcher on Steam, so I will move them back for you. It would have been a great idea if Steam brought in traffic. There was some success with that, but it was not at a level that it justifies all the complication of going through that system.
    I am finding that the gallery and videos do a much better job of keeping people's attention than having a bunch of custom game pages they have to click around to find something interesting, so I intend to move all the games, videos, and screenshots out of the games database and eventually phase that out. You are welcome to put links in your screenshots and videos, and it will bring traffic to your page.
    One thing that really got me thinking this way was I have noticed some people post screenshots here regularly. I see them posting the same images on some other more general gamedev websites, and they seem to come here because they get more attention and feedback than they do on the bigger sites. I have even seen people post videos here and get more views than they do on YouTube (assuming YouTube views and comments are even real, which they might very well not be).
    In general I am finding that pushing all the content together and serving it up at 25 dopamine hits per page works really well. I want to move the content out of the projects area and phase that out too, for this reason.
    The chat feature I installed last week is a massive success and fills a huge need without sending all our activity away somewhere else.
    I want the website to be more focused because that helps me think. I don't want any more changes. I don't really want to try any new ideas this year because I don't think it's necessary.
    Going forward, we know now what our strategy is. On the consumer side, I just need to listen to you, the users, about what features you need, and deliver an improved overall quality over what we did with Leadwerks Engine. Although you don't have to do VR, focusing on VR gives me a niche of customers who I can help achieve some very interesting games, and it is gives you guys a way to sell in a market that has a lot of demand and isn't saturated. Imagine if there were 100 indie VR games available in the Downloads area right now. That would be quite popular.
    On the business side, I plan to give them something they cannot say no to, which is why I am been spending so long developing this killer new technology.
    In the end, Steam didn't have anything to offer other than another place to sell software, and it made me do some really weird things. There is no advantage to using any of the features built into Steam, except maybe the P2P networking system if you need it.
    I don't think more "innovation" is really needed now. Mostly I just need focus on giving you the features you need with good quality and documentation. Of course, the new engine allows me to do that in a way I could not before.
  16. Josh

    Articles
    You probably have noticed that our entire community has been migrated from the leadwerks.com domain to our new ultraengine.com domain. In this article I am going to describe recent changes to our website, the process by which these decisions were made, and the expected outcomes. These are not just superficial changes, but rather it is a fundamental restructuring of the way our business works. This is the execution of a plan I have had for several years.
    When we first starting selling on Steam and I saw the revenue it was bringing in, my reaction was to focus entirely on Steam and try to align everything we did to that. The idea was that if I could bring that advantage to you, we would all benefit from the massive gaming traffic that goes through Steam. I replaced many of our community features with Steam-centric equivalents. Our website gallery, videos, and downloads were replaced by the Steam equivalent. To view this content on our website I used the Steam web API to hook into our application ID and retrieve content.
    The screenshots and videos were a success. Leadwerks Editor allows publishing a screenshot directly to Steam, and we have gained a very big collection of screenshots and videos on Steam. However, there are two problems that have caused me rethink this approach:
    First, when it came to monetization, Steam seems to be a one-trick pony. Try as I could, Steam users just don't buy things through Steam Workshop. Our own sales in the web-based Marketplace are outperforming Steam.

    Second, Steam has flooded their marketplace with new titles. This incentivizes competition for digital shelf space. Instead of investing in Steam features for one application ID, it makes more sense to release multiple products on Steam and not tie yourself to one Steam application ID. Valve's position seems to be that you are responsible for driving traffic to your game, but if that is the case why are they still charging 30%? Our situation on Steam is still good, but I am not throwing everything 100% into Steam in the future and I prefer to drive traffic to our store on our own website.

    Based on what I have seen, it makes sense to move our center of activity back to our own website. Recent technological advances have made this easier. Cheap file storage backend systems have eliminated the expenses and technical problems of storing large amounts of user data. RSS feed importers have allowed us to hook into the Steam web API to feed all our Steam screenshots and videos into our own system.
    Videos
    We have a new video section on our site. You can paste a YouTube link through our web interface or upload a video file directly. Any videos you publish on Steam will be automatically fed in as well. You will notice in the tutorials section I am now hosting tutorial videos on our own site. They are also uploaded on YouTube, but I am not relying on YouTube anymore for video hosting.

    In the future I plan to support user-created paid video tutorials, with the same rules as paid items in the Marketplace.
    Gallery
    A new screenshot gallery is up, with images hosted on our own site again. I hope to find a way to migrate all our content on Steam into this system, like we did with videos. I also want to bulk upload all our old screenshots from before Steam.
    The Steam-based gallery and videos can still be viewed on the leadwerks.com website, as well as the Leadwerks documentation.
    Marketplace Games
    The Marketplace we have now is a 2.0 version of our original system before Steam, with a new Games category. Back in the days before Steam it always amazed me that Furious Frank had over 20,000 downloads. This was from the days before itch.io and gamejolt, and there was a big appetite for indie games. The Games database of our website never reached that level, and I think the reason was that we should have focused on the content. If people want to watch videos they will go to the videos section. If people want to download free games they will go to the Games category in the Marketplace. Having a customized page on our website with a lot of information and links all in one place is about as pointless as having a Facebook fan page. There's no reason for it, all it does is slow down the delivery of the actual content. It looks kind of cool, but I think the viewer just wants to get to the content (download the game, watch videos, view screenshots) instead of wading through a lot of custom CSS pages. If you want to drive traffic to your website or to your Steam page, post a video and put links in the description where you want the viewer to go next.
    In addition to uploading free games, you can now sell your games in the Marketplace. I have plans to attract a lot of new traffic to the site in 2021, so maybe your games can get more sales at the same time. The same 70/30 split we use for Marketplace assets applies to games.
    Furious Frank is starting over from zero and I realize he would have been enjoyed by over 100,000 players had I not pushed our traffic towards Steam so hard.
    Email Marketing (and leaving social media behind)
    At a startup event I attended years ago, one of the speakers told me that email was actually their best marketing tool. I was kind of surprised because to me it seemed archaic, but that conversation stuck in my mind. According to conventional wisdom, if you want to get the word out about your product you should crete an Instagram account, upload your images, and then when you invariably get no traffic you should blame yourself because your content sucks. Who is pushing this "conventional wisdom"? It is being pushed by giant tech companies that seek to absorb and monetize all your content, and a network of parasitical "gurus" who want to sell you useless advice and then blame you when it fails. This is not the way online customer acquisition actually works.
    I started looking at traffic on our social media accounts and comparing it to email and web traffic, and the results are shocking. Our email newsletters regularly result in 30x more clicks than anything I write on social media. Not only are they failing to bring in an acceptable level of traffic, Twitter and even Steam are actively preventing us from reaching our customers by censoring our content.
    Email is the only direct line of communication you have with your own customers. All third-party middlemen have the potential to abuse their position. They will flood their marketplace with products, change their algorithms, arbitrarily censor or shadowban content. The only thing that will provide a counterweight to that abuse is a good BATNA. If you don't have that you can expect to be treated badly. (It's really a miracle that email was invented early enough to become a common open standard. This would never happen today. "Blue Sky" is probably a sabotage effort.)
    In that light, the advice I got makes a lot of sense. Once I realized this I stopped posting on Facebook and Twitter and just left a pinned message directing people to the mailing list:

    Our new website also features a mailing list sign-up form featured prominently in the website footer.

    Instead of wasting time shouting into the wind on (anti)social media I am going to instead focus on writing quality articles that will be posted on our site and sent out to the email list.
    Store
    Valve has made it clear that game developers should not rely on Steam alone to drive traffic to their game. Okay, well if I am responsible for bringing in traffic, I am going to direct it to my own store, not to Steam.
    The MBA in me realizes two things:
    Getting a user's email address is good and has a quantifiable value. Getting a user's credit card system stored in our system is even better. (It doesn't actually store credit cards on our server, it stores a token that can only be used with our domain.) These two steps are serious hurdles to overcome for any web store. Now that I am selling Ultra App Kit directly on our own site, I have seen an increase in sales of items in our Marketplace. This is not a coincidence. People buy one thing and then it becomes a habit. A subscription option will be available for future software products. All new software I release is going to require an initial sign-in to your forum account. We have tens of thousands of users on Steam that I have no email addresses for or ability to contact, and that is not going to work going forward. (I will make sure an override is built into the software that can be activated by a dead man switch.)

    This system gives me the ability to offer software products under a subscription model for the first time ever. This is preferable from my point of view, but I understand it's not for everyone and a conventional software license will also be available.
    We also have an automated system to send out Steam keys, so I am now able to sell Steam keys directly in our store. When you order you will receive an email with a link to retrieve your key. Once you enter the key in Steam it is added to your Steam account just as if you bought it on Steam.
    To make payments easier we are now accepting PayPal and cryptocurrency payments, in addition to credit cards.
    (Valve clearly recognizes a problem with visibility on Steam and is desperately trying to convince you to stay engaged in their system so they can take their 30%. I don't mean to say they are bad guys, I am just saying in any partnership both parties will have some divergent interests and must stick up for themselves. The day that mine is the only company selling products on Steam is when I will consider going Steam-exclusive again. What indie developers should be doing right now is selling their own games on their own websites, in addition to third-party stores like Steam.)
    Rolling Out Ultra Engine (Leadwerks 5)
    Breaking off the GUI features of the new engine and releasing it early as the standalone product Ultra App Kit was a great idea because it allows me to implement all these things early, test them out, and ensures a smoother release of the finished 3D engine later this year. The basic theme we see here is that these social and gaming platforms are no longer doing their job effectively and we must build something of our own. Withdrawing our content from them and building up our own website only makes sense if there is a way to drive more traffic here. If Steam is flooded and social media is saturated, how can we attract traffic? Email is good for engaging customers you have already made contact with, but how do you attract new people in large numbers? I have an answer to that, but it really deserves a whole article itself.
    Conclusion
    Here is where we were before:
    Domain name no one could pronounce or spell. It was literally impossible to tell someone to go to our website without explaining the spelling. User content hosted on Steam and YouTube Almost all sales through Steam Ineffective outreach through social media Single product for sale with perpetual free upgrades No ability to collect customer information Here is where we are now:
    Domain name that can easily spread by word of mouth Most user content hosted on our own website Direct sales through our own website and many third-party stores Effective outreach through email (and other means I will talk about later) Ability to sell annual paid updates or subscription model Growing customer database of people I can directly contact I hope I have shown how all these changes were not just random decisions I made, but are part of a "holistic" strategy, for lack of a better word.
  17. Josh
    Gamers have always been fascinated with the idea of endless areas to roam.  It seems we are always artificially constrained within a small area to play in, and the possibility of an entire world outside those bounds is tantalizing.  The game FUEL captured this idea by presenting the player with an enormous world that took hours to drive across:
    In the past, I always implemented terrain with one big heightmap texture, which had a fixed size like 1024x1024, 2048x2048, etc.  However, our vegetation system, featured in the book Game Engine Gems 3, required a different approach.  There was far too many instances of grass, trees, and rocks to store them all in memory, and I wanted to do something really radical.  The solution was to create an algorithm that could instantly calculate all the vegetation instances in a given area.  The algorithm would always produce the same result, but the actual data would never be saved, it was just retrieved in the area where you needed it, when you needed it.  So with a few modifications, our vegetation system is already set up to generate infinite instances far into the distance.

    However, terrain is problematic.  Just because an area is too far away to see doesn't mean it should stop existing.  If we don't store the terrain in memory then how do we prevent far away objects from falling into the ground?  I don't like the idea of disabling far away physics because it makes things very complex for the end user.  There are definitely some tricks we can add like not updating far away AI agents, but I want everything to just work by default, to the best of my ability.
    It was during the development of the vegetation system that I realized the MISSING PIECE to this puzzle.  The secret is in the way collision works with vegetation.  When any object moves all the collidable vegetation instances around it are retrieved and collision is performed on this fetched data.  We can do the exact same thing with terrain   Imagine a log rolling across the terrain.  We could use an algorithm to generate all the triangles it potentially could collide with, like in the image below.

    You can probably imagine how it would be easy to lay out an infinite grid of flat squares around the player, wherever he is standing in the world.

    What if we only save heightmap data for the squares the user modifies in the editor?  They can't possibly modify the entire universe, so let's just save their changes and make the default terrain flat.  It won't be very interesting, but it will work, right?
    What if instead of being flat by default, there was a function we had that would procedurally calculate the terrain height at any point?  The input would be the XZ position in the world and the output would be a heightmap value.

    If we used this, then we would have an entire procedurally generated terrain combined with parts that the developer modifies by hand with the terrain tools.  Only the hand-modified parts would have to be saved to a series of files that could be named "mapname_x_x.patch", i.e. "magickingdom_54_72.patch".  These patches could be loaded from disk as needed, and deleted from memory when no longer in use.
    The real magic would be in developing an algorithm that could quickly generate a height value given an XZ position.  A random seed could be introduced to allow us to create an endless variety of procedural landscapes to explore.  Perhaps a large brush could even be used to assign characteristics to an entire region like "mountainy", "plains", etc.
    The possibilities of what we can do in Leadwerks Engine 5 are intriguing.  Granted I don't have all the answers right now, but implementing a system like this would be a major step forward that unlocks an enormous world to explore.  What do you think?

  18. Josh

    Articles
    Breaking the new engine up into sub-components and releasing them in several stages is something new. The reason for trying this was twofold:
    Steam now incentives competition for digital shelf space, unlike when Leadwerks was first released on Steam in 2014, when I was trying to build up my quality of presence and promote one app ID on Steam. I would like to get some new software out early before the full engine is finished. The beautiful thing is that Ultra App Kit all consists of things I had to get done anyways. It was not a distraction from the development of the 3D engine.  Here are some of the features I was able to finish and document. All of these will be part of the new 3D engine when it comes out:
    New documentation system using markdown pages on Github, integrated search, sitemap generation, caching, copy button in code boxes. New project creation wizard Finalize display and window classes GUI system Math Memory Process Multithreading Strings Pixmap class Icons System Dialogs In addition, this allows me to establish a lot of my plans for marketing of the new engine and perform a "dry run" before the final release:
    Community system migrated to new website License Management system collects Steam customer emails, allows subscription license model. New videos section of site with a category for official video tutorials. Affiliate / referral system Partnerships with video creators, bloggers, and advertisers. The initial release will support C++ programming, but subsequent releases for Lua and C# will allow me to judge how much interest there is in these languages relative to one another.
    Maybe we will see Ultra 2D Kit this summer?
  19. Josh

    Articles
    A new update is available on Steam for Ultra App Kit.
    A TEXTFIELD_PASSWORD style flag has been added and is used for the password entry form:

    A WINDOW_CHILD style flag has been added. I found this was necessary while implementing a Vulkan 3D viewport in a GUI application. You can read more about that here.
    Pressing the Tab key will now switch the focus between widgets.
    The "Learn" tab in the project manager has been moved in front of the "Community" tab.
    The Visual Studio project is now using a property sheet to store the location of the headers and libs, the way Leadwerks 4 does.
    Build times have been sped up using incremental linking, /Debug.fastlink, multi-processor builds (which finally work with precompiled headers), and other tricks. Build times will typically be less than one second now if you are just modifying your own code. (This blog article from Microsoft was very helpful.)
    The precompiled header has been changed to "UltraEngine.h" which I find more intuitive and compatible than "pch.h".
    Ultra App Kit lets you easily build desktop GUI applications. If you don't have it already, you can get access to the beta by purchasing it now in our store.
  20. Josh

    Articles
    Before finalizing Ultra App Kit I want to make sure our 3D engine works correctly with the GUI system. This is going to be the basis of all our 3D tools in the future, so I want to get it right before releasing the GUI toolkit. This can prevent breaking changes from being made in the future after the software is released.
    Below you can see our new 3D engine being rendered in a viewport created on a GUI application. The GUI is being rendered using Windows GDI+, the same system that draws the real OS interface, while the 3D rendering is performed with Vulkan 1.1. The GUI is using an efficient event-driven program structure with retained mode drawing, while Vulkan rendering is performed asynchronously in real time, on another thread. (The rendering thread can also be set to render only when the viewport needs to be refreshed.)

    The viewport resizes nicely with the window:

    During this process I learned there are two types of child window behavior. If a window is parented to another window it will appear on top of the parent, and it won’t have a separate icon appear in the Windows task bar. Additionally, if the WS_CHILD window style is used, then the child window coordinates will be relative to the parent, and moving the parent will instantly move the child window with it. We need both types of behavior. A splash screen is an example of the first, and a 3D viewport is an example of the second. Therefore, I have added a WINDOW_CHILD window creation flag you can use to control this behavior.
    This design has been my plan going back several years, and at long last we have the result. This will be a strong foundation for creating game development tools like the new engine's editor, as well as other ideas I have.
    This is what "not cutting corners" looks like.
  21. Josh

    Articles
    I'm finalizing the Ultra App Kit API, which is going to be the basis of the new engine's API. Naming commands themselves is always a bit of an art unto itself. Some things are named a certain way because it is common convention, like Stream::Seek() instead of Stream::SetPosition() and ACos() instead of ArcCosine().
    Other times we have holdovers from old APIs or BASIC syntax like Stream::EOF() or String::Mid().
    Should a class that has no SetSize() method use GetSize() or Size() for a method that returns the size? I have to decide these things now and live with the repercussions for years.
    Take a look through the new documentation and tell me if you have any suggestions, or forever hold your peace.
  22. Josh
    An update for Ultra App Kit beta on Steam is now available. This finishes the user interface scaling to support HD, 4K, 8K, and other resolutions. My original plan was to force an application restart if the scale setting was changed, but I found a way to dynamically resize the interface in a manner that gives natural results, so it now supports dynamic rescaling. That is, if the user changes the Windows DPI setting, or if a window is dragged to a monitor with a different DPI setting, the application will receive an event and rescale the interface dynamically.
    Below you can see a sample interface scaled at 100% and 150%:


    Ultra App Kit can be pre-ordered in our store now. You will receive the finished software when it is complete, and a Steam key now to access the beta:
     
  23. Josh

    Articles
    2020 was the most intellectually challenging year in my career. Many major advancements were invented, and 2021 will see those items refined, polished, and turned into a usable software product. Here is a partial list of things I created:
    Streaming hierarchal planet-scale terrain system with user-defined deformation and texture projection. Vulkan post-processing stack and transparency with refraction. Vulkan render-to-texture. Major progress on voxel ray tracing. Porting/rewrite of Leadwerks GUI with Implementation in 3D, rendered to texture, and using system drawing. Plugin system for loading and saving textures, models, packages, and processing image and mesh data. Lua debugger with integration in Visual Studio Code. Pixmap class for loading, modifying, compressing, and saving texture data. Vulkan particle system with physics. Implemented new documentation system. Lua and C++ state serialization system with JSON. C++ entity component system and preprocessor. (You don't know anything about this yet.) Not only was this a year of massive technical innovation, but it was also the year when my efforts were put to the test to see if my idea of delivering a massive performance increase for VR was actually possible, or if I was in fact, as some people from the Linux community have called me, "unbelievably delusional". Fortunately, it turned out that I was right, and an as-of-yet-unreleased side-by-side benchmark showing our performance against another major engine proves we offer significantly better performance for VR and general 3D graphics. More on this later...
    Additionally, I authored a paper on VR graphics optimization for a major modeling and simulation conference (which was unfortunately canceled but the paper will be published at a later time). This was another major test because I had to put my beliefs, which I mostly gain from personal experience, into a more quantifiable defensible scientific format. Writing this paper was actually one of the hardest things I have ever done in my life. (I would like to thank Eric Lengyel of Terathon Software for providing feedback on the final paper, as well as my colleagues in other interesting industries.)
    I'm actually kind of floored looking at this list. That is a massive block of work, and there's a lot of really heavy-hitting items. I've never produced such a big volume of output before. I'm expecting 2021 to be less about groundbreaking research and more about turning these technologies into usable polished products, to bring you the benefits all these inventions offer.
  24. Josh
    The beta testers and I are are discussing game programming in the new engine. I want to see C++, Lua, and C# all take a near-identical approach that steals the best aspects of modern game programming and ditches the worst, to create something new and unique. To that end, we are developing the same simple game several times, with several different methodologies, to determine what really works best. One thing I realized quickly was we really need a way to load prefabs from files.
    I started implementing a JSON scene format using the wonderful nlohmann::json library, and found the whole engine can easily serialize all information into the schema. You can save a scene, or save a single entity as a prefab. They're really the same thing, except that prefabs contain a single top-level entity.
    { "scene": { "entities": [ { "angularVelocity": [ 0.0, 0.0, 0.0 ], "castShadows": true, "collisionType": 0, "color": [ 0.7529412508010864, 1.2352941036224365, 1.5, 1.0 ], "floatPrecision": 32, "guid": "53f8d368-24da-4fba-b343-22afb4237d1b", "hidden": false, "light": { "cacheShadows": true, "coneAngles": [ 45.0, 35.0 ], "range": [ 0.019999999552965164, 12.0 ], "type": 0 }, "mass": 0.0, "matrix": 52, "name": "Point Light 2", "physicsMode": 1, "pickMode": 0, "position": 0, "quaternion": 24, "rotation": 12, "scale": 40, "static": false, "velocity": [ 0.0, 0.0, 0.0 ] } } } } For fast-loading binary data I save an accompanying .bin file. The values you see for position, rotation, etc. are offsets in the binary file where the data is saved.
    From there, it wasn't that much if a stretch to implement Lua State serialization. Lua tables align pretty closely to JSON tables. It's not perfect, but it's close enough I would rather deal with the niggles of that than implement an ASCII data structure that doesn't show syntax highlighting in Visual Studio Code.
    "luaState": { "camera": "ab65bd91-153d-47fb-a11b-ff40c19cd8f4", "cameraheight": 1.7, "camerarotation": "<Vec3>::-4.2,39.1,0", "carriedobject": "9daf54a7-c3b5-4b4b-979b-6b034d6b80fd", "carriedobject_damping": "<Vec2>::0.1,0.1", "carriedobject_gravitymode": true, "carryposition": "<Vec3>::-0.259477,-0.372455,1.95684", "carryrotation": "<Quat>::0.0635833,0.755824,-0.649477,-0.0535437", "interactionrange": 2.5, "listener": "45c93683-ca3b-493a-ad06-b16fe14e4175", "looksmoothing": 2.0, "lookspeed": 0.1, "maxcarrymass": 10.0, "modelfile": "Models/Weapons/Ronan Rifle/scene.gltf", "mouselost": false, "mousemovement": "<Vec2>::-5.00474e-19,3.1102e-22", "movespeed": 5.0, "throwforce": 1500.0, "weapon": "2f8e3d74-26be-4828-9053-a455a9fd05fd", "weaponposition": "<Vec3>::0.12,-0.4,0.42", "weaponrotation": "<Vec3>::-89.9802,-0,0", "weaponswayspeed": 0.1, "weaponswayticks": 2443.5362155621197 } As a result, we now have the ability to easily add quick save of any game, and loading of the game state, automatically, without any special code. The only exception is for entities that are created in code, since they do not have a GUID to trace back to the original loaded scene. This is easily handled with a LoadState() function that gets executed in Lua after a saved game is loaded. In my FPSPlayer script I create a kinematic joint to make the player carry an object when they select it by looking at it and pressing the E key. Since this joint is created in code, there is no way to trace it back to the original scene file. So what I do is first remove the existing joint, if an object is currently being picked up, and then create a new joint, if one has been loaded in the game save file.
    function entity:LoadState() if self.carryjoint ~= nil then self.carryjoint.child:SetGravityMode(self.carriedobject_gravitymode) self.carryjoint.child:SetDamping(self.carriedobject_damping.x, self.carriedobject_damping.y) self.carryjoint:Break() self.carryjoint = nil end if self.carriedobject ~= nil then local pos = TransformPoint(self.carryposition, self.camera, nil) self.carryjoint = CreateKinematicJoint(pos, self.carriedobject) self.carryjoint:SetFriction(1000,1000) end end Here is the result. The player rotation, camera angle, and other settings did not have to be manually programmed. I just saved the scene and reloaded the entity info, and it just works. You can see even the weapon sway timing gets restored exactly the way it was when the game is reloaded from the saved state.
    For most of your gameplay, it will just work automatically. This is a game-changing feature because it enables easy saving and loading of your game state at any time, something that even AAA games sometimes struggle to support.
  25. Josh
    Light is made up of individual particles called photons. A photon is a discrete quantum of electromagnetic energy. Photons are special because they have properties of both a particle and a wave. Photons have mass and can interact with physical matter. The phenomenon of "solar pressure" is caused by photons bombarding a surface and exerting force. (This force actually has to be accounted for in orbital mechanics.) However, light also has a wavelength and frequency, similar to sound or other wave phenomenon.
    Things are made visible when millions of photons scatter around the environment and eventually go right into your eyes, interacting with photoreceptor cells on the back surface of your interior eyeball (the retina). A "grid" of receptors connect into the optic nerve, which travels into your brain to the rear of your skull, where an image is constructed from the stimulus, allowing you to see the world around you.
    The point of that explanation is to demonstrate that lighting is a scatter problem. Rendering, on the other hand, is a gather problem. We don't care about what happens to every photon emitted from a light source, we only care about the final lighting on the screen pixels we can see. Physically-based rendering is a set of techniques and equations that attempt to model lighting as a gather problem, which is more efficient for real-time rendering. The technique allows us to model some behaviors of lighting without calculating the path of every photon.
    One important behavior we want to model is the phenomenon of Fresnel refraction. If you have ever been driving on a desert highway and saw a mirage on the road in the distance, you have experienced this. The road in the image below is perfectly dry but appears to be submerged in water.

    What's going on here? Well, remember when I explained that every bit of light you see is shot directly into your eyeballs? Well, at a glancing angle, the light that is most likely to hit your eyes is going to be bouncing off the surface from the opposite direction. Since you have more light coming from one single direction, instead of being scattered from all around, a reflection becomes visible.
    PBR models this behavior using a BRDF image (Bidirectional reflectance distribution function). These are red/green images that act as a look-up table, given the angle between the camera-to-surface vector and the incoming light vector. They look something like this:

    You can have different BRDFs for leather, plastic, and all different types of materials. These cannot be calculated, but must be measured with a photometer from real-world materials. It's actually incredibly hard to find any collection of this data measured from real-world materials. I was only able to find one lab in Germany that was able to create these. There are also some online databases available that these as a text table. I have not tried converting any of these into images.
    Now with PBR lighting, the surrounding environment plays a more important role in the light calculation than it does with Blinn-Phong lighting. Therefore, PBR is only as good as the lighting environment data you have. For simple demos it's fine to use a skybox for this data, but that approach won't work for anything more complicated than a single model onscreen. In Leadwerks 4 we used environment probes, which create a small "skybox" for separate areas in the scene. These have two drawbacks. First, they are still 2D projections of the surrounding environment and do not provide accurate 3D reflections. Second, they are tedious to set up, so most of the screenshots you see in Leadwerks are not using them.

    Voxel ray tracing overcomes these problems. The 3D voxel structure provides better reflections with depth and volume, and it's dynamically constructed from the scene geometry around the camera, so there is no need to manually create anything.

    I finally got the voxel reflection data integrated into the PBR equation so that the BRDF is being used for the reflectance. In the screenshot below you can see the column facing the camera appears dull.

    When we view the same surface at a glancing angle, it becomes much more reflective, almost mirror-like, as it would in real life:

    You can observe this effect with any building that has a smooth concrete exterior. Also note the scene above has no ambient light. The shaded areas would be pure black if it wasn't for the global illumination effect. These details will give your environments a realistic lifelike look in our new engine.
×
×
  • Create New...