Search the Community
Showing results for tags 'nvidia'.
-
I tried to use a probe in my Dungeon Level ( example video made in godot : X.XOM RIAD example video in Godot ) I imported the Level as glb model and made it to a mdl -> dragged and dropped the mdl into the Level. I placed the probe and hit "baking global illumination" -> this ends in a software crash without any error, after around a half minute of loading. The Textures used on this model are in 2k resolution, if that info matters.
-
The crash occurs after adding PROBE, as soon as I launch build global illumination as soon as I launch build global illumination CPU type OctalCore Intel Core i9-9900K, 4700 MHz System board Gigabyte Z390 Aorus Pro WiFi (3 PCI-E x1, 3 PCI-E x16, 2 M.2, 4 DDR4 DIMM, Audio, Video, Gigabit LAN, WiFi) System memory 32000 MB (DDR4 SDRAM) Video adapter NVIDIA GeForce RTX 3070 Operating system Microsoft Windows 10 Pro
-
Once any code file is opened in the script editor, the main window viewports can no longer resize correctly. This happens if the script editor window is parented to the main window, or if it is a top-level window. Nvidia GEForce 1080 TI, driver 32.0.15.7688. The issue seems to be caused by this line of code, which is used for the Scintilla widget control: SetWindowLongPtr(hwnd, GWLP_WNDPROC, (LONG_PTR)SyntaxEditor::CustomWndProc);
-
In my game with on level with many objects memory was increasing pretty fast and fps starts lowering at some point even if nothing happens. In example with single light in release mode memory usage increased for me from 150 to 300 for 5 mins, With more lights and more fps (vsync off and maybe release mode as well) this happens much faster - around 10 Mb per seconds. #include "UltraEngine.h" using namespace UltraEngine; int main(int argc, const char* argv[]) { auto displays = GetDisplays(); auto window = CreateWindow("Ultra Engine", 0, 0, 1280, 720, displays[0], WINDOW_CENTER | WINDOW_TITLEBAR); auto framebuffer = CreateFramebuffer(window); auto world = CreateWorld(); world->RecordStats(true); auto camera = CreateCamera(world); camera->SetClearColor(0.125); camera->SetFov(70); camera->Move(0, 2, -8); auto ground = CreateBox(world, 20, 1, 20); ground->SetPosition(0, -0.5, 0); ground->SetColor(0, 1, 0); vector<shared_ptr<PointLight>> lights; for (int i = -10; i < 10; i++) { auto light = CreatePointLight(world); lights.push_back(light); light->SetPosition(i, 0, 0); } //Main loop while (window->Closed() == false and window->KeyHit(KEY_ESCAPE) == false) { world->Update(); world->Render(framebuffer, false); window->SetText("FPS: " + String(world->renderstats.framerate) + " RAM: " + WString(GetMemoryUsage() / 1024)); } return 0; }
-
After a swap chain is created, if post-processing effects are enabled, it will take three calls to World::Render before anything except a black screen will appear. Related thread: #include "UltraEngine.h" using namespace UltraEngine; int main(int argc, const char* argv[]) { EngineSettings settings; settings.asyncrender = false; Initialize(settings); //Get the displays auto displays = GetDisplays(); //Create window auto window = CreateWindow("Ultra Engine", 0, 0, 1280, 720, displays[0], WINDOW_RESIZABLE | WINDOW_CENTER | WINDOW_TITLEBAR); //Create world auto world = CreateWorld(); //Create framebuffer auto framebuffer = CreateFramebuffer(window); //Create a camera auto camera = CreateCamera(world, PROJECTION_ORTHOGRAPHIC); camera->SetClearColor(0,0,1,1); camera->SetDepthPrepass(false); auto fx = LoadPostEffect("Shaders/FXAA.fx"); camera->AddPostEffect(fx); while (window->Closed() == false and window->KeyDown(KEY_ESCAPE) == false) { auto e = WaitEvent(); if (e.id == EVENT_KEYDOWN and e.data == KEY_SPACE) { camera->SetClearColor(Random(), Random(), Random()); world->Update(); world->Render(framebuffer); } } return 0; }