Jump to content

LordHippo

Members
  • Posts

    71
  • Joined

  • Last visited

Everything posted by LordHippo

  1. I think in 5 years gamers will use consoles, as they've used them for over 20 years.
  2. Mirrors Edge uses a completely static lighting, according to the DICE's presentations. They claim to use a realtime GI in BF3, but I couldn't find a way to test it myself. Maybe unreal4 can do GI in realtime, but it's not out yet
  3. As far as I know, the only realtime GI that actually works, is the one that Crytek has implemented. There are some papers describing it, and you can find it in NVidia DirectX samples including source code and the papers. But this method is not being used in any real game, even in Crysis2. But maybe you can find a non-global solution for your specific problem.
  4. Nice work! The whole particle thing is going to be great. I really like it Hope to see the final thing myself and have a chance to play with it soon.
  5. There is a good talk about animation driven locomotion in GDC12. You can find it in the free section of gdcvault. But I think driving the character's position completely from animations is not a good idea. Cause you know sometimes artists and designers does not have the same interests
  6. Loading 1 instance of all your models and textures is a good idea, only if you have no memory constrains. Also FreeEntity decreases instance count of that entity by one, and when the instance count reaches zero, the model will be freed completely. So the next time you load that model it will be loaded from the hdd. So your only way is to hide that instance. BTW, you only can load a model or texture in the main thread. You can load one instance of all your models in the loading of your scene, then load any instances you want each frame. But it has no obvious benefit over loading them all at once.
  7. Cubemaps can be rendered in realtime. I remember someone posted a sample about it. As I've mentioned, you need a low-res cubemap to get a glossy look. So just simply render a low-res cubemap, something like 32*32 and assign to your glossy mat.
  8. Many developers are using assembly in their game engines' code. But it's not "100 times" faster or anything near that. But it's a must to get some parts of the engine done.
  9. Use a low-res cubemap to get the glossy look.
  10. you can read that with glRead command I think. It reads a range of pixels from the buffer.
  11. It's probably because you're not loading or assigning a texture, that post processing shader is using. So the font texture that was assigned previously for text rendering, is used by the shader.
  12. LE2 just has some parts of the runtime side of a game engine. It simply offers NO offline side. So if you're ready to complete it and make your own game engine, use LE. if not, go for Unity or UDK. Or you can wait for LE3 and see if it has the offline side of an engine.
  13. GPU based particles, using geometry shaders.
  14. 17ms for frustum culling 10,000 objects is too high. Does this include occlusion culling?!
  15. The first presentation is the one from DICE that I've mentioned. The article was really nice. It shows the performance benefit in a simple way.
  16. I think my statement is misunderstood. I mean OO is being replaced by DO, and it is no longer the best choice. Newly developed engines and games are not using it.
  17. Well, I have one important advice for you: writing a future proof code is your biggest enemy! You would say that I'm wrong, but believe me, programming for games are different than other types of programming. With some research you'll see that no AAA game is made with a closed source game engine. About %70 of AAA games are made with in-house engines, and the other %30 are made with their own modified versions of commercial engines. So face the truth: There is NO general code in game development industry. You have to make your own technology for your game. Also it is a good practice to write your code in a data-oriented way. This way you have to figure out how your game data is transformed first, then implement some routines to accomplish that transformations. By data transformation I mean something like this: User input -> Game Logic -> Physics calculations -> Render engine -> Monitor To achieve this, make a very big flowchart of your game, then design your code specific to the game. So with every change in the game design, the flowchart needs to be modified. This method has three main advantages: 1- Better performance (because of memory access patterns) 2- Better project scheduling 3- Better readable code For more info, read the publications of DICE and Insomniac, that cover this method in more depth. Object oriented programming is no longer used in game programming, for many reasons that's covered in those documents.
  18. You can calculate global normal in the "mesh.frag" like this: vec3 g_nml = normal * gl_NormalMatrix; Then you can combine textures based on this global normal.
  19. Leadwerks already put lighting result in the bloom buffer, and use them in the bloom shader. But tbh the bloom shader itself is not good.
  20. Do you mean the global angle? So if the object turns the textures are changed? Or local angle of the vertices?
  21. It's already included with Leadwerks versions of 2.32 and higher. But its name is still SSAO. Just enable SSAO option in the editor, or use SetSSAO(1) in your code.
  22. If you just want to experiment with shaders, I highly recommend FX Composer or RenderMonkey. Also if you want to do more DirectX work, you can use something like Ogre which is a good render engine to test your ideas on.
×
×
  • Create New...