Jump to content

iOS Rendering Pipeline for LE3


golden_gate
 Share

Recommended Posts

Will the iOS implementation of LE3 allow developers to use the latest rendering APIs provided by Apple.

In iOS 5 the CVOpenglESTextureCache API was introducded which greatly improved rendering performance.

 

The layer of abstraction LE3 implements via LUA looks great. But in my experience sometimes with game engines it all comes down to what APIs are implemented beneath these layers of abstaction utlized with scripting languages. It doesn't matter if its C#, Javascript, LUA ... What good is the scripting if I can't tap into the full power of the native APIs...

 

I need an game engine that will allow me to use the CVOpenglESGTextureCache API and use native iOS APIs (e.g, AVFoundation).

 

If there are missing APIs can you create plugins?

 

Congratulations on LE3! It looks extremely promising...biggrin.png

Link to comment
Share on other sites

I have never heard of that API. My experience has taught me to stay away from vendor-specific extensions.

 

However, since Leadwerks is just a static library that gets included in an Xcode project, it give you a lot of low-level access.

My job is to make tools you love, with the features you want, and performance you can't live without.

Link to comment
Share on other sites

The API requires Opengl ES 2.0 (programmable pipeline) ... does not work with a fixed function pipeline. Optimal rendering performance on iOS requires this iOS.

 

I would seriousy recommend understanding this API... In fact there are some frameworks that require this API.

 

Some iOS resources when apple migrated to this rendering pipeine implementation.

******************************************

For those registered iOS developer on the forum that are interested in this topic and want to perform your own technical due diligence or see the technical merits of these APIs from Apple's perspective I have provided the links below for you to review at your earliest convenience.

]WWDC 2011 Session Videos

 

https://developer.ap...deos/wwdc/2011/

 

Session 419: Capturing From the Camera Using AV Foundation on iOS 5

API enhancements and performance improvements

 

Brad Ford, iOS Engineering

 

Summary: Highlights all of the benefits of CVOpenglESTextureCache

 

https://developer.ap...deos/wwdc/2012/

Session 517: Real Time Media Effects and Processing During Playback

Simon Goldrei, Media Systems

 

Summary: This session is packed with good stuff. But more importantly Apple has introduced APIs to allow synchronization of Audio with Video (Opengl ES 2.0)

Link to comment
Share on other sites

It does use OpenGL ES 2.0, so you're good there. I'll be interested in seeing what you come up with.

 

We implemented a system to add multiple hooks into the engine, specifically for the purpose of expandability through plugins.

 

What is this for, recording video and displaying it as a texture?

My job is to make tools you love, with the features you want, and performance you can't live without.

Link to comment
Share on other sites

It does use OpenGL ES 2.0, so you're good there.

 

Thats fantastic!

 

What is this for, recording video and displaying it as a texture?

 

1.Interactive 3D environments/Virtual Worlds with multimedia --> Video Textures. In my scenario I will be loading existing multimedia assets from a users device. ((extremely powerful with iOS Core Image Filters)

2. Advanced/Real Time Post Processing (extremely powerful with iOS Core Image Filters )

3. Video Texture Atlas (as opposed to a static texture altas that is built one time). Then a developer can grab pixels in real time from the video texture and used as needed (additional post processing, etc, etc)

This new API basically replaces glTexImage2D. The developer no longer needs to pass a pointer to some image data. Instead you provide a pixel buffer and the API returns a texture handle back to you that you can use as you see fit. Apple found a better way of doing the traditional memcpy from CPU to GPU for the texture data.

 

Technically its not a new API... It has been on Mac OS for a while.

 

As long as LE3 does not force me to use glTexImage2D then we should be able to use it....

Edited by golden_gate
Link to comment
Share on other sites

Cool!

 

Use this to get the OpenGL texture handle:

Texture* texture = Texture::Create(512,512);
GLUint texhandle = ((OpenGLES2Texture*)texture)->glhandle;

 

And then you can do anything you want with it.

My job is to make tools you love, with the features you want, and performance you can't live without.

Link to comment
Share on other sites

I just wanted to emphasize that the API returns a texture handle to you.

 

size_t frameWidth = CVPixelBufferGetWidth(pixelBuffer); // Pixel Buffers provided by CPU decoding video

size_t frameHeight = CVPixelBufferGetHeight(pixelBuffer);

 

CVOpenGLESTextureRef texture = NULL; // create OpengGLESTexture Ref

CVReturn err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault,

videoTextureCache,

pixelBuffer,

NULL,

GL_TEXTURE_2D,

GL_RGBA,

frameWidth,

frameHeight,

GL_BGRA,

GL_UNSIGNED_BYTE,

0,

&texture);

Noticed that we passed in actual pixel buffer with its attributes (the frame width and height) .

Note the '&texture' in the the API call .... This information is passed back to you ... You can also query the size of the texture given back to you. The API determines the size of the texture because you passed in the attributes of the pixel buffer (as compared ot your snippet).

 

Now since it exists you can bind it as needed ...

 

glBindTexture(CVOpenGLESTextureGetTarget(texture), CVOpenGLESTextureGetName(texture));

 

Yes...its a bit different ... others call it weird .. The API is giving the texture handle back to you instead of you submitting it.

 

Some game engines (e.g, Unity) have a rigid rendering pipeline that only allows for the traditional rendering approach I mention above using glTexImage2d. This approach is 'old school' ... not as old as Bainer Hall from my UCD days ... but still 'old school'

 

Eventhough you may not have a LUA script for this I am hoping that your rendering pipeline allows for this type of option.

 

It would be nice to have the LUA layer of absraction.. then you could use LUA for rapid prototyping of multimedia assets in

virtual worlds ....

Link to comment
Share on other sites

Ha, I went to UCD too.

 

Couldn't you simply replace the Leadwerks texture handle and target with those values, and delete the original handle? The engine wouldn't have any idea what happened.

 

Another (probably better) approach would be to just create a class that extends the OpenGLES2Texture class, and replace the Bind() function with your own. That's the beauty of OO.

 

It'a also possible to add your extended class into Lua using ToLua++. That's what we use to expose our API to Lua.

My job is to make tools you love, with the features you want, and performance you can't live without.

Link to comment
Share on other sites

Your response seems very promising.smile.png Especially "...The engine wouldn't have any idea what happened."

 

I think all of your options are very legitimate... but the last important point....

 

I will have to have access the EAGLContext. In my previous post you will see a parameter in the extremely long api called "videoTextureCache". Its created with the following function:

 

CVReturn err = CVOpenGLESTextureCacheCreate(kCFAllocatorDefault, NULL, oglContext, NULL, &videoTextureCache);

The function simply returns a handle to a new CVOpenGLESTexture cache. But notice we passed in a EAGLContext ..

 

Eventually we have to render the contents to the display via [oglContext presentRenderbuffer:GL_RENDERBUFFER].

If I can have access to the EAGLContext then we are good to go.

 

 

This looks extremely promosing. Please believe me .. I know this API that apple is using to get all of its advertised bench marking is complicated. If I can wrap this "rendering plumbing" in Lua and hide it then we can just concentrate on quickly creating these 3d interactive scenes.

Link to comment
Share on other sites

You actually get all the code that creates the context and everything, since it isn't part of the cross-platform Leadwerks API. It's all in the generated Xcode projects, so you can access everything.

  • Upvote 1

My job is to make tools you love, with the features you want, and performance you can't live without.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

×
×
  • Create New...