iOS OpenGL: Why has scaling my renderbuffer for retina shrunk my image, and how do I fix it? -


i'm working on augmented reality project using retina ipad 2 layers - camera feed , opengl overlay - not making use of high resolution screen. camera feed being drawn texture, appears being scaled , sampled, overlay using blocky 4 pixels scale up:

screen scale factor = 1

i have looked through bunch of questions , added following lines eaglview class.

to initwithcoder, before calling setupframebuffer , setuprenderbuffer:

    self.contentscalefactor = [[uiscreen mainscreen] scale]; 

and in setupframebuffer

float screenscale = [[uiscreen mainscreen] scale]; float width = self.frame.size.width; float height = self.frame.size.height;  glrenderbufferstorage(gl_renderbuffer, gl_depth_component16, width * screenscale, height * screenscale); ... glteximage2d(gl_texture_2d, 0, gl_rgba, width*screenscale, height*screenscale, 0, gl_rgba, gl_unsigned_byte, 0); 

the last 2 lines being modified include scale factor.

running code gives me following results:

screen scale factor = 2

as can see, image fills lower left quarter of screen, can confirm image scaled, not cropped. can me work out why is?

its not being scaled, drawing frames defined size defined prior allowing render buffer 2x size in both directions.

most going on defined sizing in terms of pixels rather more general opengl coordinate space moves -1 1 in both x , y directions (this when working in 2d, are).

also, calling:

float width = self.frame.size.width; float height = self.frame.size.height; 

will return size not retina size. if nslog out, see on retina based device, return values based on non-retina based screens, or more movement unit, not pixel.

the way have chosen obtain view's actual size in pixels is:

glint mywidth  = 0; glint myheight = 0;  glgetrenderbufferparameterivoes(gl_renderbuffer_oes, gl_renderbuffer_width_oes,  &mywidth); glgetrenderbufferparameterivoes(gl_renderbuffer_oes, gl_renderbuffer_height_oes, &myheight); 

in ios, have been using below code setup:

-(void)setupview:(glview*)theview{     const glfloat znear = 0.00, zfar = 1000.0, fieldofview = 45.0;     glfloat size;     glenable(gl_depth_test);     glmatrixmode(gl_projection);     size = znear * tanf(degrees_to_radians(fieldofview) / 2.0);      //cgrect rect = theview.bounds;     glint width, height;      glgetrenderbufferparameterivoes(gl_renderbuffer_oes, gl_renderbuffer_width_oes, &width);     glgetrenderbufferparameterivoes(gl_renderbuffer_oes, gl_renderbuffer_height_oes, &height);      // nslog(@"setupview rect width = %d, height = %d", width, height);      glfrustumf(-size, size, -size / ((float)width / (float)height), size /            ((float)width / (float)height), znear, zfar);     glviewport(0, 0, width, height);     glmatrixmode(gl_modelview);      glloadidentity(); 

}

the above routine used within code testing on both retina , non-retina setups, , working fine. setupview routine overridable within viewcontroller.


Comments

Popular posts from this blog

java - Custom OutputStreamAppender not run: LOGBACK: No context given for <MYAPPENDER> -

java - UML - How would you draw a try catch in a sequence diagram? -

c++ - No viable overloaded operator for references a map -