Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
NorthByNorthwest
Oct 9, 2012

Joda posted:

How would you go about changing the resolution of the framebuffer you're drawing to? I'm working on a project with a 500x500 viewport, but I need to encode a 10x10 texture with data that I compute on the GPU. I'm still a huge newbie when it comes to OpenGL, so I really can't figure out how to make that work. As far as I can tell, if I make a 10x10 texture and a 10x10 renderbuffer and draw the elements I want to the texture, it just takes the top-left most 10x10 pixels of the 500x500 framebuffer, in stead of drawing the entire scene in 10x10 texels as I would expect. That is to say I bind a 10x10 texture to the framebuffer, draw my elements and get the corner of the scene in stead of the whole scene.

In my project, I used:
code:
        glBindFramebuffer(GL_FRAMEBUFFER, framebufferID);
	glViewport(0, 0, 1024, 1024);
	glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
        ..draw to 1024x1024 texture code...

	glBindFramebuffer(GL_FRAMEBUFFER,0);
	glViewport(0, 0, WINDOW_WIDTH, WINDOW_HEIGHT);
	glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
        ...draw scene code...
Changing the viewport worked nicely for me.

Adbot
ADBOT LOVES YOU

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply