How can I change the position of the mouse cursor in OpenGL/Glut?

I'm writing a simple game and I'm going to have the mouse control the camera (using GlutPassiveMotionFunc). I'm going to pitch and yaw based off the mouse difference between callbacks, however I think it would be a good idea to "force" the mouse back to the center of the screen every time they tried to move it. This way their cursor won't be at the edge of the screen and they can't move any further in that direction. What Glut / OpenGL command can I use to force the position of the mouse to

Opengl How to batch same square in a single glVertexPointer

I've read that to optimize drawing, one can draw a set of figures using the same texture in one pass. But how do i connect my singles square together to form one figure to send to glVertexPointer. (read in PowerVR MBX.3D Application Development Recommendations.1.0.67a - page 5)

Why can't I know the state of OpenGL lights in GLSL?

Is there a way to find out if a light is enabled in GLSL that doesn't involve passing attributes or creating a ton of different shaders? What about using NVidia's C for Graphics (Cg)? Can I do it with Cg? I am now convinced that you can't do it. But now I ask: why not?

Opengl 3d geometry: how to align an object to a vector

i have an object in 3d space that i want to align according to a vector. i already got the Y-rotation out by doing an atan2 on the x and z component of the vector. but i would also like to have an X-rotation to make the object look downwards or upwards. imagine a plane that does it's pitch yaw roll, just without the roll. i am using openGL to set the rotations so i will need an Y-angle and an X-angle.

Opengl GLSL - Problem of casting and weird bug

i'm here to get help about a strange behavior of my GLSL code when i cast a float to an int and i never seen such a bug since i started GLSL Actually i'm trying to achieve mesh skinning on CPU with GLSL I use an ATI Radeon HD 4850 (Gainward) and i work with OpenGL 2.1 on Windows XP so on CPU side I gather bones indices and weights, and i throw them to the shader with vertex attributes then i multiply bones matrices with weights and use the result to compute normal and gl_Position very usual

Opengl In a GLSL fragment shader, how to access to texel at a specific mipmap level?

I am using OpenGL to do some GPGPU computations through the combination of one vertex shader and one fragment shader. I need to do computations on a image at different scale. I would like to use mipmaps since their generation can be automatic and hardware accelerated. However I can't manage to get access to the mipmap textures in the fragment shader. I enabled automatic mipmap generation: glTexParameteri(GL_TEXTURE_2D, GL_GENERATE_MIPMAP, GL_TRUE); I tried using texture2DLod in the shader

Opengl Using Vertex Buffer Objects for a tile-based game and texture atlases

I'm creating a tile-based game in C# with OpenGL and I'm trying to optimize my code as best as possible. I've read several articles and sections in books and all come to the same conclusion (as you may know) that use of VBOs greatly increases performance. I'm not quite sure, however, how they work exactly. My game will have tiles on the screen, some will change and some will stay the same. To use a VBO for this, I would need to add the coordinates of each tile to an array, correct? Also, to

Opengl Trouble Modifying Contents of Frame Buffer

I am trying to rotate the contents of the frame buffer by putting it into a texture then putting it back into the buffer. I get a white rotated rectangle. I suspect I am overlooking something obvious. Here is the code I am using: glReadBuffer(GL_AUX1); glEnable(GL_TEXTURE_2D); glBindTexture(GL_TEXTURE_2D, textures[1]); glCopyTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 0, 0, rect.width(), rect.height(), 0); glDrawBuffer(GL_AUX1); glPushMatrix(); glRotatef(head - new_head, 0, 0,

I have a problem about opengl glut glutStrokeCharacter, the code did not work?

I write a opengl code on ubuntu. I want to draw a text on the screen, but the output() function seems not work. Can you tell me why? void output(GLfloat x, GLfloat y, char *text) { char *p; glPushMatrix(); glTranslatef(x, y, 0); for (p = text; *p; p++) glutStrokeCharacter(GLUT_STROKE_ROMAN, *p); glPopMatrix(); } void myDraw(void) { glClear(GL_COLOR_BUFFER_BIT); glColor3f(1.0f,

Opengl Points resulting from the intersection between three spheres using GPU hardware

There are analytical expressions that permit the calculation of the curve resulting from the overlapping between three penetrating spheres. There are also approximate methods that using grids or another methodologies, calculate points with more or less accuracy that belong to this interesection. I wonder if for the latter, the calculation can be done somehow using special hardware functions from the GPU, with CUDA or OpenGL. I need it for a very computing intensive number crunching program, so t

Setting up OpenGL Multiple Render Targets

I've seen a lot of material on this subject, but there are some differences between the examples I've found and I'm having a hard time getting a solid understanding of the correct process. Hopefully someone can tell me if I'm on the right track. I should also mention I'm doing this on OS X Snow Leopard and the latest version of Xcode 3. For the sake of example, let's say that I want to write to two targets, one for normal and one for color. To do this I create one framebuffer and bind two text

Opengl 3D visualization of complex geometries in a GUI

I would like to develop a small cross-platform for (structured) mesh generation software (similar to Gmesh) and possibly 3D pre/post processing (like Salome). In order to make things easier I'd like to use already made libraries, to better focus on the development of what I need. I need 1. geometrical modelling capabilities 2. GUI 3. 3D visualization. I have been looking around but the whole workflow results a bit blurry. I think pyGTK and GLADE are good choices for me ( because of the com

Opengl glOrtho in a 3D scene isn't working

I made a 3D scene and I used glOrtho and gluOrtho2D to get things to stay on my screen when I move the camera to look around in my 3D scene. But when I start to look around the characters disappear. How do you get the characters to stay on your screen.

Opengl How Calculate NORMAL in triangle strip

i have a problem with the lighting of my OPENGL projects. I program on c++ using ECLIPSE. I have a terrein construited by triangle strip. the code of my render is the follow: HeightMap::~HeightMap(void) { } float HeightMap::getScaledGrayColor(double height) { float fColor = NULL; if (height == this->NODATA_value) { return 0.0f; } fColor = ((255.00f / max) * height) / 255.00f; return fColor; } double* HeightMap::getHeights(int rowNumber) { rowNumber = rowNumber + 6; ifstream fin(this-&

Opengl Updating vertex data in a VBO (glBufferSubData vs glMapBuffer)

I want to update an object's list of vertices after a VBO has been created. I've seen both glBufferSubData and glMapBuffer and they both appear to do similar things, which means I'm now unsure which one to use. My pseudo workflow is: Create object Begin vertex update (calls glBufferData with data = nullptr) Update object's vertices End vertex update (takes the updated vertices and either calls glBufferSubData or glMapBuffer)

Opengl Draw polygons above background image

My question is to draw a polygon or some other OpenGL primitives above or over a background image. Summarizing like paint in different layers, but in OpenGL i think there is no layers. Now i'm doing some test trying to draw a triangle and a line over the background image. To draw the background i use a square with OpenGL window size and then apply the png image in this square as a texture. After that I try to paint the triangle and the line with different colors but I don't see anything excep

opengl rotate by quaternion

I implemented my quaternion class like this. I can convert the quaternion to 3x3 rotation matrix, but then how should i apply that to my modelview matrix?

wxWidgets OpenGL textures

I've recently been learning wxWidgets (version 2.9.4) and have been having trouble getting textures to work correctly in an OpenGL window (glColor4f works fine however). I've tried following the tutorials from wxWidget including loading textures from a wxImage. I also tried using stbi_image to load in the texture but no matter what I do the textures always remain white. Here's the code I'm using: wxImage* img = new wxImage(wxT("grass.png")); GLuint texture; glGenTextures(1, &texture); glB

Opengl 3D graphics pipeline

That's what I understand about graphic pipeline (Opengl) so far, can someone confirm that? Object space: Object (geometry) specific coords; World space: rotate, translate and scale the object to world space; Eye space: Frustum that limits the world view, with a projection matrix that projects 3d coord in a 2d plane (near plane?), scaling it when necessary and clipping it when its out of frustum/eye space I've read that eye space reverses the Z axis, so, an original +z vertex turns backward

C++ OpenGL Cube Map Perlin Noise to Sphere

I'm currently working on some Planet Generation mainly for fun and hoping to end up with some kick ass planets. I'm using a Cube which has been mapped into a sphere by normalization. The terrain isn't textured properly yet in this picture. This is just the render from the Heightmap. However this is not my problem. When creating a sphere from a cube you are left with 6 faces bent to the shape of a sphere. Therefore, I do not have latitude and longitude that can be used to wrap a spherical heigh

Opengl Executing a GLSL function only once

Is there a way to execute a function only once with some kind of boolean check on GLSL? Or am i missing something important? The code below doesn't seem to help. bool once = false; DoOnce() { for (int i = 0; i < someConst; i++) { //do some things... } once = true; } void main() { if ( !once ) { DoOnce(); } //do other things } Or do I have to use a compute shader for this kind of behavior?

Opengl Improper colors displayed when loading 32-bit png using stb_image and using GL_UNSIGNED_INT_8_8_8_8 as glTexImage2D type parameter

I'm just learning how to texture in OpenGL and am a bit confused by some of the results I'm getting. I'm using stb_image to load the following checkerboard png image: When I saved the png image I explicitly chose to save it as 32 bit. That would lead me to believe that each component (RGBA) would be stored as 8 bits for a total of 32 bits - the size of an unsigned int. However, using the following code: unsigned char * texture_data = stbi_load("resources/graphics-scene/tut/textures/check

GLUT is working fine but OpenGL is not doing anything

The exact same code snippet is working on another machine but its not working properly for me. The GLUT is working absolutely fine as it open the created window but the line segment is not shown on the window which means there is a problem with opengl. It is not even changing the background color of the window. I even test the opengl on my windows with a testing application and its working fine. #ifdef WIN32 #include <windows.h> #endif #include <GL/glut.h> #include <GL/gl.h>

Opengl rendering pipeline and transformations

So I was studying the rendering pipeline, particularly transformations, when I came across this figure: I believe that the passage from camera space to homogeneous clip space happens in the T&L (or vertex shader if we're using one) but I'm not sure if the lighting calculations are made after projection or before (to be honest I think that they could be made in world coordinates already). Then to pass to the NDC we have to divide by w. Is this passege still considerated among the T&L cal

OpenGL normal blending with black alpha edges

I have a problem with blending text. It's not only text problem but global blending problem with alpha value. I tried 2 different blending function on image (text is image too). Function glBlend with different first parameter. (I know how glBlend func works) There are black pixels around the text and it's around every alpha image with smooth alpha edges. It doesn't seem good with the text :/ The result with parameter GL_ONE is exactly what I want but I can't make text fade with alpha value.

Opengl Environment Mapping + Source Lights

I found a good example of environment mapping equirectangular. Here's the code: VERTEX SHADER varying vec3 Normal; varying vec3 EyeDir; varying float LightIntensity; uniform vec3 LightPos; void main(void){ gl_Position = ftransform(); Normal = normalize(gl_NormalMatrix * gl_Normal); vec4 pos = gl_ModelViewMatrix * gl_Vertex; EyeDir =; LightIntensity = max(dot(normalize(LightPos - EyeDir), Normal), 0.0); } FRAGMENT SHADER const vec3 Xu

Opengl SpriteKit auto-generated atlases sizes are not powers of 2

So, I'm working on a project that has some big textures and recently I decided to split into different atlases by scene so that when navigating through scenes SpritKit can get actually rid of unused textures (Since I cannot control memory usage manually I hope SpriteKit is smart enough to know when a texture atlas is not being used at all and release it if required). Now, after doing this change i went to take a look at resulting atlases and to my surprise their size isn't a power of 2, which

Opengl Can't give Matrix to Shader as uniform

I'm currently implementing matrices in my enigne. With standard glTranslate and glRotate and then ftransform() in the shader it works. Done manually not. How i give Matrix to the Shader: public static void loadMatrix(int location, Matrix4f matrix) { FloatBuffer buffer = BufferUtils.createFloatBuffer(16);; buffer.flip(); glUniformMatrix4(location, false, buffer); } Sending viewMatrix: shaderEngine.loadMatrix(glGetUniformLocation(shaderEngine.standard, "vi

Recursive subdivision on octahedron in OpenGL

I have been referring to this post Drawing Sphere in OpenGL without using gluSphere()? which has helped me quite a lot but i'm stumped now. I have an octahedron in my scene and I would now like to recursively subdivide the triangles to create a sphere. I found this block of code which is supposed to carry out the subdivision for me but I don't understand it fully void subdivide(GLfloat v1[3], GLfloat v2[3], GLfloat v3[3], int depth) { GLfloat v12[3], v23[3], v31[3]; int i; if (depth ==

Opengl ellipsoid and paraboloid relation

My task was to create a paraboloid and a ellipsoid using c++ with opengl. I implemented paraboloid as set of points that's distance from the focal point and the distance from the plane is the same. After running the compiler, I got something that looked like an ellipsoid. The mistake was that accidentally I put a multiplier (that was around 1.1) at the distance from focal point. So I got the equation |Q-F|*1.1=|Q-P| , where Q is the point to be decided whether part of the ellipse/paraboloid,

Opengl Syncing image display with screen refresh rate

What the program does: Using PyQt4 to display images (simple jpg/png files). The objective: Have an image displayed/drawn on the screen in sync with the screen's refresh rate. A pseudo-code sample of what i would like to achieve: pixmap = set_openGL_pixmap(myPixmap) draw_openGL_pixmap(pixmap) doSomthingElse() Ideally, the draw_openGL_pixmap(pixmap) function should only return after the screen has refreshed and the image is displayed. Than doSomthingElse() will be executed immediately

Opengl Shader's function parameters performance

I'm trying to understand how passing parameters is implemented in shader languages. I've read several articles and documentation, but still I have some doubts. In particular I'm trying to understand the differences with a C++ function call, with a particular emphasis on performances. There are slightly differences between HSLS,Cg and GLSL but I guess the underline implementation is quite similar. What I've understood so far: Unless otherwise specified a function parameter is always passed b

Opengl not drawing correctly using shader - issue with matrix setup/initialization?

When I render my app, I'm expecting to see a number of rectangles surrounding the edges of the window. Instead I'm seeing this .. All objects will be at z == 0.0f. If I dont render my scene using shaders, all objects show fine. So thinking it must be matrix calculation issue? Anyone know where I might be going wrong with my matrix setups? matrices is a custom class which contains the 3 matrices .. public class MatrixUtils { /* The different matrices */ private Matrix4f modelMatrix

Opengl GL_FRAMEBUFFER_INCOMPLETE_ATTACHMENT when using texture with internal format GL_R16_SNORM

I am rendering 16bit grayscale images using OpenGL and need to fetch rendered data to buffer without bit depth decimation. My code works perfectly on Intel(R) HD Graphics 4000, but on nVidia graphic cards (NVIDIA Corporation GeForce G 105M/PCIe/SSE2, NVIDIA Corporation GeForce GT 640M LE/PCIe/SSE2) it fails with status GL_FRAMEBUFFER_INCOMPLETE_ATTACHMENT when I try to render to texture with internal format GL_R16_SNORM. It works when I use GL_R16 format, but I also loose all negative values of

OpenGL 4.3 - CubeMap only works when bound as a GL_TEXTURE_2D

I am trying to implement a skybox in open gl. It doesn't work unless I bind It as GL_TEXTURE_2D This is how I load my cube map: // void SkyBoxMaterial::CreateCubeMap(const char* front, const char* back, const char* top, const char*bottom, const char* left, const char*right, GLuint *tex_cube){ glGenTextures(ONE, tex_cube); assert(LoadCubeMapSide(*tex_cube, GL_TEXTURE_CUBE_MAP_NEGATIVE_Z, front)); assert(LoadCubeMapSide(*tex_cube, GL_TEXTURE_C

Camera Space OpenGL Tutorial

I'm following the tutorial at which I believe the user, Nicol Bolas, is the author. Under the Camera Perspective topic, I am getting stuck. "Our perspective projection transform will be specific to this space. As previously stated, the projection plane shall be a region [-1, 1] in the X and Y axes, and at a Z value of -1. The projection will be from vertices in the -Z direction onto this plane; vertices that h

Opengl How should you efficiently batch complex meshes?

What is the best way to render complex meshes? I wrote different solutions below and wonder what is your opinion about them. Let's take an example: how to render the 'Crytek-Sponza' mesh? PS: I do not use Ubershader but only separate shaders If you download the mesh on the following link: and load it in Blender you'll see that the whole mesh is composed by about 400 sub-meshes with their own materials/textures respectively. A dummy renderer

OpenGL (core profile) model's triangles wrong rendering

Here is an issue of my project, please look at the screenshots: problems original(correct) My object is build in a wrong way. The vertices are not connected properly. I suspect that it has something to do with the indices of the model. Anyway here is the code that constructs the mesh for me: mesh model::processMesh(aiMesh * mesh_, const aiScene * scene) { std::vector<vertex> vertices; std::vector<GLuint> indices; std::vector<texture> textures; //vertices

Opengl Understanding the ModelView Matrix

I want to analyze the each component of my 4*4 ModelView Matrix. I came to know that the starting 3*3 of ModelView Matrix stores rotation. If i want my object to have no rotation with respect to camera so My ModelView Matrix looks like this How to change my ModelView Matrix if i want to have NO Translation or Scaling ? Can anyone explain the Maths behind this.

Opengl How can I draw custom colors in Cocos2d-x draw function?

I am using Cocos 2.2.6 (the latest 2.x release shown on their site). (I have to use 2.x for business purposes) I know how to draw shapes in opengl and how to color them, but unfortunately what I am trying to do is not working exactly. I am trying to give color to vertices with glColor3f and glColor4f but they just show up as monochrome. I tried to enable disable material, lighting etc.. nothing worked.. Is there a filter or something? If so how can I disable it so I can use my own colors in a

OpenGL | Render Vertex at UV coordinate

I'm using OpenGL and I need to render the vertecies of a 3D model to a FBO at the UV coordinate of the vertex. To do that, I first have to convert the UV coordinate space to the screen space. I came to the conclusion that: uv.x * 2 - 1 uv.y * 2 - 1 …should do the trick. I used that in my vertex shader to place the vertex at those new positions. The result looks like this: …while it should should look like this: It seems like it's scaled up. I dont know where the problem is.

Opengl Correct relational operator/function for comparison of vec3 values?

I'm trying to check if vec3 values have the same components. int same = 0; vec3 v1 = vec3(1.0f, 0.0f, 0.0f); vec3 v2 = vec3(0.0f, 0.0f, 0.0f); if (v1 == v2) // <- this part { same = 1; } Is == the correct relational operator for vec3 type? If not, what can I use (operators and functions are also welcome) to compare vec3 values?

Opengl How to Discard One of the Line Segments?

I'm using GL_LINES to draw lots of lines. But some of the lines should be hidden. Is there any way to discard a specific line segment? I can implement this by putting lines in different buffers. However, the performance is bad when I have too many buffers. So currently I want to put them into a single buffer and discard the unnecessary one.

Opengl How to rotate any vector by quaternion in glm?

I'm experimenting with quaternion rotations and I can't seem to get it right... I'm trying to rotate each vector by a quaternion (3 vectors = fwVec3, upVec3, rightVec3) but the axis are not rotating correctly - for instance - when I rotate 90 degrees around object's rightVec3 makes the object face downwards, meaning it's upvector is now 90 degrees rotated - but when I rotate around the object's new upvector the object doesn't rotate around it's own upvector, but instead, the global world upvecto

Opengl Same program, target both Open GL and Open GL ES 2.0

Can one run the same program (unmodified) on both the "desktop" OpenGL and OpenGL ES 2.0 platforms, provided that this program only performs 2D accelerated rendering? An average Windows desktop PC and Raspberry Pi will run the program. The GL context is obtained via the functions provided by the excellent SDL2 library. And for the drawing routines a texture atlas will be used. It would be convenient if a program could be developed/debugged on a PC and then simply be recompiled to run on the ra

Opengl How to improve a look of a 2d light?

I'd like to have a nice light. I am not sure how exactly should I call it, but you'll see when I actually show some example. What I mean by nice is like this scene: It looks like light has some area and it ends somewhere. What I tried to do is to calculate distance from current pixel to a light position and use the distance as a light, it does not look good at all My last shot was to try 3d lightning techniques. I defined a normal for the scene which is vec3(0.0f, 0.0f, 1.0f) and calcul

OpenGL Convert NV12 to RGB24 using shader

I tried to write an application to display YUV image in OpenGL. I was successfully converted YUV to RGB in C++ using this snippet (source) static long int crv_tab[256]; static long int cbu_tab[256]; static long int cgu_tab[256]; static long int cgv_tab[256]; static long int tab_76309[256]; static unsigned char clp[1024]; //for clip in CCIR601 void init_yuv420p_table() { long int crv,cbu,cgu,cgv; int i,ind; static int init = 0; if (init == 1) retu

  1    2   3   4   5   6  ... 下一页 最后一页 共 55 页