Some suggestions
You can get way more display flexibility by using quads instead of points. You basically put in vertex data for each particle something like
//localX, localY, data for particle, data for particle, ...
-1, -1, gravityForParticle0, velocityForParticle0, etc..,
1, -1, gravityForParticle0, velocityForParticle0, etc..,
-1, 1, gravityForParticle0, velocityForParticle0, etc..,
1, 1, gravityForParticle0, velocityForParticle0, etc..,
-1, -1, gravityForParticle1, velocityForParticle1, etc..,
1, -1, gravityForParticle1, velocityForParticle1, etc..,
-1, 1, gravityForParticle1, velocityForParticle1, etc..,
1, 1, gravityForParticle1, velocityForParticle1, etc..,
-1, -1, gravityForParticle2, velocityForParticle2, etc..,
1, -1, gravityForParticle2, velocityForParticle2, etc..,
So the data for each particle is identical for each vertex of each quad. In other words you have
unit vertex #0, particle0 data
unit vertex #1, particle0 data
unit vertex #2, particle0 data
unit vertex #3, particle0 data
unit vertex #0, particle1 data
unit vertex #1, particle1 data
unit vertex #2, particle1 data
unit vertex #3, particle1 data
unit vertex #0, particle2 data
unit vertex #1, particle2 data
unit vertex #2, particle2 data
unit vertex #3, particle2 data
Now you can rotate, scale, and orient the quad in your shader and offset it however you want, something you can't do with POINTS
.
Also, If your particle system is deterministic (as in the position of any particle is based solely on time) then you can put all your variables in attributes and uniforms and just pass in the time as a uniform.
You can see an example of this kind of system here. These particles run entirely on the GPU. The only thing passed in is time and matrices for projecting. They handle orienting the particles in 3D, changing color over time, rotating over time, position with velocity and acceleration over time, even texture animation over time (see the numbers in the example)
On top of those techniques, if your particle system is not deterministic, meaning you have state that changes every frame, you can write state to a texture by attaching the texture to a framebuffer object. If your machine supports floating point textures then you can write to an RGBA
/FLOAT
and read that texture as input in either the vertex or fragment shader in a subsequent draw call.
There's an example here. You can even view the texture being used for calculations.