[Bf-committers] GLSL for Mac
significant.bit at gmail.com
Tue Jan 19 21:44:45 CET 2016
GLSL 1.2 (#version 120) correlates to OpenGL 2.1, the lowest version we
support and the *highest* we support on Mac until the next version of
I thought EXT_gpu_shader4 would let you use uint but lemme check...
* Full signed integer and unsigned integer support in the OpenGL
- Integers are defined as 32 bit values using two's complement.
- Unsigned integers and vectors thereof are added.
Which to me means it should work! Maybe it doesn't have the uint keword
yet. Try something like this in the shader:
#if __VERSION__ < 130
#define uint unsigned int
musician, naturalist, pixel pusher, hacker extraordinaire
On Tue, Jan 19, 2016 at 2:14 PM, CGLabs <cglabs at mail.com> wrote:
> I am trying to use a modified “gpu_shader_material.glsl” which has
> variables defined as uint types.
> The GLSL version that Blender is picking up is 120:
> #version 120
> #extension GL_ARB_draw_instanced: enable
> #extension GL_EXT_gpu_shader4: enable
> #define GPU_NVIDIA
> I am getting syntax error for uint when the OpenGL runtime complies the
> dynamic source code.
> I am assuming that 120 correlates to OpenGL 1.20 and it seams that uint is
> not supported.
> Even more odd is that is I replace all uint by unsigned int it GLGS
> compiler is “happy” and works (despite the fact that OpenGL 1.20 states
> unsigned as reserved word).
> Does anyone know what is going on with GLSL’s uint ?
> My machine: MacBook PRO with NVIDIA GeForce GT 750M 2048 MB
> Marcelo Varanda
> Bf-committers mailing list
> Bf-committers at blender.org
More information about the Bf-committers