[Bf-committers] GLSL for Mac

Mike Erwin significant.bit at gmail.com
Tue Jan 19 21:44:45 CET 2016


Hi Marcelo,

GLSL 1.2 (#version 120) correlates to OpenGL 2.1, the lowest version we
support and the *highest* we support on Mac until the next version of
Blender.

I thought EXT_gpu_shader4 would let you use uint but lemme check...

https://www.opengl.org/registry/specs/EXT/gpu_shader4.txt

* Full signed integer and unsigned integer support in the OpenGL
  Shading Language:

  - Integers are defined as 32 bit values using two's complement.
  - Unsigned integers and vectors thereof are added.


Which to me means it should work! Maybe it doesn't have the uint keword
yet. Try something like this in the shader:

#if __VERSION__ < 130
#define uint unsigned int
#endif

Mike Erwin
musician, naturalist, pixel pusher, hacker extraordinaire

On Tue, Jan 19, 2016 at 2:14 PM, CGLabs <cglabs at mail.com> wrote:

> Hello,
>
> I am trying to use a modified “gpu_shader_material.glsl” which has
> variables defined as uint types.
> The GLSL version that Blender is picking up is 120:
>
> #version 120
> #extension GL_ARB_draw_instanced: enable
> #extension GL_EXT_gpu_shader4: enable
> #define GPU_NVIDIA
>>>>
> I am getting syntax error for uint when the OpenGL runtime complies the
> dynamic source code.
> I am assuming that 120 correlates to OpenGL 1.20 and it seams that uint is
> not supported.
>
> Even more odd is that is I replace all uint by unsigned int it GLGS
> compiler is “happy” and works (despite the fact that OpenGL 1.20 states
> unsigned as reserved word).
>
> Does anyone know what is going on with GLSL’s  uint ?
>
> My machine: MacBook PRO with NVIDIA GeForce GT 750M 2048 MB
>
> Regards,
> Marcelo Varanda
>
>
> _______________________________________________
> Bf-committers mailing list
> Bf-committers at blender.org
> http://lists.blender.org/mailman/listinfo/bf-committers
>


More information about the Bf-committers mailing list