[Bf-committers] GLSL for Mac
mv_mail at mail.com
Tue Jan 19 22:35:09 CET 2016
I have a build for the experimental branch (actually a separated copy) of Clement’s PBR for MacOS:
> On Jan 19, 2016, at 3:44 PM, Mike Erwin <significant.bit at gmail.com> wrote:
> Hi Marcelo,
> GLSL 1.2 (#version 120) correlates to OpenGL 2.1, the lowest version we
> support and the *highest* we support on Mac until the next version of
> I thought EXT_gpu_shader4 would let you use uint but lemme check...
> * Full signed integer and unsigned integer support in the OpenGL
> Shading Language:
> - Integers are defined as 32 bit values using two's complement.
> - Unsigned integers and vectors thereof are added.
> Which to me means it should work! Maybe it doesn't have the uint keword
> yet. Try something like this in the shader:
> #if __VERSION__ < 130
> #define uint unsigned int
> Mike Erwin
> musician, naturalist, pixel pusher, hacker extraordinaire
> On Tue, Jan 19, 2016 at 2:14 PM, CGLabs <cglabs at mail.com> wrote:
>> I am trying to use a modified “gpu_shader_material.glsl” which has
>> variables defined as uint types.
>> The GLSL version that Blender is picking up is 120:
>> #version 120
>> #extension GL_ARB_draw_instanced: enable
>> #extension GL_EXT_gpu_shader4: enable
>> #define GPU_NVIDIA
>> I am getting syntax error for uint when the OpenGL runtime complies the
>> dynamic source code.
>> I am assuming that 120 correlates to OpenGL 1.20 and it seams that uint is
>> not supported.
>> Even more odd is that is I replace all uint by unsigned int it GLGS
>> compiler is “happy” and works (despite the fact that OpenGL 1.20 states
>> unsigned as reserved word).
>> Does anyone know what is going on with GLSL’s uint ?
>> My machine: MacBook PRO with NVIDIA GeForce GT 750M 2048 MB
>> Marcelo Varanda
>> Bf-committers mailing list
>> Bf-committers at blender.org
> Bf-committers mailing list
> Bf-committers at blender.org
More information about the Bf-committers