Ok - so you have the latest.
That code snippet is nuts - all it does is make a CCClippingNode, add it a scene and then remove it. I guess it is maybe triggering some setup code that is making things work - not a good thing to rely on!
Can you check your CCDirector.m around line 260 has a clear call like this:
[renderer enqueueClear:(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT) color:clearColour depth:1.0f stencil:0 globalSortOrder:NSIntegerMin];
And just to check this is for rendering to the main framebuffer? I had a load of problems in rendering clipping nodes to off-screen buffers (CCRenderTextures) until I did a problem depth included clear on them.
You could maybe stick logging into the code in CCDirector to spit out a frame number and check that the CCClippingNode code isn't running too early.
Also you say your actual cllipingnode is in another node subclass - it doesn't use any custom shaders does it? The CCClippingNode code has some hideous hacks in where it replaces the shader on the clipping node, but the hack only works against relatively standard shaders.
32 and 64 bit devices will both have 8 bits of stencil buffer - it is paired with the depth buffer (which is 24 bits) to make a 32 bit entry.
It does sound like something is messing with the order of things and a setup process is going missing. Do you see the same behaviour on devices as on simulator? The OpenGL frame capture on device can help diagnose what is going on (and 3.4.X added some nice markers to help work out what part of the frame is what) but it is still pretty gnarly to dig through.
For reference I use a load of Clipping Node stuff in my current app and have no difference on 32 or 64 bit targets either hardware or simulators - but it is one of the more fragile areas of the code subject to breakage at the any change....