I'm trying to apply a noise effect to my canvas, based on a codepen I saw, which in turn appears to be very similar to an SO answer.
I want to produce a "screen" of randomly transparent pixels, but instead of that I get a field that's completely opaque red. I'm hoping someone who is more familiar with either canvas or typed arrays can show me what I'm doing wrong, and maybe help me understand a few of the techniques at play.
I refactored the codepen code significantly, because (for now) I don't care about animating the noise:
/**
* apply a "noise filter" to a rectangular region of the canvas
* @param {Canvas2DContext} ctx - the context to draw on
* @param {Number} x - the x-coordinate of the top-left corner of the region to noisify
* @param {Number} y - the y-coordinate of the top-left corner of the region to noisify
* @param {Number} width - how wide, in canvas units, the noisy region should be
* @param {Number} height - how tall, in canvas units, the noisy region should be
* @effect draws directly to the canvas
*/
function drawNoise( ctx, x, y, width, height ) {
let imageData = ctx.createImageData(width, height)
let buffer32 = new Uint32Array(imageData.data.buffer)
for (let i = 0, len = buffer32.length; i < len; i++) {
buffer32[i] = Math.random() < 0.5
? 0x00000088 // "noise" pixel
: 0x00000000 // non-noise pixel
}
ctx.putImageData(imageData, x, y)
}
From what I can tell, the core of what's happening is that we wrap the ImageData
's raw data representation (a series of 8-bit elements that reflect the red, green, blue, and alpha values for each pixel, in series) in a 32-bit array, which allows us to operate on each pixel as a united tuple. We get an array with one element per pixel instead of four elements per pixel.
Then, we iterate through the elements in that array, writing RGBA values to each element (i.e. each pixel) based on our noise logic. The noise logic here is really simple: each pixel has a ~50% chance of being a "noise" pixel.
Noise pixels are assigned the 32-bit value 0x00000088
, which (thanks to the 32-bit chunking provided by the array) is equivalent to rgba(0, 0, 0, 0.5)
, i.e. black, 50% opacity.
Non-noise pixels are assigned the 32-bit value 0x00000000
, which is black 0% opacity, i.e. completely transparent.
Interestingly, we don't write the buffer32
to the canvas. Instead, we write the imageData
that was used to construct the Uint32Array
, leading me to believe that we're mutating the imageData object through some kind of pass-by-reference; I'm not clear exactly why this is. I know how value & reference passing works generally in JS (scalars are passed by value, objects are passed by reference), but in the non-typed array world, the value passed to the array constructor just determines the length of the array. That's evidently not what's happening here.
As noted, instead of a field of black pixels that are either 50% or 100% transparent, I get a field of all solid pixels, all red. Not only do I not expect to see the color red, there's zero evidence of the random color assignment: every pixel is solid red.
By playing with the two hex values, I've discovered that this produces a scattering of red on black that has the right kind of distribution:
buffer32[i] = Math.random() < 0.5
? 0xff0000ff // <-- I'd assume this is solid red
: 0xff000000 // <-- I'd assume this is invisible red
But it's still solid red, on solid black. None of the underlying canvas data shows through the pixels that should be invisible.
Confusingly, I can't get any colors other than red or black. I also can't get any transparency other than 100% opaque. Just to illustrate the disconnect, I've removed the random element and tried writing each of these nine values to every pixel just to see what happens:
buffer32[i] = 0xRrGgBbAa
// EXPECTED // ACTUAL
buffer32[i] = 0xff0000ff // red 100% // red 100%
buffer32[i] = 0x00ff00ff // green 100% // red 100%
buffer32[i] = 0x0000ffff // blue 100% // red 100%
buffer32[i] = 0xff000088 // red 50% // blood red; could be red on black at 50%
buffer32[i] = 0x00ff0088 // green 50% // red 100%
buffer32[i] = 0x0000ff88 // blue 50% // red 100%
buffer32[i] = 0xff000000 // red 0% // black 100%
buffer32[i] = 0x00ff0000 // green 0% // red 100%
buffer32[i] = 0x0000ff00 // blue 0% // red 100%
What's going on?
EDIT: similar (bad) results after dispensing with the Uint32Array
and the spooky mutation, based on the MDN article on ImageData.data
:
/**
* fails in exactly the same way
*/
function drawNoise( ctx, x, y, width, height ) {
let imageData = ctx.createImageData(width, height)
for (let i = 0, len = imageData.data.length; i < len; i += 4) {
imageData.data[i + 0] = 0
imageData.data[i + 1] = 0
imageData.data[i + 2] = 0
imageData.data[i + 3] = Math.random() < 0.5 ? 255 : 0
}
ctx.putImageData(imageData, x, y)
}
Your Hardware's endianness is designed as LittleEndian and thus the correct Hex format is 0xAABBGGRR
, not 0xRRGGBBAA
.
First let's explain the "magic" behind TypedArrays: ArrayBuffers.
An ArrayBuffer is a very special object which is directly linked to the device's memory. In itself the ArrayBuffer interface doesn't have too much features for us, but when you create one, you actually allocated its length
in memory, for your own script. That is, the js engine won't deal with reallocating it, moving it somewhere else, chunking it and all these slow operations like it does with usual JS objects.
This thus makes it one of the fastest objects to manipulate binary data.
However, as said before, its interface is in itself quite limited. We have no way to access the data directly from the ArrayBuffer, to do this we have to use a view object, which won't copy the data, but really just offer a mean to access it directly.
You can have different views over the same ArrayBuffer, but the data used will always just be the one of the ArrayBuffer, and if you do edit an ArrayBuffer from one view, then it will be visible from the other:
const buffer = new ArrayBuffer(4);
const view1 = new Uint8Array(buffer);
const view2 = new Uint8Array(buffer);
console.log('view1', ...view1); // [0,0,0,0]
console.log('view2', ...view2); // [0,0,0,0]
// we modify only view1
view1[2] = 125;
console.log('view1', ...view1); // [0,0,125,0]
console.log('view2', ...view2); // [0,0,125,0]
There are different kind of view objects, and each will offer different ways to represent the binary data that is assigned to the memory slot allocated by the ArrayBuffer.
TypedArrays like Uint8Array, Float32Array etc. are ArrayLike interfaces which offer an easy way to manipulate the data as an Array, representing the data in their own format (8bits, Float32 etc.).
The DataView interface allows for more open manipulations like reading in different formats even from normally invalid boundaries, however, it comes at the cost of performance.
The ImageData interface itself uses an ArrayBuffer to store its pixel data. By default, it exposes an Uint8ClampedArray view over this data. That is, an ArrayLike object, with each 32bits pixel represented as values from 0 to 255 for each channel Red, Green, Blue and Alpha, in this order.
So your code is taking advantage of the fact TypedArrays are only view objects and that having an other view over the underlying ArrayBuffer will modify it directly.
Its author chose to use an Uint32Array because its a way to set a full pixel (remember canvas image is 32bits) in a single shot. You can reduce the work needed by four time.
However, doing so, you start dealing with 32bits values. And this may come a bit problematic, because now endianness matters.
The Uint8Array [0x00, 0x11, 0x22, 0x33]
will be represented as the 32bits value 0x00112233
in BigEndian systems, but as 0x33221100
in LittleEndian ones.
const buff = new ArrayBuffer(4);
const uint8 = new Uint8Array(buff);
const uint32 = new Uint32Array(buff);
uint8[0] = 0x00;
uint8[1] = 0x11;
uint8[2] = 0x22;
uint8[3] = 0x33;
const hex32 = uint32[0].toString(16);
console.log(hex32, hex32 === "33221100" ? 'LE' : 'BE');
Note that most personal hardware are LittleEndian, so it's no surprise if your computer also is.
So with all this, I hope you do know how to fix your code: to generate the color rgba(0,0,0,.5)
, you need to set the Uint32 value 0x80000000
drawNoise(canvas.getContext('2d'), 0, 0, 300, 150);
function drawNoise(ctx, x, y, width, height) {
const imageData = ctx.createImageData(width, height)
const buffer32 = new Uint32Array(imageData.data.buffer)
const LE = isLittleEndian();
// 0xAABBRRGG : 0xRRGGBBAA;
const black = LE ? 0x80000000 : 0x00000080;
const blue = LE ? 0xFFFF0000 : 0x0000FFFF;
for (let i = 0, len = buffer32.length; i < len; i++) {
buffer32[i] = Math.random() < 0.5
? black
: blue
}
ctx.putImageData(imageData, x, y)
}
function isLittleEndian() {
const uint8 = new Uint8Array(8);
const uint32 = new Uint32Array(uint8.buffer);
uint8[0] = 255;
return uint32[0] === 0XFF;
}
<canvas id="canvas"></canvas>
User contributions licensed under CC BY-SA 3.0