How to calculate the number of bytes a 32-bit number will take when chunked into bytes

0

So we have this to break a 32 bit integer into 8-bit chunks:

var chunks = [
  (num & 0xff000000) >> 24,
  (num & 0x00ff0000) >> 16,
  (num & 0x0000ff00) >> 8,
  (num & 0x000000ff)
]

How can you tell how many chunks it will be before computing the chunks? Basically I would like to know if it will be 1, 2, 3, or 4 bytes before I chunk it into the array. Some bit trick or something, on the 32-bit integer.

function countBytes(num) {
  // ???
}
javascript
bit-manipulation
asked on Stack Overflow Jul 3, 2020 by Lance Pollard

2 Answers

1

There are several approaches I can think of, depending on your preference and/or codebase style.

The first one uses more analytical maths that the other and can perform a little worse than the bitwise maths one below:

// We will need a logarithm with base 16 since you are working with hexadecimals
const BASE_16 = Math.log(16);
const log16 = (num) => Math.log(num) / BASE_16;

// This is a function that gives you the number of non-zero chunks you get out
const getNumChunks = (num) => {
  // First we grab the base-16 logarithm of the number, that will give us the number of places 
  // you need to left-shift 16 to get your number.
  const numNonZeroes = Math.round(log16(num));
  // We need to divide that number by 2 since you are grabbing bits by two
  const numChunks = Math.ceil(numNonZeroes / 2);
  
  return numChunks;
}

The second one is strictly bitwise:

const getNumChunks = (num) => {
  let probe = 0xff;
  let numChunks = 0;
  while ((probe & num) || num > probe) {
      probe = probe << 8;
      numChunks++;
  }
  
  return numChunks;
}

JSFiddle here

1

Or this one liner making use of the clz32 function to determine how many bytes of a 32 bit unsigned int are being utilized...

function numOfBytes( x ) {
    return x === 0 ? 1 : (((31 - Math.clz32( x )) / 8) + 1) | 0;
}
answered on Stack Overflow Jul 10, 2020 by Trentium

User contributions licensed under CC BY-SA 3.0