I have a program that will read in a character and a signed int. The program uses Macros to set a flag based on the value in bit 32, then masks and shifts bits 31-24 to get a character value, and finally counts the last 23 bits. My program is supposed to use compiler macros to solve this. I've been doing a lot of trial and error but can't figure out what is causing the wrong output
#include <stdlib.h>
#include <stdio.h>
#include <unistd.h>
#include <iostream>
#include <string.h>
#define setflagbuff(x) ((x) |= 0x80000000)
#define issetflagbuff(x) ((x) & 0x80000000)
#define getcharbuff(x) ((x) = ((x) & 0x7F800000) >> 23)
#define getintbuff(x) ((x) = (x) & 0x7FFFFF)
#define setcharbuff(x) ((x) = (x) << 23)
#define buildbuff(c, i, x) ((c) = (setflagbuff(c) | setcharbuff(i) | (x)))
int main(int argc, char *argv[]){
int i;
if(strcmp(argv[1], "d") == 0){
int num = (int)(argv[2][0]);
std::cout << num << std::endl;
int cbuf = getcharbuff(num);
int ibuf = getintbuff(num);
std::cout << argv[2] << " " << issetflagbuff(num) << " " << cbuf << " " << ibuf << std::endl;
}
}
issetflagbuff
will return 1 if bit 32 is set
getcharbuff
will pull out bits 24-31, mask them and shift
getintbuff
will pull out bits 1-23
I know my problem is in the bitwise macros but im not sure what's wrong
Example input: d -1300234227
Expected output: -1300234227 1 e 13
Actual output right now: -1300234227 0 0 0
User contributions licensed under CC BY-SA 3.0