Quite often when using hardware interfaces you'll have to set groups of bits or set them without changing the rest of the bits. The interface description says something like: you get a System.UINT32, bit 0 is set if available; bits 1..7 mean the minimum value; bits 8..14 is the maximum value; bits 15..17 is the threshold, etc. I have to do this for a lot of values, each with their own start and stop bits.
That's why I'd like to create a class that can convert the values (start bit; stop bit; raw UINT32 value) into the value it represents, and back.
So something like:
class RawParameterInterpreter
{
public int StartBit {get; set;} // counting from 0..31
public int StopBit {get; set;} // counting from 0..31
Uint32 ExtractParameterValue(Uint32 rawValue);
Uint32 InsertParameterValueToRawValue(Uint32 parameterValue,
Uint32 rawValue);
}
I understand the part with handling the bits:
// example bits 4..7:
extract parameter from raw value: (rawvalue & 0x000000F0) >> startbit;
insert parameter into raw: (parameter << startbit) | (rawValue & 0xFFFFFF0F)
The problem is, how to initialize the 0x000000F0 and 0xFFFFFF0F from values startBit and endBit? Is there a general method to calculate these values?
I would use something like this
Uint32 bitPattern = 0;
for (int bitNr = startBit; bitNr <= stopBit; ++bitNr)
{
bitPattern = bitPattern << 2 + 1;
}
bitPattern = bitPattern << startBit;
I know the class System.Collections.BitArray. This would make it even easier to set the bits, but how to convert the BitArray back to Uint32?
So question: what is the best method for this?
Well, your question is very general but,
You could use an enum
with a Flags
attribute.
[Flags]
public enum BitPattern
{
Start = 1,
Stop = 1 << 31
}
User contributions licensed under CC BY-SA 3.0