Seems like a dumb question, but the value it returns is not what I'm expecting... I need to emulate the crc32w instrcution on an arm7 chip (which doesn't support this instruction), so I need a c implementation that gets the same result. Everything I've tried differs. According to the documentation at http://www.keil.com/support/man/docs/armclang_asm/armclang_asm_awi1476352818103.htm, it should do:
CRC32 takes an input CRC value in the first source operand, performs a CRC on the input value in the second source operand, and returns the output CRC value. The second source operand can be 8, 16, or 32 bits. To align with common usage, the bit order of the values is reversed as part of the operation, and the polynomial 0x04C11DB7 is used for the CRC calculation
That's nice, but if I run:
uint32_t crc=0xFFFFFFFF;
uint32_t val=100;
asm volatile("crc32w %w0, %w0, %w1": "+r" (crc): "r" (val) );
Then I get a crc of 6aff40b7
. If I plug the same numbers into http://www.sunshine2k.de/coding/javascript/crc/crc_js.html (or other online crc web pages) I get 0x6B9B7A5D
. I tried toggling the reverse bits, etc, but I can't come up with 6aff40b7
. So, my question is what exactly crc32w
do?
Using the HAL (Hardware Abstraction layer) with ARM CRC32 with the following set:
/* CRC init function */
void MX_CRC_Init(void)
{
hcrc.Instance = CRC;
hcrc.Init.DefaultPolynomialUse = DEFAULT_POLYNOMIAL_ENABLE;
hcrc.Init.DefaultInitValueUse = DEFAULT_INIT_VALUE_ENABLE;
hcrc.Init.InputDataInversionMode = CRC_INPUTDATA_INVERSION_BYTE; // CRC_INPUTDATA_INVERSION_NONE;
hcrc.Init.OutputDataInversionMode = CRC_OUTPUTDATA_INVERSION_ENABLE; // CRC_OUTPUTDATA_INVERSION_DISABLE;
hcrc.InputDataFormat = CRC_INPUTDATA_FORMAT_BYTES;
if (HAL_CRC_Init(&hcrc) != HAL_OK)
{
_Error_Handler(__FILE__, __LINE__);
}
}
I am getting results that agree with the CRC32/JAMCRC variant. i.e. calling
const char * ts3 = "123456789";
MX_CRC_Init();
crc32 = HAL_CRC_Calculate(&hcrc, (uint32_t *)ts3, strlen(ts3));
gives 0x340bc6d9 in crc32 which corresponds to CRC-32/CRCJAM. The bitwise NOT of it is the standard CRC-32. I have no Idea why it is called JAM
const char * ts2 = "The quick brown fox jumped over the lazy brown dog";
MX_CRC_Init();
crc32 = HAL_CRC_Calculate(&hcrc, (uint32_t *)ts2, strlen(ts2));
gave 0x38dd18f ~= 0xFC722E70
which both agree with https://crccalc.com/
// This code assumes that unsigned is at least 4 bytes. If not, use
// unsigned long instead.
// When called with data == NULL, the initial CRC value is returned.
unsigned crc32jamcrc(unsigned crc, void const *mem, size_t len) {
unsigned char const *data = mem;
if (data == NULL)
return 0xffffffff;
while (len--) {
crc ^= *data++;
for (unsigned k = 0; k < 8; k++)
crc = crc & 1 ? (crc >> 1) ^ 0xedb88320 : crc >> 1;
}
return crc;
}
If you are using the CRC32 engine in the stm32 feeding it whole little endian 32 bit words, then this C code will give you the same results as dropping uint32_t's on the CRC->DR.
// This is C code that gives the same results
// as sending words to the STM32 CRC engine.
// Most online calculators apply CRC's to bit reversed
// streams and will therefore, not give the same results.
// Optimally this would be done via DMA in the background.
//
uint32_t stm32_crc32(uint32_t crc, uint32_t data) {
int i;
crc = crc ^ data;
for (i = 0; i < 32; i++) {
if (crc & 0x80000000)
crc = ((crc << 1) ^ 0x04C11DB7) ;
else
crc = (crc << 1) & 0xFFFFFFFF;
}
return crc;
}
User contributions licensed under CC BY-SA 3.0