Up Conversion from int32 to int64 appends 0xCCCCCCCC instead of 0x00000000

-2

I have observed with my debugger a function being called and when it up-converts the int from int32 to long long, sometimes (not randomly), it appends 0xCCCCCCCC to the start of the local variable. What follows is simplified example of the problem. If you want the details keep reading.

writecolumn(int32 col, void *data) <= col is 0x00000001
{
    column(col, *data); <= col is still 0x00000001
}

column( long long col, void *data)
{
    ..code...    <= col is now 0xCCCCCCCC00000001
}

Here are more details:

I am converting two libraries from 32 to 64 bit operation. This was done origionally for W7, but it broke on W10. I am using W10x64, Visual Studio 2019 set to compile my solution in x64.

I have two .dll's, GFITS and CFITS. Both Open Source. Both share headers and define:

typedef long long LONGLONG;

GFITS is mostly a wrapper so that Labview can call CFITS code, which has datatypes unsupported in Labveiw. Ignore the labview for now because I am debugging from in VS2019 and labview is not involved in this problem. (It was in others).

I have GFITS with a function that calls another CFITS function. The GFITS looks like:

EXPORT MgErr gfits_write_col(RefnumHdl handle
                            ,int32     datatype
                            ,int32     colnum
                            ,int32     row
                            ,int32     nelements
                            ,void     *data)
{

 ...stuff...

        return ffpcl(refnum->fptr
                             ,datatype
                             ,colnum
                             ,row
                             ,1
                             ,nelements
                             ,data
                             ,&status);
}

It calls a CFITS function that look like this:

int ffpcl(  fitsfile *fptr,  /* I - FITS file pointer                       */
            int  datatype,   /* I - datatype of the value                   */
            int  colnum,     /* I - number of column to write (1 = 1st col) */
            LONGLONG  firstrow,  /* I - first row to write (1 = 1st row)        */
            LONGLONG  firstelem, /* I - first vector element to write (1 = 1st) */
            LONGLONG  nelem,     /* I - number of elements to write             */
            void  *array,    /* I - array of values that are written        */
            int  *status)    /* IO - error status                           */

All the details about what goes wrong are in my first, simplified, example. The last detail is that it only screws up on firstrow and nelem but not firstelem.

I have never seen this before. They were both compiled in the same solution for x64. The error I ultimately get is "index out of bounds" because my index is 0xCC... and therefor negative.

Is there a compiler flag I am missing? A header miss-match? Please point me in any direction.

Thanks. Kevan

c
casting
64-bit
asked on Stack Overflow Mar 18, 2020 by Kevan Anderson

0 Answers

Nobody has answered this question yet.


User contributions licensed under CC BY-SA 3.0