Why Win32 error codes don't match method signatures?

1

Can someone help me understand why Win32 error codes don't match method signatures and what is the correct way to deal with this issue?

Take for example SCardReleaseContext, as per the MSDN documentation the return type is LONG. As per this MSDN article, the c# equivalent of LONG is int. Looking at some p/invoke example signatures of SCardReleaseContext, the return type is also marked as int.

However, the definitions of WinSCard error codes, do not match the int type, because the value simply doesn't fit into int.

Currently, in my c# program, I have had to define the return value and the error code values as uint. This way my code compiles and works. Otherwise, the compiler will complain that the value cannot be converted to int, for instance - enum ErrorCodes : int { SCARD_E_CANCELLED = 0x80100002 }.

Can I correctly assume that a c++ compiler will take the value 0x80100001 (which doesn't fit into an int), let it overflow and thus turn the value into some appropriate matching negative value?

Also, am I correct in saying that the c# compiler will not allow such a mistake/bug/flaw to be written and the correct way of handeling this (corner) case is to purposefully "brake" the p/invoke signature and use uint instead?

Edit:
Fixed the last link, which was sort of incorrect, in it the values were cast to DWORD, which happens to be uint. Set the link to point to Microsoft's own site.

c#
pinvoke
asked on Stack Overflow Apr 6, 2012 by Marko • edited Apr 6, 2012 by Marko

3 Answers

2

The winapi was designed to be used with a C compiler. A language which allows assigning an unsigned literal to a signed integer. Some compilers generate a warning for that but never an error. Your C# compiler isn't so forgiving. You'll need to apply two two-by-fours to keep it happy:

    enum ErrorCodes : int { 
        SCARD_F_INTERNAL_ERROR = unchecked((int)0x80100001) 
    }

The cleaner solution is to have the enum inherit from uint instead.

    enum ErrorCodes : uint { 
        SCARD_F_INTERNAL_ERROR = 0x80100001 
    }

And change the pinvoke declaration to return an uint instead of an int. Lying in the pinvoke declaration is a pretty common technique, especially with the ones that take a PVOID or LPARAM. Of course, you have to know the consequences. There are none for lying LONG to uint, the types have the same size, only the value interpretation is different.

answered on Stack Overflow Apr 6, 2012 by Hans Passant • edited Apr 6, 2012 by Hans Passant
1

How comes that they don't fit? Eg. (DWORD) 0x80100006 fits into 32 bits. Of course, it's confusing that you would have negative numbers in .NET, but the value itself fits nicely. Your easiest option would be to change P/Invoke declaration to uint, yes.

1

I'd say that MS simply got it wrong when they implemented SCardReleaseContext and that SCardReleaseContext should have been declared to return DWORD. If I were you I would simply declare your P/invoke to return uint.

answered on Stack Overflow Apr 6, 2012 by David Heffernan

User contributions licensed under CC BY-SA 3.0