I am seeing a strange behavior with CryptStringToBinary Cryptography API. Please see the below code (config: x64 Debug):
#include "stdafx.h"
#include <windows.h>
#include <strsafe.h>
#include <iostream>
#include <exception>
void main()
{
DWORD dwSkip;
DWORD dwFlags;
DWORD dwDataLen;
//LPCWSTR pszInput = L"jAAAAAECAAADZgAAAKQAAGdnNL1l56BWGFjDGR3RpxTQqqn6DAw3USv2eMkJYm4t"; //this works fine
LPCWSTR pszInput = L"MyTest"; //doesnt work, API returns false,error code 0x0000000d
// Determine the size of the BYTE array and allocate memory.
if(! CryptStringToBinary(
pszInput,
_tcslen( pszInput ) + 1,
CRYPT_STRING_BASE64,
NULL,
&dwDataLen,
&dwSkip,
&dwFlags ) )
{
DWORD dw = GetLastError(); //0x0000000d: The data is invalid
throw std::exception( "Error computing Byte length." );
}
BYTE *pbyteByte = NULL;
try
{
pbyteByte = new BYTE[ dwDataLen ];
if( !pbyteByte )
{
DWORD m_dwError = ERROR_INVALID_DATA;
throw std::exception( "Wrong array size." );
}
}
catch( std::exception &ex )
{
throw ex;
}
catch(...)
{
throw std::exception( "Out of memory." );
}
return ;
}
With first pszInput string (commented string above), the CryptStringToBinary returns true but if i use L"MyTest" as pszInput string it returns false with error code 0x0000000d. I do see, there is some issue with length of the string passed to the API. When I pass the length without null terminated char (removed +1), the API returns true always. But in this case, is the BYTE length returned correct?
Could anybody help me understanding the reason behind this behavior? Also, is my usage of the length parameter in API is correct?
Thanks in advance!
You need to have as input a base64 string as defined by the flag CRYPT_STRING_BASE64 and L"MyTest" is not base64 while the first string is.
Examples of input formats : http://pkix2.sysadmins.lv/library/html/T_PKI_ManagedAPI_CryptEncoding.htm
User contributions licensed under CC BY-SA 3.0