If you do a
std::vector<double> v = std::vector<double>(); your vector is going to have a capacity equal to zero, fine.
Now I have an API having a function having a
std::vector<double> reference parameter to which I pass a vector definied as above. And at debug, I see a capacity equal to ...
At the beginning I thought it was an
INT_MAX of some sort, until I check that it is a prime number (between 2^28 and 2^29, and it's not even the biggest prime number smaller than 2^29.)
Just to be really sure (I am already sure of it in fact, but before harassing the API guys) : is this number connected somehow to the STL ?
Some precisions. The real setting is that I am in Python, using an API binding c++ code, in fact binding
std::vector<double> and a c++ function
F having as parameter a reference to such a vector. I see a normal capacity (zero) after initializing the vector binder in the python code, and I see the capacity being this prime number immediately after entering the c++ function binding
Remark. Following a comment of hnefatl I did a little track down and :
d:\Program Files (x86)\Microsoft Visual Studio\2017\Community\VC\Tools\MSVC\14.12.25827\include\xhash has
#define _HASH_SEED (size_t)0xdeadbeef and
size_t hash_value(const _Kty& _Keyval) uses it, and it appears in a comment in ...
c:\PYTHON\PYTHON_OFFICIAL\python-3.6.3\include\abstract.h regarding the
long PyObject_Hash(PyObject *o) which is implemented elsewhere, and used indeed I guess ... and I indeed rely on
Of course, I would like to give a minimal reproducing example, but it's not easy --> I will give it this week-end.
User contributions licensed under CC BY-SA 3.0