I've been having an issue getting the Python C API to not give me errors.
Background: I've been using ctypes
to run native code (C++) for a while, but until now I had never actually done anything specific with the Python C API. I had mostly just been passing in structs from Python and filling them from C++. The way I was using structs was becoming cumbersome, so I decided I would try creating Python objects directly in C++ and just pass them back to my Python script.
Code:
I have a DLL (Foo.dll
) with only one function:
#define N 223
__declspec(dllexport) void Bar(void)
{
std::cout << "Bar" << std::endl;
for (int i = 0; i < N; ++i)
{
auto list = PyList_New(0);
std::cout << "Created: " << i << std::endl;
//Py_DECREF(list);
}
}
And then I have the Python script I'm running:
import ctypes as C
dll = r"C:\path\to\dll\Foo.dll"
Foo = C.CDLL(dll)
# lists = [[] for _ in range(0)]
Foo.Bar()
print "Done."
What happens: If I define N
in the above DLL to be 222
or below, the code works fine (except for the memory leak, but that isn't the problem).
If I uncomment the //Py_DECREF(list)
line, the code works fine.
However, with the above code, I get this:
Bar
Created: 0
Created: 1
Created: 2
...(omitted for your sake)
Created: 219
Created: 220
Created: 221
Traceback (most recent call last):
File "C:\path_to_script\script.py", line 9, in <module>
Foo.Bar()
WindowsError: exception: access violation reading 0x00000028
In fact, I get this same result with dictionaries, lists, tuples and so on. I get the same result if I create a list and then append empty sublists to that list.
What's weirder, every list that I make from within the actual Python script will decrease the number of lists the DLL can make before getting this windows error.
Weirder still, if I make more than 222 lists in my python script, then the DLL won't encounter this error until it's created something like 720 more lists.
**Other details: **
Python.h
and python27.lib
from that distribution2.7.13 :: Anaconda custom (32-bit)
As long as I don't create many PyObject
s from my C++ code, everything seems to work fine. I can pass PyObject
s to and from the Python code and it works fine.. until I've created "too many" of the objects from within my C++ code.
What is going on?
From the documentation for CDLL:
The Python global interpreter lock is released before calling any function exported by these libraries, and reacquired afterwards.
This makes it unsafe to use Python C API code. Exactly how it fails is unpredictable, as you are finding. I'd guess it has to do with if the allocation triggers a run of the garbage collector, but I don't think it's worth spending too much time trying to work out the exact cause.
There's (at least) two solutions to chose from:
ctypes.PyDLL
(which the documentation notes is like CDLL
except that it does not release the GIL)Reacquire the GIL within your C++ code - an easy way to do this is:
auto state = PyGILState_Ensure();
// C++ code requiring GIL - probably your entire loop
PyGILState_Release(state);
User contributions licensed under CC BY-SA 3.0