How to set magic pointer values on arbitrary pointer types


I have an application in which I've experienced some rare segmentation faults due to nullptr dereferences. The pointer values in the application follow a pretty standard life cycle:

  1. I initialize them to nullptr.
  2. They get set to a value at some point early on when information becomes available to set them to a constructed instance.
  3. They then get used for a time.
  4. They finally get freed, after which point I set them to null.

When I do post mortem analysis on a core, it would be helpful to know whether the pointer had a nullptr value because it was just uninitialized but not set to an actual instance (i.e., between steps 1 and 2 above) or because it had been previously freed (after step 4 above). To help with this analysis, I would like to use magic pointer values for the initialization and post-free'ing of the objects. I have in mind hexspeak values such as 0xCAFEBABE and 0xDEADBEEF for steps 1 and 4, respectively.

I have multiple types of pointers that I would like to use for this. I find that for any given type, I can initialize it using an explicit cast. Here is a minimal example:

#include <iostream>

using namespace std;

class A {
    A(int a) : _a{a} {};
    int _a;

main(int argc, char* argv[])
    A *a = reinterpret_cast<A*>(0xCAFEBABE);
    cout << a->_a << endl;

    return 0;

This does what I want: when I run this, it crashes and when I print the value of a in the debugger it prints 0x00000000cafebabe.

However, specifying the reinterpret_cast for each pointer type, once for each initialization and once after each destruction, will get tedious. Is there a more efficient way to accomplish this?


A couple people added comments recommending that I manage these pointers with smart pointers. I appreciate the intention of this feedback, but that advice does not apply to this situation because this is not a bug with managing the lifetime of these objects. In general, the instances of my classes in this application are managed by smart pointers. In the particular case of this crash the pointers are for OpenSSL objects which are constructed with SSL_new and freed with SSL_free. But that's beside the point: they are correctly constructed when a connection is created and correctly freed when the connection goes away. The instances should not exist outside of that timeframe. The bug presents itself when a part of my application attempts a write using one of these non-live instances. In order to help track this down, I'd like to know whether the write happened on an object before the connection was set up or after the connection went away. That's where the magic pointer values come in.

asked on Stack Overflow May 22, 2020 by firebush • edited May 24, 2020 by firebush

1 Answer


You an use a function template to remove the tedium a bit.

template <typename Object>
Object* get_pointer()
   return reinterpret_cast<Object*>(0xCAFEBABE);

template <typename Object>
Object* get_pointer(Object* /*unused*/)
   return get_pointer<Object>();

and use it as:

A* a = get_pointer(a);


A* a = get_pointer<A>();
answered on Stack Overflow May 22, 2020 by R Sahu • edited May 22, 2020 by R Sahu

User contributions licensed under CC BY-SA 3.0