I have an application which monitors touchscreen messages in a Windows tablet. I want to wake up some hardware when the user touches the touchscreen. I have used the approach from Using Raw Input from C# to handle multiple keyboards.
I call RegisterRawInputDevices() to register to receive touchscreen messages:
        var rid = new RawInputDevice[1];
        rid[0].UsagePage = 13;     // HidUsagePage.Digitizer
        rid[0].Usage = 4;          // HidUsage.TouchScreen
        rid[0].Flags = 0x00000100; // RawInputDeviceFlags.InputSink
        rid[0].Target = this.Handle;
        if (!RegisterRawInputDevices(rid, (uint)rid.Length, (uint)Marshal.SizeOf(rid[0])))
        {
            // report error
        }
I then receive WM_INPUT messages in my message pump:
    protected override void WndProc(ref Message message)
    {
      switch (message.Msg)
        {
            case WmInput:
                // handle touch messages
                break;
        }
        base.WndProc(ref message);
    }
This works fine most of the time. I can touch anywhere on the screen and my application gets a WM_INPUT message. But if I run Windows Device Manager, I do not get touch messages when I touch inside the Device Manager.
I have tested this with the Microsoft Digiinfo tool, and it has the same behaviour.
I also see the same behaviour with the Computer Management application.
Is there a class of applications that can block WM_INPUT messages? Is there some way around this problem? I may just have to accept this behaviour, but I would still like to understand it.
User contributions licensed under CC BY-SA 3.0