Just throwing this out there as this has gone unanswered for some time. I tried building something like this in C# when you first asked but kept running into issues and then got distracted by other work.
There are a lot of things to take into account, such as:
Something like this for an OS running windows is much easier to build in C++ as you can access the windows user32.dll functions and monitor the touch messages from the operating system easier.
In .NET which is what I know best, you are confined to your own application window and thread, which prevents you from detecting touches globally like when the application is minismised or the touch occurs outside of the window space. That being said it is possible, just new territory for me and hard to debug especially since I am using a wacom tablet to simulate touch screen stuff so windows doesn't handle it exactly the same.
Now if you wanted a debugging application to use as a tool, so a window that takes up the whole screen, and you touch somewhere on the window and it records the coords so you can determine if there is a hardware fault for example. That is very easy to do and I can teach you how to make it or provide it to you. But if you want it running in the background while users are using the device BAU, then that is tricky and I think I'm done scratching my head with that for now
.
There are a lot of things to take into account, such as:
- do you want to know the coords of the first point of contact someone had with the touch screen,
- or do you want to know where the OS predicts the touch was supposed to be based on a average of an array of contact points on the touch screen.
- or do you want to know the coords where the pointer clicks once the OS has processed the touch?
Something like this for an OS running windows is much easier to build in C++ as you can access the windows user32.dll functions and monitor the touch messages from the operating system easier.
In .NET which is what I know best, you are confined to your own application window and thread, which prevents you from detecting touches globally like when the application is minismised or the touch occurs outside of the window space. That being said it is possible, just new territory for me and hard to debug especially since I am using a wacom tablet to simulate touch screen stuff so windows doesn't handle it exactly the same.
Now if you wanted a debugging application to use as a tool, so a window that takes up the whole screen, and you touch somewhere on the window and it records the coords so you can determine if there is a hardware fault for example. That is very easy to do and I can teach you how to make it or provide it to you. But if you want it running in the background while users are using the device BAU, then that is tricky and I think I'm done scratching my head with that for now