Tech Support Guy banner

I'm looking for help to develop this script

808 Views 2 Replies 2 Participants Last post by  zorgan
1. I have a windows 10 pro machine which I can run powershell scripts from, we use it as a server in the office
2. I have across a large area 100 touchscreens with software on them running
3. some machines have developed touchscreen issues but they are happening without me knowing until a user discovers something is not working as it should.
4. I want to establish a way of knowing when a touchscreen is being touched to detect phantom touches? someone said this can be possible through perhaps, javascript, .net but I have only ever used powershell in the past.

anyone able to help? I have a plan but I am unsure it will work.

I could use (if I find out how to ) .net to run a small program or service on the machines, then have powershell grab that information at will and output into a console....

COMPUTER001 X,Y (coordinates)
COMPUTER002 X,Y (coordinates)
COMPUTER003 X,Y (coordinates)
COMPUTER004 X,Y (coordinates)

a touchscreen is a HID device, I think there is MSDN .net website some ways to call the position of a touchscreens cursor?

I know nothing of C# or C++ to enable me to use this information, as I mentioned before I have been using powershell before, I am unaware what kind of learning curve I would need to go through just to write this simple program to do this.

any ideas? or any other ways I could do this that would be better?

thanks in advance.

Not open for further replies.
1 - 3 of 3 Posts
Just throwing this out there as this has gone unanswered for some time. I tried building something like this in C# when you first asked but kept running into issues and then got distracted by other work.

There are a lot of things to take into account, such as:
  1. do you want to know the coords of the first point of contact someone had with the touch screen,
  2. or do you want to know where the OS predicts the touch was supposed to be based on a average of an array of contact points on the touch screen.
  3. or do you want to know the coords where the pointer clicks once the OS has processed the touch?
You can use any of these and they all represent different things. The learning curve for something like this is quite complex and requires quite a deep understanding of the operating system and how it works.

Something like this for an OS running windows is much easier to build in C++ as you can access the windows user32.dll functions and monitor the touch messages from the operating system easier.

In .NET which is what I know best, you are confined to your own application window and thread, which prevents you from detecting touches globally like when the application is minismised or the touch occurs outside of the window space. That being said it is possible, just new territory for me and hard to debug especially since I am using a wacom tablet to simulate touch screen stuff so windows doesn't handle it exactly the same.

Now if you wanted a debugging application to use as a tool, so a window that takes up the whole screen, and you touch somewhere on the window and it records the coords so you can determine if there is a hardware fault for example. That is very easy to do and I can teach you how to make it or provide it to you. But if you want it running in the background while users are using the device BAU, then that is tricky and I think I'm done scratching my head with that for now :p.
See less See more
Thanks so much for the reply. Let me give you an example of how I would love it to work.

So at the moment I can run powershell scripts and check things, such as cpu usage percentage, c drive empty space, basics stuff and this is all from powershell on the main office computer.

So I can get the console to output for let's say the cpu usage on the computers


and so forth...

Would love to have this same kind of thing happening. But instead tell me if the coordinates listed in a scan, so it's just checking once what the current location of a touch is, most obviously will have no touch so won't report anything. Some will so then I'll get data (I.E. feedback of this on the scan) this will then allow me to run it a couple of times or even after reboot of all machines and it will tell me if any machines have any touch bubbles etc on the screen and if I does I can manually investigate or get another script to run that will disable the computer from a person using it due to the touchscreen malfunction, make sense?
See less See more
1 - 3 of 3 Posts
Not open for further replies.