Possible improvement for TE35 touchscreens

I believe the touchscreen code to detect a touch event could be improved.
Let me describe what happens.
The problem happen sometimes randomly or can be seen easily by pressing multiple times on the screen per second. What happens is we get a touchevent not on the correct position.
The position is ALWAYS too much UP and LEFT from the correct possition where the finger was pressed.
This behavior happens less with a touchscreen pen, but can still happen sometimes.
The screen is calibrated and most of the time its at the correct position.

The theory I have for this bug is that: you must have a loop somewhere that samples the analog pins multiple times per second.
But sometimes the sample is taken while a finger is being pressed (lets say halfway pressed).
I believe that an Improvement could be made:
First, detect that the analog is above the threshold that ā€˜something is pressedā€™, but donā€™t fire the touch event yet! wait for a sample or 2 that the curve is ā€˜stableā€™ enough that the touch is fully pressed and only then fire the event at the correct position.
I already verified with an oscilloscope, and the touchscreen analog output seems to be not the problem (it seems sensitive enough if I press and depress the touchscreen 2-3 times a second).

There were other users in the past that reported this problem, but never investigated further I believe.
Is the touchscreen code available somewhere?

which processor?

Same problem was happening both on EMX and G120, on 4.1 and 4.2.

I will forward this on to our team, thanks

We tested 2 TE35 screens, and NewHaven 4.3" screens.
Same behavior, we get wrong touch position events.
The wrong positions are always too much up or left, and anywhere between the correct position and the border of the screen.

Easy to test: Plot the coordinates in text on the screen where you touch.
Touch multiple times with finger.
Every now and then, the coordinate will be too much up or left.

2 points to the comment above

  1. using a stylus is needed for accuracy on resestive touch screen. You need capacitive if you like better accuracy (like our CP7)
  2. Did you calibrate your display, which is required on resistive touch displays See C:\Users\Gus Issa\Documents\Microsoft .NET Micro Framework 4.2\Samples\TouchCalibration

1)Yes, we tested with a stylus or pen and the position is mostly correct. But it still happens every now and then, and can be provoked all the time by pressed on and off the touchscreen 2-4 times a second on the screen with a stylus. About every 1 on 10 you should get a wrong position if you click near thebottom and right.
As said earlier Iā€™ve checked with an oscilloscope, and the 2 analog pins were very accurate.
Thatā€™s when iā€™ve noticed the ā€˜rampsā€™ on the analog touchscreen outputs, and I believe it could be a software bug issue, that theres need to be extra validation when sampling the analog singals before firing the touch event.
Why it works more with a pen instead of finger ? I think the pen works more as an on/off switch, and a finger takes a little more time.
Again, the wrong position is always UP and LEFT, further proving my theory.

  1. Yes we did calibrate them, using the sample project. I donā€™t think its a calibration issue since everything works at the correct position after giving the calibration points (TE35 and NewHaven 4.3"), its just that sometimes I get a touchevent at the wrong position (too much left and up).

I had this with a design I done a few weeks ago. I am seeing something similar with the ChipworkX board too.

This really only happens when you use a finger to activate touch. A stylux or other hard device sorts this out.

What I had to do was to sample the Z register and work out the amount of force being applied. I then set this at a value that was big enough so that finger touch would work but not too big that it required a hard press. A little experimentation and it works perfect and 100% each time for my design.

I am using the ADS7846 but they all seem to be the same interface, only different manufactured devices. I have also used the TSC2046 and got the same results.

Not sure if your firmware handles the Z but it might be an option you can consider?

Is it possible to get the current touchscreen code, or some example on a way to reprogram it manually reading the analog pin?
We are using WPF, and would like to retain the ā€œOnTouchDownā€ events on WPF components.

Iā€™m pretty sure I could improve the code so it would work better with a finger.
My reasoning is based on those facts:
-I can reproduce this behavior using a pen, by pressing multiple times on the screen per seconds (just happens less than with a finger)
-Those touchscreen are used on dashboard GPS for cars, and they work fine with fingers.
-I monitored the analog signals on an oscilloscope, and they are clearly working fine. BUT you do see a ā€œrampā€ when pressing with a finger for a couple milliseconds. Iā€™m sure by adding some validations while capturing the analog pins, the touch event will occur at the correct position.

Interesting as Iā€™ve notice too that using a stylus or other hard device works better then my finger. I figured it was because of my Grizzly bear paw hands or maybe its the fur lined gloves that I wear while coding in my igloo, but an improvement with these touch screen would be so very much appreciated.

With the G400 making its way into Gadgeteer land, touch screens and Glide are going to be used way more then they have, so any improvement that can be made are a very good thing.

@ Dave McLaughlin
Thanks for the suggestion and using the Z, I think I will try a TSC2046 and see how it works.
However Iā€™m not sure how to transfer the positions I would read from communicating with the TSC2046 and then fire the OnEvents in WPF?

Maybe someone has a link to some examples on how to rewrite the handling from touch positions to touch events of WPF components?

Update!
Iā€™ve been able to setup a TSC2046, talk to it using SPI, and Iā€™m now at a point where I can succesfully know X,Y coordinates along with Z (pressure).
The touchscreen seems to work very well as I can ā€œdragā€ a finger or a pen on it and the cursor follows correctly.
I can also just ā€œtapā€ the screen with a finger and the coordinates detected are always OK.

Now my next challenge: how can I pass the coordinates in X,Y to WPF so it delegates the events correctly?
For example, if I have a panel on top of WPFWindow, I want it to generate TouchDown/TouchUp/TouchMove events.
Example:


public Object DrawImage(Object arg) {
            Image someImage = new  Image(Resources.GetBitmap(
                        Resources.BitmapResources.someImage));
            Panel pnl = new Panel();
            pnl.Children.Add(someImage);
            display.WPFWindow.Child = pnl;
            pnl.TouchDown += new Microsoft.SPOT.Input.TouchEventHandler(pnl_TouchDown);
}

void pnl_TouchDown(object sender, Microsoft.SPOT.Input.TouchEventArgs e) {
            int x, y;
            e.GetPosition((UIElement)sender, 0, out x, out y);
            Debug.Print("X:" + x + " Y:" + y);
}

For now those events are not working since WPF has no idea where to get the touch coordinates fromā€¦
I could alway make my custom ā€œbuttonā€ class and handle the events my way, but Iā€™d prefer using WPF if possible.

Ok this might be the biggest hack of the year, but it works!!!
Hereā€™s the snippet on how I did it:



                    // This find the child element recursively from X,Y pos
                    UIElement element = mMainWindow.ChildElementFromPoint(XPos, YPos);

                    // Setup the touch info for the listener of the event
                    Microsoft.SPOT.Touch.TouchInput[] touches = new Microsoft.SPOT.Touch.TouchInput[1];
                    Microsoft.SPOT.Touch.TouchInput newTouch = new Microsoft.SPOT.Touch.TouchInput();
                    touches[0] = newTouch;
                    newTouch.X = XPos;
                    newTouch.Y = YPos;
                    TouchEventArgs touchArgs = new TouchEventArgs(this, DateTime.Now, touches);

                    // Using reflection to access the protected method "OnTouchDown", and passing the TouchEventArgs
                    Type t = element.GetType();
                    MethodInfo m = t.GetMethod("OnTouchDown", BindingFlags.NonPublic | BindingFlags.Instance);
                    m.Invoke(element, new object[] { touchArgs });

I looked all over the web to find a solutionā€¦
In the end, I did alot of try and error until it worked.
I have no idea how stable and reliable the solution is, but so far so good!
All my UI componenets (everything that inherit from UIElement) gets its TouchDown called correctly. Now I have to implement OnTouchUp but itā€™s pretty easy

1 Like

Good one! I am a big fan of Reflection!

@ PhilM - We liked the example for custom touch events that you gave. We added it to the Touch tutorial at https://www.ghielectronics.com/docs/162/touch. If youā€™d like us to take it down, just let us know.

Thereā€™s no problem thatā€™s why I posted it, so it could help others.
I got inspired from lots of tids and bits of codes in the forum so I donā€™t mind :slight_smile: