How to raise a TouchEvent

Glide listens for a TouchEvent, then decodes it and decides what to do. So, it seems to me that is a touch screen driver would just raise a touch event Glide would ‘magically’ just work. I’m working with a Newhaven capacitive touch screen and the GHI driver that raises it own flavor of events.

If we look at me modified GHI cap driver code below. The this.ScreenReleased is the internal event type GHI defines. I was able to create an array of touch point(s) and a TouchEvent object. But how to raise the event? For the ScreenPressed case I show another alternative by calling ‘pressed’ which is defined as “public event Microsoft.SPOT.Touch.TouchScreenEventHandler pressed;”, so I think this should raise a touch screen event but how is that related to a TouchEvent that Glide is listening for?

The NETMF docs are next to useless on this and I feel like only 25% of the needed information is given. Any input?

                if (((first & 0xC0) >> 6) == 1)
                {
                    this.ScreenReleased(this, new TouchEventArgs(x, y));
                    Microsoft.SPOT.Touch.TouchInput[] touch = new Microsoft.SPOT.Touch.TouchInput[1];
                    touch[0].X = x;
                    touch[0].Y = y;
                    Microsoft.SPOT.Touch.TouchEvent joe = new Microsoft.SPOT.Touch.TouchEvent();
                    joe.EventMessage = 2;
                    joe.Touches = touch;
                }
                else
                {
                    this.ScreenPressed(this, new TouchEventArgs(x, y));
                    Microsoft.SPOT.Touch.TouchInput[] touch = new Microsoft.SPOT.Touch.TouchInput[1];
                    touch[0].X = x;
                    touch[0].Y = y;
                    Microsoft.SPOT.Touch.TouchScreenEventArgs bob
                        = new Microsoft.SPOT.Touch.TouchScreenEventArgs(System.DateTime.Now, touch, null);
                    pressed(this, bob);
                }

@ Jeff_Birt -
It didn’t work for me either, so I had to go this way for now to make it work.


private void Touch_ScreenReleased(CapacitiveTouchController sender, CapacitiveTouchController.TouchEventArgs e)
{
      GlideTouch.RaiseTouchUpEvent(null, new TouchEventArgs(new TouchInput[] { new GlideTouchInput(e.X, e.Y ) }));
}

private void Touch_ScreenPressed(CapacitiveTouchController sender, CapacitiveTouchController.TouchEventArgs e)
{
      GlideTouch.RaiseTouchDownEvent(null, new TouchEventArgs(new TouchInput[] { new GlideTouchInput(e.X, e.Y) }));
}


class GlideTouchInput : TouchInput
 {
        public GlideTouchInput(int x, int y) : base()
        {
            base.X = x;
            base.Y = y;
        }
 }

I copied what you did to get it working but it seems silly to have to do so. I just got done looking through the codeshare and all the touch drivers I saw just raised their own flavor of events.

Agreed, but didn’t want to spend time on it for now. I’ll be probably switching to Synth anyway.

This can’t be ‘secret’ information guys. I have been trying to download the NETMF source but CodePlex is moving VERY slowly. Hopefully some insight can be gleaned from the source. As they say in Star Wars “Use the Source Luke!” ::slight_smile:

@ EvoMotors - here is a slightly simplification of your solution. I still can download the NETMF srouce from CodePlex so this will have to do for me, for now.

        private void Touch_ScreenReleased(CapacitiveTouchController sender, CapacitiveTouchController.TouchEventArgs e)
        {
            GHI.Glide.Geom.Point point = new GHI.Glide.Geom.Point(e.X, e.Y);
            GlideTouch.RaiseTouchUpEvent(null, new GHI.Glide.TouchEventArgs(point));
        }

        private void Touch_ScreenPressed(CapacitiveTouchController sender, CapacitiveTouchController.TouchEventArgs e)
        {
            GHI.Glide.Geom.Point point = new GHI.Glide.Geom.Point(e.X, e.Y);
            GlideTouch.RaiseTouchDownEvent(null, new GHI.Glide.TouchEventArgs(point));
        }

The GHI touch driver sends you an event when the screen is touched. You then have to hook that event into whatever GUI driver you are using. The driver has no idea what GUI you are using so it simply gives you the postiion and UP or DOWN state etc and you have to deal with sending this to the GUI. This is pretty much what EvoMotors code does.

This is pretty much normal behavour and the lacking thing is some examples from GHI on how to hook this into Glide. :slight_smile: