Glide listens for a TouchEvent, then decodes it and decides what to do. So, it seems to me that is a touch screen driver would just raise a touch event Glide would ‘magically’ just work. I’m working with a Newhaven capacitive touch screen and the GHI driver that raises it own flavor of events.
If we look at me modified GHI cap driver code below. The this.ScreenReleased is the internal event type GHI defines. I was able to create an array of touch point(s) and a TouchEvent object. But how to raise the event? For the ScreenPressed case I show another alternative by calling ‘pressed’ which is defined as “public event Microsoft.SPOT.Touch.TouchScreenEventHandler pressed;”, so I think this should raise a touch screen event but how is that related to a TouchEvent that Glide is listening for?
The NETMF docs are next to useless on this and I feel like only 25% of the needed information is given. Any input?
if (((first & 0xC0) >> 6) == 1)
{
this.ScreenReleased(this, new TouchEventArgs(x, y));
Microsoft.SPOT.Touch.TouchInput[] touch = new Microsoft.SPOT.Touch.TouchInput[1];
touch[0].X = x;
touch[0].Y = y;
Microsoft.SPOT.Touch.TouchEvent joe = new Microsoft.SPOT.Touch.TouchEvent();
joe.EventMessage = 2;
joe.Touches = touch;
}
else
{
this.ScreenPressed(this, new TouchEventArgs(x, y));
Microsoft.SPOT.Touch.TouchInput[] touch = new Microsoft.SPOT.Touch.TouchInput[1];
touch[0].X = x;
touch[0].Y = y;
Microsoft.SPOT.Touch.TouchScreenEventArgs bob
= new Microsoft.SPOT.Touch.TouchScreenEventArgs(System.DateTime.Now, touch, null);
pressed(this, bob);
}