Main Site Documentation

Inconsistancy in netmf docs


I’ve been working on creating an input provider for the LCD Shields buttons. Why? Well, just to see how to do it. The first big problem I have encountered is with InputProviderSite not having the same signature as shown in the nefmf docs and every stinking example I can find for creating an input provider. This is being done in VS2010 and netmf 4.1.

According to the netmf docs: , the signature for InputProviderSite.ReportInput is…

public bool ReportInput (
InputReport inputReport

yet, Intellisense tells me it is

public bool ReportInput (
InputDevice device,
InputReport inputReport

This is really strange as the docs show it as being the same since version 2.0! So in the code below I modified the ReportInputCallBack delegate to match this new signature but then I can’t do a Dispatcher.BeginEnvoke as the callback signature is wrong!

The second problem is that when I get this part working I need to be able to add a Button event handler to ‘something’. It seems MS has linked the Button class to WPF. But if I have an application that does not have a typical display, like the LCD Shield, but I want to take advantage of the Buttons built in goodies how can I do that. To where do I attache my button event handler?

internal class LCDKeysProvider
            public struct ButtonPad
                public Button button;
                public FEZ_Shields.KeypadLCD.Keys key;

            private delegate bool ReportInputCallBack(InputDevice device, InputReport inputReport);
            public readonly Dispatcher dispatcher;
            private ButtonPad[] _buttons;
            private ReportInputCallBack callback;
            private InputProviderSite _site;
            private PresentationSource _source;

            public LCDKeysProvider(LCDLiveText parent)
                this._buttons = new ButtonPad[5];
                _buttons[0].key = FEZ_Shields.KeypadLCD.Keys.Up;
                _buttons[0].button = Button.VK_UP;

                _buttons[1].key = FEZ_Shields.KeypadLCD.Keys.Down;
                _buttons[1].button = Button.VK_DOWN;

                _buttons[2].key = FEZ_Shields.KeypadLCD.Keys.Right;
                _buttons[2].button = Button.VK_RIGHT;

                _buttons[3].key = FEZ_Shields.KeypadLCD.Keys.Left;
                _buttons[3].button = Button.VK_LEFT;

                _buttons[4].key = FEZ_Shields.KeypadLCD.Keys.Select;
                _buttons[4].button = Button.VK_SELECT;

                this._source = null;
                _site = InputManager.CurrentInputManager.RegisterInputProvider(this);
                callback = new ReportInputCallBack(_site.ReportInput); // InputProviderSite does not have the same signature the docs say it should
                dispatcher = Dispatcher.CurrentDispatcher;

                parent.KeyEvent += new KeyEventHandler(keyPressed);

            private void keyPressed(object sender, KeyEventArgs e)
                Debug.Print("at RawInputReport");
                RawButtonActions action = RawButtonActions.ButtonDown;
                Button button = _buttons[(int)e.key].button;
                RawButtonInputReport report = new RawButtonInputReport(_source, DateTime.Now, button, action);

                // problem here as callback does not have correct signature
                //dispatcher.BeginInvoke(callback, report);



You should report this directly to Microsoft. The more users complain the faster they would fix the help docs I hope!

See the forum here


I tried to post over there a few days ago and my post dissapeard into the ether. I just reposted and it seems to really be there this time.

So, obviously the docs are wrong.

Two questions remain though:

  1. How to call BeginInvoke when the callback signature is wrong?
  2. How to attach an event handler?


More fun: I created an array of objects to pass for the callback as Dispatacher.BeginInvoke has this signature:

public DispatcherOperation BeginInvoke (
Delegate method,
Object[] args

I get a build error that says:

Argument 1: cannot convert from ‘LCDLiveText.LCDLiveText.LCDKeysProvider.ReportInputCallBack’ to ‘Microsoft.SPOT.DispatcherOperationCallback’

That is because DispatcherOperationCallback has this signature

public delegate Object DispatcherOperationCallback (
Object arg

So evidently the ‘Dispatcher.CurrentDispatcher’ returns a dispatcher type that expects a DispatcherOperationCallBack type delegate? I sure am confused…


I am also confused and not sure how to help you :wall:


Please look at Samples with NETMF SDK. Also, look at our graphical demo, it is working fine :wink:
(link removed)


I noticed the LCD you are using. Are doing this on Domino? These functions will throw exceptions on Domino because they all based on WPF. Domino does not have WPF.


Yes, it is on a Domino. I knew that the WPF graphics type things were not available but I thought perhaps I could make use of the Button events. My thinking is that since the whole idea of a ButtonDevice was to map a hardware input to an input event it might still work. I guess since all the WPF goodies are missing from the Domino then not even the buttons will work.

I was not getting any exceptions though, could not even get the code to compile as the functions signatures seemed to be all different (which should not have anything to do with the platform?)


Mike, I used the method shown in the graphical demo to instantiate the callback object and callback arguments. It now builds and runs throwing no exceptions. Now if I could only find a way to add an event handler for the buttons I would be all set. Since I do not have a WPF window object it might not be possible.

I’m not even sure trying to use an input provider like this is a good idea but I am curious if it will work.


What are you trying to do exactly? You can make your own buttons events easier…

I am not WPF expert but the classes you mentioned are tied with WPF. Also, TinyCore assembly is HUGE. These should not be used on Domino. Any other way of doing it is better.


I was just trying to see if I could get it to work without really having WPF. It kind of annoys me that MS created a nice interface for buttons on a device and then decided to link it to WPF. There are probably many more non-WPF devices that could have benefited from some sort of easy to use Button type class.

I already create my own event that fires on a button up of the LCD keys. I was really just wanting a little better interface for LCD buttons than a simple event firing. Something that could detect if you hold a button down (like a key repeat), etc.

The specific usage case that started with experiment is that I wanted my program to display something on the LCD and pause until the user hit ‘Select’ and then drop down into the UI where they could select a test to run (the cable tester), etc. Even if I did not attach my LCD key event handler until after the ‘Select’ key was hit the first time it would catch the key up event when select was released.

Then I got off on this tangent of wondering how an InputProvider worked and wanted to try and make it work even if it was a silly way of doing things. Now, I will go back to figuring out how to make better use of a generic key release event handler to be able to do things like wait for a key press in a certain section of the UI and then consume that event so it does not trickle on down to any other event handlers.

Adding TinyCore raised the project size by about 30K :o, you are right it is very large.


So basically you are struggling with the buttonevent handles for button up, button down and button pressed?

If so, that’s the same problem I’ve had with the hexapod.

How far is your experimenting/code?

Did you come up with something?


Generating the event is not too difficult. I set up an ExtendedTimer that calls this code every 100ms

          currentKey = FEZ_Shields.KeypadLCD.GetKey();
            if (currentKey != FEZ_Shields.KeypadLCD.Keys.None
                & lastKey == FEZ_Shields.KeypadLCD.Keys.None)
                lastKey = currentKey;
            else if (currentKey == FEZ_Shields.KeypadLCD.Keys.None
                & lastKey != FEZ_Shields.KeypadLCD.Keys.None)
                keyArg.key = lastKey;
                lastKey = FEZ_Shields.KeypadLCD.Keys.None;

Basically this looks to see if the state of FEZ_Shields.KeypadLCD.GetKey() has changed from returning ‘None’ to returning any key pressed, this signifies a key down and we save the ID of the key that is pressed. If the state of FEZ_Shields.KeypadLCD.GetKey() has changed from a key being down to ‘None’ then we know the key was released and we fire off the OnKeyEvent. Since the update rate of the Extended timer is a relatively slow 100ms it makes a pretty good debounce (glitch) filter for the keys as well.

This set up is nice and simple but only gives you a key up event. If you wanted to emit key down events and key held events it would have to be more elaborate (and the timer would need to be a little faster.)

If you are using GPIO pins that are interrupt capable then the job is much easier as you you have events already generated for both edges of the signal and a built in glitch filter.

What is going through my mind now is a better/smarter way to handle the events once you have them. You don’t always want to do the same thing when a given event fires and we don’t have all the goodies of a larger frame work to help us out.


Hey guys, perhaps its now worth taking the “buttons” part of this discussion to it’s own thread? The title is about docs, it’d be great to capture this stuff in a more relevant place - it’s very interesting and worthwhile!