Hello Guys,
I plan to send a test object from a client side to the server side using UDP protocol.
the test object is called “kid”.
this is the code from the client side.
Kid kid1 = new Kid(“craig”, 11);
byte[] test = Reflection.Serialize(kid1, typeof(Kid));
Debug.Print("the byte is : " + test.Length.ToString());
clientSocket.SendTo(test, serverEndPoint);
when I check the byte length. it shows me :10.
and when i check the “test” variable. it shows me :
[0] 0
[1] 0
[2] 0
[3] 11
[4] 5
[5] 99
[6] 114
[7] 97
[8] 105
[9] 103
this is the info from the client side.
as the test object has been sent to the server side.
byte[] receiveBuffer = new byte[serverSocket.Available];
Debug.Print(receiveBuffer.Length.ToString());
Kid kid2 = (Kid)Reflection.Deserialize(receiveBuffer, typeof(Kid));
Debug.Print(kid2.ToString());
from the sever side, it shows me clearly that the length of receiveBuffer is 10, which is same as the client side. However
when I check the receiveBuffer, it shows me this
[0] 0
[1] 0
[2] 0
[3] 0
[4] 0
[5] 0
[6] 0
[7] 0
[8] 0
[9] 0
Has anyone encountered this kind of problems before? :-[