Questions regarding sending a serialized object from a client side to a server side

Hello Guys, :slight_smile:

I plan to send a test object from a client side to the server side using UDP protocol.

the test object is called “kid”.

this is the code from the client side.

Kid kid1 = new Kid(“craig”, 11);

byte[] test = Reflection.Serialize(kid1, typeof(Kid));
Debug.Print("the byte is : " + test.Length.ToString());
clientSocket.SendTo(test, serverEndPoint);

when I check the byte length. it shows me :10.

and when i check the “test” variable. it shows me :

[0] 0
[1] 0
[2] 0
[3] 11
[4] 5
[5] 99
[6] 114
[7] 97
[8] 105
[9] 103

this is the info from the client side.

as the test object has been sent to the server side.

byte[] receiveBuffer = new byte[serverSocket.Available];

Kid kid2 = (Kid)Reflection.Deserialize(receiveBuffer, typeof(Kid));

from the sever side, it shows me clearly that the length of receiveBuffer is 10, which is same as the client side. However
when I check the receiveBuffer, it shows me this

[0] 0
[1] 0
[2] 0
[3] 0
[4] 0
[5] 0
[6] 0
[7] 0
[8] 0
[9] 0

Has anyone encountered this kind of problems before? :-[

Where do you copy the data from the socket into receiveBuffer?

The length of receiveBuffer is 10 as expected because you declared it to 10 via

byte[] receiveBuffer = new byte[serverSocket.Available];

Your question is on serialization or on transferring data? Can we separate the 2 complete different topics please.

Can you serialize and deserialize on same program? You should have no problems if you are suing larger devices, smaller ones do not support serialization …this is another topic :slight_smile:

Now, make a buffer of some numbers and try to send and receive, do you have any problems?

Errol is right, you never receive the actual data… You just create an zeroed array of 10 bytes.