using System;
using System.Threading;
using Microsoft.SPOT;
using Microsoft.SPOT.Hardware;
using System.Net;
using System.Net.Sockets;
using GHIElectronics.NETMF.Net;
using System.Text;
namespace MP3Shield
{
class SendIPToClient
{
private const int port = 2000;
public static void SendIP()
{
using (Socket serverSocket = new Socket(AddressFamily.InterNetwork, SocketType.Dgram, ProtocolType.Udp))
{
EndPoint remoteEndPoint = new IPEndPoint(IPAddress.Any, port);
serverSocket.Bind(remoteEndPoint);
if (serverSocket.Poll(-1, SelectMode.SelectRead))
{
byte[] inBuffer = new byte[serverSocket.Available];
int count = serverSocket.ReceiveFrom(inBuffer, ref remoteEndPoint);
string message = new string(Encoding.UTF8.GetChars(inBuffer));
Debug.Print("Received: " + message);
}
}
}
}
}
Everything works and inBuffer receives the right bytes, which are “1,2,3,4,1,2,3,4,1,2,3,4”. But when I create message it contains little squares, like when you can’t visualize japanese kanji. What could it be?
Thanks
@ GG - The bytes you are trying to convert to characters are not printable UTF-8 characters. The printable UTF-8 characters are in the range of 32-127 for single byte characters after which the characters require multiple bytes. All bytes below 32 are control characters. Try sending the following bytes
I managed to do it without using Encoding.UTF8 at all. I modified the server in this way. Basically I manually convert the bytes received to ASCII
namespace MP3Shield
{
class SendIPToClient
{
private const int port = 2000; //scelta arbitraria
public static void SendIP()
{
Debug.Print("Prova di ricezione del broadcast");
using (Socket serverSocket = new Socket(AddressFamily.InterNetwork, SocketType.Dgram, ProtocolType.Udp))
{
EndPoint remoteEndPoint = new IPEndPoint(IPAddress.Any, port);
serverSocket.Bind(remoteEndPoint);
if (serverSocket.Poll(-1, SelectMode.SelectRead))
{
byte[] inBuffer = new byte[serverSocket.Available];
int count = serverSocket.ReceiveFrom(inBuffer, ref remoteEndPoint);
char[] message = new char[inBuffer.Length];
for (int n = 0; n < inBuffer.Length; n++)
message[n] = (char) (inBuffer[n] + 48); //conversion to ASCII
string s = new string(message);
Debug.Print(s);
}
}
}
}
}
I’ll keep that in mind, thanks.
Consider that for this application I only need the actual byte array, which will contain even the IP address of the client who sent the broadcast message. I wanted to convert it to a string just for fun and debugging purposes. Infact, converting bytes like 192 to ASCII gives an undesired result.
Using this approach is shifting the bytes into the printable ASCII range, however, this will only work if the bytes you are sending are limited to 0-9. Higher numbers are going to show other characters or push you outside of the printable ASCII range( > 127) and create invalid UTF-8 code points.
Of course II do not know your use case I do not know if that is acceptable in your case, but I thought it might be worth mentioning.
I know it only works for a limited range, thanks.
As I said, for this application I only need the simple byte array. But since the Encoding wasn’t working, I wanted to know how to fix this thing