Send captured image to socket server

Hi, is there any code samples of transmitting the captured image from Serial L2 camera to PC windows forms picturebox?

I tried to find information on codeshare with tags: video stream, image stream, wifi network image transfer. There is no information about this.

Take it easy please. This is a welcoming place :slight_smile:

2 Likes

@ Alex Bilityuk -

I do not have any code… I thought I did but cannot fine it?

I do remember that I started with this reference.

Be aware that the code shown at the link would have to be updated for your camera and what version of net Micro you are currently using…

Hope it get you started…

Have a GREAT day!

1 Like

If you have a netwrok Connection (WIFI or Ethernet) You can use nearly the same code on NETMF as on Windows.
You establish a Socket connection, and send the byte array you got from the camera to windows. If needed you can send some additional information ahead (number of bytes, image width/height.
In Windows you create a Bitmap object from this byte array. NETMF images have always 16 Bit/Pixel. Not sure if RGB16 or BGR16 Pixel format.

There should be plenty samples about networking in the Forum and codeshare.
You can also see if you find a C# TCP Client Server Sample for windows.
The code in NETMF is nearly the same, as soon as network is configured.

1 Like

I read a provided article about WIFI rover, but it completly different from my project.

Now i am trying to send raw image from server to client using sockets in NETMF, but with no success.

Here is my server side code, i am sending an image byte[] array using a chunks method, because of buffer size limitations in NETMF:


if (serverSocket.IsConnected)
                    {
                        Byte[] bytesToSend = capturedata;
                        Byte[] outboundBuffer = new byte[512];


                        int incomingOffset = 0;

                        while (incomingOffset < bytesToSend.Length)
                        {
                            int length = System.Math.Min(outboundBuffer.Length, bytesToSend.Length - incomingOffset);                                                            
                            Array.Copy(bytesToSend, incomingOffset, outboundBuffer, 0, length);
                            incomingOffset += length;

                            // Transmit outbound buffer
                            serverSocket.SendBinary(outboundBuffer);

                        }
                     }

Here is my client side code. I am trying to prepare image from memorystream and show it in picturebox:


public static void ReadCallback(IAsyncResult ar)
    {


        // Retrieve the state object and the handler socket
        // from the asynchronous state object.
        StateObject state = (StateObject)ar.AsyncState;
        handler = state.workSocket;

        // Read data from the client socket. 
        int bytesRead = handler.EndReceive(ar);

        if (bytesRead > 0)
        {

            memorystream.Write(state.buffer, 0, bytesRead);

            if (bytesRead<=0)
            {

                PICTUREBOX_CAPTURED_IMAGE.Image = new System.Drawing.Bitmap(memorystream);// Here is an ArgumentException

            }

            else
            {
                // Not all data received. Get more.
                handler.BeginReceive(state.buffer, 0, StateObject.BufferSize, 0,
                new AsyncCallback(ReadCallback), state);
            }
        }
    }

I dont know what the matter, but it seems like the server side chunks buffers not completely writes to memorystream on client side, because Byte[] bytesToSend = capturedata; all the time is about 12540 bytes, but when i start to investigate this value on clientside memory stream capacity is about 3500-4600 bytes. And i think because of it i have an Argument Exception when i am trying to make a Bitmap image from stream in this line of code:


if (bytesRead<=0)
            {

                PICTUREBOX_CAPTURED_IMAGE.Image = new System.Drawing.Bitmap(memorystream);// Here is an ArgumentException

            }

Can anybody help me to find a solution. I need to send byte arrays by chunks from server to client and set the picturebox.image from Bitmap memory stream.

I guess that


Wants an image file format from the stream.
But you get an byte array with pixel info from the camera, which you send 1:1 over network.
You need to create the Bitmap with the correct size and pixel format, and then copy the pixel data into the bitmap.

Also the code

```cs
if (bytesRead > 0)
         {

             memorystream.Write(state.buffer, 0, bytesRead);

             if (bytesRead<=0)
             {

is suspicious.
the variable bytesRead does not change in that code.
SO the 2nd if should always be false.

I tried something like this


Image im = Image.FromStream(mStrm, true);
im.Save("image.bmp");

But the same exception. I think the problem in that i recieve not all data in client side memory stream. Maybe i am not properly writes it from chunks or something. Why do i have image buffer on the server side of size for example 12500 bytes but when i send it chunk by chunk to the client side and check the memory stream capacity property it shows 3465? I excpect that it should show me a 12500 bytes capacity like the original buffer on the server side before i devided it on chunks and send over network…

@ Alex Bilityuk - Of course, if not all data is transmitted, then that’s a problem.
But on the Windows side, you can not create an image or bitmap from this stream alone.
The byte array from the camera does not contain any information of the bitmap size and pixel format. No matter if you take a constructor that takes a stream or the method FromStream. How should it know how big the image is and how the pixels are encoded.
I don’t have the API in mind, but you have to something like this (in pseudo code):

int width = ?;
int height = ?;
PixelFormat pixelFormat = Pixelformats.BGR16;
var bitmap = new Bitmap(width, height, pixelFormat);
bitmap.CopyBinaryData... // something that takes binary data from your stream

1 Like

Thank you for reply, i am trying something like this as you adviced:


Bitmap bmp = new Bitmap(320, 240, PixelFormat.Format16bppArgb1555);

But Bitmap doesn’t have a method copyBinaryData or create from stream and so on…

@ Alex Bilityuk - Use the LockBits method. By this you get an BitmapData object.
Check the MSDN docs about it:

Don’t Forget to unlock the bista afterwards.
You should use a try/finnally block to lock and unlock.
The sample in the MSDN docs copies the Pixel data aout of the Bitmap.
You just Need to inverse the copiing code

1 Like

I checked this code sample, but it already using an image file from specified pass:


// Create a new bitmap.
            Bitmap bmp = new Bitmap("c:\\fakePhoto.jpg");

But i don’t have an image yet, i need to construct it first and i can’t because of Argument Exception on this line of code:


PICTUREBOX_CAPTURED_IMAGE.Image = new System.Drawing.Bitmap(memorystream);

Also i noticed strange behavior, maybe you can give me a help and advice what am i doing wrong. Tha matter is that i have more bytes on client side than i sended from server side. For example if server sended 13072 bytes, client memorystream length is about 16984 bytes… what is wrong with it… I double checked the server side algorithm and it’s working as expected.

@ Alex Bilityuk - You get the ArgumentException, because the constructor with the stream argument wants a stream with an Image file, not just Pixel data.

You Need to create the Bitmap with this constructor:

public Bitmap(
	int width,
	int height,
	PixelFormat format
)

And then use LockBits to copy your Pixel data into the Bitmap.

I tried this code, like in example you provided;


  RobotService.memorystream.Position = 0;
            Bitmap bmp = new System.Drawing.Bitmap(320, 240, PixelFormat.Format24bppRgb);
            // Lock the bitmap's bits.  
            Rectangle rect = new Rectangle(0, 0, bmp.Width, bmp.Height);
            System.Drawing.Imaging.BitmapData bmpData =
                bmp.LockBits(rect, System.Drawing.Imaging.ImageLockMode.ReadWrite,
                bmp.PixelFormat);

            // Get the address of the first line.
            IntPtr ptr = bmpData.Scan0;

            // Declare an array to hold the bytes of the bitmap. 
            int bytes  = Math.Abs(bmpData.Stride) * bmp.Height;
         
            byte[]inputdata = RobotService.memorystream.GetBuffer();
            
            // Copy the RGB values back to the bitmap
            System.Runtime.InteropServices.Marshal.Copy(inputdata, 0, ptr,bytes);
            
            // Unlock the bits.
            bmp.UnlockBits(bmpData);
            PICTUREBOX_CAPTURED_IMAGE.Image = bmp;

My inputdata buffer that i construct from memory stream is 32000-48000 everytime, but " int bytes = Math.Abs(bmpData.Stride) * bmp.Height" value in this line of code is 230400 and when i am trying to copy data to bitmap it throws an ArgumentOutofRange Exception because there is not enough bytes in inputdata buffer to copy… what can i do?

You should add a Debug.Print statement to your NETMF code, which writes how many bytes you have sent in each iteration of your loop.
Then do the same in the windows receive callback.
By this you can see how many data is really sent and received.

You should also only get 153600 bytes of data, since NETMF uses 16 bits per pixel.
The PixelFormat to use for your Bitmap object should be Format16bppRgb or Format16bppBgr.
Normally there are 6 bits for green and 5 for red and blue (no alpha).

You can also use wire shark to monitor the network traffic from your board.

1 Like

Can you help me to find out how to check how many bytes i sent on each iteration in my server side code:


if (serverSocket.IsConnected)
                    {
                        Byte[] bytesToSend = capturedata;
                        Byte[] outboundBuffer = new byte[1024];


                        int incomingOffset = 0;

                        while (incomingOffset < bytesToSend.Length)
                        {
                            int length = System.Math.Min(outboundBuffer.Length, bytesToSend.Length - incomingOffset);                                                            
                            Array.Copy(bytesToSend, incomingOffset, outboundBuffer, 0, length);
                            incomingOffset += length;

                            // Transmit outbound buffer
                            serverSocket.SendBinary(outboundBuffer);

                        }
                     }

outboundBuffer has only the lenght property that returns 1024 every time.

I checked the PixelFormat of L2 camera pictured image and it returns : Format24bppRgb

@ Alex Bilityuk - Generally, the length variable you calculate is the number of bytes you send in each iteration. Or so to say the number of bytes you should send.
Actually your code sends always 1024 bytes, even in the last iteration, if length is less than 1024. This is because you always send the whole array with SendBinary.

I assume that you use a gadgeteer project, since the normal network Socket objects does not have this method.
Check if there is a SendBinary overload that accepts an additional length parameter.
SO you call it like

...
int totalSent = 0;
while (...)
{
   ...
   serverSocket.SendBinary(outboundBuffer, length);
   totalSent += length;
   Debug.Print("Sent " + length);
}
Debug.Print("Total sent " + totalSent);

Two (or three) more questions:
What network/WIFI module do you use?
Do you use a UDP or an TCP connection?
If TCP, which side is the server, which the client?

1 Like

Thank you for reply and your help!

  1. I am using a gadgeeter project v 4.3
  2. I am using a WiflySocket and sendbinary method doesn’t have an overload for length
  3. i am using an RN-XV WIfly module
  4. i am using a TCP connection
  5. Server is gadgeeter project it sends picture data in outboundBuffer chunks to windows 7 client socket.
  6. If you need more information i am ready to provide everything i can.

@ Alex Bilityuk - Ok
On a TCP connection it is very unlikely that you loose data on the way.
So you should look the debug outputs from sending, and add similar outputs to the windows client. Normally they should be similar (be aware that TCP is a streaming connection, so the chunk size on the client might be different than on the sending side.
And you need to create a smaller byte array for the last chunk, if less than 1024 bytes are left to send, or you would send some unneeded data at the end.

@ Alex Bilityuk - which driver do you use, that provides the socket feature to the RN-XV WIfly. Do you use the RN-XV WIfly on the GHI-XBee Adapter module?

1 Like