Problem with serial camera output:

I wrote the following code

 Program.camera.SetImageSize(Gadgeteer.Modules.GHIElectronics.SerCam.Camera_Resolution.SIZE_VGA);
            Program.camera.StartStreaming();
  
            if (!Program.camera.isNewImageReady)
            {
                Thread.Sleep(100);
            }
            Program.camera.StopStreaming();
            byte[] temp = Program.camera.dataImage;
}

My problem is that I send the temp array over the internet to matlab program and there i need to reshape it to the image original size. The temp array size should have been 6404803 but it isn’t so how is the data decoded?

Suggestions?

NETMF images are always 16 bit (2 Byte per pixel)

@ Reinhard Ostermeier - You are the guy who sent images to matlab? Can you provide me some details about the implementation of it?

let me make it easier for you. lets say I took a picture size 320*240



and sent the bytes array to matlab. the size of the byte array is 3576. Now matlab has an array of uint16 size 1X3576. What can I do with it? How can I create a picture from it?

And also, every time I take a picture, the length is changed…

@ bioengproject - No, I never worked with Mathlab. And I used SerCam only once because it was part of a Kit I have. I displayed the image on a display, which worked without any issues.

What size is the array in NETMF. It should have 320x240x2 = 153600 bytes.
I’m not sure if you can send the whole array at once, or if you need to iterate over it and send it in chunks.

If the size in NETMF is ok, but you get different sizes in MAthlab, then I assume that you also need to read multiple times in mathlab to get all the data.

Once you have the data in mathlab you can convert the data to a mathlab image.
The NETMF image contains just one row after the other, and each row are just one pixel after the other.
Pixels are usually in RGB or BGR 565, means 5 bits red (or blue) 6 bits green, 5 bits blue (or red).

Sercam takes JPG images ! Of course the “size” of the resulting array is different each time.

@ Reinhard Ostermeier - Problem is that it isn’t 320 X 240 X 2 ! !
It changes every time I take a picture… something around 3675…

So bret, what can I do?

take JPG image, using NETMF to convert to BMP, this image is what you need, I think.

Something like this:

 

with dataJPEGis jpeg data 

then 

```cs

byte[] bmpData = image.GetBytes();

@ Dat - And then what? The matlab will get an array of bytes and how can I create image back?

@ bioengproject -

If you want jpeg, sercam can give you jpeg
if you want bmp, you can convert from jpeg to bmp by netmf, or actually sercarm also can give you bmp directly.

For jpeg, this is an image, data will start from 0xFFD88 and end at 0xFFD9.
For BMP, you need to add 54 bytes header of BMP if you want yo show the picture on window OS or some other platform…

So I am not sure what you mean “how can I create image back”

There is a method in the GHI libraries to convert the bitmap data to a BMP file format.

@ Dat - This is my status:
I want to take a picture using serial camera, and then send it over the internet to my computer using sockets.

Wrote the following code:

 takePicture()
        {
            Program.camera.SetImageSize(Gadgeteer.Modules.GHIElectronics.SerCam.Camera_Resolution.SIZE_QVGA);
             Bitmap LCD = new Bitmap(SystemMetrics.ScreenWidth, SystemMetrics.ScreenHeight);
            Program.camera.StartStreaming();
     //       LCD.Clear();
            if (!Program.camera.isNewImageReady)
            {
                Thread.Sleep(100);
            }
            Program.camera.StopStreaming();
            Program.camera.DrawImage(LCD, 0, 0, SystemMetrics.ScreenWidth, SystemMetrics.ScreenHeight);
            byte[] bmpData = LCD.GetBitmap();   // Size Will Always be 307200 = 640*480  for some unknown reason
            byte[] temp = Program.camera.dataImage;
            byte[] arr = new byte[temp.Length];   // Size is changing each picture
            arr = Program.camera.dataImage;
            return arr;
        }

To send the byte array:

 buf)
        {
            int offset = 0;
            int ret = 0;
            int len = buf.Length;
            while (len > 0)
            {
                ret = m_clientSocket.Send(buf, offset, len, SocketFlags.None);
                len -= ret;
                offset += ret;
            }
        }

I wish to send this byte array and do some manipulation at the other side to convert it to 320X240X3 RGB matrix so I can see it in my computer.

I’m lost and I need some help

@ bioengproject -

Completely we can do that, but first need some change in your code

if you want jpg:


Program.camera.SetImageSize(Gadgeteer.Modules.GHIElectronics.SerCam.Camera_Resolution.SIZE_QVGA);  // =>> this should be call once time.
 Bitmap LCD = new Bitmap(SystemMetrics.ScreenWidth, SystemMetrics.ScreenHeight);
// =>> this should be call once time because the size BMP is fixed.

private byte[] takePicture()
        {
             Program.camera.StartStreaming();  
           // if (!Program.camera.isNewImageReady) ==> this should be in while -loop until image is ready
         while(!Program.camera.isNewImageReady)
            {
                Thread.Sleep(100);
            }
            Program.camera.StopStreaming();
        
            return  Program.camera.dataImage;  // this is JPG
        
        }

if you want BMP


Program.camera.SetImageSize(Gadgeteer.Modules.GHIElectronics.SerCam.Camera_Resolution.SIZE_QVGA);  // =>> this should be call once time.
 Bitmap LCD = new Bitmap(SystemMetrics.ScreenWidth, SystemMetrics.ScreenHeight);
// =>> this should be call once time because the size BMP is fixed.

private byte[] takePicture()
        {
             Program.camera.StartStreaming();  
           // if (!Program.camera.isNewImageReady) ==> this should be in while -loop until image is ready
         while(!Program.camera.isNewImageReady)
            {
                Thread.Sleep(100);
            }
            Program.camera.StopStreaming();
        
            Program.camera.DrawImage(LCD, 0, 0, SystemMetrics.ScreenWidth, SystemMetrics.ScreenHeight);
            // Note sure if you are using 4.3 or 4.2
           // 4.2
          //byte bmp42 = new byte[320*240*3+54];
          //  BitmapToBMPFile(LCD.GetBitmap(), 320,240, bmp42 );
          // return bmp42 ;
           // if 4.3
          return GHI.Utilities.Bitmaps.ConvertToFile(LCD.GetBitmap()); 

           // this will return 320x240*3 + 54 bytes. after sending over your socket, name it as text.bmp then you can show on the pc
        }

And

you selected:


....

[quote]byte[] bmpData = LCD.GetBitmap();   // Size Will Always be 307200 = 640*480  for some unknown reason[/quote]

that because you selected QVGA it means 320x240*4 = 307200 => ARGB format 
 640*480 = VGA, not QVGA as you selected.


And this is few hundred kb for transfer, I suggest you should send them in blocks with size about 10Kb for each block, should not send all in once time over the socket

And you also don't need to call 


```cs
Program.camera.StartStreaming();  

everytime you take a picture. just call once time somewhere before take picture and your function just few line of codes.

 takePicture()
        {

         while(!Program.camera.isNewImageReady)
            {
                Thread.Sleep(5);
            }
            return data;/// jpg or bmp, 
}

@ Dat - Wrote down:

 takePicture()
        {
            Program.camera.SetImageSize(Gadgeteer.Modules.GHIElectronics.SerCam.Camera_Resolution.SIZE_QVGA);  // =>> this should be call once time.
            Bitmap LCD = new Bitmap(SystemMetrics.ScreenWidth, SystemMetrics.ScreenHeight);
            Program.camera.StartStreaming();
            // if (!Program.camera.isNewImageReady) ==> this should be in while -loop until image is ready
            while (!Program.camera.isNewImageReady)
            {
                Thread.Sleep(100);
            }
            Program.camera.StopStreaming();
            Program.camera.DrawImage(LCD, 0, 0, SystemMetrics.ScreenWidth, SystemMetrics.ScreenHeight);
                byte[] bmp42 = new byte[320*240*3+54];
           Util.BitmapToBMPFile(LCD.GetBitmap(), 320,240, bmp42 );
            return bmp42 ;}

The problem is that bmp42 has the same values all the time! no matter what picture I took!

And another thing… Why is the size of the array defined as 3202403 +54 ? I’m trying to use reshape function on matlab to create the RGB matrix from the array… So I call reshape(array, 320,240,3) and then get an error because the array is larger then this number by 54…

@ bioengproject -

What are values? all of them are 0x00, 0x00… or 0xFF, 0xFF?

3202403 => this is raw data, 3 bytes per pixel
+54: this is header, is needed for window to show the picture in PC.

Depends on what you need, you can remove this 54 bytes header or not.
if you are going to remove this header, your array should be start from array[54] instead of array[0].

if you don’t need header then :


 Program.camera.DrawImage(LCD, 0, 0, SystemMetrics.ScreenWidth, SystemMetrics.ScreenHeight);
                byte[] bmp42ARGB = LCD.GetBitmap();

bmp42ARGB will be in ARGB and size = 320x240*4 as you saw them before.

you can convert from ARGB 32 bit to RGB 24 bit by removing alpha bytes then you will have 320x240*3.

;
byte []  arrayRGB32bit =  LCD.GetBitmap();

void Conver32To24()
int i;
int j;
for (i=0; i< 320*240*4; i++)
{
      // if format ARGB
       if (i%4==0) continue;
      // if format RGBA
      //  if (i%4==3) continue;
        arrayRGB24bit [j++] = arrayRGB32bit [i];
   

}

@ Dat - What is RGBA you wrote?
Anyhow, Now, matlab has 3202403 +54 array with values of bmp42.
I threw away the first 54 bytes and now i got array size 3202403.
How is the array arranged? Is it:

Array[0] <- Red value of first pixel (1,1)
Array[1] <- Green value of first pixel (1,1)
Array[2] <- Blue value of first pixel (1,1)
Array[3] <- Red value of second pixel (2,1)
Array[4] <- Green value of second pixel (2,1)
Array[5] <- Blue value of second pixel (2,1)

… ??

What is RGBA you wrote? => because I am not sure if alpha at the end or not.

[quote]Array[0] <- Red value of first pixel (1,1)
Array[1] <- Green value of first pixel (1,1)
Array[2] <- Blue value of first pixel (1,1)
Array[3] <- Red value of second pixel (2,1)
Array[4] <- Green value of second pixel (2,1)
Array[5] <- Blue value of second pixel (2,1)
[/quote]

basically, it should be that, or Red and Blue are swapped.

@ Dat - Listen, it’s just not good :frowning:

 takePicture()
        {
            Program.camera.SetImageSize(Gadgeteer.Modules.GHIElectronics.SerCam.Camera_Resolution.SIZE_QVGA);  // =>> this should be call once time.
            Bitmap LCD = new Bitmap(SystemMetrics.ScreenWidth, SystemMetrics.ScreenHeight);
            Program.camera.StartStreaming();
            // if (!Program.camera.isNewImageReady) ==> this should be in while -loop until image is ready
            while (!Program.camera.isNewImageReady)
            {
                Thread.Sleep(100);
            }
            Program.camera.StopStreaming();
            Program.camera.DrawImage(LCD, 0, 0, SystemMetrics.ScreenWidth, SystemMetrics.ScreenHeight);
                byte[] bmp42 = new byte[320*240*3+54];
           Util.BitmapToBMPFile(LCD.GetBitmap(), 320,240, bmp42 );
            File.WriteAllBytes("\\SD\\CreatedBMPFile.bmp", bmp42);
            return bmp42 ;
}

As you see, I saved it to sd card, and then I put it on the computer:
Windows says:(image below)

just to point out: the image I took with the camera is displayed on the gadgeteer’s screen so it’s fine.