I just wanted to test out SD card loading as well as the N18 display and so wrote a few lines to test it, but I’m getting a System.OutOfMemoryException on the SDCard.LoadBitmap() line. The image is 320x240 and 60k. I’m assuming it means this image is too big for the RAM but is there any way around this rather than reducing the quality of the image?
What format does the camera capture images in because obviously they aren’t too big to store in RAM?
Just tried it with a 16k JPG and getting the same problem.
Cerberus and simplegraphics are not an easy combination to work with.
Were you intending to window the image (so you only see a portion of it at once) or were you intending to show the image shrunk to the display size? The most common approach to this situation is to load the image in tiles, say a quarter of the screen at a time, and display each one before reusing memory to load the next tile.
If this were a true Bitmap you could open the file as a stream and read chunks of it. But I see that you’re using JPEG. This is probably part of the problem. JPEG uses compression so on disk and in RAM it’s small. But when the JPEG is converted to a true bitmap (to send the screen the bits to display it) the size is Height * Width * Color Bits.
You could either save your images in smaller tiles (maybe six 320x40 images) and shuffle them out (I recommend trying this first), or switch to using a bitmap so you can stream chunks of it. Streaming chunks might look something like this…
int bitmapHeaderSize = 54; //Bitmaps have a 54byte header.
int chunkSize = 320 * 40 * 3; //Assuming you want 40 line chunks and you're using 24 bit color (3 bytes for 24 bits).
byte chunk = new byte[chunkSize];
byte chunkWithHeader = new byte[chunkSize + bitmapHeaderSize];
Bitmap bmp = null;
FileStream fs = System.IO.File.OpenRead("yourfilepath");
fs.Read(chunkWithHeader, 0, 54); //This loads the header into your buffer.
//Bytes 22 - 25 in the header contain the height. We need to change that to 40.
chunkWithHeader = 0x28; //40 in Big-Endian binary is 0010 1000 0000 0000
chunkWithHeader = 0;
chunkWithHeader = 0;
chunkWithHeader = 0.
for (int i = 54; i < fs.Length; i += chunkSize) //We start at the header
if (fs.Length < (i + chunkSize))
chunkSize = ((int)fs.Length - i);
fs.Read(chunk, i, chunkSize);
Array.Copy(chunk, 0, chunkWithHeader, 54, chunk.Length); //I don't like this but I can't specify offets for fs.Read
bmp = new Bitmap(chunkWithHeader, Bitmap.BitmapImageType.Bmp);
display_N18.SimpleGraphics.DisplayImage(bmp, i * 40, 0);
…This code probably won’t work. I just typed it up without trying it. But you get the idea.
Also, the way you’re looping through the list of files is going to waste memory and performance will degrade as there are more files. Instead consider using the IEnumerable pattern…
Thanks for the replies, I’m having a go at the ‘tiling a bitmap’ way as you describe @ untitled.
Annoyingly, I’m getting a System.ArgumentException on:
I tried changing this to:
```cs]GT.Picture pic = new GT.Picture(...);[/code
and that seems to work but when I call (using x = 0 and y = 0 as a test for now):
```cs]display.SimpleGraphics.DisplayImage(pic, 0, 0);[/code
I now get the System.OutOfMemoryException on this line rather than when I create the byte array as previously.
Could it be that even just chunkWithHeader and then GT.Picture are using up too much RAM?
Sorry I'm not getting a System.ArgumentException on new Bitmap(), I'm getting: System.ArgumentOutOfRangeException but don't see why.
My chunk has the header and I've set the BitmapType to BMP.
Do I need to do anything else before calling new Bitmap() with my byte.