Issue with byte[]?

I have a small webserver application I am working on, and may have an issue with the allocation of memory for ‘byte[]’. I posted a more general query about my webserver issue in the ‘Gadgeteer’ board, figuring I just needed a little help. After looking at the issue some more, I’m thinking this could be a beta bug. I am using a Cerberus, updated to 7/23 SDK, with the network image loaded and an SD card attached.

When I try to allocate space with byte[] I get debugger messages such as:


Failed allocation for 198 blocks, 2376 bytes

Failed allocation for 172 blocks, 2064 bytes

I guess it’s possible I’m running short on memory, but if that’s the case something is using a lot more than I think as I don’t think I have any large objects. Here is the code I’m executing:


                {
                    // Writes data to browser
                    FileStream fs = new FileStream(path, FileMode.Open, FileAccess.Read);
                    byte[] buf = new byte[fs.Length];   // Works at 4000 and below, does Not work at 5000?!?!?
                    fs.Read(buf, 0, buf.Length);
                    responder.Respond(buf, ContentType);
                    fs.Dispose();
                    Debug.Print("File served...");
                }

Just how big can a byte[] array be? If I am limited on memory, what items related to networking or SD cards eat the most? And finally, is there another way to tackle this file size issue within Gadgeteer?

Thanks in advance, this stuff is really pretty cool…

This is an issue on all .NET MF boards that have very limited memory. One thing i recommend is using the FileStream like this

public static class FileHelper
{
	public const int BufferSize = 64;

	public static byte[] ReadFile(string path)
	{
		byte[] bytes;

		using (var stream = new FileStream(path, FileMode.Open, FileAccess.Read, FileShare.None, BufferSize))
		{
			bytes = new byte[stream.Length];
			stream.Read(bytes, 0, bytes.Length);
			stream.Close();
		}

		return bytes;
	}

	public static void WriteFile(string path, byte[] bytes)
	{
		using (var stream = new FileStream(path, FileMode.Create, FileAccess.Write, FileShare.None, BufferSize))
		{
			stream.Write(bytes, 0, bytes.Length);
			stream.Close();
		}
	}
}

The default constructor allocates a lot of memory for the internal buffer. If you specify the buffer size explicitly you will save some space. Also think if you are able to work on chunks of data instead of the whole data array. In some cases you don’t need to hold the whole data in memory.

debug.gc() is also your friend; use TRUE to force a GC pass, FALSE to return available memory…

Debug.Print("Free mem : " + Debug.GC(false).ToString());

1 Like

Brett is right, call garbage collection always after you get rid of large data.