Memory Management and Object Lifetimes

I’m working on some memory issues and I think I noticed a big difference between Micro .NET and “big” .NET (which I’m used to). I’d appreciate corrections if I’m wrong about this.

“Big” .NET is pretty aggressive about trying to clean things up, memory-wise. Micro .NET is pretty lazy about it. For example, if you have the following code…

string Foo = "a Really Long String, possibly made by concatenating stuff"
byte[] FooBytes = Encoding.UTF8.GetBytes(Foo);
byte[] HashedFoo = Hash(FooBytes);
byte[] SignedHashedFoo = Sign(HashedFoo);

Under the “big” .NET framework, after the 2nd line Foo is cleaned up and unavailable. It recognizes that Foo isn’t accessed any more and cleans it up. Under the Micro .NET Framework, Foo is available until the end of the method and will not be garbage collected, even if that means throwing an out of memory exception.

Does that jive with what other people have experienced? Or am I seeing behavior that isn’t really there?

There is a long story behind this but in new comming netmf 4.2 this will be closer to the big .NET

For now, you can force compaction using debug.gc(true)

I’m actually not complaining, it’s just good to know.

Though now I’m curious about the long story. Any links?

No links, sorry :slight_smile:

In my app I’m trying to post about a kilobyte of data to a server that uses OAuth. So I’ve got a bunch of data which all builds on previous values…

Data: ~1kb
OAuth Signature Base String: ~350 bytes
OAuth Keys: ~200 bytes
Authorization Header: ~400 bytes

All of that stuff was adding up, and eventually I hit the tipping point. Now I got 99 problems but memory management ain’t one.