What is best method to read large txt file line by line Fez Domino

so i want to read large txt file line by line, now i am reading it using SteamReader.Readline()
but exception occurs OutOfMemory?
i need your advise what is a best method to fix this problem?
i checked ReadBlock also but could not do to read line by line?

How large is large?
Is there a number in kB or MB?

Post your code. May be it can be optimized to avoid out of memory situations.

I just completed a micro database to read a CSV file of 14 Mb with 250,000 rows - and it finds the record by key is less than 100 ms :slight_smile:

I initially tried ReadLine and got out of memory exceptions - because ReadLine allocates an internal buffer of 4K right off the bat.

The trick is not to use large buffers and read the data byte for byte or use very small buffers that you can discard quickly. It is useful to apply simple state machines and not build up large strings. Also remember that SD cards work in blocks of 512 bytes, so optimising reading that size helps.

Yep, others have worked around the ReadLine excessive buffer usage;

http://www.tinyclr.com/forum/2/1620