Here is the code I’m using:
using (StreamWriter output = new StreamWriter(Path.Combine(masterdestination, "Master.txt")))
{
string masterfolders = sourcefolder1;
string[] filess = Directory.GetFiles(masterfolders, "*.txt");
foreach (string file in filess)
{
output.WriteLine(Path.GetFileName(file));
}
}
This code will search for all files in a user specified directory for any txt file. These directories sometimes contain 2million files.
monitoring this process while it’s running I’ve seen it climb up to 800MB memory usage. Is there a way I can preserve the speed of this process and limit the memory it uses? Or have it read and dump and continue? Hashtable? Any idea’s would be awesome.
Directory.GetFiles really sucks. If you can use .NET 4.0 you should look into using Directory.EnumerateFiles. From the docs: