I am curently loading from 10k to 200k files, so I am looking for some performance boost here.
Here is how I am doing this:
List<string> myFiles = new List<string>();
OpenFileDialog openFileDialog = new OpenFileDialog();
openFileDialog.Multiselect = true;
openFileDialog.Filter = "Text files (*.txt)|*.txt|All files (*.*)|*.*";
if (openFileDialog.ShowDialog() == true)
{
foreach (string filename in openFileDialog.FileNames)
{
myFiles.Add(filename);
}
}
string[] files = myFiles.ToArray();
Splitter(files);
private void Splitter(string[] file)
{
try
{
tempDict = file
.SelectMany(i => File.ReadAllLines(i)
.SelectMany(line => line.Split(new[] { ' ', ',', '.', '?', '!', }, StringSplitOptions.RemoveEmptyEntries)))
.GroupBy(word => word)
.ToDictionary(g => g.Key, g => g.Count());
}
catch (Exception ex)
{
Ex(ex);
}
}
I was thinking about making foreach pararell:
Parallel.ForEach(openFileDialog.FileNames, filename =>
{
mySpam.Add(filename);
});
Is it safe? Are there any more improvements in this code to be made? Right now, it gets laggy and program "freezes" while loading those files, so it would be nice if it was doing those things "behind scenes". I am using WPF and the loading files begins after button click.
Aucun commentaire:
Enregistrer un commentaire