Processing file input for large files[100+ MB] - Performance suggestions?
I am wondering if anyone could suggest come performance improvements for
processing a very large file[100+MB]. THe processing taking place here is
on 30-50MB chunks of the file.
Performance is extremely important here.
+ I initialise a StringBuilder object with the result of the function
System.Text.Encoding.GetString to convert the byte input to string.
[potentially gobbling up an extraordinary amount of RAM]
+ The StringBuilder is converted to a string, capitalisation taken
out[ToLower], and split[String.Split()] on the new line character '\n' which
is placed in a string array.
> fileLines = SB.ToString.ToLower().Split(newline);
At the moment this is taking up to 40-50% of the programs running time to
perform, I'd really like to get this down as low as possible.
Im guessing not converting to a Byte array would save quite a bit of time,
as well as finding a better way to split the string into lines. But I've had
no luck at all finding anything.
|All times are GMT. The time now is 07:20 PM.|
Powered by vBulletin®. Copyright ©2000 - 2013, vBulletin Solutions, Inc.
SEO by vBSEO ©2010, Crawlability, Inc.