Velocity Reviews - Computer Hardware Reviews

Velocity Reviews > Newsgroups > Programming > ASP .Net > Processing file input for large files[100+ MB] - Performance suggestions?

Thread Tools

Processing file input for large files[100+ MB] - Performance suggestions?

Posts: n/a
I am wondering if anyone could suggest come performance improvements for
processing a very large file[100+MB]. THe processing taking place here is
on 30-50MB chunks of the file.

Performance is extremely important here.

+ I initialise a StringBuilder object with the result of the function
System.Text.Encoding.GetString to convert the byte[] input to string.
[potentially gobbling up an extraordinary amount of RAM]

> myEncoding.getString(fileInput);

+ The StringBuilder is converted to a string, capitalisation taken
out[ToLower], and split[String.Split()] on the new line character '\n' which
is placed in a string array.

> fileLines = SB.ToString.ToLower().Split(newline);

At the moment this is taking up to 40-50% of the programs running time to
perform, I'd really like to get this down as low as possible.

Im guessing not converting to a Byte array would save quite a bit of time,
as well as finding a better way to split the string into lines. But I've had
no luck at all finding anything.


Reply With Quote

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off

Similar Threads
Thread Thread Starter Forum Replies Last Post
ZODB memory problems (was: processing a Very Large file) DJTB Python 1 05-22-2005 06:57 AM
processing a large utf-8 file Ivan Voras Python 1 05-20-2005 11:58 PM
RE: processing a Very Large file Robert Brewer Python 3 05-19-2005 06:58 AM
processing a Very Large file DJTB Python 4 05-18-2005 12:38 PM
Re: large file processing Thomas Matthews C++ 0 08-29-2003 04:15 PM