Velocity Reviews - Computer Hardware Reviews

Velocity Reviews > Newsgroups > Programming > Python > Re: fastest data / database format for reading large files

Thread Tools

Re: fastest data / database format for reading large files

Chris Rebert
Posts: n/a
On Tue, Oct 16, 2012 at 11:35 AM, Pradipto Banerjee
<(E-Mail Removed)> wrote:
> I am working with a series of large files with sizes 4 to 10GB and may need to read these files repeated. What data format (i.e. pickle, json, csv, etc.) is considered the fastest for reading via python?

Pickle /ought/ to be fastest, since it's binary (unless you use the
oldest protocol version) and native to Python. Be sure to specify
HIGHEST_PROTOCOL and use cPickle.

You might consider using SQLite (or some other database) if you will
be doing queries over the data that would be amenable to SQL or


P.S. The verbose disclaimer at the end of your emails is kinda annoying...
Reply With Quote

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off

Similar Threads
Thread Thread Starter Forum Replies Last Post
Database Database Database Database Computer Information 0 09-27-2012 02:43 AM
DataBase DataBase DataBase DataBase Computer Information 0 09-26-2012 09:40 AM
*fastest* was to get a large directory listing in Perl Seth Brundle Perl Misc 4 09-22-2005 12:23 AM
Fastest 5 mp Digital Camera ? Fastest 4 mp Digital Camera? Digital Photography 6 10-28-2004 11:33 AM
Backing Up Large Files..Or A Large Amount Of Files Scott D. Weber For Unuathorized Thoughts Inc. Computer Support 1 09-19-2003 07:28 PM