Velocity Reviews - Computer Hardware Reviews

Velocity Reviews > Newsgroups > Programming > ASP .Net > Using Dataset for large amounts of data

Reply
Thread Tools

Using Dataset for large amounts of data

 
 
Brent
Guest
Posts: n/a
 
      04-06-2004
Hi,
I'm wondering if it is good to use datasets for large amounts of data with
many users. I'm talking tables with 130,000 records and 15 columns. And we
want current data, so no cached data. Right now we are using an
OleDbDataReader and then just doing reader.Read() to go through the
necessary records. So would using a dataset be good, or bad for this
compared to what we are doing?

We're looking at it again to get the features of transactions out of it, but
are not sure if they can handle that much data very well?


 
Reply With Quote
 
 
 
 
Raymond Lewallen
Guest
Posts: n/a
 
      04-06-2004
Holy cow!! Why would you bring back that much data? First thing I would
do, regardless of what you are trying to do, is find a way to greatly limit
that number of records you are returning somehow. Do your records have a
unique key of some sort? And are they ordered somehow, maybe by the unique
key (that would be prefered)?

DataReaders will perform faster than datasets. DataSets are disconnected
from the database and must store all their information in memory.
DataReaders remain connected to the database and pull records from the
database as you request them (i.e. reader.Read()).

HTH,

Raymond Lewallen

"Brent" <(E-Mail Removed)> wrote in message
news:(E-Mail Removed)...
> Hi,
> I'm wondering if it is good to use datasets for large amounts of data with
> many users. I'm talking tables with 130,000 records and 15 columns. And we
> want current data, so no cached data. Right now we are using an
> OleDbDataReader and then just doing reader.Read() to go through the
> necessary records. So would using a dataset be good, or bad for this
> compared to what we are doing?
>
> We're looking at it again to get the features of transactions out of it,

but
> are not sure if they can handle that much data very well?
>
>



 
Reply With Quote
 
 
 
 
Cor
Guest
Posts: n/a
 
      04-07-2004
Hi Raymond,

I agree completly with the first part of your message. With the second part
I do not (completly).

> Holy cow!! Why would you bring back that much data? First thing I would
> do, regardless of what you are trying to do, is find a way to greatly

limit
> that number of records you are returning somehow. Do your records have a
> unique key of some sort? And are they ordered somehow, maybe by the

unique
> key (that would be prefered)?


> DataReaders will perform faster than datasets. DataSets are disconnected
> from the database and must store all their information in memory.
> DataReaders remain connected to the database and pull records from the
> database as you request them (i.e. reader.Read()).


It is not said that DataReaders perform faster than using the dataadapter
which uses also the datareader.

Main thing is that it fixes very efficient the data in memory while with a
datareader you need to do that item by item.

The amount of data with a right selection, as you stated before, can often
be that few, that it is probably often almost the same amount of bytes as
the code needed to retrieve it using a datareader, so that can not be a
point. (and in a modern computer absolute not relevant).

Moreover when readed it the dataset becomes very handy to handle.

Just my thought to give the OP another opinion, while I can think about
situations where your statement is right.

Cor










 
Reply With Quote
 
Raymond Lewallen
Guest
Posts: n/a
 
      04-07-2004
Brent,

I posted arguments and sample code on DataSet vs DataReader on a response to
Cor on this post dated 4/7/2004 08:24 AM CST. Be sure and read that and it
my help further clarify what I failed to completely clarify in my initial
response post yesterday.

Raymond Lewallen

"Brent" <(E-Mail Removed)> wrote in message
news:(E-Mail Removed)...
> Hi,
> I'm wondering if it is good to use datasets for large amounts of data with
> many users. I'm talking tables with 130,000 records and 15 columns. And we
> want current data, so no cached data. Right now we are using an
> OleDbDataReader and then just doing reader.Read() to go through the
> necessary records. So would using a dataset be good, or bad for this
> compared to what we are doing?
>
> We're looking at it again to get the features of transactions out of it,

but
> are not sure if they can handle that much data very well?
>
>



 
Reply With Quote
 
 
 
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Problems with large amounts of data using prototype's Ajax.Updater class cringer@goimage.com Javascript 0 09-05-2006 07:42 PM
Opinion: Optimal way of holding large amounts of data between pages? David ASP .Net 0 06-21-2006 08:08 PM
HTTP POST to send large amounts of data? Bint HTML 1 03-19-2006 02:11 PM
transferring large amounts of data Andersen Java 2 11-03-2005 03:16 AM
storing large amounts of data in a list/dictionary flamesrock Python 2 03-12-2005 04:59 AM



Advertisments