GPU vs CPU

Discussion in 'NZ Computing' started by Lawrence D'Oliveiro, Nov 28, 2009.

  1. <http://blogs.zdnet.com/hardware/?p=6289>: “...because increasingly software
    developers are looking to the GPU to take the load off the CPU.â€

    Just one question: _what_ load? CPUs are already underutilized as they
    stand. And this is getting worse as we add more cores.
     
    Lawrence D'Oliveiro, Nov 28, 2009
    #1
    1. Advertising

  2. Lawrence D'Oliveiro

    Enkidu Guest

    Lawrence D'Oliveiro wrote:
    > <http://blogs.zdnet.com/hardware/?p=6289>: “...because increasingly software
    > developers are looking to the GPU to take the load off the CPU.â€
    >
    > Just one question: _what_ load? CPUs are already underutilized as they
    > stand. And this is getting worse as we add more cores.
    >

    High power gaming.

    Cheers,

    Cliff

    --

    The Internet is interesting in that although the nicknames may change,
    the same old personalities show through.
     
    Enkidu, Nov 28, 2009
    #2
    1. Advertising

  3. Lawrence D'Oliveiro

    Me Guest

    Enkidu wrote:
    > Lawrence D'Oliveiro wrote:
    >> <http://blogs.zdnet.com/hardware/?p=6289>: “...because increasingly
    >> software developers are looking to the GPU to take the load off the CPU.â€
    >>
    >> Just one question: _what_ load? CPUs are already underutilized as they
    >> stand. And this is getting worse as we add more cores.
    > >

    > High power gaming.
    >
    >

    For intensive image / home HD video editing also perhaps.
    Some recent Adobe products use GPU as well as being optimised to use
    multiple CPU cores.
    From what I've seen GPU use tends to be for "viewing" functions, like
    quick/smooth zooming of massive image files using Photoshop.
    Complex image editing can be painfully slow with single core machines,
    or applications like "Gimp" which don't seem to support multi-core. Some
    digital cameras are outputting >50 megapixel files, or smaller image
    files are stitched (panoramas or mosaic) to massive size.
     
    Me, Nov 28, 2009
    #3
  4. Lawrence D'Oliveiro

    Gordon Guest

    On 2009-11-28, Lawrence D'Oliveiro <_zealand> wrote:
    ><http://blogs.zdnet.com/hardware/?p=6289>: ?...because increasingly software
    > developers are looking to the GPU to take the load off the CPU.?
    >
    > Just one question: _what_ load? CPUs are already underutilized as they
    > stand. And this is getting worse as we add more cores.


    My take on this is that the GPU's are getting so powerful, that they are
    often "idling" so lets get the CPU and the GPU working together and really
    blow things out of the water as we know it
     
    Gordon, Nov 28, 2009
    #4
  5. Lawrence D'Oliveiro

    victor Guest

    Enkidu wrote:
    > Lawrence D'Oliveiro wrote:
    >> <http://blogs.zdnet.com/hardware/?p=6289>: “...because increasingly
    >> software developers are looking to the GPU to take the load off the CPU.â€
    >>
    >> Just one question: _what_ load? CPUs are already underutilized as they
    >> stand. And this is getting worse as we add more cores.
    > >

    > High power gaming.
    >
    > Cheers,
    >
    > Cliff
    >

    http://en.wikipedia.org/wiki/GPGPU#Applications
     
    victor, Nov 28, 2009
    #5
  6. In message <>, Gordon wrote:

    > On 2009-11-28, Lawrence D'Oliveiro <_zealand>
    > wrote:
    >
    >> <http://blogs.zdnet.com/hardware/?p=6289>: “...because increasingly
    >> software developers are looking to the GPU to take the load off the CPU.â€
    >>
    >> Just one question: _what_ load? CPUs are already underutilized as they
    >> stand. And this is getting worse as we add more cores.

    >
    > My take on this is that the GPU's are getting so powerful, that they are
    > often "idling" so lets get the CPU and the GPU working together and really
    > blow things out of the water as we know it


    But if they’re so powerful that they’re spending so much of their time
    idling, then making them more powerful will mean they spend even more time
    idling.
     
    Lawrence D'Oliveiro, Nov 28, 2009
    #6
  7. Lawrence D'Oliveiro

    Enkidu Guest

    Lawrence D'Oliveiro wrote:
    > In message <>, Gordon wrote:
    >
    >> On 2009-11-28, Lawrence D'Oliveiro
    >> <_zealand> wrote:
    >>
    >>> <http://blogs.zdnet.com/hardware/?p=6289>: “...because
    >>> increasingly software developers are looking to the GPU to take
    >>> the load off the CPU.â€
    >>>
    >>> Just one question: _what_ load? CPUs are already underutilized as
    >>> they stand. And this is getting worse as we add more cores.

    >> My take on this is that the GPU's are getting so powerful, that
    >> they are often "idling" so lets get the CPU and the GPU working
    >> together and really blow things out of the water as we know it

    >
    > But if they’re so powerful that they’re spending so much of their
    > time idling, then making them more powerful will mean they spend even
    > more time idling.
    >

    Yes, we can wait for IO even faster than we did before.

    Cheers,

    Cliff

    --

    The Internet is interesting in that although the nicknames may change,
    the same old personalities show through.
     
    Enkidu, Nov 28, 2009
    #7
  8. Lawrence D'Oliveiro

    ~misfit~ Guest

    Somewhere on teh intarwebs Enkidu wrote:
    > Lawrence D'Oliveiro wrote:
    >> <http://blogs.zdnet.com/hardware/?p=6289>: "...because increasingly
    >> software developers are looking to the GPU to take the load off the
    >> CPU." Just one question: _what_ load? CPUs are already underutilized as
    >> they stand. And this is getting worse as we add more cores.
    >>

    > High power gaming.


    Also programme like SETI can utilise some GPUs and often the GPU can do 10x
    the work of the CPU due largely to the way they're designed.
    --
    Shaun.

    "Give a man a fire and he's warm for the day. But set fire to him and he's
    warm for the rest of his life." Terry Pratchet, 'Jingo'.
     
    ~misfit~, Nov 29, 2009
    #8
  9. Lawrence D'Oliveiro

    Malcolm Guest

    On Mon, 30 Nov 2009 11:16:23 +1300
    "~misfit~" <> wrote:

    > Somewhere on teh intarwebs Enkidu wrote:
    > > Lawrence D'Oliveiro wrote:
    > >> <http://blogs.zdnet.com/hardware/?p=6289>: "...because increasingly
    > >> software developers are looking to the GPU to take the load off the
    > >> CPU." Just one question: _what_ load? CPUs are already
    > >> underutilized as they stand. And this is getting worse as we add
    > >> more cores.
    > >>

    > > High power gaming.

    >
    > Also programme like SETI can utilise some GPUs and often the GPU can
    > do 10x the work of the CPU due largely to the way they're designed.

    Hi
    Yup, that's why I run the cuda driver....

    CUDA Device Query (Driver API) statically linked version
    There is 1 device supporting CUDA

    Device 0: "GeForce 8600 GTS"
    CUDA Driver Version: 2.30
    CUDA Capability Major revision number: 1
    CUDA Capability Minor revision number: 1
    Total amount of global memory: 268107776 bytes
    Number of multiprocessors: 4
    Number of cores: 32
    Total amount of constant memory: 65536 bytes
    Total amount of shared memory per block: 16384 bytes
    Total number of registers available per block: 8192
    Warp size: 32
    Maximum number of threads per block: 512
    Maximum sizes of each dimension of a block: 512 x 512 x 64
    Maximum sizes of each dimension of a grid: 65535 x 65535 x 1
    Maximum memory pitch: 262144 bytes
    Texture alignment: 256 bytes
    Clock rate: 1.46 GHz
    Concurrent copy and execution: Yes
    Run time limit on kernels: Yes
    Integrated: No
    Support host page-locked memory mapping: No
    Compute mode: Default (multiple host
    threads can use this device simultaneously)

    BOINC then uses it as a coprocessor....

    --
    Cheers Malcolm °¿° (Linux Counter #276890)
    SUSE Linux Enterprise Desktop 11 (x86_64) Kernel 2.6.27.37-0.1-default
    up 1 day 2:45, 2 users, load average: 0.29, 0.30, 0.35
    GPU GeForce 8600 GTS Silent - CUDA Driver Version: 190.18
     
    Malcolm, Nov 30, 2009
    #9
  10. Lawrence D'Oliveiro

    thingy Guest

    On Nov 28, 2:33 pm, Lawrence D'Oliveiro <l...@geek-
    central.gen.new_zealand> wrote:
    > <http://blogs.zdnet.com/hardware/?p=6289>: “...because increasingly software
    > developers are looking to the GPU to take the load off the CPU.”
    >
    > Just one question: _what_ load? CPUs are already underutilized as they
    > stand. And this is getting worse as we add more cores.


    ease? Its hard to make applications truly multi-threaded and take
    advantage of lots of cores....and 8 cores will be here in 2010. Its
    interesting, at one stage Intel was talking of 10Ghz and beyond, now
    after that (marketing hype) flopped its 80 cores and beyond. The idea
    (or part) of the 80 cores is that some will be GPUs on the one
    die....and you vary the quantities of each type of core depending on
    the CPU's base unit's use...

    I suspect that this all points to limits of humans as programmers (and
    possibly return on investment, ie man hours) to get multi-core general
    purpose CPUs to do what's wanted/needed....to compensate there will be
    co-processors and lots of them that get called....Another indication
    of this was AMD(?) commenting that their goal was to get (massively)
    multi-cored CPUs to appear as one or two to the OS/application layer.

    So, eealistically one CPU at 3~3.6Ghz and 4 cores would probably be
    the upper limit of whats achievable for a desktop (maybe 4Ghz and 8
    cores, tops)....then we have the human limitations...so you have an
    envelope you have to stay inside....

    regards
     
    thingy, Nov 30, 2009
    #10
  11. Lawrence D'Oliveiro

    thingy Guest

    On Nov 29, 10:18 am, Enkidu <> wrote:
    > Lawrence D'Oliveiro wrote:
    > > In message <>, Gordon wrote:

    >
    > >> On 2009-11-28, Lawrence D'Oliveiro
    > >> <_zealand> wrote:

    >
    > >>> <http://blogs.zdnet.com/hardware/?p=6289>: “...because
    > >>> increasingly software developers are looking to the GPU to take
    > >>> the load off the CPU.”

    >
    > >>> Just one question: _what_ load? CPUs are already underutilized as
    > >>> they stand. And this is getting worse as we add more cores.
    > >> My take on this is that the GPU's are getting so powerful, that
    > >> they are often "idling" so lets get the CPU and the GPU working
    > >> together and really blow things out of the water as we know it

    >
    > > But if they’re so powerful that they’re spending so much of their
    > > time idling, then making them more powerful will mean they spend even
    > > more time idling.

    >
    > Yes, we can wait for IO even faster than we did before.
    >
    > Cheers,
    >
    > Cliff
    >
    > --
    >
    > The Internet is interesting in that although the nicknames may change,
    > the same old personalities show through.


    SSD is a big leap forward on that....at the moment its high tech
    mechanical devices with their physical limits v chips and a
    circuitboard...the latter to me with the huge speed advantage as well
    seems the future.

    regards
     
    thingy, Nov 30, 2009
    #11
  12. Lawrence D'Oliveiro

    thingy Guest

    On Nov 30, 11:16 am, "~misfit~" <>
    wrote:
    > Somewhere on teh intarwebs Enkidu wrote:
    >
    > > Lawrence D'Oliveiro wrote:
    > >> <http://blogs.zdnet.com/hardware/?p=6289>: "...because increasingly
    > >> software developers are looking to the GPU to take the load off the
    > >> CPU." Just one question: _what_ load? CPUs are already underutilized as
    > >> they stand. And this is getting worse as we add more cores.

    >
    > > High power gaming.

    >
    > Also programme like SETI can utilise some GPUs and often the GPU can do 10x
    > the work of the CPU due largely to the way they're designed.
    > --
    > Shaun.
    >
    > "Give a man a fire and he's warm for the day. But set fire to him and he's
    > warm for the rest of his life." Terry Pratchet, 'Jingo'.


    Great so we know in some remote spot many years ago there was some
    dumb ass playing with an expensive radio....I'd rather save the CO2
    emisions here thanks.

    ;]

    regards
     
    thingy, Nov 30, 2009
    #12
  13. In message
    <>, thingy
    wrote:

    > SSD is a big leap forward on that..


    Not if you’re accessing it through the same old SATA interface.
     
    Lawrence D'Oliveiro, Nov 30, 2009
    #13
  14. In message <15ee38d6-8b99-4ab6-8e25-
    >, thingy wrote:

    > On Nov 28, 2:33 pm, Lawrence D'Oliveiro <l...@geek-
    > central.gen.new_zealand> wrote:
    >
    >> Just one question: _what_ load? CPUs are already underutilized as they
    >> stand. And this is getting worse as we add more cores.

    >
    > ease? Its hard to make applications truly multi-threaded and take
    > advantage of lots of cores...


    It’s not exactly easy to do SIMD programming of GPUs either.
     
    Lawrence D'Oliveiro, Nov 30, 2009
    #14
  15. In message <hepuki$g79$>, Lawrence D'Oliveiro wrote:

    > Just one question: _what_ load?


    A question underscored by the death of Larrabee
    <http://blogs.zdnet.com/hardware/?p=6383>.
     
    Lawrence D'Oliveiro, Dec 8, 2009
    #15
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Silverstrand
    Replies:
    0
    Views:
    748
    Silverstrand
    Jan 10, 2006
  2. Silverstrand
    Replies:
    0
    Views:
    1,070
    Silverstrand
    May 17, 2006
  3. Silverstrand

    Thermaltake W2 GPU waterblock

    Silverstrand, Aug 13, 2006, in forum: Front Page News
    Replies:
    0
    Views:
    843
    Silverstrand
    Aug 13, 2006
  4. Mikester71

    GPU fan + temp problem

    Mikester71, Jan 10, 2006, in forum: Computer Support
    Replies:
    2
    Views:
    426
    Mikester71
    Jan 11, 2006
  5. Gilles Vollant \(MVP\)

    Support of NVIDIA GeForce4 Ti 4200 GPU

    Gilles Vollant \(MVP\), Jan 9, 2006, in forum: Windows 64bit
    Replies:
    4
    Views:
    1,406
    =?Utf-8?B?RmVybmFuZG9CUg==?=
    Apr 5, 2006
Loading...

Share This Page