overclocking

Discussion in 'Digital Photography' started by Peter, Aug 26, 2010.

  1. Peter

    Peter Guest

    Peter, Aug 26, 2010
    #1
    1. Advertising

  2. Peter

    Guest

    On Aug 26, 6:37 pm, "lofi" <> wrote:
    > Overclocking is an utterly pointless exercise they may generate a few higher
    > numbers on a measurement scale but has absolutely no practical benefit to
    > the end user.


    Absolutely? So you say... :/ The frame rate, speed of loading
    textures, ect
    of my flight simulator tells me a different story than the one you
    offer.

    > It can easily fry your CPU, memory or motherboard even when done cautiously.


    I've been overclocking for years and I've never fried anything yet.
    Not a
    single part.

    > The main reason for that is most computer cases are inadequately
    > ventilated/cooled and even the slight increase in power usage with minor
    > overclocking can melt silicon.


    Slight increases will not be a problem. Large increases will require
    better
    cooling. Thinking people are aware of this issue.

    > Overclocking is for people who would rather play with hardware settings than
    > accomplish any real tasks.


    No, overclocking is for people that want to get upper end power with
    chump change price tags. Or they are already running upper end
    gear and want to get even more performance from the machine.
    I could be running the fastest machine on the market, and I would
    still likely overclock it to some extent. I can use every bit of power
    I can get.

    > If you need significant increase in computer power you need a newer, faster
    > machine.


    My present machine is about 3-4 months old.. And it's overclocked.
    Quite stable. Does not overheat. And I get better frame rates on
    my flight simulator which is the main application that prompts me to
    do such a thing.

    I'm not saying overclocking is for everyone. It's not. Some people
    are too damned stupid to safely overclock a machine.
    But to say it has no purpose other than play with hardware settings
    is pure bovine excrement.
     
    , Aug 27, 2010
    #2
    1. Advertising

  3. Peter

    peter Guest

    On 8/26/2010 11:03 PM, wrote:
    > On Aug 26, 6:37 pm, "lofi"<> wrote:
    >> Overclocking is an utterly pointless exercise they may generate a few higher
    >> numbers on a measurement scale but has absolutely no practical benefit to
    >> the end user.

    >
    > Absolutely? So you say... :/ The frame rate, speed of loading
    > textures, ect
    > of my flight simulator tells me a different story than the one you
    > offer.
    >
    >> It can easily fry your CPU, memory or motherboard even when done cautiously.

    >
    > I've been overclocking for years and I've never fried anything yet.
    > Not a
    > single part.
    >
    >> The main reason for that is most computer cases are inadequately
    >> ventilated/cooled and even the slight increase in power usage with minor
    >> overclocking can melt silicon.

    >
    > Slight increases will not be a problem. Large increases will require
    > better
    > cooling. Thinking people are aware of this issue.
    >
    >> Overclocking is for people who would rather play with hardware settings than
    >> accomplish any real tasks.

    >
    > No, overclocking is for people that want to get upper end power with
    > chump change price tags. Or they are already running upper end
    > gear and want to get even more performance from the machine.
    > I could be running the fastest machine on the market, and I would
    > still likely overclock it to some extent. I can use every bit of power
    > I can get.
    >
    >> If you need significant increase in computer power you need a newer, faster
    >> machine.

    >
    > My present machine is about 3-4 months old.. And it's overclocked.
    > Quite stable. Does not overheat. And I get better frame rates on
    > my flight simulator which is the main application that prompts me to
    > do such a thing.
    >
    > I'm not saying overclocking is for everyone. It's not. Some people
    > are too damned stupid to safely overclock a machine.
    > But to say it has no purpose other than play with hardware settings
    > is pure bovine excrement.
    >


    I did not post the link to start a war. Just to supply information on
    how to safely overclock, if anyone feels the need.

    BTW: You should have said: "MALE bovine excrement." :)

    Peter
     
    peter, Aug 27, 2010
    #3
  4. Peter

    J. Clarke Guest

    On 8/27/2010 8:24 AM, peter wrote:
    > On 8/26/2010 11:03 PM, wrote:
    >> On Aug 26, 6:37 pm, "lofi"<> wrote:
    >>> Overclocking is an utterly pointless exercise they may generate a few
    >>> higher
    >>> numbers on a measurement scale but has absolutely no practical
    >>> benefit to
    >>> the end user.

    >>
    >> Absolutely? So you say... :/ The frame rate, speed of loading
    >> textures, ect
    >> of my flight simulator tells me a different story than the one you
    >> offer.
    >>
    >>> It can easily fry your CPU, memory or motherboard even when done
    >>> cautiously.

    >>
    >> I've been overclocking for years and I've never fried anything yet.
    >> Not a
    >> single part.
    >>
    >>> The main reason for that is most computer cases are inadequately
    >>> ventilated/cooled and even the slight increase in power usage with minor
    >>> overclocking can melt silicon.

    >>
    >> Slight increases will not be a problem. Large increases will require
    >> better
    >> cooling. Thinking people are aware of this issue.
    >>
    >>> Overclocking is for people who would rather play with hardware
    >>> settings than
    >>> accomplish any real tasks.

    >>
    >> No, overclocking is for people that want to get upper end power with
    >> chump change price tags. Or they are already running upper end
    >> gear and want to get even more performance from the machine.
    >> I could be running the fastest machine on the market, and I would
    >> still likely overclock it to some extent. I can use every bit of power
    >> I can get.
    >>
    >>> If you need significant increase in computer power you need a newer,
    >>> faster
    >>> machine.

    >>
    >> My present machine is about 3-4 months old.. And it's overclocked.
    >> Quite stable. Does not overheat. And I get better frame rates on
    >> my flight simulator which is the main application that prompts me to
    >> do such a thing.
    >>
    >> I'm not saying overclocking is for everyone. It's not. Some people
    >> are too damned stupid to safely overclock a machine.
    >> But to say it has no purpose other than play with hardware settings
    >> is pure bovine excrement.
    >>

    >
    > I did not post the link to start a war. Just to supply information on
    > how to safely overclock, if anyone feels the need.
    >
    > BTW: You should have said: "MALE bovine excrement." :)


    Anyone who feels the need is quite capable of googling "overclocking"
    and finding vastly more information than your one crummy article that
    has no real information. That article tells how to tweak and pray, it
    doesn't tell how to "safely overclock".
     
    J. Clarke, Aug 27, 2010
    #4
  5. Peter

    Peter Guest

    "J. Clarke" <> wrote in message
    news:...
    > On 8/27/2010 8:24 AM, peter wrote:
    >> On 8/26/2010 11:03 PM, wrote:
    >>> On Aug 26, 6:37 pm, "lofi"<> wrote:
    >>>> Overclocking is an utterly pointless exercise they may generate a few
    >>>> higher
    >>>> numbers on a measurement scale but has absolutely no practical
    >>>> benefit to
    >>>> the end user.
    >>>
    >>> Absolutely? So you say... :/ The frame rate, speed of loading
    >>> textures, ect
    >>> of my flight simulator tells me a different story than the one you
    >>> offer.
    >>>
    >>>> It can easily fry your CPU, memory or motherboard even when done
    >>>> cautiously.
    >>>
    >>> I've been overclocking for years and I've never fried anything yet.
    >>> Not a
    >>> single part.
    >>>
    >>>> The main reason for that is most computer cases are inadequately
    >>>> ventilated/cooled and even the slight increase in power usage with
    >>>> minor
    >>>> overclocking can melt silicon.
    >>>
    >>> Slight increases will not be a problem. Large increases will require
    >>> better
    >>> cooling. Thinking people are aware of this issue.
    >>>
    >>>> Overclocking is for people who would rather play with hardware
    >>>> settings than
    >>>> accomplish any real tasks.
    >>>
    >>> No, overclocking is for people that want to get upper end power with
    >>> chump change price tags. Or they are already running upper end
    >>> gear and want to get even more performance from the machine.
    >>> I could be running the fastest machine on the market, and I would
    >>> still likely overclock it to some extent. I can use every bit of power
    >>> I can get.
    >>>
    >>>> If you need significant increase in computer power you need a newer,
    >>>> faster
    >>>> machine.
    >>>
    >>> My present machine is about 3-4 months old.. And it's overclocked.
    >>> Quite stable. Does not overheat. And I get better frame rates on
    >>> my flight simulator which is the main application that prompts me to
    >>> do such a thing.
    >>>
    >>> I'm not saying overclocking is for everyone. It's not. Some people
    >>> are too damned stupid to safely overclock a machine.
    >>> But to say it has no purpose other than play with hardware settings
    >>> is pure bovine excrement.
    >>>

    >>
    >> I did not post the link to start a war. Just to supply information on
    >> how to safely overclock, if anyone feels the need.
    >>
    >> BTW: You should have said: "MALE bovine excrement." :)

    >
    > Anyone who feels the need is quite capable of googling "overclocking" and
    > finding vastly more information than your one crummy article that has no
    > real information. That article tells how to tweak and pray, it doesn't
    > tell how to "safely overclock".



    So contribute something positive.

    --
    Peter
     
    Peter, Aug 27, 2010
    #5
  6. Peter

    Guest

    On Aug 27, 8:54 am, "J. Clarke" <> wrote:

    >
    > > I did not post the link to start a war. Just to supply information on
    > > how to safely overclock, if anyone feels the need.

    >
    > > BTW: You should have said: "MALE bovine excrement." :)

    >
    > Anyone who feels the need is quite capable of googling "overclocking"
    > and finding vastly more information than your one crummy article that
    > has no real information.  That article tells how to tweak and pray, it
    > doesn't tell how to "safely overclock".


    Sure it did for the most part.. You do it in small steps, and the
    article
    mentioned that.
    I've been overclocking machines for well over ten years. It's not
    rocket
    science. My first serious overclock was turning a Celeron 300 into
    a 500 + mhz machine. And I've overclocked every single machine I've
    had since then. Always with good results.
    And yes, there are loads and loads of other web sites dealing with
    overclocking. Big deal... Hell, I can pretty much tell anyone anything
    they want to know about it myself. :/ No brag, just fact.
    I've been doing it a long time, and with several different boxes.

    The only reason I might have sounded a bit perturbed on the other post
    were the negative comments from someone who really sounded like
    they had never even done it before, but just parroting what they had
    heard
    about it.
    Course, I may be wrong about that... But to say it has no use
    whatsoever
    doesn't strike me as someone who has spent much time actually doing
    it,
    or even had a real reason to. For probably 90% of the users,
    overclocking
    has no real benefit. The extra power does not noticeably improve the
    app's
    they are running.
    It's people like me that run flight sims, frame rate extensive games,
    etc
    that see the benefits.
    I see an increase in sim function with every bit of machine tweaking.
    So it's worth it to me. I need every bit of power I can get, but I'm
    usually
    loath to spend the big $$$ for it. So I overclock to get the same
    performance
    as the higher end machines at cheap prices.

    The ones that have the problems are the heavy voltage tweakers..
    I rarely change voltages much. If I can't get it to work with the
    normal
    voltage, I don't go much farther. Also, I don't mess with the RAM too
    much. I don't up the voltage, and I usually don't tweak the timings
    too
    much. That can lead to corruption of data in hard core cases.
    I concentrate on the CPU, video and bus speeds if they apply.
    My present machine has totally stock voltage settings.
    Also totally stock RAM settings.
    The best way to fry something is to start cranking the voltages
    sky high. Also leads to much greater heat and power consumption.

    And I have a big 117v fan on the side of the case that cools
    everything.
    MB, CPU, HD's, RAM, everything in this box runs cool.
    The CPU I'm using now is a AMD black box version that is catered to
    overclockers. It was sold as a 3.2 ghz dual core. But I'm running it
    as a
    quad core which is totally stable up to about 3.9 ghz with stock
    cooling. A large reason I can get away with that is because I don't
    crank the voltages up. But I run it slightly slower than that just to
    give me some cooling overhead with the stock CPU fan.
    If I had exotic CPU cooling, I could probably run 4 ghz plus..

    My MB/CPU came with OC software. I can change many settings
    on the fly from windows, but I usually do most in the BIOS.
    I paid $115 for that MB and CPU. I'm usually in the "get decent power
    for chump change" category as far as overclocking goes.
    For the power I get from each build, I save loads of $$$ vs buying
    a machine with those settings stock. Most all the chips of a certain
    platform are the same.
    They just test and rate them according to the testing they pass.
    But often due to demand, many of the slower rated chips are
    actually chips that have no problems passing the higher clock
    rates. IE: if they get an order for a load of 2.8 ghz chips, but have
    a batch of 3.4 ghz rated chips laying around already built, they
    will just mark the good chips with the lower rating and shove them
    out the door. Or to put it another way, many of the lower rated
    chips are actually high rate chips in disguise. Every single CPU
    I've run since the late 90's was capable of running much faster
    than actually labeled. That's because they really don't make too
    many "bad" chips any more.
    My present CPU was sold as a dual core even though it has
    four totally stable cores. All I had to do was turn on the other two
    in the BIOS setup... :/ I would have spent a good bit more
    if I had bought the same chip labeled as a quad core, even though
    in reality they are the exact same CPU's.
    It even fools CPUZ.. If I check my CPU, it thinks it's a quad
    core with the usual quad core model number instead of the
    dual core model number. Because it is.. :)
     
    , Aug 28, 2010
    #6
  7. "Neil Harrington" <> wrote in message
    news:...
    []
    > I have computers all over the place, and at the moment not a single one
    > of them is overclocked. I don't do it if I don't need it. But if
    > tomorrow or next week I should need it, I will overclock with a high
    > heart.


    Same here. But if anything, I now go for the lower-powered components -
    "green" HDs, Intel Atom processors, the simplest graphics cards (or
    built-in) etc - where at all possible, to keep down heat and noise, reduce
    power consumption, and perhaps increase reliability.

    For what I do, today's processors are plenty powerful enough - I get more
    gain in productivity from increasing RAM, perhaps using 64-bit Windows,
    and have two displays.

    Cheers,
    David
     
    David J Taylor, Aug 28, 2010
    #7
  8. Peter

    Peter Guest

    "David J Taylor" <> wrote in message
    news:i5bdbq$8jv$-september.org...
    > "Neil Harrington" <> wrote in message
    > news:...
    > []
    >> I have computers all over the place, and at the moment not a single one
    >> of them is overclocked. I don't do it if I don't need it. But if tomorrow
    >> or next week I should need it, I will overclock with a high heart.

    >
    > Same here. But if anything, I now go for the lower-powered components -
    > "green" HDs, Intel Atom processors, the simplest graphics cards (or
    > built-in) etc - where at all possible, to keep down heat and noise, reduce
    > power consumption, and perhaps increase reliability.
    >
    > For what I do, today's processors are plenty powerful enough - I get more
    > gain in productivity from increasing RAM, perhaps using 64-bit Windows,
    > and have two displays.



    Is there a significant advantage to a second display? I have been
    considering getting one, but, I have to justify the expenditure to my
    treasury department. She just got a washing machine and dryer and is unhappy
    with the speed of her WinXP machine. <G>
    A trade opportunity window, (no pun intended,) has opened. My alternative
    would be a new WA lens.



    --
    Peter
     
    Peter, Aug 28, 2010
    #8
  9. "Peter" <> wrote in message
    news:4c793be3$0$5525$-secrets.com...
    []
    > Is there a significant advantage to a second display? I have been
    > considering getting one, but, I have to justify the expenditure to my
    > treasury department. She just got a washing machine and dryer and is
    > unhappy with the speed of her WinXP machine. <G>
    > A trade opportunity window, (no pun intended,) has opened. My
    > alternative would be a new WA lens.


    For what I do, yes there is. I do some program development, so it's very
    handy to have the program under test running on the left display, and the
    compiler and debugging tools on the right-hand display. Displays today
    would likely be far cheaper than a WA lens, unless you want something very
    well calibrated. For photo use, you may have the photo on your well
    calibrated main display, and the program menus etc. on the secondary. I
    was really surprised how much of a difference it made for me. Perhaps you
    have an old display somewhere you could try out?

    Cheers,
    David
     
    David J Taylor, Aug 28, 2010
    #9
  10. Peter

    Guest

    On Aug 28, 2:49 pm, "Neil Harrington" <> wrote:

    >
    > I don't use Intel CPUs anymore since I'm convinced AMD's wares provide more
    > bang for the buck.


    That's why I went to AMD this recent build vs Intel. I actually prefer
    Intel, and the I7's generally have more brute HP than the Phenom II's.
    But.. Intel is really proud of those things.. I would have paid at
    least
    double for the I7 CPU alone, vs what I paid for both the CPU and the
    MB of the AMD Phenom II system.
    Like I say, I got the Phenom II x2 555 with a AM3 MB for only $115.
    A lot of bang for the buck for the money considering how much power
    you get once you crank up all four cores, and clock it up a bit more.
    Of course, not all 555's will have four good cores, but many do.
    That part is kind of a luck of the draw type deal.. Luckily I have
    one
    with four good cores. But even if the chip only had two good cores
    as stock, it's still a good deal for the money.
    But.. in the flight sim world, the I7 is generally considered the king
    of the hill right now for brute performance. And they overclock well
    too. An overclocked I7 system can be a real hummer. :)
    I just wish the prices were a tad lower. The I7 was really pricey
    when it first came out. Dang near a kilobuck for a high end I7 cpu..
    But the AMD box runs the sim pretty well. It does have good
    memory throughput with the direct CPU/RAM connection, which
    is similar to what the I7's, etc use.
    I do use an AM3 board and DDR3 ram, which is superior to
    the AM2's and DDR2 ram.
     
    , Aug 28, 2010
    #10
  11. Peter

    nospam Guest

    In article <>, Floyd L. Davidson
    <> wrote:

    > >> Is there a significant advantage to a second display?
    > >> I have been considering getting one, but, I have to
    > >> justify the expenditure to my treasury department. She
    > >> just got a washing machine and dryer and is unhappy
    > >> with the speed of her WinXP machine. <G>
    > >> A trade opportunity window, (no pun intended,) has
    > >> opened. My alternative would be a new WA lens.

    > >
    > >For what I do, yes there is. I do some program development, so it's very
    > >handy to have the program under test running on the left display, and the
    > >compiler and debugging tools on the right-hand display.

    >
    > You don't need, or even get any benefit from, two
    > monitors to do that. A proper windowing system is what
    > it takes, and if your won't do that without the use of a
    > second monitor I'd suggest that a different platform
    > would be very productive. The X windowing system is an
    > example.


    you most certainly *do* get a benefit from two (or more) displays,
    especially if you're using the full screen, among other things. having
    a second display is without question, a huge increase in productivity.

    three displays is also helpful in some circumstances but not as much as
    a second display.

    and then there's going a bit overboard with 13 displays :)
    <http://the.taoofmac.com/media/blog/2003/06/22/Image1.jpg>

    > >Displays today
    > >would likely be far cheaper than a WA lens, unless you want something very
    > >well calibrated. For photo use, you may have the photo on your well
    > >calibrated main display, and the program menus etc. on the secondary. I
    > >was really surprised how much of a difference it made for me. Perhaps you
    > >have an old display somewhere you could try out?

    >
    > That's a valid point, though I've always used two
    > identical monitors and found that to less expensive than
    > a single monitor of equal quality and screen area. It
    > used to be that the form factor was more suitable too
    > but today there are monitors with wider screens.
    > Whatever, it's a little bit of a pain if both monitors
    > are not the same size.


    it's not a pain at all. in fact, it's trivial. simply plug the display
    in and use the control panel to arrange it spatially, as desired.

    > Also, depending on the software used, some methods use
    > just one large video buffer rather than display each
    > monitor separately from its own buffer. The effect is
    > that the look up table for color/gamma correction will
    > be shared by both monitors, which means only one of them
    > can be calibrated precisely (the other one will only be
    > as close as can be adjusted with hardware).


    macs definitely support separate colour profiles for multiple displays
    and i'm pretty sure windows does too but i've never tried calibrating a
    two display windows system.
     
    nospam, Aug 29, 2010
    #11
  12. "Neil Harrington" <> wrote in message
    news:...
    []
    > I don't use Intel CPUs anymore since I'm convinced AMD's wares provide
    > more bang for the buck. (The exception is a little netbook I have with
    > the Atom CPU, but that's the first Intel chip I've used in about 10
    > years.) I always benchmark the onboard graphics processor (if the
    > motherboard has that) before putting in a discrete graphics card, and it
    > is just amazing how good the built-in ones are nowadays, at least the
    > ATI graphics that come with most AMD-chipset motherboards. If I didn't
    > play games I would never bother with a discrete card at all.
    >
    > The same goes for audio. Onboard audio used to be pretty bad, but now
    > it's so good I haven't used a discrete sound card for years.


    Agreed about graphics and audio - although some of the on-board graphics
    uses part of the main memory - another reason you shouldn't skimp on
    memory.

    I never buy AMD - we've seen far too many incompatibilities with AMD and
    their motherboards to make it worth the risk. One AMD motherboard I have
    will blow power tracks if a certain satellite TV card is plugged in!

    But CPU cost these days isn't really a major issue - do you really see a
    lot of benefit from paying double the money and getting 10% more
    performance? For most people, the answer is "no".

    Cheers,
    David
     
    David J Taylor, Aug 29, 2010
    #12
  13. "Floyd L. Davidson" <> wrote in message
    news:...
    > "David J Taylor" <> wrote:

    []
    >>For what I do, yes there is. I do some program development, so it's
    >>very
    >>handy to have the program under test running on the left display, and
    >>the
    >>compiler and debugging tools on the right-hand display.

    >
    > You don't need, or even get any benefit from, two
    > monitors to do that. A proper windowing system is what
    > it takes, and if your won't do that without the use of a
    > second monitor I'd suggest that a different platform
    > would be very productive. The X windowing system is an
    > example.

    []
    > Floyd L. Davidson <http://www.apaflo.com/floyd_davidson>
    > Ukpeagvik (Barrow, Alaska)


    My experience differs - I want to see both source and working program at
    the same time, and you need physical display area for that.

    As I develop Windows software, using Windows tools, suggesting that I move
    to a UNIX/X-windows platform makes little sense.

    Cheers,
    David
     
    David J Taylor, Aug 29, 2010
    #13
  14. "Floyd L. Davidson" <> wrote in message
    news:...
    > "David J Taylor" <> wrote:

    []
    >>My experience differs - I want to see both source and working program at
    >>the same time, and you need physical display area for that.

    >
    > Exactly, and two monitors or one does not make any
    > difference. Only the physical area. It has *nothing*
    > to do with using two monitors!
    >
    >>As I develop Windows software, using Windows tools, suggesting that I
    >>move
    >>to a UNIX/X-windows platform makes little sense.

    >
    > I did not suggest you move to Unix. But using a better
    > widowing system would help. X of course runs just fine
    > under Windows.


    I took your words: "I'd suggest that a different platform would be very
    productive." that way. A different windowing system would get me nothing.

    In my case, the second monitor has two inputs, and I can actually use it
    to monitor a second PC. Like many others, I also prefer the arrangement
    of two monitors side-by-side to one giant monitor. I get 2880 pixels
    width, and 1200 pixels height on the left monitor. I haven't seen too
    many monitors with that aspect ratio.

    Of course, you may have completely different requirements - I can only
    share the experience of what has worked well for me.

    Cheers,
    David
     
    David J Taylor, Aug 29, 2010
    #14
  15. "Neil Harrington" <> wrote in message
    news:...
    []
    > Right. Of course that's something you just have to put up with with
    > laptops, but with desktops it's often a reason to put in some sort of a
    > discrete graphics card even if the onboard graphics are adequate.


    Agreed.

    > My sister in her Florida place has a fairly early Windows XP system --
    > with just 256 MB of system RAM, I was shocked to see when I visited her
    > there. And only onboard graphics. Needless to say, that is one sluggish
    > computer. She doesn't use it for much besides e-mail, and is on dial-up,
    > so a sluggish system doesn't make much difference.


    Oh dear! She has my sympathy! I'd say put a GB in, but some of those
    early systems had rather expensive memory. I still have one system with a
    512MB limit (it runs Windows 2000) and one with a 1GB limit (that runs XP
    with a 300MB RAMdisk for a specialist application. It's comfortable at
    that level.

    []
    > I agree. I never buy anywhere near "cutting edge" as far as CPUs go. I
    > never considered buying dual-core Athlons when they were the newest
    > thing -- I buy 'em now, when they are dirt cheap. These sorts of things
    > get cheaper *and* more refined at the same time.


    Yes, it's where those who "need" the latest and greatest pay for the
    development costs for the rest of us. I feel I've paid my share of
    development costs for digital cameras and hard disks, though!

    Cheers,
    David
     
    David J Taylor, Aug 29, 2010
    #15
  16. Peter

    GMAN Guest

    In article <4c793be3$0$5525$-secrets.com>, "Peter" <> wrote:
    >"David J Taylor" <> wrote in message
    >news:i5bdbq$8jv$-september.org...
    >> "Neil Harrington" <> wrote in message
    >> news:...
    >> []
    >>> I have computers all over the place, and at the moment not a single one
    >>> of them is overclocked. I don't do it if I don't need it. But if tomorrow
    >>> or next week I should need it, I will overclock with a high heart.

    >>
    >> Same here. But if anything, I now go for the lower-powered components -
    >> "green" HDs, Intel Atom processors, the simplest graphics cards (or
    >> built-in) etc - where at all possible, to keep down heat and noise, reduce
    >> power consumption, and perhaps increase reliability.
    >>
    >> For what I do, today's processors are plenty powerful enough - I get more
    >> gain in productivity from increasing RAM, perhaps using 64-bit Windows,
    >> and have two displays.

    >
    >
    >Is there a significant advantage to a second display? I have been
    >considering getting one, but, I have to justify the expenditure to my
    >treasury department. She just got a washing machine and dryer and is unhappy
    >with the speed of her WinXP machine. <G>
    >A trade opportunity window, (no pun intended,) has opened. My alternative
    >would be a new WA lens.
    >
    >
    >


    Video editing is one great example of using multiple displays. I keep the main
    app running in one monitor and the video playback in the other.
     
    GMAN, Aug 29, 2010
    #16
  17. Peter

    Ray Fischer Guest

    <> wrote:
    >On Aug 26, 6:37 pm, "lofi" <> wrote:
    >> Overclocking is an utterly pointless exercise they may generate a few higher
    >> numbers on a measurement scale but has absolutely no practical benefit to
    >> the end user.

    >
    >Absolutely? So you say... :/ The frame rate, speed of loading
    >textures, ect
    >of my flight simulator tells me a different story than the one you
    >offer.


    Ooo! I bet it went from 45 frames/sec to 48 frames/sec!

    No practical benefit at all.

    --
    Ray Fischer
     
    Ray Fischer, Aug 29, 2010
    #17
  18. Peter

    Peter Guest

    "Neil Harrington" <> wrote in message
    news:...
    >
    > "David J Taylor" <> wrote in message
    > news:i5crpr$29u$-september.org...
    >>
    >> "Neil Harrington" <> wrote in message
    >> news:...
    >> []
    >>> I don't use Intel CPUs anymore since I'm convinced AMD's wares provide
    >>> more bang for the buck. (The exception is a little netbook I have with
    >>> the Atom CPU, but that's the first Intel chip I've used in about 10
    >>> years.) I always benchmark the onboard graphics processor (if the
    >>> motherboard has that) before putting in a discrete graphics card, and it
    >>> is just amazing how good the built-in ones are nowadays, at least the
    >>> ATI graphics that come with most AMD-chipset motherboards. If I didn't
    >>> play games I would never bother with a discrete card at all.
    >>>
    >>> The same goes for audio. Onboard audio used to be pretty bad, but now
    >>> it's so good I haven't used a discrete sound card for years.

    >>
    >> Agreed about graphics and audio - although some of the on-board graphics
    >> uses part of the main memory - another reason you shouldn't skimp on
    >> memory.

    >
    > Right. Of course that's something you just have to put up with with
    > laptops, but with desktops it's often a reason to put in some sort of a
    > discrete graphics card even if the onboard graphics are adequate.



    NOt all lpatops. My Lenove has a discrete graphics card, as do many others.






    --
    Peter
     
    Peter, Aug 29, 2010
    #18
  19. Peter

    nospam Guest

    In article <>, Floyd L. Davidson
    <> wrote:

    > >you most certainly *do* get a benefit from two (or more) displays,

    >
    > You do not *need* to have two monitors to get the
    > described benefit, assuming that you do use a fully
    > functional windowing system.


    in some cases you definitely do, such as when an app runs in full
    screen. games are the obvious case, but a lot of photo/video software
    is also full screen.

    > >especially if you're using the full screen, among other things. having
    > >a second display is without question, a huge increase in productivity.
    > >three displays is also helpful in some circumstances but not as much as
    > >a second display.

    >
    > If your windowing system is adequate, the size of the
    > display and the functionality have nothing to do with
    > the number of monitors.


    nope.

    > >and then there's going a bit overboard with 13 displays :)
    > ><http://the.taoofmac.com/media/blog/2003/06/22/Image1.jpg>

    >
    > An excellent example that disproves your point.


    nope.

    > The
    > benefit gained in that example is not the area or the
    > functionality.


    the benefit is having displays in front and to either side. as i said,
    13 is a bit overkill, all he really needs is one in front and two more,
    one on either side, and maybe two more at 45 degrees.

    > There is almost certainly a cost factor
    > (multiple monitors are probably vastly less expensive
    > than a custom made large screen monitor for that
    > application), and most of all there is the desired
    > illusional effect designed to make it appear to be a
    > specific environment.
    >
    > But again, multiple monitors are *not* required to
    > accomplish the functionality, only the cost benefit.


    tell us how having a surround view can be done with one display.

    > >> >Displays today
    > >> >would likely be far cheaper than a WA lens, unless you want something very
    > >> >well calibrated. For photo use, you may have the photo on your well
    > >> >calibrated main display, and the program menus etc. on the secondary. I
    > >> >was really surprised how much of a difference it made for me. Perhaps you
    > >> >have an old display somewhere you could try out?
    > >>
    > >> That's a valid point, though I've always used two
    > >> identical monitors and found that to less expensive than
    > >> a single monitor of equal quality and screen area. It
    > >> used to be that the form factor was more suitable too
    > >> but today there are monitors with wider screens.
    > >> Whatever, it's a little bit of a pain if both monitors
    > >> are not the same size.

    > >
    > >it's not a pain at all. in fact, it's trivial. simply plug the display
    > >in and use the control panel to arrange it spatially, as desired.

    >
    > It's a royal pain in the rear if you have monitors with
    > two different resolutions, assuming you actually do
    > something with them. If a window on one monitor is
    > moved to the other it can be off screen entirely, for
    > example. It can change sizes too, and it is generally
    > not reasonable to have a window half on one monitor and
    > half on another if the two have different resolutions.


    big deal. part of it will not be visible on the smaller display. drag
    it back to see all of it. why in the world would anyone do that? pick a
    small window and drag it over and you can see all of it. it's entirely
    up to the user.

    usually displays of different sizes are used for different things, such
    as putting the main photoshop image on the big display and the
    photoshop tool palettes on the small one.

    > In other words, if you do have a window system that
    > allows maximum functionality benefits from the use of
    > two monitors, it works far better if they are identical.
    > Not that mismatched monitors are not useful, just that
    > they are not as useful.


    having different size displays can be *more* useful in some cases, such
    as the one i described above with one large display for the image and
    the smaller one for tools. on the other hand, someone doing video work
    might want two 30" displays. there is no single answer for everyone.

    > Incidentally, I have my system arranged so that the
    > calibration of the left monitor can be changed
    > (different brilliance, different color temperatures and
    > different gamma) with two clicks of a mouse (with
    > instant response time too and not waiting for a control
    > panel or some such before a selection can be made). I
    > use that to allow non color managed apps, such as web
    > browsers and image previewers, to be usable when
    > previewing images for different printing systems or for
    > web publication.


    if you're going to change colour profiles on a whim, just leave the
    control panel open and then it's only one click. most people don't do
    that, so the extra second or two to open a control panel is a
    non-issue. it could probably be scripted too. non-colour managed
    browser? really? time to upgrade.
     
    nospam, Aug 30, 2010
    #19
  20. Peter

    nospam Guest

    In article <>, Floyd L. Davidson
    <> wrote:

    > >My experience differs - I want to see both source and working program at
    > >the same time, and you need physical display area for that.

    >
    > Exactly, and two monitors or one does not make any
    > difference. Only the physical area. It has *nothing*
    > to do with using two monitors!


    it does when the app is running full screen or you don't want extra
    update events.

    > >As I develop Windows software, using Windows tools, suggesting that I move
    > >to a UNIX/X-windows platform makes little sense.

    >
    > I did not suggest you move to Unix. But using a better
    > widowing system would help. X of course runs just fine
    > under Windows.


    that's a step *backwards*.
     
    nospam, Aug 30, 2010
    #20
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Silverstrand
    Replies:
    7
    Views:
    848
    unholy
    Jun 27, 2005
  2. Silverstrand

    Athlon64 3700+ Overclocking Fun

    Silverstrand, Jul 7, 2005, in forum: Front Page News
    Replies:
    4
    Views:
    873
    unholy
    Jul 11, 2005
  3. Silverstrand

    Overclocking and WindowsXP x64 Edition

    Silverstrand, Aug 8, 2005, in forum: Front Page News
    Replies:
    1
    Views:
    626
    PUTALE
    Aug 10, 2005
  4. Silverstrand

    Overclocking and WindowsXP x64 Edition

    Silverstrand, Dec 26, 2005, in forum: Front Page News
    Replies:
    0
    Views:
    607
    Silverstrand
    Dec 26, 2005
  5. Silverstrand
    Replies:
    0
    Views:
    665
    Silverstrand
    Mar 5, 2006
Loading...

Share This Page