New MS Ram diagnostic tool..

Discussion in 'NZ Computing' started by Greg Townsend, Jul 31, 2003.

  1. Greg  Townsend

    Jay Guest

    Robert Mathews wrote:

    > On Tue, 05 Aug 2003 18:39:40 +1000, Jay <> wrote:
    >
    >>Nicholas Sherlock wrote:
    >>
    >>> "Jay" <> wrote in message
    >>> news:bgmn7b$qu8r9$-berlin.de...
    >>>> Nicholas Sherlock wrote:
    >>>> > They run from a bootdisk or cd and only use about 50kb of memory. The
    >>>> > chances of a fault being in the first little bit are pretty slim. If
    >>>> > it doesn't work, then you know that your memory is fucked anyway so
    >>>> > it has done it's job :).
    >>>>
    >>>> But what if it appears to work? How do you tell if it doesn't work?
    >>>> Does the program pop up a dialog saying "I am not working properly"?
    >>>>
    >>>
    >>> If it isn't working properly, it'll collapse in a screaming, twitching,
    >>> quivering heap. You can't mix up CPU instructions and continue executing
    >>> as if nothing is wrong.

    >>
    >>Yes you can. A "bne" can become a "beq" and the program will
    >>just take a different logic path. That is, you'll get the wrong answer
    >>but the program will work.
    >>
    >>You are quite wrong.

    >
    >
    >
    > No You bloody are, go back to your PS2 as PC's are way out of you depth.


    No you are bloody stupid and wrong.
    You probably don't have any brains either.
    Maybe you should be getting yourself a brain testing program.
    But don't bother reading the results with your defective brain.

    An Intel opcode of, say, 0x72 (jc) could become 0x73 (jnc) with just a
    single bit error. That would most probably not result in a spectacular
    failure at all. The program could appear to be working properly.

    In fact, it might just result in the program *not* reporting an error.
    Or the program might just skip a test loop.
     
    Jay, Aug 5, 2003
    #21
    1. Advertising

  2. Greg  Townsend

    Jay Guest

    Ben Perston wrote:

    > Jay wrote:
    >> Ben Perston wrote:

    >
    >>>What do you get if you calculate the probability of this diagnostic
    >>>(taking 50k of memory) failing to spot an error on say a system with 256
    >>>MB? I.e. that the dodgy bit will be used by the programme and that it
    >>>will cause this specific scenario?

    >>
    >>
    >> A probability greater than zero.

    >
    > Any idea of its magnitude?
    >
    > Other events with probabilities greater than zero include the failure of
    > an otherwise perfect memory tester due to cosmic ray strike, after all...


    It depends on the RAM failure rate.
    What is the RAM failure rate?
    If it is 'r' bits per second then the probability for a 100k program in 32MB
    memory running for t seconds would be r * t * 0.1/32 wouldn't it?

    Can't you figure that out?
    Or is multiplication a bit too difficult for you?
     
    Jay, Aug 6, 2003
    #22
    1. Advertising

  3. Greg  Townsend

    Ben Perston Guest

    Jay wrote:
    > Ben Perston wrote:
    >
    >
    >>Jay wrote:
    >>
    >>>Ben Perston wrote:

    >>
    >>>>What do you get if you calculate the probability of this diagnostic
    >>>>(taking 50k of memory) failing to spot an error on say a system with 256
    >>>>MB? I.e. that the dodgy bit will be used by the programme and that it
    >>>>will cause this specific scenario?
    >>>
    >>>
    >>>A probability greater than zero.

    >>
    >>Any idea of its magnitude?
    >>
    >>Other events with probabilities greater than zero include the failure of
    >>an otherwise perfect memory tester due to cosmic ray strike, after all...

    >
    >
    > It depends on the RAM failure rate.
    > What is the RAM failure rate?


    Well, what makes RAM fail? I don't know very much about how RAM works.
    Does a faulty RAM module usually have some bits that have a much
    higher probability than the rest of failing, or just an overall
    probability for any event to go wrong that's high enough to be observed?
    If the former I think your problem must be irrelevant...

    > If it is 'r' bits per second then the probability for a 100k program in 32MB
    > memory running for t seconds would be r * t * 0.1/32 wouldn't it?


    Well, 0.098... /32, but you're close enough. You have missed the
    proportion of errors that don't cause the programme to fail in the
    manner Nicholas described as another factor. Any idea what that would be?

    > Can't you figure that out?
    > Or is multiplication a bit too difficult for you?


    I was just wondering if you could produce any estimates based on a
    decent understanding of how RAM works as to whether the problem you're
    talking about is genuine or something that would not be observed in any
    practical experiment.
     
    Ben Perston, Aug 6, 2003
    #23
  4. Greg  Townsend

    Jay Guest

    Ben Perston wrote:

    > Jay wrote:
    >> Ben Perston wrote:
    >>
    >>
    >>>Jay wrote:
    >>>
    >>>>Ben Perston wrote:
    >>>
    >>>>>What do you get if you calculate the probability of this diagnostic
    >>>>>(taking 50k of memory) failing to spot an error on say a system with
    >>>>>256
    >>>>>MB? I.e. that the dodgy bit will be used by the programme and that it
    >>>>>will cause this specific scenario?
    >>>>
    >>>>
    >>>>A probability greater than zero.
    >>>
    >>>Any idea of its magnitude?
    >>>
    >>>Other events with probabilities greater than zero include the failure of
    >>>an otherwise perfect memory tester due to cosmic ray strike, after all...

    >>
    >>
    >> It depends on the RAM failure rate.
    >> What is the RAM failure rate?

    >
    > Well, what makes RAM fail? I don't know very much about how RAM works.
    > Does a faulty RAM module usually have some bits that have a much
    > higher probability than the rest of failing, or just an overall
    > probability for any event to go wrong that's high enough to be observed?
    > If the former I think your problem must be irrelevant...


    Single bit errors are the most common.

    >
    >> If it is 'r' bits per second then the probability for a 100k program in
    >> 32MB memory running for t seconds would be r * t * 0.1/32 wouldn't it?

    >
    > Well, 0.098... /32, but you're close enough. You have missed the
    > proportion of errors that don't cause the programme to fail in the
    > manner Nicholas described as another factor. Any idea what that would be?


    Simple branch instructions are one of the most common.
    I would estimate they amount to about 7% of all instructions
    in a typical program. They might be short (2 byte) or longer.
    So another factor of, say 0.07.
    Are they relatively innocuous? Well the program usually expects
    the branch to be taken or not taken (that is why it is a conditional
    branch). To a "jc" changing into a "jnz" or even a "jo" might
    be relatively harmless.

    BTW my factor 0.1/32 isn't quite right because for each opcode there
    are a number of 1-bit different opcodes that are similar.
    Most others will probably result in a bad opcode fault because either
    the mutated opcode has a different length (so following opcode is garbage)
    or the opcode itself is garbage.

    >> Can't you figure that out?
    >> Or is multiplication a bit too difficult for you?

    >
    > I was just wondering if you could produce any estimates based on a
    > decent understanding of how RAM works as to whether the problem you're
    > talking about is genuine or something that would not be observed in any
    > practical experiment.


    Because there is a large number of quite different opcodes it would
    take some effort to work it out. But whether a mutation into a different
    opcode is relatively benign (superficially) is more difficult
    to estimate.

    Anyhow, the best memory testing uses known good ram or rom.

    Unfortunately the best memory tests do not generate the inductive fields
    and sudden dc supply surges that peripherals that are connected to
    you computer can produce (including hard disks, monitors etc). Not to
    mention mains earth loop effects, stray electromagnetic fields, etc.

    The best way to test memory is to swap it.
     
    Jay, Aug 6, 2003
    #24
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. pel

    diagnostic tool?

    pel, Jun 19, 2004, in forum: Computer Support
    Replies:
    3
    Views:
    492
    Plato
    Jun 19, 2004
  2. james mcgillin

    DIAGNOSTIC TOOL AND YPROXY

    james mcgillin, Nov 6, 2004, in forum: Computer Support
    Replies:
    5
    Views:
    470
    Toolman Tim
    Nov 6, 2004
  3. =?Utf-8?B?cm9iZXJ0X2R1bmNvbWJl?=

    Network Diagnostic Tool In Internet Explorer 7 on Windows Xp Sp2

    =?Utf-8?B?cm9iZXJ0X2R1bmNvbWJl?=, Feb 25, 2007, in forum: Wireless Networking
    Replies:
    1
    Views:
    615
    Jack \(MVP-Networking\).
    Feb 25, 2007
  4. Kue2

    Memory Diagnostic tool in Vista 64

    Kue2, Dec 20, 2007, in forum: Windows 64bit
    Replies:
    1
    Views:
    504
    Charlie Russel - MVP
    Dec 20, 2007
  5. jeffreywilens

    Missing Memory Diagnostic Tool

    jeffreywilens, Feb 20, 2008, in forum: Windows 64bit
    Replies:
    6
    Views:
    1,280
    jeffreywilens
    Feb 22, 2008
Loading...

Share This Page