NICE!, this is a great place!

Discussion in 'Computer Support' started by Clifford Lotz, Mar 9, 2005.

  1. Just all kinds of interests re computers in here.
    Clifford Lotz, Mar 9, 2005
    #1
    1. Advertising

  2. Clifford Lotz

    Wizard Guest

    Lifford Clutz wrote:

    > Just all kinds of interests re computers in here.


    **** off, ****.
    Wizard, Mar 9, 2005
    #2
    1. Advertising

  3. Clifford Lotz

    trout Guest

    Clifford Lotz wrote:

    > Just all kinds of interests re computers in here.


    As well as a couple of dumbass cross-posting morph-trolls.
    --
    I imagine you've noticed.
    trout, Mar 9, 2005
    #3
  4. Clifford Lotz

    James Guest

    "Wizard" <> wrote in message
    news:...
    > Lifford Clutz wrote:
    >
    >> Just all kinds of interests re computers in here.

    >
    > **** off, ****.
    >


    Iwanttobethewizard!
    James, Mar 9, 2005
    #4
  5. Clifford Lotz

    Gordon Guest

    James wrote:
    || "Wizard" <> wrote in message
    || news:...
    ||| Lifford Clutz wrote:
    |||
    |||| Just all kinds of interests re computers in here.
    |||
    ||| **** off, ****.
    |||
    ||
    || Iwanttobethewizard!

    Of Oz?
    Gordon, Mar 9, 2005
    #5
  6. Clifford Lotz

    °Mike° Guest

    Just don't go getting the idea that computers is ALL that
    this group is about; NOTHING is "off topic", except the
    obvious spam, binaries etc.


    On Tue, 8 Mar 2005 23:34:23 -0800, in
    <>
    Clifford Lotz scrawled:

    >
    >Just all kinds of interests re computers in here.
    >


    --
    Basic computer maintenance
    http://uk.geocities.com/personel44/maintenance.html
    °Mike°, Mar 9, 2005
    #6
  7. Clifford Lotz

    jda^fx Guest

    On Wed, 9 Mar 2005 00:03:12 -0800, trout wrote in
    24hoursupport.helpdesk:

    > Clifford Lotz wrote:
    >
    >> Just all kinds of interests re computers in here.

    >
    > As well as a couple of dumbass cross-posting morph-trolls.


    I just kopied Blinkys Usenet Solution #45933: Now killing all posts
    originating at Google Groups. It works like a charm, takes a lot of
    noise away. And then some serious downscore for Xpost then it's not
    that bad.
    --
    jda^fx
    jda^fx, Mar 9, 2005
    #7
  8. jda^fx wrote:
    > On Wed, 9 Mar 2005 00:03:12 -0800, trout wrote in
    > 24hoursupport.helpdesk:


    >> Clifford Lotz wrote:


    >>> Just all kinds of interests re computers in here.


    >> As well as a couple of dumbass cross-posting morph-trolls.


    > I just kopied Blinkys Usenet Solution #45933: Now killing all posts
    > originating at Google Groups. It works like a charm, takes a lot of
    > noise away. And then some serious downscore for Xpost then it's not
    > that bad.


    Add some anonymous-posting filters to taste, and bake until crunchy:

    Score:: =-9999 % anon posters
    Message-ID: @anonymous
    Message-ID: @dizum
    Message-ID: @firenze
    Message-ID: @gilgamesh
    Message-ID: @paranoici
    Message-ID: @remailer
    Message-ID: @tatoo
    Message-ID: @liberty
    Message-ID: @cypher
    Message-ID: @mixmaster
    Message-ID: @mail.cypher
    Message-ID: netvigator.com
    From: \<anon
    From: remailer
    From: \<alias
    From: crypto

    --
    Blinky Linux Registered User 297263
    Who has implemented Usenet Solution #45933:
    Now killing all posts originating at Google Groups
    Blinky the Shark, Mar 9, 2005
    #8
  9. Clifford Lotz

    jda^fx Guest

    On 9 Mar 2005 22:22:30 GMT, Blinky the Shark wrote in
    24hoursupport.helpdesk:

    > jda^fx wrote:
    >> On Wed, 9 Mar 2005 00:03:12 -0800, trout wrote in
    >> 24hoursupport.helpdesk:

    >
    >>> Clifford Lotz wrote:

    >
    >>>> Just all kinds of interests re computers in here.

    >
    >>> As well as a couple of dumbass cross-posting morph-trolls.

    >
    >> I just kopied Blinkys Usenet Solution #45933: Now killing all posts
    >> originating at Google Groups. It works like a charm, takes a lot of
    >> noise away. And then some serious downscore for Xpost then it's not
    >> that bad.

    >
    > Add some anonymous-posting filters to taste, and bake until crunchy:
    >

    Thank's, they went right in


    --
    jda^fx
    jda^fx, Mar 9, 2005
    #9
  10. Clifford Lotz

    DC Guest

    Clifford Lotz wrote:

    > Just all kinds of interests re computers in here.


    I am not at all interested in computers.

    --
    DC Linux RU #1000111011000111001

    Customize Xnews - http://dcicons.fateback.com
    DC, Mar 10, 2005
    #10
  11. It was on Thu, 10 Mar 2005 23:52:20 +0000, that DC contributed:

    > Perce P. Cassidy wrote:
    >> It was on Thu, 10 Mar 2005 21:26:53 +0000, that DC contributed:

    >
    >>> Perce P. Cassidy wrote:
    >>>> It was on Wed, 09 Mar 2005 22:22:30 +0000, that Blinky the Shark
    >>>> contributed:


    <snip>
    >>> Rather expensive, wouldn't you say? I stick to XOVER headers (and I'm
    >>> on broadband). Non-XOVER header retrieval is just *way* slow.

    >
    >> Leafnode only retrieves posts from the server which not killfiled.
    >> For example:-
    >> 24hoursupport.helpdesk: considering articles 1271246 - 1271328
    >> 24hoursupport.helpdesk: 37 articles fetched, 46 killed

    >
    > I don't use a local spool. Nevertheless, in order to do that killing,
    > it must first download the headers referenced by your rules, no?


    Hmm...in that case, *if* it's downloading all the headers, instead of:-
    24hoursupport.helpdesk: 37 articles fetched, 46 killed
    ...why doesn't it say:-
    24hoursupport.helpdesk: 83 articles fetched, 46 killed
    ...which is what I'd expect with a newsreader killfile...

    > Maybe it seems transparent to you, but AFAIK it is happening. It's a waste of
    > resources that I would prefer to avoid.


    --
    What are black ops?
    What is 'Group13'?
    Who are 'The Increment'?
    What's that knocking at my door?
    Perce P. Cassidy, Mar 11, 2005
    #11
  12. Perce P. Cassidy wrote:
    > It was on Thu, 10 Mar 2005 23:52:20 +0000, that DC contributed:


    >> Perce P. Cassidy wrote:
    >>> It was on Thu, 10 Mar 2005 21:26:53 +0000, that DC contributed:


    >>>> Perce P. Cassidy wrote:
    >>>>> It was on Wed, 09 Mar 2005 22:22:30 +0000, that Blinky the Shark
    >>>>> contributed:


    ><snip>
    >>>> Rather expensive, wouldn't you say? I stick to XOVER headers (and I'm
    >>>> on broadband). Non-XOVER header retrieval is just *way* slow.


    >>> Leafnode only retrieves posts from the server which not killfiled.
    >>> For example:-
    >>> 24hoursupport.helpdesk: considering articles 1271246 - 1271328
    >>> 24hoursupport.helpdesk: 37 articles fetched, 46 killed


    >> I don't use a local spool. Nevertheless, in order to do that killing,
    >> it must first download the headers referenced by your rules, no?

    ^^^^^^^

    > Hmm...in that case, *if* it's downloading all the headers, instead of:-
    > 24hoursupport.helpdesk: 37 articles fetched, 46 killed
    > ..why doesn't it say:-
    > 24hoursupport.helpdesk: 83 articles fetched, 46 killed
    > ..which is what I'd expect with a newsreader killfile...


    It's not saying "37 sets of headers fetched", it's saying "37 articles
    fetched". Headers != Articles "37 articles fetched [after looking at
    83 sets of headers and killing 46 based on those headers]"

    --
    Blinky Linux Registered User 297263
    Who has implemented Usenet Solution #45933:
    Now killing all posts originating at Google Groups
    Blinky the Shark, Mar 11, 2005
    #12
  13. Perce P. Cassidy, <>, the dismissible, slovenly idiot,
    and manager in charge of the acid vats, heaved:

    > It was on Thu, 10 Mar 2005 23:52:20 +0000, that DC contributed:
    >
    >> Perce P. Cassidy wrote:
    >>> It was on Thu, 10 Mar 2005 21:26:53 +0000, that DC contributed:

    >>
    >>>> Perce P. Cassidy wrote:
    >>>>> It was on Wed, 09 Mar 2005 22:22:30 +0000, that Blinky the Shark
    >>>>> contributed:

    >
    > <snip>
    >>>> Rather expensive, wouldn't you say? I stick to XOVER headers (and
    >>>> I'm on broadband). Non-XOVER header retrieval is just *way* slow.

    >>
    >>> Leafnode only retrieves posts from the server which not killfiled.
    >>> For example:-
    >>> 24hoursupport.helpdesk: considering articles 1271246 - 1271328
    >>> 24hoursupport.helpdesk: 37 articles fetched, 46 killed

    >>
    >> I don't use a local spool. Nevertheless, in order to do that
    >> killing,
    >> it must first download the headers referenced by your rules, no?

    >
    > Hmm...in that case, *if* it's downloading all the headers, instead
    > of:- 24hoursupport.helpdesk: 37 articles fetched, 46 killed
    > ..why doesn't it say:-
    > 24hoursupport.helpdesk: 83 articles fetched, 46 killed
    > ..which is what I'd expect with a newsreader killfile...


    You fucking brain-dead ****. Fetching the headers is not the same as
    fetching the article. 37 articles fetched is 37 articles fetched out of 83
    possible articles to fetch. Grow a fucking pimple on your neck and call it a
    brain.
    Heman Schapir, Mar 11, 2005
    #13
  14. It was on Fri, 11 Mar 2005 10:14:11 +0000, that Blinky the Shark
    contributed:

    > Perce P. Cassidy wrote:
    >> It was on Thu, 10 Mar 2005 23:52:20 +0000, that DC contributed:

    >
    >>> Perce P. Cassidy wrote:
    >>>> It was on Thu, 10 Mar 2005 21:26:53 +0000, that DC contributed:

    >
    >>>>> Perce P. Cassidy wrote:
    >>>>>> It was on Wed, 09 Mar 2005 22:22:30 +0000, that Blinky the Shark
    >>>>>> contributed:

    >
    >><snip>
    >>>>> Rather expensive, wouldn't you say? I stick to XOVER headers (and I'm
    >>>>> on broadband). Non-XOVER header retrieval is just *way* slow.

    >
    >>>> Leafnode only retrieves posts from the server which not killfiled.
    >>>> For example:-
    >>>> 24hoursupport.helpdesk: considering articles 1271246 - 1271328
    >>>> 24hoursupport.helpdesk: 37 articles fetched, 46 killed

    >
    >>> I don't use a local spool. Nevertheless, in order to do that killing,
    >>> it must first download the headers referenced by your rules, no?

    > ^^^^^^^
    >
    >> Hmm...in that case, *if* it's downloading all the headers, instead of:-
    >> 24hoursupport.helpdesk: 37 articles fetched, 46 killed
    >> ..why doesn't it say:-
    >> 24hoursupport.helpdesk: 83 articles fetched, 46 killed
    >> ..which is what I'd expect with a newsreader killfile...

    >
    > It's not saying "37 sets of headers fetched", it's saying "37 articles
    > fetched". Headers != Articles "37 articles fetched [after looking at
    > 83 sets of headers and killing 46 based on those headers]"


    Yes, it brings the full article. I use it to give my Pan newsreader
    off-line functionality. Also "It can merge news articles from several
    upstream newsservers for newsreaders into one server that only support one
    server well and avoid duplicate news download for a small LAN."
    **It allows for simple filters to limit the number of articles
    downloaded,** the size of article, or exclude articles with particular
    headers (Perl-compatible regular expression match)." Ergo in the sample
    above, it's *only* fetching 37 articles from the server, & *not* fetching
    46 others, which are effectively killed.
    Perce P. Cassidy, Mar 11, 2005
    #14
  15. Perce P. Cassidy wrote:
    > It was on Fri, 11 Mar 2005 10:14:11 +0000, that Blinky the Shark
    > contributed:


    >> Perce P. Cassidy wrote:
    >>> It was on Thu, 10 Mar 2005 23:52:20 +0000, that DC contributed:


    >>>> Perce P. Cassidy wrote:
    >>>>> It was on Thu, 10 Mar 2005 21:26:53 +0000, that DC contributed:


    >>>>>> Perce P. Cassidy wrote:
    >>>>>>> It was on Wed, 09 Mar 2005 22:22:30 +0000, that Blinky the Shark
    >>>>>>> contributed:


    >>><snip>
    >>>>>> Rather expensive, wouldn't you say? I stick to XOVER headers (and I'm
    >>>>>> on broadband). Non-XOVER header retrieval is just *way* slow.


    >>>>> Leafnode only retrieves posts from the server which not killfiled.
    >>>>> For example:-
    >>>>> 24hoursupport.helpdesk: considering articles 1271246 - 1271328
    >>>>> 24hoursupport.helpdesk: 37 articles fetched, 46 killed


    >>>> I don't use a local spool. Nevertheless, in order to do that killing,
    >>>> it must first download the headers referenced by your rules, no?

    >> ^^^^^^^


    >>> Hmm...in that case, *if* it's downloading all the headers, instead of:-
    >>> 24hoursupport.helpdesk: 37 articles fetched, 46 killed
    >>> ..why doesn't it say:-
    >>> 24hoursupport.helpdesk: 83 articles fetched, 46 killed
    >>> ..which is what I'd expect with a newsreader killfile...


    >> It's not saying "37 sets of headers fetched", it's saying "37 articles
    >> fetched". Headers != Articles "37 articles fetched [after looking at
    >> 83 sets of headers and killing 46 based on those headers]"


    > Yes, it brings the full article. I use it to give my Pan newsreader
    > off-line functionality. Also "It can merge news articles from several
    > upstream newsservers for newsreaders into one server that only support one
    > server well and avoid duplicate news download for a small LAN."
    > **It allows for simple filters to limit the number of articles
    > downloaded,** the size of article, or exclude articles with particular
    > headers (Perl-compatible regular expression match)." Ergo in the sample
    > above, it's *only* fetching 37 articles from the server, & *not* fetching
    > 46 others, which are effectively killed.


    That's what I said. I just didn't take as long. :)

    --
    Blinky Linux Registered User 297263
    Who has implemented Usenet Solution #45933:
    Now killing all posts originating at Google Groups
    Blinky the Shark, Mar 11, 2005
    #15
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Silverstrand

    Very nice MMORPG (FREE)

    Silverstrand, Jul 10, 2005, in forum: Gaming
    Replies:
    7
    Views:
    26,288
    Flukemanguy
    Aug 14, 2005
  2. Nikki
    Replies:
    15
    Views:
    537
    Bob Niland
    Oct 14, 2003
  3. William Graham
    Replies:
    2
    Views:
    399
    Tony Spadaro
    Aug 23, 2004
  4. Wizard

    Re: NICE!, this is a great place!

    Wizard, Mar 9, 2005, in forum: Computer Information
    Replies:
    2
    Views:
    331
    Gordon
    Mar 9, 2005
  5. Benny Johnson

    Great place to get cheap motherboards (AMD and/or Intel)

    Benny Johnson, Jul 21, 2005, in forum: Computer Information
    Replies:
    3
    Views:
    448
    patton449
    Jul 21, 2005
Loading...

Share This Page