Velocity Reviews - Computer Hardware Reviews

Velocity Reviews > Newsgroups > Computing > NZ Computing > Is technology making mankind a threatened species? Thoughts on technology, the future, and somwhere, ethics.

Reply
Thread Tools

Is technology making mankind a threatened species? Thoughts on technology, the future, and somwhere, ethics.

 
 
Waylon Kenning
Guest
Posts: n/a
 
      10-11-2004
In my search of Unix history, I came across an article at Wired called
"Why the future doesn't need us" -
http://www.wired.com/wired/archive/8.04/joy.html?pg=1. It's four years
old, but becoming more and more relevant every day.

With technology such as nanotechnology, robotics and genetic
engineering fast approaching and maturing, will we all descend into a
real world version of the game Total Annihilation, a war between
robots with human essences in them, vs. cloned super humans with
essences of the best soldiers in them? Perhaps not, but with robotic
arms happening, robotic this and that, will humans ever stop until
we're no part flesh anymore? After all, any part flesh in a human is a
weak spot compared to metal of course.

We worry about Sadam having nuclear weapons (and we especially worry
about George Bush being voted in again), but as technology charges
ahead, do we worry that in this age of free information and the
internet, that terrorists may resort to creating a grey goo of
genetically modified bacteria to eat and destroy everything?

Perhaps a better question to ask would be, should we continue
researching new technologies without thinking about their
ramifications in the future? Should we develop all technologies then
decide to stop using some, or is it better to just not develop certain
technologies, and concentrate on things such as cold fusion? After
all, once an idea's let out of the bag, you can't put it back in.
Knowledge eh, it's a bit like a virus in that regard, you can't kill
it once it's out.

This is something I've never really thought too much about until I
read this article, and considered the fact that work I maybe doing in
the future could contribute to the fall of mankind to machines. After
all, why are we developing new technologies? Massive growth of markets
continues yet not everyone is happy. Heck, not a lot of people are
happy with lots of money (see celebrities). What is humankind's view
of utopia, and how are we working towards it?

Finally, can you imagine what the future holds? A survey of elderly
people in Britain finds that most of them believe the world was a
better place to live in in the past than now. More honest, caring,
sympathetic. If that's true, what improvements in first world society
have occured in the past 50 years? And more so, what will the next 50
years hold for us? Apart from Windows XXXXXP, with increased security,
ease of use, and protection from all nanoviruses attempting to enter
your own personal biosphere
--
Regards,
Waylon Kenning.

1st Year B.I.T. WelTec
 
Reply With Quote
 
 
 
 
Dave - Dave.net.nz
Guest
Posts: n/a
 
      10-11-2004
Waylon Kenning wrote:
> With technology such as nanotechnology, robotics and genetic
> engineering fast approaching and maturing, will we all descend into a
> real world version of the game Total Annihilation, a war between
> robots with human essences in them, vs. cloned super humans with
> essences of the best soldiers in them?


sweet as if we do, I love that game.
*snip*

> Perhaps a better question to ask would be, should we continue
> researching new technologies without thinking about their
> ramifications in the future? Should we develop all technologies then
> decide to stop using some, or is it better to just not develop certain
> technologies, and concentrate on things such as cold fusion? After
> all, once an idea's let out of the bag, you can't put it back in.
> Knowledge eh, it's a bit like a virus in that regard, you can't kill
> it once it's out.


rather interesting thought... I guess it would be wise to look into
ramifications before they get made, although even though it is decided
to not continue, doesnt mean that all will stop.

I was talking about developing weapons with a friend, more precisely
about these, http://www.strategypage.com/dls/articles/200492722.asp

I think that the users of such weapons should have to be happy using
them on their mothers/wives/kids before they are allowed to use them in war.
 
Reply With Quote
 
 
 
 
Nik Coughin
Guest
Posts: n/a
 
      10-11-2004
Waylon Kenning wrote:
> Perhaps not, but with robotic arms happening, robotic this and that, will
> humans ever stop until we're no part flesh anymore?


I was reading an article on the 'net a little while ago about how people
find prosthetic limbs on others much more psychologically unsettling if they
are designed to look like real human limbs than if they are clearly
artificial. That is, a normal person would feel more uncomfortable around
someone with a flesh-coloured plastic hand than someone with a steel hook or
claw. I can't find a link at the moment.


 
Reply With Quote
 
Waylon Kenning
Guest
Posts: n/a
 
      10-11-2004
It seems like Mon, 11 Oct 2004 14:41:33 +1300 was when "Dave -
Dave.net.nz" <dave@no_spam_here_please_dave.net.nz> said Blah blah
blah...

>I was talking about developing weapons with a friend, more precisely
>about these, http://www.strategypage.com/dls/articles/200492722.asp
>
>I think that the users of such weapons should have to be happy using
>them on their mothers/wives/kids before they are allowed to use them in war.


That's a good point. I like the active denial system, it's good. What
are the effects of say an electromagnetic pulse on humans that could
disable electronics (and those damn robots!)? In doing some research
for this post, I read
http://www.angelfire.com/or/mctrl/electrowarfare.html. The things
people can do with electromagnetic pulses!

Quote
>We know of ESB's potential for mind control largely through the work
>of Jose Delgado. One signal provoked a cat to lick its fur, then
>continue compulsively licking the floor and bars of its cage.

Unqoute.

Wow, there's some scary new technology coming out these days eh.
--
Regards,
Waylon Kenning.

1st Year B.I.T. WelTec
 
Reply With Quote
 
Bret
Guest
Posts: n/a
 
      10-11-2004
On Mon, 11 Oct 2004 14:43:39 +1300, Nik Coughin wrote:

> Waylon Kenning wrote:
>> Perhaps not, but with robotic arms happening, robotic this and that, will
>> humans ever stop until we're no part flesh anymore?

>
> I was reading an article on the 'net a little while ago about how people
> find prosthetic limbs on others much more psychologically unsettling if they
> are designed to look like real human limbs than if they are clearly
> artificial. That is, a normal person would feel more uncomfortable around
> someone with a flesh-coloured plastic hand than someone with a steel hook or
> claw. I can't find a link at the moment.


That sounds like a contradiction.

 
Reply With Quote
 
Nik Coughin
Guest
Posts: n/a
 
      10-11-2004
Bret wrote:
> On Mon, 11 Oct 2004 14:43:39 +1300, Nik Coughin wrote:
>
>> Waylon Kenning wrote:
>>> Perhaps not, but with robotic arms happening, robotic this and
>>> that, will humans ever stop until we're no part flesh anymore?

>>
>> I was reading an article on the 'net a little while ago about how
>> people find prosthetic limbs on others much more psychologically
>> unsettling if they are designed to look like real human limbs than
>> if they are clearly artificial. That is, a normal person would feel
>> more uncomfortable around someone with a flesh-coloured plastic hand
>> than someone with a steel hook or claw. I can't find a link at the
>> moment.

>
> That sounds like a contradiction.


"Stated simply, the idea is that if one were to plot emotional response
against similarity to human appearance and movement, the curve is not a
sure, steady upward trend. Instead, there is a peak shortly before one
reaches a completely human 'look' . . . but then a deep chasm plunges below
neutrality into a strongly negative response before rebounding to a second
peak where resemblance to humanity is complete.
This chasm - the uncanny valley of Doctor Mori's thesis -
represents the point at which a person observing the creature or object in
question sees something that is nearly human, but just enough off-kilter to
seem eerie or disquieting. The first peak, moreover, is where that same
individual would see something that is human enough to arouse some empathy,
yet at the same time is clearly enough not human to avoid the sense of
wrongness. The slope leading up to this first peak is a province of relative
emotional detachment - affection, perhaps, but rarely more than that."

http://www.arclight.net/~pdb/glimpses/valley.html


 
Reply With Quote
 
Roger Johnstone
Guest
Posts: n/a
 
      10-11-2004
In <(E-Mail Removed)> Waylon Kenning wrote:
> In my search of Unix history, I came across an article at Wired called
> "Why the future doesn't need us" -
> http://www.wired.com/wired/archive/8.04/joy.html?pg=1. It's four years
> old, but becoming more and more relevant every day.


> We worry about Sadam having nuclear weapons (and we especially worry
> about George Bush being voted in again),


Now imagine the terror at the thought of George Bush who ALLREADY has
many Weapons of Mass Destruction being voted in again.

> but as technology charges
> ahead, do we worry that in this age of free information and the
> internet, that terrorists may resort to creating a grey goo of
> genetically modified bacteria to eat and destroy everything?


No worries. Someone'll come up with a green goo that will eat the grey
goo ) Upgrades will of course have to be applied regularly, but annual
subscriptions will be available for a discount.

--
Roger Johnstone, Invercargill, New Zealand
http://vintageware.orcon.net.nz/
__________________________________________________ ______________________
No Silicon Heaven? Preposterous! Where would all the calculators go?

Kryten, from the Red Dwarf episode "The Last Day"
 
Reply With Quote
 
Brendan
Guest
Posts: n/a
 
      10-11-2004
On Mon, 11 Oct 2004 14:29:24 +1300, Waylon Kenning wrote:

> In my search of Unix history, I came across an article at Wired called
> "Why the future doesn't need us" - http://www.wired.com/wired/archive/8.0
> 4/joy.html?pg=1. It's four years old, but becoming more and more
> relevant every day.


This article has been much debated since then. Google those debates for a
better picture of things.

> With technology such as nanotechnology, robotics and genetic engineering
> fast approaching and maturing, will we all descend into a real world
> version of the game Total Annihilation, a war between robots with human
> essences in them, vs. cloned super humans with essences of the best
> soldiers in them?


What would the war be fought over ?

> Perhaps not, but with robotic arms happening, robotic this and that, will
> humans ever stop until we're no part flesh anymore? After all, any part
> flesh in a human is a weak spot compared to metal of course.


Nature dictates we must evolve or become extinct. That means we must
enhance ourselves in every fashion we can - which means adding technology
to our bodies, as we have been doing for 10,000 years at least.

> We worry about Sadam having nuclear weapons (and we especially worry
> about George Bush being voted in again), but as technology charges
> ahead, do we worry that in this age of free information and the
> internet, that terrorists may resort to creating a grey goo of
> genetically modified bacteria to eat and destroy everything?


How will you stop some group researching new weapons in secret ?

The only solution to that problem is to ensure everyone has the knowledge
ensuring a balance of power. It is another reason to do away with IP law
as it now stands.

> Perhaps a better question to ask would be, should we continue researching
> new technologies without thinking about their ramifications in the
> future? Should we develop all technologies then decide to stop using
> some, or is it better to just not develop certain technologies, and
> concentrate on things such as cold fusion? After all, once an idea's let
> out of the bag, you can't put it back in. Knowledge eh, it's a bit like
> a virus in that regard, you can't kill it once it's out.


Banning certain research is a sure way to guarantee some will keep doing
it in secret.

No, think in terms of keeping the balance of power.

> This is something I've never really thought too much about until I read
> this article, and considered the fact that work I maybe doing in the
> future could contribute to the fall of mankind to machines.


You watch too much bad science fiction.

The concurrent technology you forgot to mention is Artificial
Intelligence; as our engineering skill improves via nano technology, the
processing power equivalent to a human brain will become available, and
then become quickly superseded: super intelligent machines.

But the first human level AI is likely to be a human mind cloned
(uploaded) into a powerful computer (the technology for doing this is the
same technology needed for constructing the machine it is to go into).

After this, that mind might improve itself indefinitly. Think: 18 months
after it's creation, new hardware will be running atleast twice as fast.
That mind will think twice as quickly as you. It will find solutions to
problems in half the time you take. 18 months after that, 4 times as fast.


But that is not true: it will be MORE so, because it will be able to edit
OUT functions for moderating hear beats, and other bodily functions -
those being handled now by other computers or entirely irrelevant. Freeing
up processing power. This mind is already more efficient than yours. It
will also have a PERFECT memory, AND it will not need sleep so have TWICE
the time to think that you do.

e.g. model one is already 3 or 4 times faster at thinking than you.

Using genetic algorithms and neural nets and yet to be developed
techniques, this mind will be able to isolate certain areas of it's
function (it will have a complete model of it after all), clone them, and
experiment for optimisation on the clone. If it works better, it can copy
it back into the original. This means faster AND smarter. Some changes in
much-used areas might yield 100x improvements.

Add that to base improvements in computing speed over the same period. Add
to THAT technologies like quantum computation (which essentially means
infinite parallelism for certain classes of computation), chaos theory,
etc. ....

Even if all you did was emulate a million mind clones of the smartest
people in the world on the hardware, without any other tricks, you already
have doubled the IQ average of the human race. You could have a hundred
Einstein's working on 100 subjects in parallel.

You can see why this mind could easily be millions of times faster and
more intelligent than you within a few years.

> After all, why are we developing new technologies? Massive growth of
> markets continues yet not everyone is happy. Heck, not a lot of people
> are happy with lots of money (see celebrities). What is humankind's view
> of utopia, and how are we working towards it?


They will likely be many and varied.

> Finally, can you imagine what the future holds? A survey of elderly
> people in Britain finds that most of them believe the world was a better
> place to live in in the past than now. More honest, caring, sympathetic.
> If that's true, what improvements in first world society have occured in
> the past 50 years? And more so, what will the next 50 years hold for us?
> Apart from Windows XXXXXP, with increased security, ease of use, and
> protection from all nanoviruses attempting to enter your own personal
> biosphere


I anticipate we will be pretty much demigods. Your elderly you mention
above conveiently ignore a string of major wars and diseases that struck
down millions.

We live in a time of flux, technology is advancing exponentially - and
this is just as well, for we will need it to survive in a world of
diminishing resources. But it makes a lot of social upheaval, and people
cannot be certain of the future because of the rapid advances. This makes
people feel stressed.

Civilisation is on the cusp of a change as radical as the invention of
agriculture ten thousand years ago. And it is likely to happen by 2030.

--

.... Brendan

"Let us treat men and women well; treat them as if they were real; perhaps they are." -- Ralph Waldo Emerson

Note: All my comments are copyright 11/10/2004 8:10:59 p.m. and are opinion only where not otherwise stated and always "to the best of my recollection". www.computerman.orcon.net.nz.


----== Posted via Newsfeeds.Com - Unlimited-Uncensored-Secure Usenet News==----
http://www.newsfeeds.com The #1 Newsgroup Service in the World! >100,000 Newsgroups
---= East/West-Coast Server Farms - Total Privacy via Encryption =---
 
Reply With Quote
 
Waylon Kenning
Guest
Posts: n/a
 
      10-11-2004
It seems like Mon, 11 Oct 2004 21:05:36 +1300 was when Brendan
<(E-Mail Removed)> said Blah blah blah...

>On Mon, 11 Oct 2004 14:29:24 +1300, Waylon Kenning wrote:
>
>> In my search of Unix history, I came across an article at Wired called
>> "Why the future doesn't need us" - http://www.wired.com/wired/archive/8.0
>> 4/joy.html?pg=1. It's four years old, but becoming more and more
>> relevant every day.

>
>This article has been much debated since then. Google those debates for a
>better picture of things.

I'll have a look after I finish researching Unix history, which isn't
nearly as interesting.

>> With technology such as nanotechnology, robotics and genetic engineering
>> fast approaching and maturing, will we all descend into a real world
>> version of the game Total Annihilation, a war between robots with human
>> essences in them, vs. cloned super humans with essences of the best
>> soldiers in them?

>
>What would the war be fought over ?

I believe the storyline went something along the lines of people
decided that patterning (putting human a essence inside a machine) was
a good idea to save mankind. Others believed that this was a
monstrosity of humanity, being reduced to machines. So hence, a war
started. So that the non-machine people had a chance, they cloned
their best solders again and again. Cue crap storyline and interesting
game (real good RTS in my opinion, no Starcraft, but I digress).

>> Perhaps not, but with robotic arms happening, robotic this and that, will
>> humans ever stop until we're no part flesh anymore? After all, any part
>> flesh in a human is a weak spot compared to metal of course.

>
>Nature dictates we must evolve or become extinct. That means we must
>enhance ourselves in every fashion we can - which means adding technology
>to our bodies, as we have been doing for 10,000 years at least.

So where do we stop being human? Technology over the past 10,000 years
hasn't changed how we biologically work, nature's been taking care of
that. Do humans really need to short-circuit nature and accelerate our
evolution? Why? What good would this do for humanity, having people
living for 5000 years (assuming they have no decline in body function
from around 10)? I mean, do you really want your mother-in-law around
*that* long?

>> We worry about Sadam having nuclear weapons (and we especially worry
>> about George Bush being voted in again), but as technology charges
>> ahead, do we worry that in this age of free information and the
>> internet, that terrorists may resort to creating a grey goo of
>> genetically modified bacteria to eat and destroy everything?

>
>How will you stop some group researching new weapons in secret ?

I'm not too sure, do what Big Brother did in 1984? Change the language
so people can't express the idea of research? I don't know, I'm just
clutching at straws here, perhaps better surveillance could help out?

>The only solution to that problem is to ensure everyone has the knowledge
>ensuring a balance of power. It is another reason to do away with IP law
>as it now stands.
>
>> Perhaps a better question to ask would be, should we continue researching
>> new technologies without thinking about their ramifications in the
>> future? Should we develop all technologies then decide to stop using
>> some, or is it better to just not develop certain technologies, and
>> concentrate on things such as cold fusion? After all, once an idea's let
>> out of the bag, you can't put it back in. Knowledge eh, it's a bit like
>> a virus in that regard, you can't kill it once it's out.

>
>Banning certain research is a sure way to guarantee some will keep doing
>it in secret.
>
>No, think in terms of keeping the balance of power.
>
>> This is something I've never really thought too much about until I read
>> this article, and considered the fact that work I maybe doing in the
>> future could contribute to the fall of mankind to machines.

>
>You watch too much bad science fiction.

You should have seen the last movie I watched, was The Day After
Tomorrow, good movie if you enjoy movies and don't over analyse them.
And the movie before that was Sphere with Samuel L. Jackson and Sharon
Stone, now that was bad science fiction.

>The concurrent technology you forgot to mention is Artificial
>Intelligence; as our engineering skill improves via nano technology, the
>processing power equivalent to a human brain will become available, and
>then become quickly superseded: super intelligent machines.
>
>But the first human level AI is likely to be a human mind cloned
>(uploaded) into a powerful computer (the technology for doing this is the
>same technology needed for constructing the machine it is to go into).
>
>After this, that mind might improve itself indefinitly. Think: 18 months
>after it's creation, new hardware will be running atleast twice as fast.
>That mind will think twice as quickly as you. It will find solutions to
>problems in half the time you take. 18 months after that, 4 times as fast.
>
>
>But that is not true: it will be MORE so, because it will be able to edit
>OUT functions for moderating hear beats, and other bodily functions -
>those being handled now by other computers or entirely irrelevant. Freeing
>up processing power. This mind is already more efficient than yours. It
>will also have a PERFECT memory, AND it will not need sleep so have TWICE
>the time to think that you do.
>
>e.g. model one is already 3 or 4 times faster at thinking than you.
>
>Using genetic algorithms and neural nets and yet to be developed
>techniques, this mind will be able to isolate certain areas of it's
>function (it will have a complete model of it after all), clone them, and
>experiment for optimisation on the clone. If it works better, it can copy
>it back into the original. This means faster AND smarter. Some changes in
>much-used areas might yield 100x improvements.
>
>Add that to base improvements in computing speed over the same period. Add
>to THAT technologies like quantum computation (which essentially means
>infinite parallelism for certain classes of computation), chaos theory,
>etc. ....
>
>Even if all you did was emulate a million mind clones of the smartest
>people in the world on the hardware, without any other tricks, you already
>have doubled the IQ average of the human race. You could have a hundred
>Einstein's working on 100 subjects in parallel.
>
>You can see why this mind could easily be millions of times faster and
>more intelligent than you within a few years.

OK, so we can have massive super computers thinking for themselves. So
why would they want humans? After all, we're slow, weak, fragile. Put
it this way, will these AI machines *improve* humankind, and lead us
to a utopia where there is food for all, no war, and most people are
content with life?

>> After all, why are we developing new technologies? Massive growth of
>> markets continues yet not everyone is happy. Heck, not a lot of people
>> are happy with lots of money (see celebrities). What is humankind's view
>> of utopia, and how are we working towards it?

>
>They will likely be many and varied.
>
>> Finally, can you imagine what the future holds? A survey of elderly
>> people in Britain finds that most of them believe the world was a better
>> place to live in in the past than now. More honest, caring, sympathetic.
>> If that's true, what improvements in first world society have occured in
>> the past 50 years? And more so, what will the next 50 years hold for us?
>> Apart from Windows XXXXXP, with increased security, ease of use, and
>> protection from all nanoviruses attempting to enter your own personal
>> biosphere

>
>I anticipate we will be pretty much demigods. Your elderly you mention
>above conveiently ignore a string of major wars and diseases that struck
>down millions.
>
>We live in a time of flux, technology is advancing exponentially - and
>this is just as well, for we will need it to survive in a world of
>diminishing resources. But it makes a lot of social upheaval, and people
>cannot be certain of the future because of the rapid advances. This makes
>people feel stressed.
>
>Civilisation is on the cusp of a change as radical as the invention of
>agriculture ten thousand years ago. And it is likely to happen by 2030.

I no doubt it is. I do however doubt if it is for the best of mankind.
But we're all entitled to our own opinions after all, and we're often
wrong, which makes taking to a human such as yourself Brendan far more
interesting than a super AI machine that knows all
--
Regards,
Waylon Kenning.

1st Year B.I.T. WelTec
 
Reply With Quote
 
Brendan
Guest
Posts: n/a
 
      10-12-2004
On Tue, 12 Oct 2004 00:34:06 +1300, Waylon Kenning wrote:

>> Nature dictates we must evolve or become extinct. That means we must
>> enhance ourselves in every fashion we can - which means adding
>> technology to our bodies, as we have been doing for 10,000 years at
>> least.


> So where do we stop being human?


We don't. Being human is a state of mind. And anyway, what is soo great
about being human when you could be superhuman ?

It's not as if people will need to be forced to 'upgrade'.

> Technology over the past 10,000 years hasn't changed how we biologically
> work, nature's been taking care of that.


Yes it has. We live longer.

> Do humans really need to short-circuit nature


It is natural to 'short circuit' nature.

> and accelerate our evolution?


Yes.

> Why?


Because we will become extinct if we do not.

> What good would this do for humanity, having people living for 5000 years
> (assuming they have no decline in body function from around 10)? I mean,
> do you really want your mother-in-law around *that* long?


One thing it would allow is for us to exploit the resources of our galaxy
to conduct mega scale engineering.

The primary benefit of our current short lifespan is to promote more rapid
evolution. In a technological humanity ('post human'), evolution may
proceed without the need to produce a new person. E.g. you upgrade
yourself.

>> How will you stop some group researching new weapons in secret ?


> I'm not too sure, do what Big Brother did in 1984? Change the language so
> people can't express the idea of research? I don't know, I'm just
> clutching at straws here, perhaps better surveillance could help out?


There is NO feasible way to do it. The logistics of it alone would cause
more trouble than the research itself would, and you could never be sure
you got everyone.

>> You can see why this mind could easily be millions of times faster and
>> more intelligent than you within a few years.


> OK, so we can have massive super computers thinking for themselves. So
> why would they want humans?


Sentimental reasons. It'd take such a infinitesimal portion of their
resources to keep homo sapiens sapiens in the fashion they wanted, there
would be little compelling them to do otherwise. Remember, they have the
resources of the Galaxy utilise.

Transhumanity - the various machine/human AI's - will go on to populate
the galaxy.

There is no reason an individual could not be part of both groups.

> After all, we're slow, weak, fragile. Put it this way, will these AI
> machines *improve* humankind, and lead us to a utopia where there is
> food for all, no war, and most people are content with life?


Food, indeed all material goods, could be supplied via nano technology and
energy. No need for high level AI. So, yes.

'No war' would mean reprogramming the human brain, making you some form of
transhuman. If you had previously opted out of becoming transhuman you
have also opted out of the improved behaviour they are capable of. You
would still war, and it would be your own fault.

>> Civilisation is on the cusp of a change as radical as the invention of
>> agriculture ten thousand years ago. And it is likely to happen by 2030.


> I no doubt it is. I do however doubt if it is for the best of mankind.


We have no choice.

> But we're all entitled to our own opinions after all, and we're often
> wrong, which makes taking to a human such as yourself Brendan far more
> interesting than a super AI machine that knows all


I suspect a super AI could keep you interested indefinitely should it
choose to.

--

.... Brendan

"New York: A third-rate Babylon." -- H. L. Mencken

Note: All my comments are copyright 12/10/2004 and are opinion only where not otherwise stated and always "to the best of my recollection". www.computerman.orcon.net.nz.


----== Posted via Newsfeeds.Com - Unlimited-Uncensored-Secure Usenet News==----
http://www.newsfeeds.com The #1 Newsgroup Service in the World! >100,000 Newsgroups
---= East/West-Coast Server Farms - Total Privacy via Encryption =---
 
Reply With Quote
 
 
 
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
JOURNALIST THREATENED FOR EXPOSING COVER-UPS Fritz Wuehler Computer Support 0 06-18-2009 09:43 AM
Computers are the downfall of mankind!!! Lucas J.Riesau Computer Information 4 06-15-2007 12:59 PM
THE BEATLES CAUSED THE DISINTEGRATION OF MANKIND =?ISO-8859-1?Q?R=F4g=EAr?= Computer Support 16 06-03-2007 05:01 AM
Vista users threatened by Windows Mail exploit Au79 Computer Support 3 03-28-2007 06:19 PM
the other day somebody in yahoo chat threatened to "burn " my computer . lousy flagpins Computer Security 2 08-21-2003 06:53 AM



Advertisments