Author Topic: Singularity  (Read 3985 times)

0 Members and 1 Guest are viewing this topic.

Offline risaacs

  • Brand New
  • Posts: 1
Singularity
« on: August 12, 2006, 08:08:31 PM »
Hey Guys,

It is important to realize that Kurzweil didn't really come up with the idea of the singularity.  It has been proposed by people involved in information technology as far back as Alan Turing.

As far as how you can measure human brain capacity in terms of computer power, Hans Moravec (AI researcher at Carnegie Mellon) has a paper that goes into some detail as to how that is calculated.  His methodology may be incorrect, but it isnt just drawn out of the sky.

Check it out here:  http://www.transhumanist.com/volume1/moravec.htm

Also, anyone who thinks that AI research has gone nowhere just isn't paying attention.  I can provide some details if anyone is interested.

Certainly this isn't a scientific means of coming up with whether the singularity is ever going to occur, but I have been thinking and I cannot think of any field of knowledge work that isn't experiencing accellerating progress (at least, from a cursory examination).  I may just be optimistic.  :)

An interesting question to Steve Novella: Would you consider neurology to be accellerating in terms of discovery?

Thanks,

Robert Isaacs

Offline heliocentricra

  • Frequent Poster
  • ******
  • Posts: 3772
Singularity
« Reply #1 on: December 07, 2006, 09:13:37 PM »
I thought that semi-recent issue of Skeptic came down too hard on AI. Sure, they might not make anything close to the complexity of the human brain any time soon, but to discredit an entire field just because of that is silly.

Offline jason

  • Frequent Poster
  • ******
  • Posts: 3067
Re: Singularity
« Reply #2 on: December 07, 2006, 11:05:06 PM »
Quote from: "risaacs"
Also, anyone who thinks that AI research has gone nowhere just isn't paying attention.  I can provide some details if anyone is interested.

Could you post some references?

As part of my undergraduate back in 1990, I did a course on basic AI. Even then, it was somewhat disheartening how little progress had been made on the predictions made back in the 1960s. Even now, while there have been specific advances in some areas (natural language processing, image recognition, machine learning), the field seems no closer to achieving their goal of creating a self-aware artificial intelligence.

Being a software engineer, what really strikes me is that while the specific technologies change, the fundamentals of software have remained effectively static almost since the beginning. The tools are incredibly improved, there's been all the advances in hardware that give vast performance improvements, but at the base of it software is still "dumb".

As science teases apart the functioning of the brain, it will presumably become evident how the property of consciousness emerges. Once that is sufficiently understood, it may be possible to replicate that functioning using software. Coming at it the other way (trying to understand consciousness from the outside and building a system to emulate it) just doesn't seem to be possible. The implication is that AI will continue to effectively mark time until it has a sufficiently well-understood model to copy... at which point things can really start moving.

Something of a prerequisite for the singularity. But if you believe Kurzweil's graphs indicating exponential growth in all fields, I guess science will have nailed how the brain produces awareness before too much longer.
quot;Reality is that which, when you stop believing in it, doesn't go away." - Philip K. Dick
"Scientific skepticism: the buck stops at reality."

Offline azinyk

  • Well Established
  • *****
  • Posts: 1224
Re: Singularity
« Reply #3 on: December 08, 2006, 11:41:42 AM »
Quote from: "risaacs"
I cannot think of any field of knowledge work that isn't experiencing accellerating progress (at least, from a cursory examination).


I think there are a lot of fields like that.  Some fields have rapid early progress, followed by a period of mature stagnation.  Civil aviation has scarcely progressed at all since the 1960s, even though the state of the art was advancing quickly before then.

Other fields are stop-and-go, seeing occasional bursts of progress with long periods of regrouping as people realize that "breakthroughs" were not as helpful as was first thought.  Superconductivity is such a field.

Ashley Zinyk

Offline leonet

  • Well Established
  • *****
  • Posts: 1820
    • http://folding.extremeoverclocking.com/sigs/sigimage.php?un=leonet&t=39227
Re: Singularity
« Reply #4 on: December 13, 2006, 08:14:37 AM »
Quote from: "jason"

Being a software engineer, what really strikes me is that while the specific technologies change, the fundamentals of software have remained effectively static almost since the beginning. The tools are incredibly improved, there's been all the advances in hardware that give vast performance improvements, but at the base of it software is still "dumb".


Wouldn't you say that this is also a product of hardware design?  I'm currently studying software engineering and it seems as if the disconnect lies in the fact that the brain isn't "designed" at all.  The "mind" arises from the organized chaos of the biological system. In other words, it's still a hardware issue.
Use the word cybernetics, Norbert, because nobody knows what it means. This will always put you at an advantage in arguments.” -Claude Shannon

Offline jason

  • Frequent Poster
  • ******
  • Posts: 3067
Re: Singularity
« Reply #5 on: December 13, 2006, 03:45:14 PM »
Quote from: "leonet"
Wouldn't you say that this is also a product of hardware design?  I'm currently studying software engineering and it seems as if the disconnect lies in the fact that the brain isn't "designed" at all.  The "mind" arises from the organized chaos of the biological system. In other words, it's still a hardware issue.

So, you're saying that consciousness/self-awareness emerged as a property of the complex system that is the human brain--with its initial "animal" functions of instinctive behaviour patterns--presumably in some kind of primitive form at first, and subsequently evolving (in terms of the underlying structures being selected for to enhance that consciousness) when it turned out to give a survival advantage. Or at least that's what I'm reading into it. :)

The implication for artificial awareness is that as the systems that we build get more complex, and the expert/decision/reasoning systems with a learning capability become more sophisticated, consciousness may appear to spontaneously emerge at some point... and so, as with the brain, artificial awareness could "evolve"? It could even emerge from the unanticipated interaction of multiple systems that need to communicate with each other, with its overall consciousness spanning a combination of individual software entities.

Truly fascinating topic to speculate on, though it implies a level of software sophistication that doesn't exist yet... but who's to say where it could end up? If self-modifying and/or evolving code (i.e., genetic algorithms) ever goes anywhere, the capability of replicating how life has similarly evolved could become a reality.
quot;Reality is that which, when you stop believing in it, doesn't go away." - Philip K. Dick
"Scientific skepticism: the buck stops at reality."

 

personate-rain