« My first forray into being a bayesian Howard Hughes | Main | Microprizes, Microprediction markets, Microbets »

April 27, 2008

Comments

Byrne

Your nonprofit sounds like a mechanical turk: the easy, automatic parts are done easily and automatically, but when it impersonates a human, it uses a human to do it. Unless you mean that it would feed correspondence into a program and produce responses, without human intervention.

Hopefully Anonymous

Not "a human" Byrne, but multiple humans. Why should that preclude the emergence of consciousness any more than silicon circuitry?

Nick Tarleton

If, as is realistic, the humans controlling the interaction used their imagination and mental model of other humans to decide what to say, I don't see why you would expect their imagination and deliberation to have its own consciousness. If they did a Chinese Room-style collective emulation of a brain, I think it would be conscious. If they collectively simulated an AI, I don't know. There are probably algorithms that produce human behavior but not consciousness, and there are definitely algorithms that produce the same behavior with different consciousness (because a person's thoughts are underdetermined by their actions).

Hopefully Anonymous

"There are probably algorithms that produce human behavior but not consciousness"

-seems like you're more in my camp on this than Caledonian's or TGGP's.

Although you're in it even more strongly than I am. I don't have a sense of the probabilities to say "probably".

Nick Tarleton

Except I think any such algorithm would have to have substantially different structure from a human mind. I very much doubt your cryonically revived body, or your upload, could behave like but not be "you".

Hopefully Anonymous

"Except I think any such algorithm would have to have substantially different structure from a human mind."

And this is based on what? Something more than personal intuition? It seems dangerously like an opiate to me, regarding angst about cryonics and uploading.

Nick Tarleton

The Generalized Anti-Zombie Principle.

Hopefully Anonymous

Please elaborate how the Generalized Anti-Zombie Principle makes you think both:

1. "There are probably algorithms that produce human behavior but not consciousness"

and

2. "any such algorithm would have to have substantially different structure from a human mind."

botogol

I think you're conflating questions together, which are really distinct.

- could a corporation pass a turing test?
- does passing a turing test (ie being able to impersonate a human) have anything at all to do with being conscious?
- could a corporation be considered conscious?

I think the answers are
- well yes, obviosuly. A corporation with this objective could be set up in 5 minutes with a single employee and access to the internet.

- only loosely. The Turing Test wasn't even designed as a test for consciousness, but as one for intelligence (not the same). I am inclined to the view that to pass a test would probably require consciousness. The converse is obviously not true, it's easy to imagine conscious beings who could not successfully impersonate a human (eg an alien)

- now that's a tough question

Hopefully Anonymous

Botogol,
1. I was transparent in the OP that I was conflating previously distinct questions, and I explained the reason I was doing so.
2. A corporation with multiple employees could also achieve this objective, and even if each word typed into the computer was done by shareholder or board member vote, I think it could pass a turing test. Does that imply an ispo facto consciousness of the shareholders/board members by your standards?
3. "I am inclined to the view that to pass a test would probably require consciousness." What would be your best arguments against that position? Do you think it's possible you believe that as an opiate, because it makes you feel better about cryonics/uploading?

botogol

HA,

1) I think it's hard to conceive a sense in which a corporation has a consciousness distinct from the consciousness of the humans within it.

If a conscious system contains a conscious component, then it's natural to focus on the consciousness residing in that spot, and discard the rest of the system as a support mechanism (whence don't talk of a conscious body, or even a conscious head, but a conscious brain makes sense)

2) Passing a Turing Test means being able to mislead fallible humans with a frequency that is statistically-significantly better than chance.

I can imagine a clever programme that would not be conscious (or claim to be conscious) but which could reliably misdirect human testers in a Turing Test After all, people are easily fooled.

Hopefully Anonymous

Botogol,
Thanks for the sincere engagement with me. The link to your blog doesn't seem to be working for me. Are you sure:

http://www.blog.greenideas.com/

is the correct address?

botogol

http://blog.greenideas.com

BUT drat and curse, it should work with or without the www with our without the /

It does for me!

(a curse be on blogger custom domains)

The comments to this entry are closed.