If you mean "in the context of a classical text-only real-time conversation Turing test" - that's fairly easy.
1: Show a lack of ability to create new models of abstract situations (emotional, interpersonal);
2: Show a lack of ability to integrate new themes into the conversation, and / or elucidate the relationship between the old theme and the new theme;
3: The oldie-but-goodie non-sequitur complete with lack of ability to refer to the non-sequitur statement in future statements: schizoid amnesia, on a far faster timescale than any human could accomplish without massive brain damage (one individual in particular who only has short-term memory formation but remembers most of what he experienced up to the injury is a sticker: He'd probably be very difficult to tell apart from a poor-quality chatbot script);
4: Inability to make interpersonal relationship statements that integrate social mores and manners to explain "quirks" - i.e. it has become fashionable lately to blanket-excuse one's rudeness by saying "Oh, Sorry: I Have Asperger's" - Which is a cop-out that a Turing test would try to use, as a response to someone expressing offense at an unknown's statement. How is that relevant? What part of the statement was rude? "Ah, I apologise for offending your sense of propriety; I know you are British and that the British rarely approach such a subject directly; I have socialisation difficulties and am, by habit, frank and direct."
5: Statements throughout the conversation being authored by more than one personality. Some Turing tests use stock phrases or sentences in response to specific situations. These are authored by more than a single author. It comes through as a lack of cohesive voice, as if reading a quotation book instead of a single piece.
--
In /person/, it would be an entirely different matter.
Also: Inability to find humour in at least one of two classes of joke: The geeky "apparent contradiction" type of joke : ; Or the more-normal "other person's pain" humour; Someone could make a database of known jokes, so it would need to be an original joke.
In person? Bad sex. Bad sex is the realm of poor AI and subhumans, and anyone who does it poorly is therefore a soulless automaton in any sense. Sure, sure, you can look for plugs, circuit board innards, an incredibly skilled performance of The Robot, or urges to wear leather and shades and kill Sarah Conner, but it can all be faked.
Over the computer? Not sure. Inability to recognize or run with any subject, unnaturally fast construction of long sentences (in a chat room), an apparent grasp of the language, yet no conversation skill (question/response/question, etc). Modern AI attempts seem to wander a bit, or suddenly change subjects or say random things when they've run out of responses on a given subject or keyword.
So basically the human would have to be drunk or stoned or otherwise impaired. Perhaps with a text-to-speech program to help keep the spelling correct. Maybe a pair of humans or a team could pull it off.
Depends on what you mean by computer. If you allow for the possibility of future evolutions of what we consider a computer today, then I'd have to say that humans are computers. In fact, my thesis for my phil mind class this fall covered this exact topic. Based primarily on the thoughts of Daniel Dennett. In his paper, Real Patterns, he discusses a model of a universal turing machine (one which can emulate any other turing machine including itself) and several perspectives from which the system can be viewed. On one level, the operations of such a machine are merely law based physical changes that can be predicted. If, however, one begins to view some of the patterns created by the machine as "design" units then you get a different picture of the system by which prediction is easier, but less accurate.
no subject
If you mean "in the context of a classical text-only real-time conversation Turing test" - that's fairly easy.
1: Show a lack of ability to create new models of abstract situations (emotional, interpersonal);
2: Show a lack of ability to integrate new themes into the conversation, and / or elucidate the relationship between the old theme and the new theme;
3: The oldie-but-goodie non-sequitur complete with lack of ability to refer to the non-sequitur statement in future statements: schizoid amnesia, on a far faster timescale than any human could accomplish without massive brain damage (one individual in particular who only has short-term memory formation but remembers most of what he experienced up to the injury is a sticker: He'd probably be very difficult to tell apart from a poor-quality chatbot script);
4: Inability to make interpersonal relationship statements that integrate social mores and manners to explain "quirks" - i.e. it has become fashionable lately to blanket-excuse one's rudeness by saying "Oh, Sorry: I Have Asperger's" - Which is a cop-out that a Turing test would try to use, as a response to someone expressing offense at an unknown's statement. How is that relevant? What part of the statement was rude? "Ah, I apologise for offending your sense of propriety; I know you are British and that the British rarely approach such a subject directly; I have socialisation difficulties and am, by habit, frank and direct."
5: Statements throughout the conversation being authored by more than one personality. Some Turing tests use stock phrases or sentences in response to specific situations. These are authored by more than a single author. It comes through as a lack of cohesive voice, as if reading a quotation book instead of a single piece.
--
In /person/, it would be an entirely different matter.
no subject
no subject
no subject
just sayin'
no subject
Over the computer? Not sure. Inability to recognize or run with any subject, unnaturally fast construction of long sentences (in a chat room), an apparent grasp of the language, yet no conversation skill (question/response/question, etc). Modern AI attempts seem to wander a bit, or suddenly change subjects or say random things when they've run out of responses on a given subject or keyword.
So basically the human would have to be drunk or stoned or otherwise impaired. Perhaps with a text-to-speech program to help keep the spelling correct. Maybe a pair of humans or a team could pull it off.
no subject