In this essay I will argue that Searle’s ‘Chinese Room’ does not entirely disprove a possibility of strong Artificial Intelligence (AI). I will briefly outline the concept of strong AI (as opposed to weak AI), the thought experiment itself, and why it poses a threat to Strong AI. I will then provide and discuss a range of objections to Searle’s.
AI by essence seeks to build an intelligent machine and explain the nature of intelligence. In this essay I will be focusing on The Chinese Room’s threat to strong AI in particular. Strong AI, a proponent of physicalism, reasons that an appropriately programmed computer would necessarily have a mind of its …show more content…
Just as Searle in the room, as a living programme, could not understand the Chinese slips of paper for what they said but could distinguish the form of the Chinese characters the ‘Chinese Room’ favours the dualist theory of mental stuff that is separated and cannot be recreated by physical mechanisms – the aim of strong …show more content…
Searle argues that the Turing Test fails because the behaviour of a machine does not necessarily mean that there are mental states going on inside, which ultimately summarises his idea that machines cannot think in the sense that strong AI wishes to achieve. But in his own thought experiment, the seemingly intelligent behaviour (from a Chinese speaker’s perspective) of the room contains himself, with his own mental states, intentionality and growth. By this I mean Searle-in-the-room is still able to choose to reject the instruction manual, still able to be confused by the instructions it gives him- indeed able to get the answers truly wrong as opposed to pretending to get the answer wrong (as is as part of the difficulty in the Turing test to distinguish a real person from a machine) and is subject to learning how to do the process quicker and recognising symbols with more ease until he eventually no longer needs to even rely on the book - even if he does not understand the questions for what they are. Thus Searle’s argument is flawed as has attributed these mental states to a system that he argues has