The traditional idea that seeks to compare the human mind, with all its complexities and biochemical functions, to that of artificially programmed digital computers is counterproductive and should be discredited in dialogues regarding the theory of intelligence artificial. This traditional notion is similar to comparing, in crude terms, between cars and airplanes or ice cream and cream cheese. Human mental states are caused by various behaviors of elements in the brain, and these behaviors are judged by the biochemical composition of our brain, which is responsible for our thoughts and functions. When discussing the mental states of systems it is important to distinguish between the human brain and that of any natural or artificial organism said to have central processing systems (e.g. chimpanzee brain, microchip etc.). While there may be various similarities between these systems in terms of functions and behaviorism, the inherent intentionality within such systems differs widely. While it may not be possible to prove that mental states do or do not exist in systems other than our own, in this article I will attempt to present arguments that a machine that computes and responds to inputs does indeed have a mental state, but that it does not necessarily translate into a form of mentality. This article will discuss how the states and intentionality of digital computers are different from human brain states and yet are actually states of mind resulting from various functions in their central processing systems. The most common refutation to the notion of mental states in digital computers is that there are inherent limitations of computation and that there are inabilities existing in any algorithm of... middle of paper ......intelligent and intentional activity that takes place all interior of the room and the digital computer. Supporters of Searle's argument, however, would counter that if there is an entity that performs computations, such as a human or a computer, it cannot understand the meaning of the symbols it uses. They argue that digital computers do not understand the input given or the output given. But it cannot be said that digital computers as a whole cannot understand. Those who only enter data, being only a part of the system, cannot know the system as a whole. If there is a person inside the Chinese room manipulating the symbols, the person is already intentional and has a mental state, therefore, thanks to the perfect integration of their hardware and software systems which understand the inputs and outputs as systems whole, digital computers also have moods.
tags