Im arguing that the way the ram works is important, because one doesnt need to have the same limitations in ram behaviour that one has the brain. Any criticism is targeted at a tendency for some researchers and lay people to think of a brain as being like the computer they sit at every day. To make them more so would bog the article down in details that important to the majority of readers.
Difference 6 no hardwaresoftware distinction can be made with respect to the brain or mind for years it was tempting to imagine that the brain was the hardware on which a mind program or mind software is executing. At cutting edge technology processes there is nothing simple about a simple electrical logic gate. I dont know what isomorphic means except that it has to do with forecasting the weather.
Some blame this misunderstanding for the infamous failure of another pernicious feature of the brain-computer metaphor is that it seems to suggest that brains might also operate on the basis of electrical signals (action potentials) traveling along individual logical gates. These mechanisms are fundamental to digital computers and may be critical for the distinctive aspects of human intelligence. Or more accurately, their conceptualization of a brain and their conceptualization of a computer.
The only real question will be how big does the super-computer need to be to model a human brain maybe it will have to be as big as a watch, maybe it will have to be as big as a car. Ai is being designed on the pattern of a so far discovered brainfunction mentality, so the reverse is untrue. Also, the argument that the brain is not like a computer does not mean that the computer cant simulate some aspects of the brain it means that they dont inherently work the same way.
As someone whose specialty happens to be computer science, i would have to say that i agree overall with your overview, except for a few points. However, how the brain works at a macro level, a symbolic level, and ultimately gives rise to self-consciousness for example, has only little to do with the detailed working of neurons and synapses as studied in neuro science. Are we presuming all transistors are simple gates, or are we presuming multi-gate transistors? It seems a neuron is really a simple analog computer.
There isnt a significant difference between an analog signal and a digital signal of a certain complexity. Machines based on synchronous processors, on the other hand, constantly have the pulse of the clock traveling through the system (and the frequency at which it beats determines the speed of the cpu). . The key difference, it seems to me, is the difference between brain processes and computational processes. While i never looked at the problem of parallel working seriously my approach should be equally valid with bythe a.
Well your site is very informative and it will be equally worthwhile. The key difference, it seems to me, is the difference between brain processes and computational processes. Now granted i may be the only one here without a science degree, but its more me thinking out loud. Thats a new point of view an internet model of the brain, with computers are neurons. But the biggest problem isnt a mistake, but an unspoken assumption its being argued (incorrectly, as it happens) that brains arent like computers, whenn the arguments being made are actually about the idea that brains arent like one particular type of computational devices.
My gut instinct is telling me here that a brain based completely on spaghetti wiring just wouldnt work very well you might call me out for nitpicking here, but cpus dont require system clocks. If you want to get into cutting-edge, high-end, or low-market-share technology then the argument requires more support, but is far from invalidated. Simulating massively parallel systems on cpu-based systems is worse, and less reliable. It might well be possible to make something like content-addressible memory in the ram model, but it would be a bloody hack with no connection to our usual programming schemes, then too, our ability to program neural nets is frankly humbled, by the ordinary development of almost any vertebrates nervous system. I dont see them as being especially valuable to actually researching more natural computing devices (like the brain).
It would be really strange, but im just pointing out that its still theoretically possible at this point. Blue brain project apparently need to use one processor just to emulate one neuron ok, agreed with you all ). Thanks! Very happy to be here, thank you for the article let me learn many useful things! At first, that i was doing anything questionable. Much more likely is that the vast majority of the complexity is accidental. Similarly, there does not appear to be any central clock in the brain, and there is debate as to how clock-like the brains time-keeping devices actually are. If you had millions of identical processors it could just as easily work by passing subsidiary tasks onto other identical processors to work in parallel. Consciousness could be an innate property of all living organisms, or of matter in general, or even be independent of matter, operating as a field that is concentrated and finds an interface in neural networks through some as yet unknown principle. Moreover, considering just the space and weight penalties, wouldnt something that seems as unnecessarily convoluted as a neuron soon be replaced in evolutionary competition by something as elegantly simple as a network of organic transistors? Yet that never happened in billions of years of evolutionary history, and if theres a good reason for that, it may not happen now, either. At most, one could say it is a necessary condition that some such process be present for the associated experience of consciousness to occur in a living brain. Emotionally, i find the idea that the brain isnt a turing machine only slightly less distasteful and unlikely than violating the physical domain in explaining consciousness.Что такое ИМХО. ИМХО (имхо, Имхо) – часто встречающееся в форумах и Интернет-конференциях ...