AI seems to be a big thing these days, with companies using AI for everything from monotonous YouTube voice-overs to questionable artwork. Sometimes it shows up in places regardless of whether it’s welcomed or not, such as automated assistants in computer operating systems. AI has actually been around in some form for years, and not just in science fiction movies. Numerous video games have used some form of AI, whether it be a simple algorithm to control the opponent paddle in a game of Pong or the computer-controlled rivals hurling blue shells at you in a fierce game of Mario Kart. It was only in recent years with the rise of generative AI services and language learning models like ChatGPT, that it became a major buzzword, with AI systems creating content that is sometimes indistinguishable from.
Under the hood, however, generative AI essentially a bunch of computer code running on a bunch of electrical circuits, and during the current AI boom (or bubble, depending on who you ask) as of this writing, numerous companies are trying to develop and benefit from the technology. Semiconductor manufacturers, for example, are developing specialized chips (named Neural Processing Units, or NPUs) that are optimized for AI operations. Of course, any computer can do AI functions, but a more specialized chip can do it more efficiently. Likewise, a graphics processor (or GPU) is designed for graphically intensive computations and during the 1990s, processor manufacturers started to offer optional “floating point” co-processors, chips designed to efficiently perform floating-point arithmetic. Of course, a regular processor, could do floating-point operations, but it was not very efficient at it. Later processors included a built in floating point unit (FPU), and many modern computers are integrating NPU functionality for efficient AI operations.

Growing up around computers (being exposed to machines such as the Acorn Electron, Macintosh SE, and Amstrad PC1512 among others), I have thought about the parallels between humans and computers. Computer processors are designed with a specific instruction set based on what the processor is designed to do. Early processors typically only had primitive instructions, such as basic arithmetic, as well as instructions to fetch data from and write data to memory, and performing branching (making a decision on what instructions to execute based on a value in memory or the result of an operation). More modern systems will have more sophisticated and specialized instructions. For example, in the late 1990s, Intel introduced Pentium chips featuring MMX technology, a set of instructions that were designed for multimedia. AMD followed suit by adding MMX instructions to their processors and then developing the 3DNow! instruction set, an instruction set designed for 3D graphics and games beyond what MMX could offer.
Different computers are optimized for different tasks. Smartphones might not be able to run high-end games at maximum resolutions, but they are designed to run for hours or even days on a single battery charge, while a gaming PC will use more power, but it will have higher resolution graphics and higher frame rates without breaking a sweat. Assuming that it is Turing-complete, the mobile processor can still do all the things the desktop processor can do, but it would likely take much longer to do so.
What does processor instruction sets have to do with neurodiversity? Like the human brain, a computer has different components, such as memory – both short term in the form of RAM and long term in the form of a hard drive or solid state drive (SSD), as well as specialized components for performing different functions like input and output. Cache memory can be used and as with computers everyone has a slightly different architecture. There are low-level instructions to take care of basic needs, such as input and output (sensing and muscle commands), as well as higher level instructions such as language and logic. Some minds perform certain instructions more efficiently than others, and other functions have to be “emulated”.
For me, one of my biggest struggles in life is processing faces, which often results in embarrassment and hinders forming relationships with others. Many people, especially the neurotypical have dedicated “hardware” for facial processing, but my mind does not natively support such an instruction set. Instead, I have to depend on other instructions that are less optimized for the task and while I may eventually recognize someone, it takes a long time and is more error-prone – there are computer algorithms out there that can process faces, and maybe with enough effort, I could learn from some of them to be able recognize faces, but it would take quite a lot of work and would be able to do such processing in a split second as society expects. On the contrary, I have good visual-spatial processing abilities, amazing my parents and teachers with the structures I built from building blocks and Lego and knowing how to navigate a railway system in an unfamiliar city. At the more extreme levels are people who may have savant abilities. Their minds might have highly efficient and accurate instructions for processing certain things, such as processing math problems, as well as memory writing and retrieval with a ton of RAM and disk space, but few instructions for “functioning” in society. Most of us, however, might not have such instructions, and have to resort to alternative instruction sets to perform such tasks.

Even mental meltdowns, the bane of many parents of neurodivergent kids, has a counterpart in computing. Meltdowns often result from the mind receiving too much input from various sources. In computing, a buffer is a type of memory used to hold working data as it moves between two locations. A hard drive, for example, uses a buffer to decouple the variable read and write rates of the moving heads and platters and the system bus. A keyboard buffer, likewise, store keystrokes from the user while the system processes them. If the user presses a key while the CPU is doing something else, it will remain in the buffer until it is ready. If the buffer gets full (e.g. the user holds down a key while the system is doing something else), the system will trigger an error – on older computers this is often indicated by an obnoxious beeping sound. The mind works in a similar way, and because some neurodivergent people may not be able to process information as quickly and filter out unimportant input as it arrives, it fills up the buffer and results in a crash.
Whether it’s a versatile all-purpose processor that can do a little bit of everything, a highly optimized processor that can perform dedicated tasks, or something in between, the mind can be like any type of computer, and the world has uses for all makes and models.



































