![]() ![]() CPUs now usually need to process multiple instructions at once, and therefore have multiple "cores", each of which has resources for performing calculations, almost equivalent to an entire processor. So what are these "cores" everyone keeps talking about? A CPU in the old days would just follow the basic outline we described in Part 1. Single-core, dual-core, quad-core and beyond Drivers are low-level software interfaces that help the operating system recognise and use the full capabilities of hardware such as graphics cards, scanners, printers, etc. On the other hand, device drivers need to be 64-bit on a 64-bit OS. Unfortunately this does not work the other way around. Most programs that were designed for 32-bit environments can run just fine on a 64-bit operating system (and 64-bit hardware) by emulating a 32-bit environment (in other words, the program doesn't know that it's not running on 32-bit hardware or software). It also allows for more op codes to be sent to the CPU and also for more permutations and combinations of bits. ![]() The other half of the puzzle is software, which must be aware of the fact that it can use 64-bit hardware so that when the CPU finally receives its instructions, they are 64-bit aware and the largest blocks of data are also 64 bits in size.Īll this translates into additional speed for programs can take advantage of the extra resources. Remember those registers we discussed in Part 1? On a hardware level, 64-bit registers can hold 64 bits (binary digits, 0 and 1) and so for the duration of each clock cycle, more data can be held by a 64-bit register than a smaller one. So what is this 64-bit thingy, and why is it such a big deal? Intel followed with the Silvermont microarchitecture (branded Atom for the public) later the same year. The A7 is based on an ARMv8 design and is codenamed "Swift". More recently, Apple announced the A7, the first 64-bit chip to hit phones and tablets. All subsequent versions of Mac OS X have been 64-bit only. Microsoft Windows Vista didn't gain enough traction itself, so it was Windows 7 that ushered in the 64-bit era with drivers, OS and processor hardware all fully compatible with each other.Īpple also started moving away from 32-bit systems after Snow Leopard. Windows XP had sort of served as experimental 64-bit platform, with not enough device drivers ready to make a proper jump. Hardware has been 64-bit capable since 2003, when AMD's Athlon64 first released, but software support only really became mature with Windows 7 in 2009. In the last few years, we've witnessed a slow transition from 32-bit computing to 64-bit computing. While the scope and scale of CPU design has changed over the years, these concepts are timeless. These have formed the foundation of CPUs right from the earliest days of personal computing, and still govern the way today's PCs, laptops, phones, tablets, appliances and accessories are developed. In the first part of this article, we looked at the fundamental building blocks of a CPU: logical units, instruction sets, and architectures. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |