¶ … Operating Systems
Copernicus, Newton, Kepler, and Einstein all developed "laws" or theories so fundamental to our understanding of the universe that they are known to most people (Krass, 2003). Moore's notion that computing power would double about every 18 months as engineers figure out how to build ever-faster microprocessors is not as popular as Einstein's theories, but it has held up very well over the years.
Many of those advances, however, have been made within the confines of the 32-bit architecture. However, since the mid-'90s, some computers have been built with expanded memory and processing speeds, as a result of 64-bit architecture. While 64-bit processors have traditionally been expensive, proprietary, and complex computers used mainly by scientists and engineers, modern and relatively inexpensive 64-bit processors bring this speed advantage to the everyday processors that power Websites, corporate applications, and even personal computers.
Sixty-four-bit computing represents the third major architecture shift since the invention of the microprocessor. The first shift, in the early 1980s, took processors from 8-bit to 16-bit computing. The second came a few years later with the move from 16-bit to 32-bit computing. The third shift appeared in the mid-1990s, when the first proprietary 64-bit processors hit the market. More recently, Intel, Advanced Micro Devices, and Apple Computer have introduced 64-bit processors for desktop computers and servers running the Windows, Linux, and Mac operating systems.
Understanding the Differences Between 32-Bit and 64-Bit
It is important to understand what a 64-bit processor really does, and what potential advantages it has over a 32-bit processor (Dean, 2003). Many people believe that a 64-bit processor can work on twice as much information at the same time in comparison to a 32-bit processor. This is true in some ways, but does not equate to a doubling in performance, as many believe.
64-bit processor is simply one that can work with numbers that are up to 64 bits long instead of 32 bits long. Each bit is a binary digit represented by a 1 or a 0, and each additional bit to a binary number means it can represent a greater range of numbers. This range is known as the 'dynamic range'.
Increasing the dynamic range of a processor provides an advantage as it can more easily manipulate larger numbers. However, this is a limited advantage even in high-end applications. Most CPUs do not need to be throwing around numbers larger than 4.3 billion, and if it does come across one occasionally, it can always split it up, although at the cost of performance.
One of the true benefits of 64-bit processors is the ability to access more physical memory (Dean, 2003). At the moment, with complicated programming tricks and workarounds, a 32-bit processor can only access 4GB of memory, because each bit of memory must have an address so the processor can locate it. With 4GB of memory, a user has used up every number a 32-bit processor can represent as an address, and it is nearly impossible for the processor to be able to recognize more memory.
64-bit processor can potentially access up to 18 million terabytes (18 billion gigabytes). Not that any one person would want that much yet, but there are plenty of applications where 4GB is not enough, such as large database servers, 3D CAD and scientific analysis.
Performance benefits will increase when software is written specifically to take advantage of 64-bit processors. In this case, a 64-bit processor can perform an instruction on a larger chunk of data in one operation, but again, this is only of benefit if a user is working with very large numbers, or they need very high integer precision. As such, daily office applications and games realize very little benefit or improvement from 64-bit technology.
In a nutshell, the main performance-related benefit of 64-bit computing is the ability to access more memory. As many personal computers sold today are supplied with 512MB of RAM, it is reasonable to assume that PCs being shipped in future years time might come with 4GB as standard, in which case user will consider a 64-bit processor to increase their capabilities.
An additional benefit that comes from some 64-bit processors, including the 32-bit/64-bit AMD64 processors, is an increase in the number of registers, GPRs (General Purpose Registers) (Dean, 2003). Registers store each piece of data just before being operated upon, so the more registers, the more data can be made ready to process - although this raises another challenge of how best to manage these registers to make sure they're optimized. Still, more registers are an excellent thing for programmers.
How 64-Bit Processors Work
The term 64-bit describes the size of the addresses the processor uses to organize the system's main memory banks (Krass, 2003). Sixty-four-bit systems use wider registers so that the programs running on the computer can compute...
Without the consent of the user, the program will never be elevated to administrator privilege. The MIC or integrity levels is again a new security concept with Vista OS. This feature is controlled by the Access control entry (ACE) in the System Access control List (SACL) of a file, process or a registry key. By associating every process with an integrity level, the OS limits privilege escalation attacks. [Matthew
7 billion by 2008 establishing the fact that Linux is no more a fringe player but rather a mainstream. IDC admitted that Linux is not being used just on new hardware only. As an alternative customers frequently reinstall existing servers to run Linux. While considering for such use as also the use of Linux for secondary OS, IDC forecasts for servers running Linux to remain 26% larger in 2008. Evidently,
NET development platform, further accelerating performance on the Windows 7 desktop and server operating systems (Wildstrom, 2009). This strategy worked as it gave Microsoft the opportunity to create a highly differentiated system level of performance wills also ensuring backward compatibility to previous generation applications and their respective API calls (Bradley, 2009). Microsoft also took the added step of ensuring the MinWin kernel could also manage a high level of transaction
According to Paul B. Mckimmy (2003), "The first consideration of wireless technology is bandwidth. 802.11b (one of four existing wireless Ethernet standards) is currently the most available and affordable specification. It allows a maximum of 11 megabits per second (Mbps)" (p. 111); the author adds that wired Ethernet LANs are typically 10 or 100 Mbps. In 1997, when the IEEE 802.11 standard was first ratified, wireless LANs were incompatible and
Microsoft has provided legacy API support in the latest WinHEC-delivered kit of the Microsoft Windows Vista Developer's SDK to ensure upward and backward compatibility. Having a more streamlined kernel is also going to give the user interface greater speed and ability to respond to more complex imaging tasks. As a result, Microsoft is planning to have Tablet PC technology and touch-screen support on all versions of Windows 7 in
Migration Project Scenario: Tony's Chips has recently been sold to a new independent company. The new company has hired you to manage a project that will move the old Website from an externally hosted solution to an internal one. The company's leadership is very concerned about redundancy for their site, insisting that a back-up site be available as a failover in case the main site goes down. In addition, they
Our semester plans gives you unlimited, unrestricted access to our entire library of resources —writing tools, guides, example essays, tutorials, class notes, and more.
Get Started Now