Solving the 1D Bin Packing Problem Using a Parallel Genetic Algorithm: A Benchmark Test
The past few decades have witnessed the introduction in a wide range of technological innovations that have had an enormous impact on consumers, businesses and governmental agencies. Computer-based applications in particular have been key in facilitating the delivery of a wide range of services and information, and computer processing speeds have consistently increased incrementally. Computer processing speeds, though, have a natural limit, with electricity being unable to travel faster than the speed of light. Therefore, even the optimal processing speeds attainable in the future will remain constrained in this regard, but there are some alternative approaches to computer processing that can further increase the functionality of computers, including parallel computing and genetic algorithms which are discussed further below.
Parallel Computing
In computing, the term "parallelism" is used to describe a system's architecture, in other words, "The organization and interconnection of components of computer systems" (Faulkner, Senker & Velho, 1999, p. 135). Although processing speeds have continued to double roughly every 18 months following Moore's Law, computers that use sequential architectures are still constrained in several ways in their ability to perform calculation functions. Irrespective of how fast a computer functions in sequential architectural configurations, each separate instruction or processing operation must be completed prior to the initiation of the next instruction or processing operation (Faulkner et al., 1999). Clearly, increases in processing speed can help sequential processing function more efficiently, but even here there is a fundamental constraint. As Faulkner and his associates emphasize, "This approach has its barriers, not least the speed of light since electrical signals cannot travel faster than light. Another way to increase the performance of computers is to find an alternative architecture" (Faulkner et al., 1999, p. 135). In response to the need for faster and more efficient computer processing, a number of devices have been introduced during the past half century or so that have employed numerous processors that operate in parallel to accomplish a single task (Faulker et al., 1999). According to Faulkner and his associates, "Most of this development has involved the gradual introduction of parallelism into supercomputers. There are six main forms: concurrent input -- output, operations, pipelining, memory interleaving and hierarchy, parallel functional units, vector processing, and multiple central processors. However, none of these approaches completely abandons the sequential architecture" (1999, p. 136).
Parallel systems are comprised of a number of processors, each of which assumes responsibility for completion of discrete portions of a processing job simultaneously. Although attempts to produce truly parallel processing began during the late 1960s, it was not until the 1980s that the approach became commercially viable with the introduction of so-called "field effect chips" (Faulkner et al., 1999). According to MacKenzie, "In field effect chips, current flows only in the surface plane of the microchip; in a bipolar chip, the current flow perpendicular to the chip as well as along it" (1998, p. 106). Based on its ability to be mass produced with relative ease, highly integrated field effect technology introduced a number of opportunities for configuring large numbers of field effect chips in parallel architectures (MacKenzie, 1998). As Faulkner and his colleagues point out, "Although slower than the bipolar chips conventionally used in supercomputers, large numbers of [field effect chips] could be arrayed or configured in highly parallel architectures to achieve competitive processing speeds" (1999, p. 137). Beginning in the 1980s, parallel architectures became increasingly competitive with so-called supercomputers. In this regard, MacKenzie notes that, "Until the very end of the 1980s, these did not claim to rival mainstream supercomputing in absolute floating-point performance, promising instead a superior price-performance ratio. However, by the start of the 1990s, with the most advanced field effect chips (such as the million-transistor Intel i1860) being claimed to offer on a single chip a floating-point processing performance approaching that of a 1976 Cray I, rivalry in absolute performance was growing" (1998, p. 107).
More recently, parallel computers have in fact started to compare favorably with the performance levels achieved by supercomputers through the use of incremental parallelism; however, the primary advantage from the start has been this increased price-performance ratio by reducing the costs that are associated with attaining the fastest processing speeds (Faulkner et al., 1999). Parallel computers differ from sequential processing methods in three basic ways that are related to their design as described in Table 1 below.
Table 1
Basic differences between parallel and sequential processing
Difference
Description
The type and number of processors,...
Cracking and Protecting My Genetic Code Cracking Your Genetic Code Genetics has advanced to the point that it is inevitable that genotyping services will become commonplace and increasingly inexpensive, unless legislatures take action to limit the public's access to this information. Since genotyping services are essentially businesses catering to the public and providers of medical services, an issue about quality and the relevance of findings becomes important. Near the beginning of the
cheap genomic sequencing has widespread and unforeseen cultural, political, and societal implications that have only just begun to reverberate through the human population at large. Genomic sequencing not only reveals some of the causes and connections behind certain diseases or disorders, but also puts the lie to certain forms of bigotry which assumed that dramatic phenotypic differences represented a similarly dramatic genetic or biological difference (put another way, genome
Genetic screening is one of the most controversial topics in the scientific arena today. The advent of the Human Genome Project, which maps the complete human genetic code, has brought this issue to the forefront. This paper will discuss the basic science that underlies genetic screening, applications of genetic screening, and investigate some of the common misconceptions and ethical questions about its use. Genetic screening itself is simply "the systematic search
(Biohazards: The Next Generation? There is a wide variety of such products and that includes salmon which grow twice as fast as the regular salmon, but nobody is thinking of the consequences when these fish will escape from the farms where they are being cultivated. Some types of plants like poplar, eucalyptus and pine are being modified so that their rate of growth increases and are able to stop reacting
Genetic Engineering The process of altering genes, or genetic engineering, has become a more heated subject as science and technology continue to evolve. In fact, with DNA technology, genetic modifications within plants and other organisms has become a major development, especially in the world of agriculture and medicine. However, there is still the possibility of the inability to contain the spreading and somewhat "tainting" of non-genetically modified organisms, which seem to
Genetic Engineering The alteration of the genetic structure of any organism is done by means of Genetic engineering that provides characters beneficial or pleasing to the individual performing the alternation. In other words it is a treatment of the DNA or RNA pool (Sarah. 2002). For instance, the most greatly well-known example of genetic engineering is the sheep Dolly that was cloned in the year 1996. Here, in order to create
Our semester plans gives you unlimited, unrestricted access to our entire library of resources —writing tools, guides, example essays, tutorials, class notes, and more.
Get Started Now