IEEE Micro is a broad-based practitioner-oriented magazine of the IEEE Computer Society targeting small system and semiconductor chip professionals, including electronic engineers, designers, architects, developers, process improvement experts, testers, quality engineers, and project managers. It features peer-reviewed articles, special focus sections, regular columns by prominent authors, technology news, experience reports, and opinion pieces. The topics it covers include integrated circuit processes and practices, project management, development tools and infrastructure, as well as chip design and architecture, empirical evaluations of small system and IC technologies and techniques, human and social aspects of system development. In addition, IEEE Micro regularly contains columns devoted to legal developments relating to software and electronic systems (Micro Law), to standardization (Micro Standards), and to economics issues affecting the electronics industry (Micro Economics).
The editors-in-chief of IEEE Micro, since its inception, have been:
1980-1982 -- Richard C. Jaeger
1983-1984 -- Peter Rony and Tom Cain
1985-1987 -- James J. Farrell III
1987-1990 -- Joe Hootman
1991-1994 -- Dante Del Corso
1995-1998 -- Steve Diamond
1999-2001 -- Ken Sakamura
2003-2006 -- Pradip Bose
2007-2008 -- David H. Albonesi
Sunday, April 19, 2009
IEEE
IEEE Computer Society is an organizational unit of the Institute of Electrical and Electronics Engineers (IEEE). It was established in 1963 when the American Institute of Electrical Engineers (AIEE) and the Institute of Radio Engineers (IRE) merged to create the IEEE. At the time of the merger, the AIEE’s Subcommittee on Large-Scale Computing (established 1946) merged with the IRE’s Technical Committee on Electronic Computers (established 1948) to create the IEEE Computer Group. The group became the IEEE Computer Society in 1971.
SOFTWARE engg
Software engineering is the application of a systematic, disciplined, quantifiable approach to the development, operation, and maintenance of software, and the study of these approaches; that is, the application of engineering to software.
The term software engineering first appeared in the 1968 NATO Software Engineering Conference and was meant to provoke thought regarding the current "software crisis" at the time. Since then, it has continued as a profession and field of study dedicated to creating software that is of higher quality, cheaper, maintainable, and quicker to build. Since the field is still relatively young compared to its sister fields of engineering, there is still much work and debate around what software engineering actually is, and if it deserves the title engineering. It has grown organically out of the limitations of viewing software as just programming. Software development is a term sometimes preferred by practitioners in the industry who view software engineering as too heavy-handed and constrictive to the malleable process of creating software.
Yet, in spite of its youth as a profession, the field's future looks bright as Money Magazine and Salary.com rated software engineering as the best job in America in 2006.
The term software engineering first appeared in the 1968 NATO Software Engineering Conference and was meant to provoke thought regarding the current "software crisis" at the time. Since then, it has continued as a profession and field of study dedicated to creating software that is of higher quality, cheaper, maintainable, and quicker to build. Since the field is still relatively young compared to its sister fields of engineering, there is still much work and debate around what software engineering actually is, and if it deserves the title engineering. It has grown organically out of the limitations of viewing software as just programming. Software development is a term sometimes preferred by practitioners in the industry who view software engineering as too heavy-handed and constrictive to the malleable process of creating software.
Yet, in spite of its youth as a profession, the field's future looks bright as Money Magazine and Salary.com rated software engineering as the best job in America in 2006.
RELATIONSHIP WITH OTHER FIELDS
alternative names have been proposed. Certain departments of major universities prefer the term computing science, to emphasize precisely that difference. Danish scientist Peter Naur suggested the term datalogy, to reflect the fact that the scientific discipline revolves around data and data treatment, while not necessarily involving computers. The first scientific institution to use the term was the Department of Datalogy at the University of Copenhagen, founded in 1969, with Peter Naur being the first professor in datalogy. The term is used mainly in the Scandinavian countries. Also, in the early days of computing, a number of terms for the practitioners of the field of computing were suggested in the Communications of the ACM – turingineer, turologist, flow-charts-man, applied meta-mathematician, and applied epistemologist. Three months later in the same journal, comptologist was suggested, followed next year by hypologist. The term computics has also been suggested. Informatik was a term used in Europe with more frequency.
The renowned computer scientist Edsger Dijkstra stated, "Computer science is no more about computers than astronomy is about telescopes." The design and deployment of computers and computer systems is generally considered the province of disciplines other than computer science. For example, the study of computer hardware is usually considered part of computer engineering, while the study of commercial computer systems and their deployment is often called information technology or information systems. However, there has been much cross-fertilization of ideas between the various computer-related disciplines. Computer science research has also often crossed into other disciplines, such as Cognitive science, economics, mathematics, physics (see quantum computing), and linguistics.
Computer science is considered by some to have a much closer relationship with mathematics than many scientific disciplines, with some observers saying that computing is a mathematical science. Early computer science was strongly influenced by the work of mathematicians such as Kurt Gödel and Alan Turing, and there continues to be a useful interchange of ideas between the two fields in areas such as mathematical logic, category theory, domain theory, and algebra.
The relationship between computer science and software engineering is a contentious issue, which is further muddied by disputes over what the term "software engineering" means, and how computer science is defined. David Parnas, taking a cue from the relationship between other engineering and science disciplines, has claimed that the principal focus of computer science is studying the properties of computation in general, while the principal focus of software engineering is the design of specific computations to achieve practical goals, making the two separate but complementary disciplines.
The academic, political, and funding aspects of computer science tend to depend on whether a department formed with a mathematical emphasis or with an engineering emphasis. Computer science departments with a mathematics emphasis and with a numerical orientation consider alignment computational science. Both types of departments tend to make efforts to bridge the field educationally if not across all research.
The renowned computer scientist Edsger Dijkstra stated, "Computer science is no more about computers than astronomy is about telescopes." The design and deployment of computers and computer systems is generally considered the province of disciplines other than computer science. For example, the study of computer hardware is usually considered part of computer engineering, while the study of commercial computer systems and their deployment is often called information technology or information systems. However, there has been much cross-fertilization of ideas between the various computer-related disciplines. Computer science research has also often crossed into other disciplines, such as Cognitive science, economics, mathematics, physics (see quantum computing), and linguistics.
Computer science is considered by some to have a much closer relationship with mathematics than many scientific disciplines, with some observers saying that computing is a mathematical science. Early computer science was strongly influenced by the work of mathematicians such as Kurt Gödel and Alan Turing, and there continues to be a useful interchange of ideas between the two fields in areas such as mathematical logic, category theory, domain theory, and algebra.
The relationship between computer science and software engineering is a contentious issue, which is further muddied by disputes over what the term "software engineering" means, and how computer science is defined. David Parnas, taking a cue from the relationship between other engineering and science disciplines, has claimed that the principal focus of computer science is studying the properties of computation in general, while the principal focus of software engineering is the design of specific computations to achieve practical goals, making the two separate but complementary disciplines.
The academic, political, and funding aspects of computer science tend to depend on whether a department formed with a mathematical emphasis or with an engineering emphasis. Computer science departments with a mathematics emphasis and with a numerical orientation consider alignment computational science. Both types of departments tend to make efforts to bridge the field educationally if not across all research.
ACHIEVEMENTS
Despite its relatively short history as a formal academic discipline, computer science has made a number of fundamental contributions to science and society. These include:
Started the "digital revolution", which includes the current Information Age and the Internet.
A formal definition of computation and computability, and proof that there are computationally unsolvable and intractable problems.
The concept of a programming language, a tool for the precise expression of methodological information at various levels of abstraction.
In cryptography, breaking the Enigma machine was an important factor contributing to the Allied victory in World War II.
Scientific computing enabled advanced study of the mind, and mapping the human genome became possible with Human Genome Project. Distributed computing projects such as Folding@home explore protein folding.
Algorithmic trading has increased the efficiency and liquidity of financial markets by using artificial intelligence, machine learning, and other statistical and numerical techniques on a large scale.
Started the "digital revolution", which includes the current Information Age and the Internet.
A formal definition of computation and computability, and proof that there are computationally unsolvable and intractable problems.
The concept of a programming language, a tool for the precise expression of methodological information at various levels of abstraction.
In cryptography, breaking the Enigma machine was an important factor contributing to the Allied victory in World War II.
Scientific computing enabled advanced study of the mind, and mapping the human genome became possible with Human Genome Project. Distributed computing projects such as Folding@home explore protein folding.
Algorithmic trading has increased the efficiency and liquidity of financial markets by using artificial intelligence, machine learning, and other statistical and numerical techniques on a large scale.
COMPUTER SCIENCE HISTORY
The early foundations of what would become computer science predate the invention of the modern digital computer. Machines for calculating fixed numerical tasks, such as the abacus, have existed since antiquity. Wilhelm Schickard built the first mechanical calculator in 1623. Charles Babbage designed a difference engine in Victorian times helped by Ada Lovelace. Around 1900, punch-card machines were introduced. However, all of these machines were constrained to perform a single task, or at best some subset of all possible tasks.
During the 1940s, as newer and more powerful computing machines were developed, the term computer came to refer to the machines rather than their human predecessors. As it became clear that computers could be used for more than just mathematical calculations, the field of computer science broadened to study computation in general. Computer science began to be established as a distinct academic discipline in the 1960s, with the creation of the first computer science departments and degree programs. Since practical computers became available, many applications of computing have become distinct areas of study in their own right.
Although many initially believed it impossible that computers themselves could actually be a scientific field of study, in the late fifties it gradually became accepted among the greater academic population. It is the now well-known IBM brand that formed part of the computer science revolution during this time. IBM (short for International Business Machines) released the IBM 704 and later the IBM 709 computers, which were widely used during the exploration period of such devices. "Still, working with the IBM [computer] was frustrating...if you had misplaced as much as one letter in one instruction, the program would crash, and you would have to start the whole process over again". During the late 1950s, the computer science discipline was very much in its developmental stages, and such issues were commonplace.
Time has seen significant improvements in the usability and effectiveness of computer science technology. Modern society has seen a significant shift from computers being used solely by experts or professionals to a more widespread user base.
During the 1940s, as newer and more powerful computing machines were developed, the term computer came to refer to the machines rather than their human predecessors. As it became clear that computers could be used for more than just mathematical calculations, the field of computer science broadened to study computation in general. Computer science began to be established as a distinct academic discipline in the 1960s, with the creation of the first computer science departments and degree programs. Since practical computers became available, many applications of computing have become distinct areas of study in their own right.
Although many initially believed it impossible that computers themselves could actually be a scientific field of study, in the late fifties it gradually became accepted among the greater academic population. It is the now well-known IBM brand that formed part of the computer science revolution during this time. IBM (short for International Business Machines) released the IBM 704 and later the IBM 709 computers, which were widely used during the exploration period of such devices. "Still, working with the IBM [computer] was frustrating...if you had misplaced as much as one letter in one instruction, the program would crash, and you would have to start the whole process over again". During the late 1950s, the computer science discipline was very much in its developmental stages, and such issues were commonplace.
Time has seen significant improvements in the usability and effectiveness of computer science technology. Modern society has seen a significant shift from computers being used solely by experts or professionals to a more widespread user base.
COMPUTER SCIENCE
Computer science (or computing science) is the study of the theoretical foundations of information and computation, and of practical techniques for their implementation and application in computer systems. It is frequently described as the systematic study of algorithmic processes that describe and transform information; the fundamental question underlying computer science is, "What can be (efficiently) automated?" Computer science has many sub-fields; some, such as computer graphics, emphasize the computation of specific results, while others, such as computational complexity theory, study the properties of computational problems. Still others focus on the challenges in implementing computations. For example, programming language theory studies approaches to describing computations, while computer programming applies specific programming languages to solve specific computational problems, and human-computer interaction focuses on the challenges in making computers and computations useful, usable, and universally accessible to people.
The general public sometimes confuses computer science with vocational areas that deal with computers (such as information technology), or think that it relates to their own experience of computers, which typically involves activities such as gaming, web-browsing, and word-processing. However, the focus of computer science is more on understanding the properties of the programs used to implement software such as games and web-browsers, and using that understanding to create new programs or improve existing ones.
The general public sometimes confuses computer science with vocational areas that deal with computers (such as information technology), or think that it relates to their own experience of computers, which typically involves activities such as gaming, web-browsing, and word-processing. However, the focus of computer science is more on understanding the properties of the programs used to implement software such as games and web-browsers, and using that understanding to create new programs or improve existing ones.
INFO TECH

Information technology (IT), as defined by the Information Technology Association of America (ITAA), is "the study, design, development, implementation, support or management of computer-based information systems, particularly software applications and computer hardware. IT deals with the use of electronic computers and computer software to convert, store, protect, process, transmit, and securely retrieve information.
Today, the term information technology has ballooned to encompass many aspects of computing and technology, and the term has become very recognizable. The information technology umbrella can be quite large, covering many fields. IT professionals perform a variety of duties that range from installing applications to designing complex computer networks and information databases. A few of the duties that IT professionals perform may include data management, networking, engineering computer hardware, database and software design, as well as the management and administration of entire systems.
When computer and communications technologies are combined, the result is information technology, or "infotech". Information Technology (IT) is a general term that describes any technology that helps to produce, manipulate, store, communicate, and/or disseminate information. Presumably, when speaking of Information Technology (IT) as a whole, it is noted that the use of computers and information are associated.
The term Information Technology (IT) is sometimes said to have been coined by Jim Domsic of Michigan in November 1981.[citation needed] Domsic, who worked as a computer manager for an automotive related industry, is supposed to have created the term to modernize the outdated phrase "data processing". The Oxford English Dictionary, however, in defining information technology as "the branch of technology concerned with the dissemination, processing, and storage of information, esp. by means of computers" provides an illustrative quote from the year 1958 (Leavitt & Whisler in Harvard Business Rev. XXXVI. "The new technology does not yet have a single established name. We shall call it information technology.") that predates the so-far unsubstantiated Domsic coinage.
In recent years ABET and the ACM have collaborated to form accreditation and curriculum standards for degrees in Information Technology as a distinct field of study separate from both Computer Science and Information Systems. SIGITE is the ACM working group for defining these standards.
Today, the term information technology has ballooned to encompass many aspects of computing and technology, and the term has become very recognizable. The information technology umbrella can be quite large, covering many fields. IT professionals perform a variety of duties that range from installing applications to designing complex computer networks and information databases. A few of the duties that IT professionals perform may include data management, networking, engineering computer hardware, database and software design, as well as the management and administration of entire systems.
When computer and communications technologies are combined, the result is information technology, or "infotech". Information Technology (IT) is a general term that describes any technology that helps to produce, manipulate, store, communicate, and/or disseminate information. Presumably, when speaking of Information Technology (IT) as a whole, it is noted that the use of computers and information are associated.
The term Information Technology (IT) is sometimes said to have been coined by Jim Domsic of Michigan in November 1981.[citation needed] Domsic, who worked as a computer manager for an automotive related industry, is supposed to have created the term to modernize the outdated phrase "data processing". The Oxford English Dictionary, however, in defining information technology as "the branch of technology concerned with the dissemination, processing, and storage of information, esp. by means of computers" provides an illustrative quote from the year 1958 (Leavitt & Whisler in Harvard Business Rev. XXXVI. "The new technology does not yet have a single established name. We shall call it information technology.") that predates the so-far unsubstantiated Domsic coinage.
In recent years ABET and the ACM have collaborated to form accreditation and curriculum standards for degrees in Information Technology as a distinct field of study separate from both Computer Science and Information Systems. SIGITE is the ACM working group for defining these standards.
HISTORY

The history of computing hardware encompasses the hardware, its architecture, and its impact on software. The elements of computing hardware have undergone significant improvement over their history. This improvement has triggered worldwide use of the technology, performance has improved and the price has declined. Computers are accessible to ever-increasing sectors of the world's population. Computing hardware has become a platform for uses other than computation, such as automation, communication, control, entertainment, and education. Each field in turn has imposed its own requirements on the hardware, which has evolved in response to those requirements.
The von Neumann architecture unifies our current computing hardware implementations. Since digital computers rely on digital storage, and tend to be limited by the size and speed of memory, the history of computer data storage is tied to the development of computers. The major elements of computing hardware implement abstractions: input, output, memory, and processor. A processor is composed of control and datapath. In the von Neumann architecture, control of the datapath is stored in memory. This allowed control to become an automatic process; the datapath could be under software control, perhaps in response to events. Beginning with mechanical datapaths such as the abacus and astrolabe, the hardware first started using analogs for a computation, including water and even air as the analog quantities: analog computers have used lengths, pressures, voltages, and currents to represent the results of calculations. Eventually the voltages or currents were standardized, and then digitized. Digital computing elements have ranged from mechanical gears, to electromechanical relays, to vacuum tubes, to transistors, and to integrated circuits, all of which are currently implementing the von Neumann architecture.
The von Neumann architecture unifies our current computing hardware implementations. Since digital computers rely on digital storage, and tend to be limited by the size and speed of memory, the history of computer data storage is tied to the development of computers. The major elements of computing hardware implement abstractions: input, output, memory, and processor. A processor is composed of control and datapath. In the von Neumann architecture, control of the datapath is stored in memory. This allowed control to become an automatic process; the datapath could be under software control, perhaps in response to events. Beginning with mechanical datapaths such as the abacus and astrolabe, the hardware first started using analogs for a computation, including water and even air as the analog quantities: analog computers have used lengths, pressures, voltages, and currents to represent the results of calculations. Eventually the voltages or currents were standardized, and then digitized. Digital computing elements have ranged from mechanical gears, to electromechanical relays, to vacuum tubes, to transistors, and to integrated circuits, all of which are currently implementing the von Neumann architecture.
COMPUTER

A computer is a machine that manipulates data according to a list of instructions.
Although mechanical examples of computers have existed throughout history, the first resembling a modern computer were developed in the mid-20th century (1940–1945). The first electronic computers were the size of a large room, consuming as much power as several hundred modern personal computers (PC). Modern computers based on tiny integrated circuits are millions to billions of times more capable than the early machines, and occupy a fraction of the space.[2] Simple computers are small enough to fit into a wristwatch, and can be powered by a watch battery. Personal computers in their various forms are icons of the Information Age, what most people think of as a "computer", but the embedded computers found in devices ranging from fighter aircraft to industrial robots, digital cameras, and children's toys are the most numerous.
The ability to store and execute lists of instructions called programs makes computers extremely versatile, distinguishing them from calculators. The Church–Turing thesis is a mathematical statement of this versatility: any computer with a certain minimum capability is, in principle, capable of performing the same tasks that any other computer can perform. Therefore computers ranging from a personal digital assistant to a supercomputer are all able to perform the same computational tasks, given enough time and storage capacity.
Although mechanical examples of computers have existed throughout history, the first resembling a modern computer were developed in the mid-20th century (1940–1945). The first electronic computers were the size of a large room, consuming as much power as several hundred modern personal computers (PC). Modern computers based on tiny integrated circuits are millions to billions of times more capable than the early machines, and occupy a fraction of the space.[2] Simple computers are small enough to fit into a wristwatch, and can be powered by a watch battery. Personal computers in their various forms are icons of the Information Age, what most people think of as a "computer", but the embedded computers found in devices ranging from fighter aircraft to industrial robots, digital cameras, and children's toys are the most numerous.
The ability to store and execute lists of instructions called programs makes computers extremely versatile, distinguishing them from calculators. The Church–Turing thesis is a mathematical statement of this versatility: any computer with a certain minimum capability is, in principle, capable of performing the same tasks that any other computer can perform. Therefore computers ranging from a personal digital assistant to a supercomputer are all able to perform the same computational tasks, given enough time and storage capacity.
Subscribe to:
Posts (Atom)