Tuesday, 28 August 2012

The New Science of Technical Analysis (Thomas DeMark)

The New Science of Technical Analysis (Thomas DeMark)


Today I’ve finally finished reading Tom DeMark‘s The New Science of Technical Analysis. I’d say that it’s difficult to judge a book written by a living person with an almost legendary status in the world of financial trading. Despite the fact that the book was written more than 16 years ago, it’s still a generous source of the extremely useful insights on technical analysis. Tom tries to use a scientific approach in one of the most unscientific fields — financial trading, and does so rather successfully.
The book is quite long and it isn’t very easy to read. It includes many chart examples from all sorts of markets. I’ll try to list the theses voiced by DeMark in his work here:
  • Conventional market analysis methods are inaccurate and fail to show consistent returns.
  • Strictly mathematical and emotionless approach is the cornerstone of the successful technical analysis.
  • The uptrend is over “when the last buyer has bought” and the downtrend is over “when the last seller has sold”.
  • There are many ways to analyze the charts — trendlines, patterns, indicators, etc. — they all work if used properly.
  • Developing and testing your own analytical and trading tools is an important part of learning.
The New Science of Technical Analysis is a great book. It’s an ultimate source of trading ideas and inspirational material. I recommend reading it because of the following advantages:
  • Strict mathematical approach to determining the chart points and signals.
  • All areas of the technical analysis are covered.
  • The book presents a lot of ideas for indicators, expert advisors and fully developed trading strategies.
  • It will inspire you to develop your own analytical rules and tools.
With all the pros, unfortunately, this book has its own cons too. Here’s a subjective list of disadvantages that prevented me from getting the usual pleasure of book reading:
  • It’s written in a very difficult language. It’s almost impossible to read it without stops and returns.
  • A lot of complex things are described poorly. For example, if not for the sample code provided in the book, I wouldn’t be able to build a Range Expansion Index for MetaTrader using only DeMark’s textual description.
  • ™ symbols everywhere in the text — I hate them :-).
  • DeMark talks a lot about his research of the presented methods but he doesn’t provide any statistical data to prove his research. That’s a huge disadvantage for a book that talks about relying on scientific approach to trading.
Overall, it’s still an awesome book to read, even if difficult at times. It will help you sticking to the accurate analytical methods and will give you enough of the basic information to create your own methods of technical analysis. And once you get used to them, you’ll be able to trade successfully. If you plan to trade only Forex, you can skip some chapters to save yourself from Tom’s writing style.

If you have any questions, comments or opinions regarding The New Science of Technical Analysis by Thomas R. DeMark, please, feel free to reply in the comments below.

Sunday, 26 August 2012

Technology

Technology


By the mid 20th century, humans had achieved a mastery of technology sufficient to leave the atmosphere of the Earth for the first time and explore space.
Technology is the making, modification, usage, and knowledge of tools, machines, techniques, crafts, systems, methods of organization, in order to solve a problem, improve a preexisting solution to a problem, achieve a goal or perform a specific function. It can also refer to the collection of such tools, machinery, modifications, arrangements and procedures. Technologies significantly affect human as well as other animal species' ability to control and adapt to their natural environments. The word technology comes from Greek τεχνολογία (technología); from τέχνη (téchnē), meaning "art, skill, craft", and -λογία (-logía), meaning "study of-".[1] The term can either be applied generally or to specific areas: examples include construction technology, medical technology, and information technology.
The human species' use of technology began with the conversion of natural resources into simple tools. The prehistorical discovery of the ability to control fire increased the available sources of food and the invention of the wheel helped humans in travelling in and controlling their environment. Recent technological developments, including the printing press, the telephone, and the Internet, have lessened physical barriers to communication and allowed humans to interact freely on a global scale. However, not all technology has been used for peaceful purposes; the development of weapons of ever-increasing destructive power has progressed throughout history, from clubs to nuclear weapons.
Technology has affected society and its surroundings in a number of ways. In many societies, technology has helped develop more advanced economies (including today's global economy) and has allowed the rise of a leisure class. Many technological processes produce unwanted by-products, known as pollution, and deplete natural resources, to the detriment of the Earth and its environment. Various implementations of technology influence the values of a society and new technology often raises new ethical questions. Examples include the rise of the notion of efficiency in terms of human productivity, a term originally applied only to machines, and the challenge of traditional norms.
Philosophical debates have arisen over the present and future use of technology in society, with disagreements over whether technology improves the human condition or worsens it. Neo-Luddism, anarcho-primitivism, and similar movements criticise the pervasiveness of technology in the modern world, opining that it harms the environment and alienates people; proponents of ideologies such as transhumanism and techno-progressivism view continued technological progress as beneficial to society and the human condition. Indeed, until recently, it was believed that the development of technology was restricted only to human beings, but recent scientific studies indicate that other primates and certain dolphin communities have developed simple tools and learned to pass their knowledge to other generations.

Contents

Definition and usage

The invention of the printing press made it possible for scientists and politicians to communicate their ideas with ease, leading to the Age of Enlightenment; an example of technology as a cultural force.
The use of the term technology has changed significantly over the last 200 years. Before the 20th century, the term was uncommon in English, and usually referred to the description or study of the useful arts.[2] The term was often connected to technical education, as in the Massachusetts Institute of Technology (chartered in 1861).[3] "Technology" rose to prominence in the 20th century in connection with the Second Industrial Revolution. The meanings of technology changed in the early 20th century when American social scientists, beginning with Thorstein Veblen, translated ideas from the German concept of Technik into "technology." In German and other European languages, a distinction exists between Technik and Technologie that is absent in English, as both terms are usually translated as "technology." By the 1930s, "technology" referred not to the study of the industrial arts, but to the industrial arts themselves.[4] In 1937, the American sociologist Read Bain wrote that "technology includes all tools, machines, utensils, weapons, instruments, housing, clothing, communicating and transporting devices and the skills by which we produce and use them."[5] Bain's definition remains common among scholars today, especially social scientists. But equally prominent is the definition of technology as applied science, especially among scientists and engineers, although most social scientists who study technology reject this definition.[6] More recently, scholars have borrowed from European philosophers of "technique" to extend the meaning of technology to various forms of instrumental reason, as in Foucault's work on technologies of the self ("techniques de soi").
Dictionaries and scholars have offered a variety of definitions. The Merriam-Webster dictionary offers a definition of the term: "the practical application of knowledge especially in a particular area" and "a capability given by the practical application of knowledge".[1] Ursula Franklin, in her 1989 "Real World of Technology" lecture, gave another definition of the concept; it is "practice, the way we do things around here".[7] The term is often used to imply a specific field of technology, or to refer to high technology or just consumer electronics, rather than technology as a whole.[8] Bernard Stiegler, in Technics and Time, 1, defines technology in two ways: as "the pursuit of life by means other than life", and as "organized inorganic matter."[9]
Technology can be most broadly defined as the entities, both material and immaterial, created by the application of mental and physical effort in order to achieve some value. In this usage, technology refers to tools and machines that may be used to solve real-world problems. It is a far-reaching term that may include simple tools, such as a crowbar or wooden spoon, or more complex machines, such as a space station or particle accelerator. Tools and machines need not be material; virtual technology, such as computer software and business methods, fall under this definition of technology.[10]
The word "technology" can also be used to refer to a collection of techniques. In this context, it is the current state of humanity's knowledge of how to combine resources to produce desired products, to solve problems, fulfill needs, or satisfy wants; it includes technical methods, skills, processes, techniques, tools and raw materials. When combined with another term, such as "medical technology" or "space technology", it refers to the state of the respective field's knowledge and tools. "State-of-the-art technology" refers to the high technology available to humanity in any field.
Technology can be viewed as an activity that forms or changes culture.[11] Additionally, technology is the application of math, science, and the arts for the benefit of life as it is known. A modern example is the rise of communication technology, which has lessened barriers to human interaction and, as a result, has helped spawn new subcultures; the rise of cyberculture has, at its basis, the development of the Internet and the computer.[12] Not all technology enhances culture in a creative way; technology can also help facilitate political oppression and war via tools such as guns. As a cultural activity, technology predates both science and engineering, each of which formalize some aspects of technological endeavor.

Science, engineering and technology

The distinction between science, engineering and technology is not always clear. Science is the reasoned investigation or study of phenomena, aimed at discovering enduring principles among elements of the phenomenal world by employing formal techniques such as the scientific method.[13] Technologies are not usually exclusively products of science, because they have to satisfy requirements such as utility, usability and safety.
Engineering is the goal-oriented process of designing and making tools and systems to exploit natural phenomena for practical human means, often (but not always) using results and techniques from science. The development of technology may draw upon many fields of knowledge, including scientific, engineering, mathematical, linguistic, and historical knowledge, to achieve some practical result.
Technology is often a consequence of science and engineering — although technology as a human activity precedes the two fields. For example, science might study the flow of electrons in electrical conductors, by using already-existing tools and knowledge. This new-found knowledge may then be used by engineers to create new tools and machines, such as semiconductors, computers, and other forms of advanced technology. In this sense, scientists and engineers may both be considered technologists; the three fields are often considered as one for the purposes of research and reference.[14]
The exact relations between science and technology in particular have been debated by scientists, historians, and policymakers in the late 20th century, in part because the debate can inform the funding of basic and applied science. In the immediate wake of World War II, for example, in the United States it was widely considered that technology was simply "applied science" and that to fund basic science was to reap technological results in due time. An articulation of this philosophy could be found explicitly in Vannevar Bush's treatise on postwar science policy, Science—The Endless Frontier: "New products, new industries, and more jobs require continuous additions to knowledge of the laws of nature... This essential new knowledge can be obtained only through basic scientific research." In the late-1960s, however, this view came under direct attack, leading towards initiatives to fund science for specific tasks (initiatives resisted by the scientific community). The issue remains contentious—though most analysts resist the model that technology simply is a result of scientific research.[15][16]

History

Paleolithic (2.5 million – 10,000 BC)

A primitive chopper
The use of tools by early humans was partly a process of discovery, partly of evolution. Early humans evolved from a species of foraging hominids which were already bipedal,[17] with a brain mass approximately one third that of modern humans.[18] Tool use remained relatively unchanged for most of early human history, but approximately 50,000 years ago, a complex set of behaviors and tool use emerged, believed by many archaeologists to be connected to the emergence of fully modern language.[19]

Stone tools

Hand axes from the Acheulian period
Human ancestors have been using stone and other tools since long before the emergence of Homo sapiens approximately 200,000 years ago.[20] The earliest methods of stone tool making, known as the Oldowan "industry", date back to at least 2.3 million years ago,[21] with the earliest direct evidence of tool usage found in Ethiopia within the Great Rift Valley, dating back to 2.5 million years ago.[22] This era of stone tool use is called the Paleolithic, or "Old stone age", and spans all of human history up to the development of agriculture approximately 12,000 years ago.
To make a stone tool, a "core" of hard stone with specific flaking properties (such as flint) was struck with a hammerstone. This flaking produced a sharp edge on the core stone as well as on the flakes, either of which could be used as tools, primarily in the form of choppers or scrapers.[23] These tools greatly aided the early humans in their hunter-gatherer lifestyle to perform a variety of tasks including butchering carcasses (and breaking bones to get at the marrow); chopping wood; cracking open nuts; skinning an animal for its hide; and even forming other tools out of softer materials such as bone and wood.[24]
The earliest stone tools were crude, being little more than a fractured rock. In the Acheulian era, beginning approximately 1.65 million years ago, methods of working these stone into specific shapes, such as hand axes emerged. The Middle Paleolithic, approximately 300,000 years ago, saw the introduction of the prepared-core technique, where multiple blades could be rapidly formed from a single core stone.[23] The Upper Paleolithic, beginning approximately 40,000 years ago, saw the introduction of pressure flaking, where a wood, bone, or antler punch could be used to shape a stone very finely.[25]

Fire

The discovery and utilization of fire, a simple energy source with many profound uses, was a turning point in the technological evolution of humankind.[26] The exact date of its discovery is not known; evidence of burnt animal bones at the Cradle of Humankind suggests that the domestication of fire occurred before 1,000,000 BC;[27] scholarly consensus indicates that Homo erectus had controlled fire by between 500,000 BC and 400,000 BC.[28][29] Fire, fueled with wood and charcoal, allowed early humans to cook their food to increase its digestibility, improving its nutrient value and broadening the number of foods that could be eaten.[30]

Clothing and shelter

Other technological advances made during the Paleolithic era were clothing and shelter; the adoption of both technologies cannot be dated exactly, but they were a key to humanity's progress. As the Paleolithic era progressed, dwellings became more sophisticated and more elaborate; as early as 380,000 BC, humans were constructing temporary wood huts.[31][32] Clothing, adapted from the fur and hides of hunted animals, helped humanity expand into colder regions; humans began to migrate out of Africa by 200,000 BC and into other continents, such as Eurasia.[33]

Neolithic through classical antiquity (10,000BC – 300AD)

An array of Neolithic artifacts, including bracelets, axe heads, chisels, and polishing tools.
Man's technological ascent began in earnest in what is known as the Neolithic period ("New stone age"). The invention of polished stone axes was a major advance because it allowed forest clearance on a large scale to create farms. The discovery of agriculture allowed for the feeding of larger populations, and the transition to a sedentist lifestyle increased the number of children that could be simultaneously raised, as young children no longer needed to be carried, as was the case with the nomadic lifestyle. Additionally, children could contribute labor to the raising of crops more readily than they could to the hunter-gatherer lifestyle.[34][35]
With this increase in population and availability of labor came an increase in labor specialization.[36] What triggered the progression from early Neolithic villages to the first cities, such as Uruk, and the first civilizations, such as Sumer, is not specifically known; however, the emergence of increasingly hierarchical social structures, the specialization of labor, trade and war amongst adjacent cultures, and the need for collective action to overcome environmental challenges, such as the building of dikes and reservoirs, are all thought to have played a role.[37]

Metal tools

Continuing improvements led to the furnace and bellows and provided the ability to smelt and forge native metals (naturally occurring in relatively pure form).[38] Gold, copper, silver, and lead, were such early metals. The advantages of copper tools over stone, bone, and wooden tools were quickly apparent to early humans, and native copper was probably used from near the beginning of Neolithic times (about 8000 BC).[39] Native copper does not naturally occur in large amounts, but copper ores are quite common and some of them produce metal easily when burned in wood or charcoal fires. Eventually, the working of metals led to the discovery of alloys such as bronze and brass (about 4000 BC). The first uses of iron alloys such as steel dates to around 1400 BC.

Energy and transport

The wheel was invented circa 4000 BC.
Meanwhile, humans were learning to harness other forms of energy. The earliest known use of wind power is the sailboat.[40] The earliest record of a ship under sail is shown on an Egyptian pot dating back to 3200 BC.[41] From prehistoric times, Egyptians probably used the power of the Nile annual floods to irrigate their lands, gradually learning to regulate much of it through purposely built irrigation channels and 'catch' basins. Similarly, the early peoples of Mesopotamia, the Sumerians, learned to use the Tigris and Euphrates rivers for much the same purposes. But more extensive use of wind and water (and even human) power required another invention.
According to archaeologists, the wheel was invented around 4000 B.C. probably independently and nearly-simultaneously in Mesopotamia (in present-day Iraq), the Northern Caucasus (Maykop culture) and Central Europe. Estimates on when this may have occurred range from 5500 to 3000 B.C., with most experts putting it closer to 4000 B.C. The oldest artifacts with drawings that depict wheeled carts date from about 3000 B.C.; however, the wheel may have been in use for millennia before these drawings were made. There is also evidence from the same period of time that wheels were used for the production of pottery. (Note that the original potter's wheel was probably not a wheel, but rather an irregularly shaped slab of flat wood with a small hollowed or pierced area near the center and mounted on a peg driven into the earth. It would have been rotated by repeated tugs by the potter or his assistant.) More recently, the oldest-known wooden wheel in the world was found in the Ljubljana marshes of Slovenia.[42]
The invention of the wheel revolutionized activities as disparate as transportation, war, and the production of pottery (for which it may have been first used). It didn't take long to discover that wheeled wagons could be used to carry heavy loads and fast (rotary) potters' wheels enabled early mass production of pottery. But it was the use of the wheel as a transformer of energy (through water wheels, windmills, and even treadmills) that revolutionized the application of nonhuman power sources.

Medieval and modern history (300 AD —)

Innovations continued through the Middle Ages with innovations such as silk, the horse collar and horseshoes in the first few hundred years after the fall of the Roman Empire. Medieval technology saw the use of simple machines (such as the lever, the screw, and the pulley) being combined to form more complicated tools, such as the wheelbarrow, windmills and clocks. The Renaissance brought forth many of these innovations, including the printing press (which facilitated the greater communication of knowledge), and technology became increasingly associated with science, beginning a cycle of mutual advancement. The advancements in technology in this era allowed a more steady supply of food, followed by the wider availability of consumer goods.
The automobile revolutionized personal transportation.
Starting in the United Kingdom in the 18th century, the Industrial Revolution was a period of great technological discovery, particularly in the areas of agriculture, manufacturing, mining, metallurgy and transport, driven by the discovery of steam power. Technology later took another step with the harnessing of electricity to create such innovations as the electric motor, light bulb and countless others. Scientific advancement and the discovery of new concepts later allowed for powered flight, and advancements in medicine, chemistry, physics and engineering. The rise in technology has led to the construction of skyscrapers and large cities whose inhabitants rely on automobiles or other powered transit for transportation. Communication was also greatly improved with the invention of the telegraph, telephone, radio and television. The late 19th and early 20th centuries saw a revolution in transportation with the invention of the steam-powered ship, train, airplane, and automobile.
F-15 and F-16 flying over a burning oil field in Kuwait in 1991.
The 20th century brought a host of innovations. In physics, the discovery of nuclear fission has led to both nuclear weapons and nuclear power. Computers were also invented and later miniaturized utilizing transistors and integrated circuits. The technology behind got called information technology, and these advancements subsequently led to the creation of the Internet, which ushered in the current Information Age. Humans have also been able to explore space with satellites (later used for telecommunication) and in manned missions going all the way to the moon. In medicine, this era brought innovations such as open-heart surgery and later stem cell therapy along with new medications and treatments. Complex manufacturing and construction techniques and organizations are needed to construct and maintain these new technologies, and entire industries have arisen to support and develop succeeding generations of increasingly more complex tools. Modern technology increasingly relies on training and education — their designers, builders, maintainers, and users often require sophisticated general and specific training. Moreover, these technologies have become so complex that entire fields have been created to support them, including engineering, medicine, and computer science, and other fields have been made more complex, such as construction, transportation and architecture.

Technology and philosophy

Technicism

Generally, technicism is a reliance or confidence in technology as a benefactor of society. Taken to extreme, technicism is the belief that humanity will ultimately be able to control the entirety of existence using technology. In other words, human beings will someday be able to master all problems and possibly even control the future using technology. Some, such as Stephen V. Monsma,[43] connect these ideas to the abdication of religion as a higher moral authority.

Optimism

Optimistic assumptions are made by proponents of ideologies such as transhumanism and singularitarianism, which view technological development as generally having beneficial effects for the society and the human condition. In these ideologies, technological development is morally good. Some critics see these ideologies as examples of scientism and techno-utopianism and fear the notion of human enhancement and technological singularity which they support. Some have described Karl Marx as a techno-optimist.[44]

Skepticism and critics of technology

On the somewhat skeptical side are certain philosophers like Herbert Marcuse and John Zerzan, who believe that technological societies are inherently flawed. They suggest that the inevitable result of such a society is to become evermore technological at the cost of freedom and psychological health.
Many, such as the Luddites and prominent philosopher Martin Heidegger, hold serious, although not entirely deterministic reservations, about technology (see "The Question Concerning Technology[45])". According to Heidegger scholars Hubert Dreyfus and Charles Spinosa, "Heidegger does not oppose technology. He hopes to reveal the essence of technology in a way that 'in no way confines us to a stultified compulsion to push on blindly with technology or, what comes to the same thing, to rebel helplessly against it.' Indeed, he promises that 'when we once open ourselves expressly to the essence of technology, we find ourselves unexpectedly taken into a freeing claim.'[46]" What this entails is a more complex relationship to technology than either techno-optimists or techno-pessimists tend to allow.[47]
Some of the most poignant criticisms of technology are found in what are now considered to be dystopian literary classics, for example Aldous Huxley's Brave New World and other writings, Anthony Burgess's A Clockwork Orange, and George Orwell's Nineteen Eighty-Four. And, in Faust by Goethe, Faust's selling his soul to the devil in return for power over the physical world, is also often interpreted as a metaphor for the adoption of industrial technology. More recently, modern works of science fiction, such as those by Philip K. Dick and William Gibson, and films (e.g. Blade Runner, Ghost in the Shell) project highly ambivalent or cautionary attitudes toward technology's impact on human society and identity.
The late cultural critic Neil Postman distinguished tool-using societies from technological societies and, finally, what he called "technopolies," that is, societies that are dominated by the ideology of technological and scientific progress, to the exclusion or harm of other cultural practices, values and world-views.[48]
Darin Barney has written about technology's impact on practices of citizenship and democratic culture, suggesting that technology can be construed as (1) an object of political debate, (2) a means or medium of discussion, and (3) a setting for democratic deliberation and citizenship. As a setting for democratic culture, Barney suggests that technology tends to make ethical questions, including the question of what a good life consists in, nearly impossible, because they already give an answer to the question: a good life is one that includes the use of more and more technology.[49]
Nikolas Kompridis has also written about the dangers of new technology, such as genetic engineering, nanotechnology, synthetic biology and robotics. He warns that these technologies introduce unprecedented new challenges to human beings, including the possibility of the permanent alteration of our biological nature. These concerns are shared by other philosophers, scientists and public intellectuals who have written about similar issues (e.g. Francis Fukuyama, Jürgen Habermas, William Joy, and Michael Sandel).[50]
Another prominent critic of technology is Hubert Dreyfus, who has published books On the Internet and What Computers Still Can't Do.
Another, more infamous anti-technological treatise is Industrial Society and Its Future, written by Theodore Kaczynski (aka The Unabomber) and printed in several major newspapers (and later books) as part of an effort to end his bombing campaign of the techno-industrial infrastructure.

Appropriate technology

The notion of appropriate technology, however, was developed in the 20th century (e.g., see the work of Jacques Ellul) to describe situations where it was not desirable to use very new technologies or those that required access to some centralized infrastructure or parts or skills imported from elsewhere. The eco-village movement emerged in part due to this concern.

Technology and competitiveness

In 1983 a classified program was initiated in the US intelligence community to reverse the US declining economic and military competitiveness. The program, Project Socrates, used all source intelligence to review competitiveness worldwide for all forms of competition to determine the source of the US decline. What Project Socrates determined was that technology exploitation is the foundation of all competitive advantage and that the source of the US declining competitiveness was the fact that decision-making through the US both in the private and public sectors had switched from decision making that was based on technology exploitation (i.e., technology-based planning) to decision making that was based on money exploitation (i.e., economic-based planning) at the end of World War II.
Technology is properly defined as any application of science to accomplish a function. The science can be leading edge or well established and the function can have high visibility or be significantly more mundane but it is all technology, and its exploitation is the foundation of all competitive advantage.
Technology-based planning is what was used to build the US industrial giants before WWII (e.g., Dow, DuPont, GM) and it what was used to transform the US into a superpower. It was not economic-based planning.
Project Socrates determined that to rebuild US competitiveness, decision making throughout the US had to readopt technology-based planning. Project Socrates also determined that countries like China and India had continued executing technology-based (while the US took its detour into economic-based) planning, and as a result had considerable advanced the process and were using it to build themselves into superpowers. To rebuild US competitiveness the US decision-makers needed adopt a form of technology-based planning that was far more advanced than that used by China and India.
Project Socrates determined that technology-based planning makes an evolutionary leap forward every few hundred years and the next evolutionary leap, the Automated Innovation Revolution, was poised to occur. In the Automated Innovation Revolution the process for determining how to acquire and utilize technology for a competitive advantage (which includes R&D) is automated so that it can be executed with unprecedented speed, efficiency and agility.
Project Socrates developed the means for automated innovation so that the US could lead the Automated Innovation Revolution in order to rebuild and maintain the country's economic competitiveness for many generations.[51][52][53]

Other animal species

This adult gorilla uses a branch as a walking stick to gauge the water's depth; an example of technology usage by non-human primates.
The use of basic technology is also a feature of other animal species apart from humans. These include primates such as chimpanzees, some dolphin communities,[54][55] and crows.[56][57] Considering a more generic perspective of technology as ethology of active environmental conditioning and control, we can also refer to animal examples such as beavers and their dams, or bees and their honeycombs.
The ability to make and use tools was once considered a defining characteristic of the genus Homo.[58] However, the discovery of tool construction among chimpanzees and related primates has discarded the notion of the use of technology as unique to humans. For example, researchers have observed wild chimpanzees utilising tools for foraging: some of the tools used include leaf sponges, termite fishing probes, pestles and levers.[59] West African chimpanzees also use stone hammers and anvils for cracking nuts,[60] as do capuchin monkeys of Boa Vista, Brazil.[61]

Future technology

Theories of technology often attempt to predict the future of technology based on the high technology and science of the time.

See also

Computer science

Computer science


Computer science or computing science (abbreviated CS or CompSci) is the scientific and mathematical approach to computation, and specifically to the design of computing machines and processes. A computer scientist is a scientist who specialises in the theory of computation and the design of computers.[1]
Its subfields can be divided into practical techniques for its implementation and application in computer systems and purely theoretical areas. Some, such as computational complexity theory, which studies fundamental properties of computational problems, are highly abstract, while others, such as computer graphics, emphasize real-world applications. Still others focus on the challenges in implementing computations. For example, programming language theory studies approaches to description of computations, while the study of computer programming itself investigates various aspects of the use of programming languages and complex systems, and human-computer interaction focuses on the challenges in making computers and computations useful, usable, and universally accessible to humans.
large capital lambda Plot of a quicksort algorithm
Utah teapot representing computer graphics Microsoft Tastenmaus mouse representing human-computer interaction
Computer science deals with the theoretical foundations of information and computation, together with practical techniques for the implementation and application of these foundations

Contents

History

Charles Babbage is credited with inventing the first mechanical computer.
Ada Lovelace is credited with writing the first algorithm intended for processing on a computer.
The earliest foundations of what would become computer science predate the invention of the modern digital computer. Machines for calculating fixed numerical tasks such as the abacus have existed since antiquity. Wilhelm Schickard designed the first mechanical calculator in 1623, but did not complete its construction.[2] Blaise Pascal designed and constructed the first working mechanical calculator, the Pascaline, in 1642. In 1694 Gottfried Wilhelm Leibnitz completed the Step Reckoner, the first calculator that could perform all four arithmetic operations. Charles Babbage designed a difference engine and then a general-purpose Analytical Engine in Victorian times,[3] for which Ada Lovelace wrote a manual. Because of this work she is regarded today as the world's first programmer.[4] Around 1900, punched card machines were introduced.
During the 1940s, as new and more powerful computing machines were developed, the term computer came to refer to the machines rather than their human predecessors.[5] As it became clear that computers could be used for more than just mathematical calculations, the field of computer science broadened to study computation in general. Computer science began to be established as a distinct academic discipline in the 1950s and early 1960s.[6][7] The world's first computer science degree program, the Cambridge Diploma in Computer Science, began at the University of Cambridge Computer Laboratory in 1953. The first computer science degree program in the United States was formed at Purdue University in 1962.[8] Since practical computers became available, many applications of computing have become distinct areas of study in their own right.
Although many initially believed it was impossible that computers themselves could actually be a scientific field of study, in the late fifties it gradually became accepted among the greater academic population.[9] It is the now well-known IBM brand that formed part of the computer science revolution during this time. IBM (short for International Business Machines) released the IBM 704 and later the IBM 709 computers, which were widely used during the exploration period of such devices. "Still, working with the IBM [computer] was frustrating...if you had misplaced as much as one letter in one instruction, the program would crash, and you would have to start the whole process over again".[9] During the late 1950s, the computer science discipline was very much in its developmental stages, and such issues were commonplace.
Time has seen significant improvements in the usability and effectiveness of computer science technology. Modern society has seen a significant shift from computers being used solely by experts or professionals to a more widespread user base. Initially, computers were quite costly, and for their most-effective use, some degree of human aid was needed, in part by professional computer operators. However, as computers became widespread and far more affordable, less human assistance was needed, although residues of the original assistance still remained.

Major achievements

The German military used the Enigma machine (shown here) during World War II for communication they thought to be secret. The large-scale decryption of Enigma traffic at Bletchley Park was an important factor that contributed to Allied victory in WWII.[10]
Despite its short history as a formal academic discipline, computer science has made a number of fundamental contributions to science and society - in fact, along with electronics, it is a founding science of the current epoch of human history called the Information Age and a driver of the Information Revolution, seen as the third major leap in human technological progress after the Industrial Revolution (1750-1850 CE) and the Agricultural Revolution (8000-5000 BCE).
These contributions include:

Philosophy

A number of computer scientists have argued for the distinction of three separate paradigms in computer science. Peter Wegner argued that those paradigms are science, technology, and mathematics.[16] Peter Denning's working group argued that they are theory, abstraction (modeling), and design.[17] Amnon H. Eden described them as the "rationalist paradigm" (which treats computer science as branch of mathematics, which is prevalent in theoretical computer science, and mainly employs deductive reasoning), the "technocratic paradigm" (which might be found in engineering approaches, most prominently in software engineering), and the "scientific paradigm" (which approaches computer-related artifacts from the empirical perspective of natural sciences, identifiable in some branches of artificial intelligence).[18]

Name of the field

The term "computer science" was first coined by the numerical analyst George Forsythe in 1961.[19] Despite its name, a significant amount of computer science does not involve the study of computers themselves. Because of this, several alternative names have been proposed. Certain departments of major universities prefer the term computing science, to emphasize precisely that difference. Danish scientist Peter Naur suggested the term datalogy, to reflect the fact that the scientific discipline revolves around data and data treatment, while not necessarily involving computers. The first scientific institution to use the term was the Department of Datalogy at the University of Copenhagen, founded in 1969, with Peter Naur being the first professor in datalogy. The term is used mainly in the Scandinavian countries. Also, in the early days of computing, a number of terms for the practitioners of the field of computing were suggested in the Communications of the ACMturingineer, turologist, flow-charts-man, applied meta-mathematician, and applied epistemologist.[20] Three months later in the same journal, comptologist was suggested, followed next year by hypologist.[21] The term computics has also been suggested.[22] In Europe, terms derived from contracted translations of the expression "automatic information" (e.g. "informazione automatica" in Italian) or "information and mathematics" are often used, e.g. informatique (French), Informatik (German), informatica (Italy), informática (Spain, Portugal) or informatika (Slavic languages) are also used and have also been adopted in the UK (as in the School of Informatics of the University of Edinburgh).[23]
A folkloric quotation, often attributed to—but almost certainly not first formulated by—Edsger Dijkstra, states that "computer science is no more about computers than astronomy is about telescopes."[note 1] The design and deployment of computers and computer systems is generally considered the province of disciplines other than computer science. For example, the study of computer hardware is usually considered part of computer engineering, while the study of commercial computer systems and their deployment is often called information technology or information systems. However, there has been much cross-fertilization of ideas between the various computer-related disciplines. Computer science research also often intersects other disciplines, such as philosophy, cognitive science, linguistics, mathematics, physics, statistics, and logic.
Computer science is considered by some to have a much closer relationship with mathematics than many scientific disciplines, with some observers saying that computing is a mathematical science.[6] Early computer science was strongly influenced by the work of mathematicians such as Kurt Gödel and Alan Turing, and there continues to be a useful interchange of ideas between the two fields in areas such as mathematical logic, category theory, domain theory, and algebra.
The relationship between computer science and software engineering is a contentious issue, which is further muddied by disputes over what the term "software engineering" means, and how computer science is defined.[24] David Parnas, taking a cue from the relationship between other engineering and science disciplines, has claimed that the principal focus of computer science is studying the properties of computation in general, while the principal focus of software engineering is the design of specific computations to achieve practical goals, making the two separate but complementary disciplines.[25]
The academic, political, and funding aspects of computer science tend to depend on whether a department formed with a mathematical emphasis or with an engineering emphasis. Computer science departments with a mathematics emphasis and with a numerical orientation consider alignment with computational science. Both types of departments tend to make efforts to bridge the field educationally if not across all research.

Areas of computer science

As a discipline, computer science spans a range of topics from theoretical studies of algorithms and the limits of computation to the practical issues of implementing computing systems in hardware and software.[26][27] CSAB, formerly called Computing Sciences Accreditation Board – which is made up of representatives of the Association for Computing Machinery (ACM), and the IEEE Computer Society (IEEE-CS)[28] – identifies four areas that it considers crucial to the discipline of computer science: theory of computation, algorithms and data structures, programming methodology and languages, and computer elements and architecture. In addition to these four areas, CSAB also identifies fields such as software engineering, artificial intelligence, computer networking and communication, database systems, parallel computation, distributed computation, computer-human interaction, computer graphics, operating systems, and numerical and symbolic computation as being important areas of computer science.[26]

Theoretical computer science

The broader field of theoretical computer science encompasses both the classical theory of computation and a wide range of other topics that focus on the more abstract, logical, and mathematical aspects of computing.

Theory of computation

According to Peter J. Denning, the fundamental question underlying computer science is, "What can be (efficiently) automated?"[6] The study of the theory of computation is focused on answering fundamental questions about what can be computed and what amount of resources are required to perform those computations. In an effort to answer the first question, computability theory examines which computational problems are solvable on various theoretical models of computation. The second question is addressed by computational complexity theory, which studies the time and space costs associated with different approaches to solving a multitude of computational problems.
The famous "P=NP?" problem, one of the Millennium Prize Problems,[29] is an open problem in the theory of computation.
DFAexample.svg Wang tiles.png P = NP ? GNITIRW-TERCES Blochsphere.svg
Automata theory Computability theory Computational complexity theory Cryptography Quantum computing theory

Information and coding theory

Information theory is related to the quantification of information. This was developed by Claude E. Shannon to find fundamental limits on signal processing operations such as compressing data and on reliably storing and communicating data. Coding theory is the study of the properties of codes (systems for converting information from one form to another) and their fitness for a specific application. Codes are used for data compression, cryptography, error detection and correction, and more recently also for network coding. Codes are studied for the purpose of designing efficient and reliable data transmission methods.

Algorithms and data structures

O(n^2) Sorting quicksort anim.gif Singly linked list.png SimplexRangeSearching.png
Analysis of algorithms Algorithms Data structures Computational geometry

Programming language theory

Programming language theory (PLT) is a branch of computer science that deals with the design, implementation, analysis, characterization, and classification of programming languages and their individual features. It falls within the discipline of computer science, both depending on and affecting mathematics, software engineering and linguistics. It is an active research area, with numerous dedicated academic journals.
\Gamma\vdash x: \text{Int} Ideal compiler.png Python add5 syntax.svg
Type theory Compiler design Programming languages

Formal methods

Formal methods are a particular kind of mathematically based technique for the specification, development and verification of software and hardware systems. The use of formal methods for software and hardware design is motivated by the expectation that, as in other engineering disciplines, performing appropriate mathematical analysis can contribute to the reliability and robustness of a design. However, the high cost of using formal methods means that they are usually only used in the development of high-integrity and life-critical systems, where safety or security is of utmost importance. Formal methods are best described as the application of a fairly broad variety of theoretical computer science fundamentals, in particular logic calculi, formal languages, automata theory, and program semantics, but also type systems and algebraic data types to problems in software and hardware specification and verification.

Concurrent, parallel and distributed systems

Concurrency is a property of systems in which several computations are executing simultaneously, and potentially interacting with each other. A number of mathematical models have been developed for general concurrent computation including Petri nets, process calculi and the Parallel Random Access Machine model. A distributed system extends the idea of concurrency onto multiple computers connected through a network. Computers within the same distributed system have their own private memory, and information is often exchanged amongst themselves to achieve a common goal.

Databases and information retrieval

A database is intended to organize, store, and retrieve large amounts of data easily. Digital databases are managed using database management systems to store, create, maintain, and search data, through database models and query languages.

Applied computer science

Artificial intelligence

This branch of computer science aims to or is required to synthesise goal-orientated processes such as problem-solving, decision-making, environmental adaptation, learning and communication which are found in humans and animals. From its origins in cybernetics and in the Dartmouth Conference (1956), artificial intelligence (AI) research has been necessarily cross-disciplinary, drawing on areas of expertise such as applied mathematics, symbolic logic, semiotics, electrical engineering, philosophy of mind, neurophysiology, and social intelligence. AI is associated in the popular mind with robotic development, but the main field of practical application has been as an embedded component in areas of software development which require computational understanding and modeling such as finance and economics, data mining and the physical sciences. The starting-point in the late 1940s was Alan Turing's question "Can computers think?", and the question remains effectively unanswered although the "Turing Test" is still used to assess computer output on the scale of human intelligence. But the automation of evaluative and predictive tasks has been increasingly successful as a substitute for human monitoring and intervention in domains of computer application involving complex real-world data.
Nicolas P. Rougier's rendering of the human brain.png Human eye, rendered from Eye.svg.png Corner.png KnnClassification.svg
Machine learning Computer vision Image processing Pattern recognition
User-FastFission-brain.gif Data.png Sky.png Earth.png
Cognitive science Data mining Evolutionary computation Information retrieval
Neuron.svg English.png HONDA ASIMO.jpg
Knowledge representation Natural language processing Robotics

Computer architecture and engineering

Computer architecture, or digital computer organization, is the conceptual design and fundamental operational structure of a computer system. It focuses largely on the way by which the central processing unit performs internally and accesses addresses in memory. The field often involves disciplines of computer engineering and electrical engineering, selecting and interconnection hardware components to create computers that meet functional, performance, and cost goals.
NOR ANSI.svg Fivestagespipeline.png SIMD.svg
Digital logic Microarchitecture Multiprocessing
Operating system placement.svg NETWORK-Library-LAN.png Emp Tables (Database).PNG Padlock.svg
Operating systems Computer networks Databases Computer security
Roomba original.jpg Flowchart.png Ideal compiler.png Python add5 syntax.svg
Ubiquitous computing Systems architecture Compiler design Programming languages

Computer graphics and visualization

Computer graphics is the study of digital visual contents, and involves syntheses and manipulations of image data. The study is connected to many other fields in computer science, including computer vision, image processing, and computational geometry, and are heavily applied in the fields of special effects and video games.

Computer security and cryptography

Computer security is a branch of computer technology, whose objective includes protection of information from unauthorized access, disruption, or modification while maintaining the accessibility and usability of the system for its intended users. Cryptography is the practice and study of hiding (encryption) and therefore deciphering (decryption) information. Modern cryptography is largely related to computer science, for many encryption and decryption algorithms are based on their computational complexity.

Computational science

Computational science (or scientific computing) is the field of study concerned with constructing mathematical models and quantitative analysis techniques and using computers to analyze and solve scientific problems. In practical use, it is typically the application of computer simulation and other forms of computation to problems in various scientific disciplines.
Lorenz attractor yb.svg Quark wiki.jpg Naphthalene-3D-balls.png 1u04-argonaute.png
Numerical analysis Computational physics Computational chemistry Bioinformatics

Health Informatics

Health Informatics in computer science is referred to as Computational health informatics and deals with computational techniques for solving problems in health care. It is a sub-branch of both computer science and health informatics.

Information science

Earth.png Neuron.png English.png Wacom graphics tablet and pen.png
Information retrieval Knowledge representation Natural language processing Human–computer interaction

Software engineering

Software engineering is the study of designing, implementing, and modifying software in order to ensure it is of high quality, affordable, maintainable, and fast to build. It is a systematic approach to software design, involving the application of engineering practices to software.
Software engineering deals with the organizing and analyzing software to get the best out of them. It doesn't just deal with the creation or manufacture of new software, but its internal maintenance and arrangement.

Academia

Conferences

Journals

Education

Some universities teach computer science as a theoretical study of computation and algorithmic reasoning. These programs often feature the theory of computation, analysis of algorithms, formal methods, concurrency theory, databases, computer graphics, and systems analysis, among others. They typically also teach computer programming, but treat it as a vessel for the support of other fields of computer science rather than a central focus of high-level study.
Other colleges and universities, as well as secondary schools and vocational programs that teach computer science, emphasize the practice of advanced programming rather than the theory of algorithms and computation in their computer science curricula. Such curricula tend to focus on those skills that are important to workers entering the software industry. The process aspects of computer programming are often referred to as software engineering.
While computer science professions increasingly drive the U.S. economy, computer science education is absent in most American K-12 curricula. A report entitled "Running on Empty: The Failure to Teach K-12 Computer Science in the Digital Age" was released in October 2010 by Association for Computing Machinery (ACM) and Computer Science Teachers Association (CSTA), and revealed that only 14 states have adopted significant education standards for high school computer science. The report also found that only nine states count high school computer science courses as a core academic subject in their graduation requirements. In tandem with "Running on Empty", a new non-partisan advocacy coalition - Computing in the Core (CinC) - was founded to influence federal and state policy, such as the Computer Science Education Act, which calls for grants to states to develop plans for improving computer science education and supporting computer science teachers.
Within the United States a gender gap in computer science education has been observed as well. Research conducted by the WGBH Educational Foundation and the Association for Computing Machinery (ACM) revealed that more than twice as many high school boys considered computer science to be a “very good” or “good” college major than high school girls.[30] In addition, the high school Advanced Placement (AP) exam for computer science has displayed a disparity in gender. Compared to other AP subjects it has the lowest number of female participants, with a composition of about 15 percent women.[31] This gender gap in computer science is further witnessed at the college level, where 31 percent of undergraduate computer science degrees are earned by women and only 8 percent of computer science faculty consists of women.[32]