Part MMCCXV Future in the informatics era - Chapter 3 Chapter 3: EXPONENTIAL ACCELERATION of INFORMATION-DRIVEN TECHNOLOGIES Chapter 3: INFORMATION-DRIVEN TECHNOLOGIES and their EXPONENTIAL ACCELERATION prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 2/120 CONTENTS Deep thoughts Convergence of all technologies to information technologies Main features of the evolution of (information) technologies The law of accelerating returns and Moore law Examples of exponential and double exponential developments of information-driven technologies. Five recent main paradigm shifts behind ICT evolution Recent developments in supercomputers. Main impacts of the ICT laws on development of society Are still radically more powerful computers in vision? What can be expected from quantum information processing and transmission? Appendix I: The law of accelerating returns as an economic theory. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 3/120 DEEP THOUGHTS Any sufficiently advanced technology is indistinguishable from magic. Arthur C. Clarke’s third law Any technology distinguishable from magic is insufficiently advanced. Barry Gehm Technology is still magic even if you know how it’s done. Terry Pratchell in ”A hat full of sky No communication technology has ever disappeared, but instead becomes increasingly less important as the technological horizon widens. Arthur C. Clarke Civilization advances by extending the number of important operations which we can perform without thinking about them. Alfred North Whitehead (1911) The reasonable man adopts himself to the world; the unreasonable one persists in trying to adopt the world to himself. Therefore all progress depends on unreasonable men. George Bernard Shaw prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 4/120 DEEP THOUGHTS CONCERNING TECHNOLOGY First we thought the PC was a calculator. Then we found out how to turn numbers into letters with ASCII - and we thought it was a typewriter. Then we discovered graphics, and we thought it was a television. With the World Wide Web, we have realized it is a brochure.. Douglas Adams There is a proverb which says: ”To err is human”, but a human error is nothing to what a computer can do, if it tries. Agata Christie, Halloween party prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 5/120 VERY SPECIAL THOUGHTS Computers are useless. They can only give you answers. Pablo Picasso The production of too many useful things results in too many useless people. Karl Marx The real danger is not that computers will begin to think like men, but that men will begin to think like computers. Sydney J. Harris Ethics change with technology. Larry Niven: N-Space As technology advances in complexity and scope, fear becomes more primitive. Don DeLillo prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 6/120 CONVERGENCE of all TECHNOLOGIES to INFORMATION TECHNOLOGIES All technologies will essentially become information technologies. Within several decades information based technology will encompass all human knowledge and proficiency, ultimately including pattern-recognition powers, problem-solving skills, and emotional and moral intelligence of the human brain itself. One should note that the term ”information technology’ is encompassing an increasingly broad class of phenomena and will ultimately include the full range of economic activities and cultural endeavour. The exponential trends in information technology are far broader than those covered by Moore’s law. We see the same type of trends essentially in every technology or measurement that deals with information. There were actually four different paradigms - electromechanical, relays, vacuum tubes, and discrete transistors - that showed exponential growth in the price and performance of computing long before integrated circuits were even invented, prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 7/120 INCREASES of COMPLEXITY and ORDER as key FEATURES of (TECHNOLOGICAL) EVOLUTION Two observations seem to play an important role in an understanding, both biological and also technological, evolution: Both concern information technologies. Observation 1: Truly epochal advances (paradigms shifts) in the history of biology and technology always involved significant increases in complexity. Observation 2: Truly epochal advances (paradigms shifts) in the history of biology and technology have always involved increases in order (in useful available information). In this context one should understand: information as a sequence of data that is meaningful in a process and order as information that fits a purpose as well as a measure of order as a measure of how well information fits the purpose. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 8/120 EVOLUTION ADVANCES and PROCESSING INFORMATION Other deep observations on advances of evolution: A primary reason that the evolution speeds up is that it builds on its own production and use of ever more sophisticated means of obtaining, recording and manipulating (increasingly useful) information. In the case of biological evolution, the most notable early example is DNA , which provides a recorded and protected transcription of life’s design from which to launch further experiments. In the case of technological evolution, ever improving human methods of obtaining, recording and processing information have led always to further advances in technology. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 9/120 THE LAW of ACCELERATING RETURN and MOORE LAW The law of accelerating return and Moore law prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 10/120 TECHNOLOGICAL PROGRESS Exponential growth is a deep feature of any evolutionary progress, of which technology is a prime example. History of technology reveals that that technological change is exponential. Since technological progress seems to double each decades, what is often assumed that it will take one hundred years is actually likely to take only 25 years. Ray Kurzweil formulated his discovery that technological progress is exponentially as the law of accelerating returns. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 11/120 EXAMPLE Biochemists were sceptical in 1990 of the goal to transcribe the entire human genome in only fifteen years. The reason for pessimism was that at that time they needed whole year to transcribe a ten-thousandth of the genome. That was the reason why many expected that to reach the goal will take 100 years. However, the goal was reached in just 13 years. The main reason behind wrong estimation was that pessimists forgot that techniques and tools for transcription can improve also very fast and pace of transcription will accelerate. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 12/120 GENERAL GROUNDS for PESSIMISM of SCIENTISTS and ENGINEERS They are often so much involved into difficulties and intricate details of their contemporary challenges that they fail to appreciate great long-term impact of their own work and of the whole large field of work in which they operate. They also fail to account for far more powerful tools they are likely to have with each new generation of technology. Scientists are trained to be sceptical and to speak cautiously about the potential impacts of their work. That could have been an understandable and satisfactory approach when a generation of science and technology lasted longer than a human generation, but this does not serve society’s interest now that a generation of a scientific and technology progress often compromises only few years. Almost anyone has a linear view of future.That is why people tend to overestimate what can be achieved in short terms (because we tend to leave out necessary details), but underestimate what can be achieved in long terms (because any exponential growth is ignored). prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 13/120 BASIC EXPECTATIONS I Computers are getting not only faster and faster they are getting faster faster. ICT performance is expected to keep growing exponentially fast in all important aspects. Moreover, we are nowadays only at the beginning of its rapidly fast growing exponential curve for its performance. All that is expected to have enormous impacts. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 14/120 THE LAW of ACCELERATING RETURNS The law of accelerating returns explains why technology, and evolutionary processes in general, progress in an exponential fashion. Basic observations: (1) The velocity of computation is proportional to the world knowledge; (2) The rate of change of the world knowledge is proportional to the velocity of computation. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 15/120 THE PRINCIPLES of THE LAW of ACCELERATING RETURN - I. Evolution applies always positive feedbacks: best methods of any stage of the evolution are used to create next stage. Each epoch has progressed more rapidly because could use better tools as previous ones. Evolution works through ”indirection”. Evolution created humans; humans created technology; humans in cooperation with very advanced technology create even more advanced technology. By the time of Singularity there will not be much difference between humans and technology - because machines will progress to be much as humans and beyond. Technology will be metaphorically the ”oposable thumb” that enables our next step in evolution. Progress will soon occur more and more at the speed close to that of light rather than of very slow electrochemical reactions. Each stage of evolution builds on better outcomes/tools that previous stage and the rate of progress of an evolutionary process increases at least exponentially. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 16/120 MOORE LAW - SEVERAL VERSIONS Moore’s law has now (at least) three forms. Economic form: Computer power doubles, for constant cost, every two years or so. Physical form: The number of atoms needed to represent one bit of information should halves every two years or so. Quantum form: For certain application, quantum computers need to increase in the size only by one qubit every two years or so, in order to keep pace with the classical computers performance increase. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 17/120 GORDON MOORE and HIS ROAD to HIS LAW In the mid-1970s Gordon E. Moore, a leading inventor of integrated circuits and later the chairman of Intel, observed that we could squeeze twice as many transistors onto an integrated circuits every twenty-four months (in mid-1960s he estimated every twelve months). Moore also observed that electrons would consequently have less distance to travel, and therefore circuits would also run faster, providing additional boosts to the overall computational power. The result is exponential growth in the price-performance of computation. Currently, we see that the doubling time for different measures - price-performance, bandwidth, capacity of the capabilities of information technology is about one year. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 18/120 MOORE LAW ORIGIN On April 19, 1965 in Electronics Moore wrote ”The future of integrated electronics is the future of electronics itself. The advantages of integration will bring about a proliferation of electronic, pushing this science into many new areas.” Moreover, Moore predicted that ”by 1975, economics may dictate squeezing as many as 65, 000 components on a single silicon chip”. Moore’s article described the repeated annual doubling of the number of transistors that could be fitted onto an integrated circuit. Moore’s initial estimate was incorrect - he revised it a decade later - but the basic idea was valid. Current situation: Top performance chips have 10 millions of transistors per mm2 . Top performance CPU have 7 billions of transistors. Currently, the IPC technology is shrinking by a factor of about four per linear dimension per decade. This miniaturization is a driving force behind Moore’s law. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 19/120 MOORE LAW VISUALLY prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 20/120 GENERAL VERSION of THE MOORE LAW Two general observation: Exponential growth in the power- and price-performance of information-based technologies is not limited to computers but it is true essentially for all information technologies and includes also human knowledge - measured in many different ways. It is also important to observe that the term ”information technology” keeps encompassing an increasingly broad class of phenomena and will ultimately include the full range of economic and cultural activities. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 21/120 LIMITATIONS of the MOORE LAW On the base of quantum mechanics Seth Lloyd determined, in 1999, that an “ultimate laptop”of the mass 1 kg and size 1 liter cannot perform more than 2.7 × 1050 bit operations per second. Calculations of Lloyd were based only on the amount of energy needed to switch from one state to another one. It seems to be harder to determine the number of bits of such an “ultimate laptop”. However, the bound 3.8 × 10126 has been determined for a computer compressed to form a black hole. It seems to be clear that Moore law cannot hold longer than for another 200 years. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 22/120 DENNARD and KOONY LAWS Dennard’s (scaling) law (1974) states, roughly, that as transistors get smaller, their power density stays constant. In other words that the power use stays in proportion with order; In still other words that both voltage and current scale (downward) with length. In other words, Dennard’s law says that voltage and current should be proportional to the linear dimension of the transistor, which implies that power consumption (the product of voltage and current) will be proportional to the area of a transistor. Koony’s law (2010) says that performance per volt in computing has been doubling every 1.57 years. In other words, Koony’s law says that the number of computations per joule of energy dissipation is (has been) doubling approximately every 1.57 years. It has been shown that Koony’s law cannot hold for more than 100 years. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 23/120 DOUBLE EXPONENTIAL GROWTH The more effective a particular evolutionary process becomes, the greater are the amount of resources that are deployed toward the further progress of that process and that may result in a double exponential growth. Example: It took three years to double the price-performance of computation at the beginning of the modern computer era (around 1950). It took two years around 1980. Recently it started to take one year. A related observation: Not only is each chip doubling in power each year for the same unit cost, but the number of chips being manufactured is also growing exponentially. Consequently, computers research budgets have grown dramatically over the decade. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 24/120 WISDOM of Gordon E. Moore (2004) No exponential is forever....but we can delete ”forever”. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 25/120 EXAMPLES HOW EVOLUTION SPEEDS UP Homo sapience evolved over the course of a few hundred thousands years; Early stages of humanoid-created technology (fire, stone-tools, wheel) required for their development tens of thousands years; A half millennium ago such products of a paradigm shift as printing press took about a century to be widely deployed. Today, the products of major paradigm shifts, such as cell phones or World Wide Web are widely adopted in only few years time. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 26/120 EVOLUTION as a SEQUENCE of PARADIGMS SHIFTS Evolution can be seen as a sequence of paradigm shifts, each represented by an ”S-curve”, as in the following figures showing them in a linear and an exponential plots. Each paradigm develops in the following three stages: Slow growth (as the early stage of exponential growth). Explosive stage of the exponential growth A leveling off, when the current paradigms impact starts to be exhausted and a shift to a new paradigm starts. The exponential growth of an evolutionary process, therefore, is presented by multiple S-curves. The most important contemporary example of this phenomenon is the five paradigms of computation discussed later. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 27/120 FIVE PARADIGMS BEHIND EXPONENTIAL GROWTH in COMPUTING Observe that each time a paradigm ”has run out of the steam” another has picked up the pace. It is expected that the three-dimensional molecular computing could be the next paradigm. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 28/120 HOW COMPUTERS SIZE SHRINKS prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 29/120 EXAMPLES of the EXPONENTIAL GROWTH in ICT EXAMPLES of the EXPONENTIAL GROWTH in ICT prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 30/120 An EXAMPLE of the ACCELERATION of PARADIGMS SHIFTS prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 31/120 EXPONENTIAL GROWTH in RAM CAPACITY ACROSS PARADIGM SHIFTS prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 32/120 COMMENTS In case of the technology evolution, we can observe the following situations: During the third maturing phase in the life cycle of a paradigm, pressure increases to build/prepare a new paradigm shift and a lot of research money goes for that. Example: The extensive research is nowadays conducted toward three-dimensional molecular computing - despite the fact that there is still at least a decade left for the paradigm of shifting transistors on flat integrated circuits using photolitography. In addition, often when a paradigm starts to reach its saturating phase, a new paradigm is often developed already into such a level that it works in some niche (attractive) applications. Example: In 1950’s engineers were working hard to shrink vacuum tubes to provide greater price-performance for computers. At this point, around 1960, transistors had already achieved a strong niche market in portable radios and were subsequently used to replace vacuum tubes in computers. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 33/120 RESOURCES UNDERLYING EXPONENTIAL GROWTH of EVOLUTIONARY PROCESSES Resources are relatively unbounded. Resource 1: Each stage of evolution provides more powerful tools for the next one. Examples: (1) In biological evolution, the advent of DNA enabled more powerful and faster evolutionary experiments; (2) The advent of computer-assisted design tools allowed a rapid development of the next generation of computers. Resource 2: Impact of (often very diverse/chaotic) environments puts pressure for finding more powerful and more efficient solutions. Examples: (1) In biological evolution, one source of diversity is the mixing and matching of gene combinations through sexual reproduction - an evolutionary innovation that accelerated the process of biological adaptation and diversity; (2) In technological evolution, that is human ingenuity combined with variable market conditions that keep the process of innovation going. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 34/120 EVOLUTION and DESIGN of BIOLOGICAL SYSTEMS One of key questions of biological evolution and systems is how it is possible for the genome, which contains relatively little information, to produce so much more complex systems as humans. There are only eight hundred million bytes of information in the entire human genome, and only 30-100 millions after data compression is applied. This is about one hundred million times less information than is represented by inter-neural connections and neurotransmitter concentration patterns in a human brain. The best answer available seems to be that biological designs are specified through randomized fractals descriptions.prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 35/120 APPLICATION of the LAW of ACCELERATING RETURN to SIX EPOCHS - 0. RNA RNA (Ribonucleic acids) is one of three major biological macromoleculs that are essential for all known forms of life (along with DNA and proteins). A central thesis of of molecular biology states that the flow of genetic information in a cell is from DNA through RNA to proteins. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 36/120 APPLICATION of the LAW of ACCELERATING RETURN to SIX EPOCHS - I. The creation of proteins from one-dimensional strings of amino acids and the strings of RNA from nucleic acids established the basic paradigm of biology. Strings of RNA (and later DNA) that self-replicated (Epoch Two) provided a digital method to record results of evolutionary experiments. The evolution of species that combine rational thought (Epoch Three) with an oposable appendage (the thumb) caused the fundamental paradigm shift from biology to technology (Epoch Four). The upcoming primary paradigm shift will be from the biological thinking to a hybrid combining biological and non-biological thinking (epoch Five). prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 37/120 Example 1a: THE ACCELERATION of the TECHNOLOGY PARADIGM SHIFT RATE The following picture shows how long it took for the late-nineteen century invention telephone - to reach significant level of usage: prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 38/120 Example 1b: THE ACCELERATION of the TECHNOLOGY PARADIGM SHIFT RATE The following picture shows that it took only a decade for the late-twentieth-century adoption of cell phones: prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 39/120 RECENT DEVELOPMENTS in CELL PHONES Mobiles and internet allowed to beat time and space from human communication point of view. Indeed, we can communicate, with no visible time delay, with anyone in the world without any recognizable time delay and we do not need even to know where (s)he is on the earth. In 2000 there were 700 million mobile phones in the world, but less than 30% of them in developing countries. In 2012 there were 6 billion of mobiles and 75% of them in developing countries. This has enormous impacts. In the industrial world we have been already for long accustomed to have libraries, phones and computers to our disposal. However, all of them have been unimaginable luxuries to people in the developing world. This keeps changing. The World Bank estimates that 75% of people have access to a mobile and that in some countries mobiles are more widespread than electricity or clean water. Progress in mobiles and internet has a very important consequence; It will bring billions of people into the community of potential knowledge creators, problem solvers and innovators. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 40/120 AN ACCELERATION of the ADOPTION of COMMUNICATION TECHNOLOGIES The following figure demonstrates a smooth acceleration in the adoption rates of communication technologies over the past century: prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 41/120 DEVELOPMENTS in SEMICONDUCTOR INDUSTRY In the following charts various data from the ”Semiconductor industry road maps up to 2018” are presented to demonstrate developments according to the Moore law. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 42/120 DYNAMIC RAM prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 43/120 DYNAMIC RAM PRICE The doubling time for bits of DRAM has been only 1.5 years. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 44/120 AVERAGE TRANSISTOR PRICE In 1968 one could buy one transistor for a dollar; in 2002 one could get about ten million transistors for a dollar. Halving time for average transistor price has been about 1.6 years. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 45/120 TRANSISTORS MANUFACTURING Very smooth acceleration in price-performance of semiconductors has progressed through series of stages of process technologies at ever smaller dimension. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 46/120 MICROPROCESSOR CLOCK SPEED As transistors become smaller and less expensive they also become faster, because of less distance electrons have to travel, by about a factor one thousand over the past thirty years. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 47/120 MICROPROCESSOR COST per TRANSISTOR CYCLE If the exponential trend towards less-expensive transistors and faster cycle times are combined the halving time is 1.1 years in the cost per transistor cycle. The cost per transistor cycle is a more accurate overall measure of price-performance because it takes into account both speed and capacity. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 48/120 TRANSISTORS per MICROPROCESSORS The number of transistors in Intel processors has doubled every two years. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 49/120 PROCESSOR PERFORMANCE Processor performance in MIPS has doubled every 1.8 years per processor. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 50/120 AN EXAMPLE of PERFORMANCE INCREASES 1967 - IBM 7094 : processor speed (MIPS) 0.25; main memory (K Bytes) 144; approximate cost (in 2003 $) 11, 000, 000. 2004- notebook processor speed (MPIS) 2, 000; main memory 256 000; cost 2000 $. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 51/120 TOTAL BITS SHIPPED Despite massive deflation in the cost of IT, demands has more than kept up: prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 52/120 TOTAL BITS SHIPPED prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 53/120 OVERALL IMPACT of IT in US The entire IT industry has grown from 4.2% of GDP in 1977 to 8.2% in 1998. Semiconductor industry enjoyed 18% annual growth in total revenue from 1958 to 2002. IT has become increasingly influential in all economic sectors. Even very common manufactured products have significant IT contribution through computers-driven design and manufacturing processes. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 54/120 EVOLUTION of COMPUTER-POWER COST The following chart (H. Moravec) plots the evolution of computer power/cost (brain power equivalent to $ 1,000 computer), using various historical computers. Observe that slope increases with time demonstrating double-exponential growth. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 55/120 COMPUTER PERFORMANCE TRENDS PROJECTED TO NEAR FUTURE Next figure predicts that supercomputers will match human brain capabilities by the end of this 2030 an personal computers will do that around 2040. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 56/120 SUPERCOMPUTERS A supercomputer is a computer system that is recognized for its high computational speed, large memory systems and extensive use of parallel processing. Supercomputers are mostly designed to deal with very important specific (mostly simulation) problems. Supercomputers are mostly used to deal with very complex problems or problems that cannot by physically designed or are too dangerous or deal with incredible small or large things. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 57/120 SUPERCOMPUTER POWER - ESTIMATIONS in 2005 prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 58/120 MOST POWERFUL SUPERCOMPUTERS NOWADAYS 1 Tianhe-2 (Milky Way-2), China, 33.86 petaflops, 3, 120. 000 cores 2 Titan, Cray XK7, OAK Ridge, 17.6 petaflops, 560,640 processors 3 Sequoia, IBM BlueGene, 16.32 petaflops, 1,472,864 cores 4 K, Fujitsu, 11 petaflops, 705,024 cores 5 Mira, IBM BlueGene/Q Argone National Lab., 10 petaflops, 786,432 cores 6 Juqueen, IBM BlueGene/Q, Juelich, Germany, 5 petaflops, 393,206 cores In November 2012 there were 23 computer systems with petaflop performance. In November 2014 there were 50 computer systems with petaflops performance Performance of the computer on 100 position increased in six months in 2012 from 172 to 241 Teraflops. In November 2013 it was 332 teraflops and on 500th position 117 teraflops Out of 500 most powerful computer systems, in November 12 251 was in US, 123 in Asia, 105 in Europe. In November 2013 list 264 was in US,.... Performance of the top computer, in the November lists, in petaflops: 1.7 in 2009, 2.6 in 2010, 10.5 in 2011, 17.6 in 2012, 33,86 in 2013 - 20-times increase in 4 years (2.2-times increase per year). Exaflops computers (1018 ) are expected in 2019 Zettaflops computers (1021 ) are expected in 202?prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 59/120 K COMPUTER prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 60/120 K-COMPUTER prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 61/120 TITAN-COMPUTER prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 62/120 TIANTHE-2 SUPERCOMPUTER prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 63/120 ENIAC-COMPUTER prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 64/120 ENIAC PROGRAMMING prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 65/120 HISTORY of SUPERCOMPUTERS - I. PREHISTORY of SUPERCOMPUTERS CDC 6600: 1966-1977 Designed to be a supercomputer. Fastest clock speed - 100 nanoseconds - of that time. 65, 000 60-bit words of memory. In addition, it was equipped with a large disk storage device and 6 high speed and intermediate storage drums. It was supported by Fortran 6 compiler. Vector processors The modern supercomputing began in the 1970s with development of vector processors. The early and mid-1980s several computers have been designed with 4-16 processors. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 66/120 HISTORY of SUPERCOMPUTERS - II. MODERN SUPERCOMPUTERS Numerical wind tunnel supercomputer - 1993-1996 It was, from today’s point of view, first really supercomputer. it was built in Japan for Aerospace National laboratory. It was the first supercomputer with a sustained performance close to 100 Gigaflops for a wide range of fluid dynamics application programs. System had 140 cores and 166 vector processors. The cycle time was 9.5 ns. ASCI Red: 1996-2006 Built by Intel for US Department of Energy and the first computer with two teraflops performance ¿ Earth simulator ES 1: 1999-2009 Built in Japan for running global climate models and to evaluate effects of global warming. ES had 640 nodes with 8 vector processors for a total 5120 processors and 10 terabytes of memory. System had 700 terabytes of disc storage and 1.6 petabytes of mass storage in tape drivers. ES-2 was built in 2009 with performance 122 teraflops. Blue Gene supercomputers (2004-2007) . Developed by IBM. Three versions Blue Gene/l, Blue Gene/P and Blue Gene/Q (2012) prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 67/120 NEXT GOALS in SUPERCOMPUTING Axaflop computers (1018 ) is the next goal that is seen as realistic. At this time we have to consider the feasibility of picocomputing (using picotechnology, measured in trillions (10−12 ) of a meter, and femtocomputing (using femtotechnology measured in (10−15 ) of a meter, as speculative. Supercomputer to be installed in 2015 in Ostrava should have power of 2650 laptops. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 68/120 BIG CLUSTER This year supercomputer Big Cluster with top performance 2 petaflops is to be installed in Ostrava. Big Cluster should belong at the time of installation among 100 most powerful supercomputers. Big Cluster should have 24,192 Intel xeon E5v3 CPU cores and 129 TB RAM operating memory. it should have 2PB of disk storage and 3 PB of backup tape storage capacity. Cost should be around 1 million of EUR and 85% of money should come from EU Structural Fonds. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 69/120 RECENT DEVELOPMENTS in QUANTUM COMPUTERS D-Wave computers - I. Recently,a Canadian company announced ”quantum computers” D-Wave One with 128 qubits and D-Wave Two with 512 qubits capable to outperform much classical computers in performing simulated annealing. This caused a lot of heated controversy. Some of top people in computing consider their outcomes critically; some of big companies/enterprises bought such a computer (GOOGLE, NASA), many top research institutions invited people behind D-Wave computers for talks. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 70/120 RECENT DEVELOPMENTS in QUANTUM COMPUTERS D-Wave computers - II. Quite a few top quantum information processing theorists dismissed their claims because: of not being clear how much quantum these machines indeed are (how much entanglement they exhibit); of not being fast enough - classical computers were shown to be faster. More than 650 (mostly) interesting entries on this controversy is on Scott Aaronson blog. Discussion brought some agreements: (a) Machines are technological achievements; (b) Claims of designers concerning performance are often exaggerations; (c) It is far from clear how big steps forward will be next prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 71/120 COMMUNICATION TECHNOLOGY No communication technology has ever disappeared, but instead becomes increasingly less important as the technological horizon widens. Arthur C. Clarke Exponential growth in communication technology has for many years been even more explosive than in processing and memory measures of computation and is no less significant in its implications. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 72/120 OTHER EXAMPLES of the LAW of ACCELERATING RETURNS The law of accelerating returns applies to all technologies. Some important examples follow. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 73/120 DNA SEQUENCING DNA sequencing is the process of determining the precise order of nucleotides within a DNA molecule. It includes any method or technology that is used to determine the order of four bases: adenine, guanine, cytosine and thymine. The advent of rapid DNA sequencing methods has greatly accelerated biological and medical research advances. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 74/120 COST of DNA SEQUENCING When the human genome scan project was launched in 1990, critics expected it will take 100 years, judging from the scanning speed of that time - finally it took a bit less than 15 years. The cost of DNA sequencing came down from about 10 $ per base pair in 1990 to a couple of pennies in 2004 and was rapidly falling down. The cost of the whole genome sequencing went from 100,000,000 in 2001 to 10,000 in 2011 prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 75/120 DNA BASE PAIRS SEQUENCING COSTS prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 76/120 DNA BASE PAIRS STRUCTURE prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 77/120 DNA BASE PAIRS STRUCTURE DETAILS prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 78/120 RECENT COSTS of DNA SEQUENCING prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 79/120 GROWTH in the AMOUNT of DNA SEQUENCE DATA Exponential growth in in the amount of DNA sequence data is presented in figure bellow. Sequencing of HIV virus took more than 15 years. For SARS virus only 31 days. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 80/120 RAM The following picture shows how exponential growth in RAM proceeds smoothly through different technology paradigms. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 81/120 MAGNETIC DATA STORAGE The growth in the price performance of magnetic memory is not a result of Moore’s law.This exponential trend of the squeezing of data onto a magnetic substrate, rather than transistors onto an integrated circuit, is a completely different technical challenge pursued by different companies. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 82/120 PRICE-PERFORMANCE of WIRELESS DATA DEVICES Exponential grows concerning communication devices has actually been for many years even more impressive as that of computation devices. First example deals with wireless communication. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 83/120 INTERNET HOSTS GROWTH - linear plot The explosion of the Internet hosts after the mid-1990, when emails and WWW strated to explode, looks as a surprise once linear plot is used. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 84/120 INTERNET HOSTS GROWTH - logarithmic plot The explosion of the Internet hosts after the mid-1990, when emails and WWW strated to explode, stops to look as a surprise once logarithmic plot is used. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 85/120 INTERNET DATA TRAFFIC Data traffic on Internet also doubled every year. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 86/120 INTERNET BANDWIDTH - BITS for SECOND To accommodate exponential growth of data traffic on Internet the data transmission speed had also to grow exponentially. The following figure shows development as a progression of S-curves. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 87/120 MINIATURIZATION Profound impact for future will have also the fact that size is decreasing also in exponential rate for a broad range of mechanical and electronic devices. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 88/120 BIOLOGY versus TECHNOLOGY Biology has inherent limitations. Every living organism must be built from proteins that are folded from one-dimensional strings of amino acids. Protein-based mechanisms are lacking in strength and speed. We expect to be able to re-engineered all of the organs and systems in our biological bodies and brains to be much more capable. Machines are expected to be able to improve their own designs and augment their capabilities without limit. Using nanotechnology-based designs, their capabilities are expected to be far greater than that of biological brains - without increasing the size or memory consumption. Tomorrow’s molecular circuits should be based on devises such as nanotubes - tiny cylinders of carbon atoms that measure about 10 atoms across and are five hundred times smaller than today’s silicon-based transistors. They should be able to operate at teraherz speed (trillion of operations per second) prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 89/120 IMPACTS of the TECHNOLOGY DEVELOPMENTS LAW - BASIC EXPECTATIONS II Impacts of the Moore law and, in general, of the law of accelerating return can be summarized as follows: As a first consequence, development of almost all areas of society will speed up so fast that what would happen in the next 1000 (500) years at the current rate od development will actually happen within next 100 (40) years. It is therefore beyond our full understanding how life will look in 30-40 years. However,... Current informatics students are expected to retire at the age 80 ± 10 years, or more, and therefore you can expect that during your life time you can expect what you can hardly imagine. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 90/120 MORE SPECIFIC EXPECTATIONS It is expected that basic computational resources (1019 cps and 1018 bits) to simulate the human brain will be available for one thousand dollars in the early 2030s. It is expected that around 2050 one will be able to buy for 1000 $ a computer whose information processing power will be greater than all unaided human brains. It is expected that, due to the development in nanotechnology and 3D molecular computing, around 2040-50 we can have nanobots - robots of the size of blood cells (7-8 microns or even smaller) that will be able, for example, travel through our body and brain (and to do useful work). This will allow to put before science, technology and medicine many new meta-goals. For example To fight death definitely or at least to prolong very significantly human (productive) age. To produce non-biological intelligence that will be many (trillion) times faster and more productive in many tasks than biological intelligence. To scan human consciousness into computers so we can live inside them, forever, at least virtually? (It is expected that many people that live today will wind up being functionally immortal.) prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 91/120 ARE RADICALLY MORE POWERFUL COMPUTERS in VISION? Basic question: Is enormous progress in the potential information processing systems we have expected so far fully in accordance with the laws of classical physics as they are known today? It is therefore natural to formulate and explore the following problems: How much could we increase power of computers when we would start to use phenomena of other potential physical worlds? In particular, how much more powerful could be computers that make full use of quantum phenomena and processes? More generally, how powerful can be computers based on the laws of some other physical worlds, especially those that we cannot prove so far as impossible? In particular can we beat Church-Turing thesis and barrier that has been seen as major limitation factors of information processing power.? prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 92/120 QUANTUM INFORMATION PROCESSING and COMMUNICATION Quantum information processing and transmission prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 93/120 CLASSICAL versus QUANTUM COMPUTING The essence of the difference between classical computers and quantum computers is in the way information is stored and processed. In classical computers, information is represented on macroscopic level by bits, which can take one of the two values 0 or 1 In quantum computers, information is represented on microscopic level using qubits, which can take on any from uncountable many values α|0 + β|1 where α, β are arbitrary complex numbers such that |α|2 + |β|2 = 1. A qubit can be seen as a state in 2-dimensional Hilbert space. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 94/120 IMPLEMENTATION of QUBITS EXAMPLE: Representation of qubits by (a) electron in a Hydrogen atom (b) a spin-1 2 particle n=1 Basis states |0> |1>H H Hamplitudes (a) (b) |0> = | > |1> = | General state = amplitudes α β α|0> + β|1> |α| + |β| = 1 α + β | > = α| > + β| > |α| + |β| = 1 2 2 2 > General state 2 n=1 n=2n=2 Basis states prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 95/120 QUANTUM REGISTERS For any integer n a quantum system consisting of n qubits forms so called n-qubit quantum register and its states will be states in 2n - dimensional Hilbert space For any function f : {0, 1}n → {0, 1}n it is possible to design, using O(n) of physical resources, a quantum state that ”contains” all 2n values of the function f - a manifestation of so called quantum massive parallelism. It would seem therefore that using a quantum computer one could compute exponentially faster, using quantum resources, than on classical computers. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 96/120 HISTORY of QUANTUM COMPUTING - I. 1970 Landauer demonstrated importance of reversibility for minimal energy computation; 1973 Bennett showed the existence of universal reversible Turing machines; 1981 Toffoli-Fredkin designed a universal reversible gate for Boolean logic; 1982 Benioff showed that quantum processes are at least as powerful as Turing machines; 1982 Feynman demonstrated that quantum physics cannot be simulated effectively on classical computers; 1984 Quantum cryptographic protocol BB84 was published, by Bennett and Brassard, for absolutely secure generation of shared secret random classical keys. 1985 Deutsch showed the existence of a universal quantum Turing machine. 1989 First cryptographic experiment for transmission of photons, for distance 32.5cm was performed by Bennett, Brassard and Smolin. 1993 Bernstein-Vazirani-Yao showed the existence of an efficient universal quantum Turing machine; prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 97/120 HISTORY of QUANTUM COMPUTING - II. 1993 Quantum teleportation was discovered, by Bennett et al. 1994 Shor discovered a polynomial time quantum algorithm for factorization; Cryptographic experiments were performed for the distance of 10km (using fibers). 1994 Quantum cryptography went through an experimental stage; 1995 DiVincenzo designed a universal gate with two inputs and outputs; 1995 Cirac and Zoller demonstrated a chance to build quantum computers using existing technologies. 1995 Shor showed the existence of quantum error-correcting codes. 1996 The existence of quantum fault-tolerant computation was shown by Shor. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 98/120 CLASSICAL versus QUANTUM TELEPORTATION The so called No-teleportation theorem says that classical teleportation is impossible. This means that there is no way to use classical channels to transmit faithfully quantum information. In contrast to the classical no-teleportation theorem, quantum teleportation is possible. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 99/120 HOW POWERFUL WOULD BE QUANTUM COMPUTERS Already known quantum algorithms for some problems are exponentially faster than all known classical algorithms for the same problems - for example for integer factorization. It can be proven for some communication problems that quantum communication can be exponentially more efficient. There are problems, for example teleportation, that cannot be done using classical resources but can be done using quantum resources. In quantum teleportation one party can teleport an unknown quantum state of its particle to the particle of another party, if they share one special quantum state, without knowing what is being teleported and where another party is located, provided two parties involved can have classical (say email) communication. Using quantum tools one can generate classical shared randomness in unconditionally secure way. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 100/120 COUNTERINTUITIVNESS in QUANTUM WORLD Quantum physics is full of unexpected or even mysterious and/or counterintuitive phenomena. For example: Unknown quantum information cannot be copied. Counterfactual phenomena are possible prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 101/120 COUNTERFACTUAL PHENOMENA The term counterfactual is usually used for things that might have happened, although they did not really happened. While classical counterfactuals do not have physical consequences, quantum counterfactuals can have surprisingly big consequences because the mere possibility that some quantum event might have happened can change the probabilities of obtaining various experimental outcomes. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 102/120 A CONSEQUENCE It can be shown that a quantum computer can provide the result of a computation without performing the computation provided it would provide the same result of computation by really performing the computation Mitchinson and Jozsa, 1999. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 103/120 HOW DIFFICULT is TO DESIGN a QUANTUM COMPUTER? Theoretically not so much because it is enough to implement multiplication of quantum states using the following matrices: CNOT =     1 0 0 0 0 1 0 0 0 0 0 1 0 0 1 0     , H = 1 √ 2 1 1 1 −1 , σ1/4 z = 1 0 0 e π 4 i Practically, very difficult -it is even not clear at all whether it is possible to build very powerful quantum computer. Two main reasons for that are: Destructive impact of environment, so called decoherence, that are almost impossible fully to eliminate. Computation has to produce states that exhibits quantum non-locality - a phenomenon that is beyond our understanding. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 104/120 A STORY of QUBIT The world is a dangerous place, particularly, if you are a qubit. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 105/120 UNSCRAMBLING of OMELET Today we are beginning to realize how much of all physical science is really only information, organized in a particular way. But we are far from unraveling the knotty question: To what extent does this information reside in us, and to what extent is it a property of nature? Our present quantum mechanics formalism is a peculiar mixture describing in part laws of Nature, in part incomplete human information about Nature – all scrambled up together by Bohr into an omelet that nobody has seen how to unscramble, Yet we think the unscrambling is a prerequisite for any further advances in basic physical theory. .. Edwin T. Jaynes, 1990 prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 106/120 SOME QUANTUM IMPLEMENTATIONS prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 107/120 TRAPPED IONS prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 108/120 NMR IMPLEMENTATIONS prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 109/120 APPENDIX - I. APPENDIX - I. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 110/120 IS IT POSSIBLE to BEAT TURING BARRIER? Is it possible to beat Turing barrier? prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 111/120 CHALLENGING TURING BARRIER Terms challenging Turing barrier, Super-Turing computation or Hypercomputation are used for attempts to show that there is a way to solve decision problems and to compute functions that are not decidable or not computable in Turing sense. Interesting enough, the term hypercomputation was introduced by Turing in 1939, in his paper Systems of logic based on ordinals which investigated mathematical systems in which an oracle was available to compute a single arbitrary function. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 112/120 TRY TO BELIEVE IMPOSSIBLE I can’t believe that! said Alice. Can’t you? the Queen said in a pitying tone.Try again: draw a long breadth, and shut your eyes. Alice laughed. There’s no use in trying, she said:one can’t believe impossible things I daresay you haven’t had much practice said the Queen. When I was your age, I always did it for half-an-hour a day. Why sometimes I’ve believed as many as six impossible things before breakfast. Lewis Carol: Through the Looking-glass, 1872 prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 113/120 TURING THESIS and PRINCIPLE Turing thesis, or Church-Turing thesis, can be formulated as follows: Every function that can be computed by what we would naturally regard as an algorithm is computable on a Turing machine, and vice verse. So called Turing principle, formulated by Deutsch, reads as follows: Every finitely realizable physical system can be perfectly simulated by a universal computing machine operating by finite means. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 114/120 CHURCH-TURING THESIS versus PHYSICAL SCIENCES Church-Turing thesis can be seen as one of guiding principles for physics. Church-Turing thesis is under permanent attack from physical sciences. Recognition of physical aspects of Church-Turing thesis has had important impacts. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 115/120 IS THERE A CHANCE TO OVERCOME TURING BARRIER? - WISDOMS? Can we ever exclude the possibility of being presented, some day (perhaps by some extraterrestrial visitors), with a device or “oracle” that “computes” a non-computable function? Martin Davis The reason why we find it possible, to construct, say, electronic calculators, and indeed why we can perform mental arithmetic, cannot be found in mathematics or logic. The reason is that the laws of physics “happen to permit” the existence of physical models for the operations of arithmetic such as addition, subtraction and multiplication. If they did not, these familiar operations would be non-computable functions. We might still know of them and invoke them in mathematical proofs (which would presumably be called “nonconstructive”) but we could not perform them. David Deutsch 1742 prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 116/120 WHY WOULD WE NEED to OVERCOME TURING BARRIER? It is interesting and intellectually usually very rewarding to overcome limitations that seem to be unquestionable. It is an attempt to demonstrate that limits of mathematics ought to be determined not solely by mathematics itself but also by physical principles; It is connected with attempts to see limitations of artificial intelligence and border lines between thinking and computing. Attempts to show that there is a way to overcome Turing barriers are an important way to improve our understanding of physical world and nature in general and to find in it new resources and/or theories. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 117/120 SECOND LAW of A. C. CLARKE The only way of discovering the limits of the possible is to venture a little way past them into the impossible. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 118/120 SOME CLASSICAL APPROACHES TO SUPER-TURING COMPUTATION To assume the existence of infinite resources Turing machines that run for an infinite period of time. Idealized analog computers that can work with all reals (even with uncomputable reals) and can harness them. To assume the existence of exponentially growing resources (for example Turing machines which increase their speed exponentially over time). To use for computation highly chaotic dynamical systems; To use capability to get finite information from infinite data in a finite time. (A summary: uncritical use of continua and dense sets.) prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 119/120 CAN QUANTUM PHENOMENA HELP to OVERCOME TURING BARRIER? Recently, several papers have appeared that claim that they present a way how to solve (in some reasonable sense) some particular unsolvable problems, using certain mathematical representation of some quantum phenomena, in the way that does not seem to contradict laws or limitations of quantum theory. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 120/120 AN EXTENDED VERSION of CHURCH-TURING THESIS Any (non-uniform interactive) network computation can be described in terms of interactive Turing machines with advices (van Leuwen, Wiedermann) that are equivalent to so called site machines and also equivalent to internet machines (GRID-networks) (that is a model inspired by computer networks and distributed computing). All these models accept all recursively enumerable sets and their complements prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 121/120 EXTENDED TURING THESIS The Extended Church-Turing Thesis (or VW-thesis of van Leeuven and Wiedermann) does not aim to attack the Church-Turing thesis; VW-thesis merely tries to identify a new proper extension of Church-Turing thesis (to cover computations that share the following features: non-uniformity of programs, interaction of machines and infinity of operations). VW-thesis tries to see the concept of computation in a broader sense, based on different assumptions and suited to answer different questions. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 122/120 PHYSICAL APPROACH - RELATIVISTIC COMPUTERS Etesi and N´emeti (2002) showed that certain relativistic space-time theories license the idea of observing the infinity of certain discrete processes in finite time. That led to the observation that certain relativistic computers could carry certain undecidable queries in finite time. On this basis Wiedermann and van Leeuwen (2005) designed a relativistic Turing machine that models the above relativistic computer and that recognizes exactly ∆2 set of Polynomial Hierarchy. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 123/120 NP-BARRIER Since NP-complete problems are so important, both practically and theoretically, and so abundant in practice, it is natural that there are many attempts to show that there are ways to solve these problems in polynomial time. There have been many attempts to beat the NP-barrier, using: diabetic computing, variations on quantum mechanics (non-linearity, hidden variable theories), analog computing, but also more esoteric ones such as relativity computing and anthropic computing. None of the above attempts can be seen as successful. They usually forget to count all of the resources needed and/or all of the known physics. Should not we take the “NP-hardness assumption saying that NP-complete problems are intractable in the physical world” as a new principle of physics (as, for example, the Second Law of Thermodynamic is)? (This principle actually starts to be used.) prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 124/120 APPENDIX II APPENDIX II prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 125/120 THE LAW of ACCELERATING RETURNS and ECONOMIC RELATIONS It is the economic imperative of a competitive marketplace that is the primary force driving technology forward and fueling the law of the accelerating returns this is equivalent to survival needs in biological evolution. In turn the law of accelerating returns is transforming economical relationships. We are moving towards more intelligent and smaller machines as the result of myriad small advances, each with its own particular economic justification. Machines that can better carry out their missions have increased value. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 126/120 EXPONENTIAL GROWTH and PERIODIC RECESSIONS Underlying exponential growth in economy is far more powerful force than periodic recessions. Recessions, including depressions, represent only temporary deviations from the underlying exponential curve. Finally, economy ends up exactly where it would have been had the recession/depression never occurred. prof. Jozef Gruska IV054 2215. Future in the informatics era - Chapter 3 127/120