go to Xputer pages

homepage | impressum  | survey  | Sitemap  |
  last update: 2010  | 

The Reinvent Computing Page @ TU Kaiserslautern

TU Kaiserslautern

 Karlsruhe Institute of Technology (KIT) homepageInstitut für Technik der Informationsverarbeitung (ITIV) des Karlsruher Institut für Technologie (KIT)

Prof. Dr.-Ing. Reiner Hartenstein

Homepage | page | CV  | Reinvent Computing Evangelist  | Impressum

IEEE fellow
SDPS fellow
FPL fellow  


Reiner Hartenstein

Credited to be the father of High Performance Reconfigurable Computing (HPRC) (see here) and the Xputer "anti-machine" data-stream-driven (non-vN) compute paradigm.

Founder of the "E.I.S.-Projekt"  --  "E.I.S." = "Entwurf Integrierter Schaltungen" (design of integrated circuits). See: http://hartenstein.de/EIS/

Wikipedia on the E.I.S. Projekt (in German): click here, and:   


Several Projects: PATMOS project, CVT projectCVS project    the KARL language  E-I.S. Projekt

European EDA industry has been killed     ... by the VHDL lobby

The PATMOS page

the FPL page

next FPL conference



Computing reinvented several times

The first electric computing machine produced in series was ready in 1884: the Hollerith tabulator. For the US census (1889) hundreds of it had been used. With the size of only about 2 kitchen refrigerators It has been quite small (figure 1). This is a good result in relation to the technology state of the art at that time*. Using punched-card input it has been datastream-driven: based on what we later called "the anti machine paradigm" [1].

Computing reinvented in 1945

More than 60 years later than the tabulator the ENIAC computer (figure 2)  was ready (1945). Being a fore-runner of the von Neumann computer it prepared the paridigm shift from data-stream-driven to instruction-stream-driven. By storing the code on magnetic tape memory much more complex problems could be handled than by plug boards (fig. 3: The historic LUT of the tabulator). Not only because of the von Neumann syndrome [2] the ENIAC was massively more inefficient** than the tabulator. The von Neumann paradigm is the foundation of the "Software Crisis". This term has been coined already in the mid 60ies. This crisis is still growing by software complexity not really manageable by humans [3]. Also see [4-6].

Reinventing Hardware Design:       

Computing must be reinvented urgently!               sais Burton Smith, see http://hartenstein.de/Burton_Smith.pdf

  The Programming Crisis     |   Twin Paradigm Computing  


Discarding datastream-based computing by adoption of the von Neumann paradigm much more than half a century ago was the biggest mistake in the history of computing, meanwhile having caused users and vendors altogether to waste quadrillions of dollars.



   *) also see "Computers are facing a seismic shift"   

   The Reconfigurable Computing Paradox   

   von Neumann Syndrome   


Reinventing Computing by RC (Reconfigurable Computing)

 A hundred years after the tabulator (fig. 1) the first FPGA appeared on the market in 1984. Again being data-stream-driven this is a route back to Hollerith. But by new technology the plug board (fig. 3) was replaced by LUTs on the microchip (fig. 4).

Many projects migrating an application from a vN CPU over to a FPGA resulted in speed-up factors by up to 4 orders of magnitude and energy saving factors of up to 3 orders of magnitude (Fig. 5). But here reinventing takes more time until finally the manycore dilemma [7] forces us to bridge the hardware / software chasm by hetero-geneous systems going to become main-stream [8]. Nathan's Law, see [9,10].


[1] the data-streams page http://data-streams.org

[2] The von Neumann Syndrome: click here

[3] Harold Lawson: Infrastructure Risk Reduction  

[4] 9 celebrities: Criticising the von Neumann Model

[5] Larry Osterman: Nathan's Laws of Software

[6] Nathatn Myhrvold: Next 50 years of Software

[7] R. Hartenstein: chapter 2 of the the E.I.S. page http://hartenstein.de/EIS/

[8] R. Hartenstein: chapter 3 of  the E.I.S. page: click here !

[9] Reiner Hartenstein: Nathan's Law summary

[10] Reiner Hartenstein: Der Öffentlichkeit unbekannte massive Energieverschwendung.

[11] Reiner Hartenstein: The PISA project and the DPLA: click here or here

[12] Reiner Hartenstein: DPLA: much more efficient than FPGAs: click here

Some Evangelist's Links:

The von Neumann syndrome

The tail wagging the dog

The Watering Can Model (slide 16)

We need a Seismic Shift

Future Computer Systems

Xputer lab achievements

Xputer-related Literature

The Worst Mistake in the History of Computing

How (not) to Invent Something

CS suffers from the Tunnel Vision Syndrome

The biggest mistake in the history of EDA

The Leading Design Language in the 80ies

Multiplier Chip automatically generated from the Math Formula

click here:

click here:

*) Advanced computer components as we know it had not yet been invented in 1884: the transistor (1934), the vacuum tube (1904), ferrite core memory (1949) magnetic tape (1898) magnetic drum (1932), hard disc (1956). In relation to the technology state of the art at that time punched card decks and their fast handling and reading devices have been a quite efficient memory technology.

**) ENIAC needed 167 square meters floor space, weighed 30 tons, consumed 160 kilowatts of electrical power