Interview with Patrick Viry
Ateji PX: 'an extension of the Java language with parallel programming primitives.'
Last week, the first public release of Ateji PX went live. Here, JAXenter speaks with Ateji founder Patrick Viry about the new language extension and programming environment for multi-core and parallel programming in Java, and the so-called 'multi-core crisis.'
JAXenter: What is Ateji PX?
Patrick Viry: Ateji PX is an extension of the Java language with parallel programming primitives. Unlike library or preprocessing approaches, this makes parallel programming simple, intuitive and easy to learn.
JAXenter: The release announcement refers to the 'multi-core crisis.' What is this crisis, and how does Ateji PX help overcome it?
Patrick Viry: The term "multi-core crisis" has been coined by chip makers. A couple of years ago, the race for ever faster chips abruptly stopped because they would dissipate too much heat and melt down. Since it wasn't possible anymore to double the speed of processors every 12 to 18 months, chip makers started to double the number of cores on the same silicon die.
Nothing really difficult from a hardware point of view, but a huge software revolution : if you don't make your programs parallel, they will be able to use only one out of many available cores. Until now, it was enough to wait a couple of years to see your application run faster, not anymore. This has been called "the end of the free lunch" for software developers.
The term "multi-core" crisis refers to the lack of tools and languages for writing parallel programs, and parallize existing code.
JAXenter: What existing languages and tools does Ateji PX leverage?
Patrick Viry: Ateji PX builds upon Java, the most popular language nowadays, and the Eclipse IDE. If you know Java and Eclipse, you'll be able to write your first Ateji PX parallel program within a half day.
JAXenter: Which parallel patterns are supported?
Patrick Viry: With only a handful of well designed parallel constructs, Ateji PX is able to express a wide range of parallel patterns:
- data parallelism (running the same operation on a large number of data elements, typically used in simulation and high-performance computing.)
- task parallelism (decomposing a problem into concurrent tasks, typically used in server applications.)
- recursive parallelism (decomposing into smaller and smaller tasks.)
- speculative parallelism (starting a computation before needing the result.)
- the Actor model (independent actors reacting to input messages.)
- data flow (think boxes with input and output wires.)
- stream computing (think database tuples flowing in these wires), on which Google's MapReduce algorithm is based.
While different tools had been devised for these different patterns, Ateji PX provides all of them within one single tool.
JAXenter: What are the advantages of parallel programming?
Patrick Viry: Most people get interested into parallel programming because it is the key to leveraging the power of multicore computers. But there are also other reasons for doing parallel programming:
- Green computing : parallel code on multicore hardware uses less power than sequential code. This is important for embedded systems (such as phones) that heavily rely on battery, and for large data centers that need to contribute to lowering global emissions.
- Responsive user interfaces and responsive servers : if the code is not parallel, it gets unresponsive, stuck or underperformant. Being able to write parallel code in an easy and intuitive way is an important step towards having "small parallel islands" all over the code.
- Algorithmic description : some algorithms require statements such as "read both input 1 and input 2, in whatever order." This is simply not possible to express in sequential languages such as Java. The programmer must order arbitrarily the statements (such as "first read input 1, then read input 2,") at the risk of making the program block.