This article is more than 1 year old

Pass the 'Milk' to make code run four times faster, say MIT boffins

New programming language does clever things with caches

MIT boffins have created a new programming language called “Milk” that they say runs code four times faster than rivals.

Professor Saman Amarasinghe says the language's secret is that changes the way cores collect and cache data.

Today, he says, cores will fetch whole blocks of data from memory. That's not efficient when working on tasks like big data, when only some of a block's content is needed by an application that may want to work on only a few items across very large data set.

Milk therefore proposes to instead work as follows:

“... when a core discovers that it needs a piece of data, it doesn’t request it — and a cacheful of adjacent data — from main memory. Instead, it adds the data item’s address to a list of locally stored addresses. When the list is long enough, all the chip’s cores pool their lists, group together those addresses that are near each other, and redistribute them to the cores. That way, each core requests only data items that it knows it needs and that can be retrieved efficiently.

With only important data making it into caches, things run more smoothly.

MIT's Computer Science and Artificial Intelligence Lab says “Milk simply adds a few commands to OpenMP, an extension of languages such as C and Fortran that makes it easier to write code for multicore processors.”

Using it means inserting “a couple additional lines of code around any instruction that iterates through a large data collection looking for a comparatively small number of items.” Milk’s compiler then gets all the fun of tracking what data has made it into which cache and other memory management magic.

Milk will be poured at this week's International Conference on Parallel Architectures and Compilation Techniques, and it is at an early stage of development. But its developers hope it will make it possible to squeeze more performance out of current CPUs and memory while we wait for hardware to speed up.” ®

More about

More about

More about

TIP US OFF

Send us news


Other stories you might like