This article is more than 1 year old

Competition crowdsources blisteringly-fast software

TopCoder challenge helps immune system research

If you want a massive improvement in the software you use, the cheapest way to get it is to host a competition on TopCoder.

That seems to be at least one of the discoveries made when a group of research biologists staged a competition on the MIT-operated site. A two-week contest with regular prizes of $US500 ended up costing the researchers just $US6,000, and yielded new – and hugely efficient and effective – software for analysing immune system genes.

The real-world problem presented by the researchers was to analyse the genes involved in producing antibodies and T-cell receptors. This is definitely a non-trivial problem in genetic research. As Nature puts it:

“These genes are formed from dozens of modular DNA segments located throughout the genome, and they can be mixed and matched to yield trillions of unique proteins, each capable of recognizing a different pathogen or foreign molecule.”

With that kind of complexity, the problem is demanding on computing resources and software.

Hence the competition: the lead researcher, Eva Guinan (of the Dana-Farber Cancer Institute) and her collaborators asked TopCoder participants if they could do better: “The researchers offered TopCoder what they thought would be an impossible goal: to develop a predictive algorithm that was an order of magnitude better than either a custom solution developed by Arnaout [Ramy Arnaout of the Beth Israel Deaconess Medical Centre] or the NIH’s standard approach (MegaBLAST)”.

The result was a huge success, 84 solutions were offered by entrants in the competition, 16 of them outperforming MegaBLAST.

The best-of-the-best was 970 times faster than either MegaBLAST or Armout’s software, which should go some way towards Guinan’s perfect world in which the researcher could run this kind of analysis on their laptops instead of supercomputers.

There were 733 participants in the competition, of which 122 submitted code; 44 percent of them were software professionals, and the rest were students at various levels.

For The Register, this is a killer observation: to make the problem accessible for the competition, “they had to first reframe the problem, translating it so that it could be accessible to individuals not trained in computational biology.”

In other words, if you ask the right question, you can get the right answer – remarkably cheaply. ®

More about

More about

More about

TIP US OFF

Send us news


Other stories you might like