This article is more than 1 year old

Boffins foresee most software written by machines in 2040

But people will still play a role in crafting code

Boffins at the Department of Energy's Oak Ridge National Laboratory speculate that by 2040 advances in AI disciplines like machine learning and natural language processing will shift most software code creation from people to machines.

In a paper distributed via ArXiv, "Will humans even write code in 2040 and what would that mean for extreme heterogeneity in computing?", ORNL researchers Jay Jay Billings, Alexander McCaskey, Geoffroy Vallee and Greg Watson suggest machines will be doing much of the programming work two decades hence.

"The major technologies that will drive the creation and adoption of [machine-generated code] already exist, either at research institutions or in the marketplace," the foursome state.

And they anticipate that the various efforts are underway to make code generation more efficient are likely to turn programming into something rather routine.

If people do need to write some code, "they may find that they spend more time using autocomplete and code recommendation features than writing new lines on their own," they say.

As examples of current research trends, they cite: the Defense Advanced Project Agency’s (DARPA) Probabilistic Programming for Advancing Machine Learning (PPAML) program, an effort to make machine learning more broadly accessible and applicable; DeepCoder and AutoML, projects which produce code from machine learning; ontology generation tools like Dresden Ontology Generator for Directed Acyclic Graphs (DOG4DAG) that structure knowledge with limited input; and code generation technologies like the Eclipse Modeling Framework and Sirius.

They also observe that the application programming interfaces (APIs) for scientific libraries are becoming standardized such that academics need only understand the problem domain, without being deeply versed in using the API.

This future isn't assured, though current devops practices lean heavily on automation and presumably will do so even more as AI advances and human programmers systematize the management of technical infrastructure at scale. The tools for writing apps that write other apps are also showing up in various programming languages, like Go.

One of the major challenges for such systems will be dealing with varied hardware architectures and software requirements. The researchers expect machine learning systems will need further layers of abstraction to operate across heterogeneous systems. But they point to recent Facebook research, saying it suggests machines may be able to negotiate with each other to communicate their requirements.

Another speed bump on the way to code without coders involves developing a better understanding of how to allocate hardware resources such that each system is optimally employed. With such knowledge, an automated programming system might avail itself of IBM's brain-inspired neuromorphic chip True North, which handles pattern recognition better than double precision arithmetic, to choose algorithms and implementation details. And then it might turn to a quantum computer for code optimization.

Billings, McCaskey, Vallee, and Watson see automated coding as a way to focus on higher level problems. "Machines writing code under human direction will only further improve our ability to explore the universe, enjoy life, and stream Netflix, especially if it saves us the trouble of learning how to make extremely heterogeneous systems work together," they conclude.

In an email to The Register, Jeff Bigham, associate professor at Carnegie Mellon's Human-Computer Interaction Institute, agrees that the nature of programming will change in the years ahead, but he questions some of the assumptions made by the ORNL researchers.

"In 2040, you'll still need someone who understands how to break down problems into computation (aka, experts at computational thinking), but those people will likely not have to know nearly as much about the specifics of the computer languages, data formats, and APIs that programmers have to know today to be effective," he said.

He expects programming 23 years hence will become more natural, with high level languages being able to accommodate pseudo code that isn't syntactically correct. He likens the leap to the way programming has changed over the past two decades.

"In 1997, I had to know how most everything worked, and my references were some books and maybe an online newsgroup," he said. "These days, I kind of just have to know vaguely what's possible, and then I find the specifics on Stack Overflow. In 2040, machine learning will fill in the gaps that I currently have to hope someone has filled in for me on Stack Overflow. That is a huge deal, but won't yet be computers actually programming for me."

Bigham said natural language processing is notoriously hard and computers still aren't very good at it.

He also took issue with a problem presented in a paper: "Given my morning cup of Starbucks coffee, under standard assumptions, what is the temperature of the coffee after ten minutes?"

The paper posed that question as the sort of programming challenge that might be answered through automated code.

Bigham said these "standard assumptions" shouldn't assumed because doing so glosses over a lot of questions that could be considered to write the coffee temperature prediction algorithm well, such as the cup material, the temperature outside, whether the coffee is being consumed as it cools, and so on.

"If you think the role of a programmer now is to take a perfectly specified problem as input and turn it into a language a machine understands, then probably not too many programmers like that will exist in 2040," he said. "But not all that many programmers like that exist today. Even the best written spec today includes a lot of unstated commonsense assumptions." ®

More about

TIP US OFF

Send us news


Other stories you might like