Scroll Top

IBM announces their AI can recode legacy software so it can run in the cloud

futurist_application_modernization

WHY THIS MATTERS IN BRIEF

The world runs on legacy code, and it’s hard for people to maintain, operate, secure, upgrade, and de-bug so using AI to modernise it could solve lots of problems.

 

Love the Exponential Future? Join our XPotential Community, future proof yourself with courses from XPotential Universityconnect, watch a keynote, or browse my blog.

Last year, IBM demonstrated how Artificial Intelligence (AI) can perform the tedious job of software maintenance through the updating of legacy code. Now they’ve introduced AI based methods for re-coding old applications so that they can operate on today’s modern cloud computing platforms – something that is a major headache and a hugely expensive problem for most older companies, such as banks, that run hundreds of billions of dollars worth of legacy software.

 

RELATED
Cerebras and G42 launch the world's largest AI supercomputer with 36 Exaflops

 

The latest IBM initiatives, dubbed Mono2Micro and Application Modernization Accelerator (AMA), give app architects new tools for updating legacy applications and extracting new value from them. These initiatives represent a step towards a day when AI could automatically translate a program written in COBOL into Java, according to Nick Fuller, director of hybrid cloud services at IBM Research.

Fuller cautions that these latest AI approaches are currently only capable “of breaking the legacy machine code of non-modular monolithic programs into standalone microservices.” There’s still another step involved in translating the programming language because, while the AMA toolkit is in fact designed to modernize COBOL, at this point it only provides an incremental step in the modernization process, according to Fuller.

 

RELATED
The first federated quantum supercomputers are emerging across Europe

 

“Language translation is a fundamental challenge for AI that we’re working on to enable some of that legacy code to run in a modern software language,” he added.

In the meantime, IBM’s latest AI tools offer some new capabilities. In the case of Mono2Micro, it first analyses the old code to reveal all the hidden connections within it that application architects would find extremely difficult and time consuming to uncover on their own, such as the multiple components in the underlying business logic that contain numerous calls and connections to each other.

Mono2Micro leverages AI clustering techniques to group similar code together, revealing more clearly how groups of code interact. Once Mono2Micro ingests the code, it analyses the source and object code both statically, in other words analysing the program before it runs, and dynamically, analysing the program while it’s running.

 

RELATED
JP Morgan becomes the first major US bank to roll its own crypto coins

 

The tool then refactors monolithic Java-based programs and their associated business logic and user interfaces into microservices. This refactoring of the monolith into standalone microservices with specific functions minimizes the connections that existed in the software when it was a monolithic program, changing the application’s structure without altering its external behaviour.

The objective of the AMA toolkit is to both analyse and refactor legacy applications written in even older languages (COBOL, PL/I). For the AMA toolkit, static analysis of the source code coupled with an understanding of the application structure is used to create a graph that represents the legacy application. When used in conjunction with deep learning methods, this graph-based approach facilitates data retention as AMA goes through deep-learning processes.

 

RELATED
China unveils new neuromorphic chip as it aims for AI dominance

 

IBM’s AI strategy addresses the key challenges for machine learning when the data input is code and the function is analysis: volume and multiple meanings. Legacy mission critical applications are typically hundreds of thousands to millions of lines of code. In this context, applying machine learning (ML) techniques to such large volumes of data can be made more efficient through the concept of embeddings.

These embedding layers represent a way to translate the data into numerical values. The power of embeddings comes from them mapping a large volume of code with multiple possible meanings to numerical values. This is what is done, for example, in translating natural human language to numerical values using “word” embeddings. It is also done in a graph context as it relates to code analysis.

 

RELATED
Drone ray gun gets FAA approval to take out drones at airports

 

“Embedding layers are tremendous because without them you would struggle to get anything approaching an efficiently performing machine-learning system,” said Fuller.

He added that in the case of code analysis, the ML system gets better in recommending microservices for the refactored legacy application by replicating the application functionality.

“Once you get to that point, you’re not quite home free, but you’re essentially 70 percent done in terms of what you’re looking to gain, namely a mission critical application that is refactored into a microservices architecture,” he said.

Related Posts

Leave a comment

You have Successfully Subscribed!

Pin It on Pinterest

Share This