0

WHY THIS MATTERS IN BRIEF

The world runs on legacy code, and it’s hard for people to maintain, operate, secure, upgrade, and de-bug so using AI to modernise it could solve lots of problems.

 

Love the Exponential Future? Join our XPotential Community, future proof yourself with courses from XPotential Universityconnect, watch a keynote, or browse my blog.

Last year, IBM demonstrated how Artificial Intelligence (AI) can perform the tedious job of software maintenance through the updating of legacy code. Now they’ve introduced AI based methods for re-coding old applications so that they can operate on today’s modern cloud computing platforms – something that is a major headache and a hugely expensive problem for most older companies, such as banks, that run hundreds of billions of dollars worth of legacy software.

 

RELATED
Intel unveils Knights Mill, a Xeon Phi for Deep Learning

 

The latest IBM initiatives, dubbed Mono2Micro and Application Modernization Accelerator (AMA), give app architects new tools for updating legacy applications and extracting new value from them. These initiatives represent a step towards a day when AI could automatically translate a program written in COBOL into Java, according to Nick Fuller, director of hybrid cloud services at IBM Research.

Fuller cautions that these latest AI approaches are currently only capable “of breaking the legacy machine code of non-modular monolithic programs into standalone microservices.” There’s still another step involved in translating the programming language because, while the AMA toolkit is in fact designed to modernize COBOL, at this point it only provides an incremental step in the modernization process, according to Fuller.

 

RELATED
Ultra energy efficient molecular computer breaks records and blurs boundaries

 

“Language translation is a fundamental challenge for AI that we’re working on to enable some of that legacy code to run in a modern software language,” he added.

In the meantime, IBM’s latest AI tools offer some new capabilities. In the case of Mono2Micro, it first analyses the old code to reveal all the hidden connections within it that application architects would find extremely difficult and time consuming to uncover on their own, such as the multiple components in the underlying business logic that contain numerous calls and connections to each other.

Mono2Micro leverages AI clustering techniques to group similar code together, revealing more clearly how groups of code interact. Once Mono2Micro ingests the code, it analyses the source and object code both statically, in other words analysing the program before it runs, and dynamically, analysing the program while it’s running.

 

RELATED
IBM pushes the boundaries and unveils the world's first 2nm computer chip

 

The tool then refactors monolithic Java-based programs and their associated business logic and user interfaces into microservices. This refactoring of the monolith into standalone microservices with specific functions minimizes the connections that existed in the software when it was a monolithic program, changing the application’s structure without altering its external behaviour.

The objective of the AMA toolkit is to both analyse and refactor legacy applications written in even older languages (COBOL, PL/I). For the AMA toolkit, static analysis of the source code coupled with an understanding of the application structure is used to create a graph that represents the legacy application. When used in conjunction with deep learning methods, this graph-based approach facilitates data retention as AMA goes through deep-learning processes.

 

RELATED
Saudi Arabia reveal plans for a 100 mile long car free city called "The Line"

 

IBM’s AI strategy addresses the key challenges for machine learning when the data input is code and the function is analysis: volume and multiple meanings. Legacy mission critical applications are typically hundreds of thousands to millions of lines of code. In this context, applying machine learning (ML) techniques to such large volumes of data can be made more efficient through the concept of embeddings.

These embedding layers represent a way to translate the data into numerical values. The power of embeddings comes from them mapping a large volume of code with multiple possible meanings to numerical values. This is what is done, for example, in translating natural human language to numerical values using “word” embeddings. It is also done in a graph context as it relates to code analysis.

 

RELATED
Ground transport at 760mph, HTT unveil their hyperloop passenger capsule

 

“Embedding layers are tremendous because without them you would struggle to get anything approaching an efficiently performing machine-learning system,” said Fuller.

He added that in the case of code analysis, the ML system gets better in recommending microservices for the refactored legacy application by replicating the application functionality.

“Once you get to that point, you’re not quite home free, but you’re essentially 70 percent done in terms of what you’re looking to gain, namely a mission critical application that is refactored into a microservices architecture,” he said.

About author

Matthew Griffin

Matthew Griffin, described as “The Adviser behind the Advisers” and a “Young Kurzweil,” is the founder and CEO of the World Futures Forum and the 311 Institute, a global Futures and Deep Futures consultancy working between the dates of 2020 to 2070, and is an award winning futurist, and author of “Codex of the Future” series. Regularly featured in the global media, including AP, BBC, Bloomberg, CNBC, Discovery, RT, Viacom, and WIRED, Matthew’s ability to identify, track, and explain the impacts of hundreds of revolutionary emerging technologies on global culture, industry and society, is unparalleled. Recognised for the past six years as one of the world’s foremost futurists, innovation and strategy experts Matthew is an international speaker who helps governments, investors, multi-nationals and regulators around the world envision, build and lead an inclusive, sustainable future. A rare talent Matthew’s recent work includes mentoring Lunar XPrize teams, re-envisioning global education and training with the G20, and helping the world’s largest organisations envision and ideate the future of their products and services, industries, and countries. Matthew's clients include three Prime Ministers and several governments, including the G7, Accenture, Aon, Bain & Co, BCG, Credit Suisse, Dell EMC, Dentons, Deloitte, E&Y, GEMS, Huawei, JPMorgan Chase, KPMG, Lego, McKinsey, PWC, Qualcomm, SAP, Samsung, Sopra Steria, T-Mobile, and many more.

Your email address will not be published. Required fields are marked *