IoT Insider Editor Kristian McCann speaks with Dr Leslie Kanthan, CEO of TurinTech AI, on AI, its explosion on to the scene, its realised potential, current pitfalls and how optimising code could be the way forward for the next leap in its abilities.
AI’s recent surged into mainstream has been impactful and profound, largely thanks to breakthroughs like ChatGPT. By leveraging complex algorithms and vast amounts of data, they offer significant potential in various sectors, from healthcare to finance, and revolutionise the way we interact with digital systems, automate tedious tasks, and provide insights derived from large-scale data analysis. However, despite its promise, AI currently faces several pitfalls. A notable challenge is the substantial energy required to power these systems; generating a single response in platforms like ChatGPT involves significant computational resources, leading to high power consumption. This aspect not only raises environmental concerns but also underscores the need for more energy-efficient AI models that can sustainably scale to meet growing demands.
“So what we mean by code optimisation is effectively identifying code that is slow, and ensuring that it’s faster and using up less resources such as memory, CPU, and thereby being less on the energy,” says Kanthan. “This approach not only speeds up processing times but also aligns with the growing need for environmental sustainability in the tech industry.”
Delving deeper into the technical aspects, Kanthan explained the use of deep learning, transformers, genetic algorithms, and their proprietary methods inspired by natural processes are used to help identify such inefficiencies in the code before they are then reworked to a more optimal state.
Kanthan highlighted how optimisation of the code is not just beneficial for AI models. By making the model more efficiency, code optimisation can help other processes downstream, like data science teams facing escalating compute power and associated costs, and by creating significant reductions in Cloud and hardware expenses.
Looking towards the future, Kanthan discussed the emerging trends in AI optimisation. These include not only code optimisation but also hardware optimisation, model optimisation, and even device optimisation, like optimising applications on smartphones for battery efficiency.
While AI continues to break new ground in various industries, Kanthan explained how pivotal AI optimisation will be on the development of AI models and machine learning techniques going forward.
For a deeper dive into Dr Leslie Kanthan’s insights, and to understand the broader implications of AI optimisation in the tech industry, listen to the full podcast episode available on Spotify, Apple Podcasts, and at the link below.
Why code optimisation could be the next step for better AI – IoT Unplugged
If the idea of appearing on the podcast to talk about IoT inspires you, feel free to reach out to us and pitch a topic you want to talk about and help us unplug the potential of IoT and explore the limitless opportunities it presents.