AI’s impact shines in Nobel Prizes

|
  • 0

AI’s impact shines in Nobel Prizes

Thursday, 24 October 2024 | B K Singh

AI’s impact shines in  Nobel Prizes

David Baker, Demis Hassabis and others have harnessed AI to revolutionise medicine and nanotechnology, ushering in a new era of scientific breakthroughs

This year’s Physics and Chemistry both Nobel prizes are linked to Artificial Intelligence. While Physics laureates applied principles and tools of Physics namely atomic spin and Boltzmann’s machine to create structures that can store, reconstruct and discover properties in data for artificial neural networks to enable powerful computers to mimic human memory and learning, Chemistry laureates worked on computational protein design that advanced protein structure predictions. In all living creatures, proteins are essential for cell functions. David Baker, a 62-year-old Professor at the University of Washington, Seattle succeeded in building an entirely new kind of Protein, that is known as life’s building blocks and comprises 20 different amino acids. Since 2003, he and his research group have created several imaginative proteins that can be used in pharmaceuticals, vaccines, tiny sensors and nanomaterials. He was awarded half the prize money for the 2024 Chemistry Nobel Prize.

In 2020 David Hassabis and John Jumper both Britons aged 49 years and 39 years respectively presented AlphaFold2, an AI model that predicted the structure of Proteins from amino acid sequences. Two Britons were awarded this year’s remaining half of the prize. AI model AlphaFold2 has opened ways to predict proteins and has already been used by researchers to predict nearly 200 million proteins. Research has also advanced in understanding antibiotic resistance and creating enzymes that have the potential to decompose plastics. Proteins are molecular machines of our lives making up a significant portion of bodies i.e. muscles, enzymes, hormones, hair, blood and cartilage.

Proteins’ shape determines their function. Different shapes can be folded in numerous ways to arrive at the resultant shapes of protein. Incorrectly folded protein may lead to improper functioning and can be the reason for Alzheimer’s, cystic fibrosis, and diabetes. As protein’s shape is solely based on amino acid sequence, these can manoeuvre into specific shapes duly minimising inter-atomic repulsions among the atoms of amino acids.

In the new era of machine learning laureates have solved the problem by using a large database of experimentally obtained protein structures to train AI and allowed it to learn the principles of folding. The result was AlphaFold2 which predicted three-dimensional structures of proteins.  A new nanomaterial was developed in 2016 where up to 120 proteins were spontaneously linked together. In 2021 nanoparticles with protein were developed that were found to imitate the influenza virus which will further help in developing a vaccine for influenza. Vaccines for livestock using the technology have already been successful. Some of the other developments are the proteins that function as molecular rotors and also the geometrically shaped proteins that can change their shape under external influences.

Such geometrically shaped proteins which are impacted by external influences are quite useful in producing tiny censors. Many new possibilities for the use of protein in nanotechnology are useful to develop new tools for precise control and manipulation of elements at the nanoscale. Magnetic resonance imaging (MRI) of molecules has been studied and researchers have found how specially modified diamond flakes can be used as nanoscale magnetic field detectors.

These tiny sensors can elucidate the structure of a single organic molecule.

With nanoscale MRI researchers may one day directly image proteins.  A 91-year-old Professor at Princeton University, USA John Hopfield and a 76-year-old Professor at University of Toronto, Canada have been awarded this year’s Nobel Prize in Physics. Hopfield created associative memory for storing and recreating images.

Like the structure of the brain, the artificial neural network was created where the brain’s neurons were represented by nodes and each of these nodes having different values. A neural network as a system is inspired by how the human brain works, it does the same with data. It takes in information, processes it and makes decisions, learning from experience. Neural network utilises Physics that describes characteristics of any material due to its atomic spin, which makes each atom a tiny magnet. Hopfield described the network as energy in the spin system as in Physics and is trained by finding values for connections between the nodes such that saved images have low energy. When it is fed with incomplete/ distorted images it works through the nodes and updates the values for each in such a way that the energy of the network falls.

The overall state of his network i.e. energy is calculated using a formula that uses all values of the nodes and all strengths of connections between them. When the minimum energy is attained, the corresponding saved image is most like the imperfect image fed in the network. Further improvements have made it possible to save more pictures and to differentiate between them even when they are quite similar.

Using the Hopfield network, Hinton found the Boltzmann machine using tools from statistical Physics – a new network that can learn to recognise a characteristic element in the given data. Researchers use the tool from statistical mechanics to look at the collective behaviour of a large number of particles to determine large-scale microscopic properties like temperature, pressure and magnetisation.

There are different energy states in the model and the material is more likely to exist in the lowest energy state. Boltzmann distribution explains how likely the given state is. It describes the probability of a system being in a particular state – solid, liquid or gas – based on its energy and temperature. Hinton used the tool to solve computing problems in neural networks.  Hinton added hidden layers to neural networks allowing machines to analyse data in more sophisticated ways.

The machine is trained by feeding it with examples that are most likely to arise when it is run. Boltzmann machine can also classify images or create new examples on the pattern it was trained. A trained Boltzmann machine can recognise a familiar trait in information it has not previously seen. Hinton has further developed this work which led to the current explosive development of machine learning. “The laureates’ work has already been of greatest benefit. In Physics we use artificial neural networks in vast range of areas, such as developing new materials with specific properties.” Says Ellen Moons, chair of the Nobel Committee for Physics.

The scientific and academic communities have expressed confusion over how machine learning and artificial neural networks can be aligned with the traditional boundaries of Physics. Nevertheless, the work of Hopfield and Hinton is having a profound impact on the digital age, even if its classification within the Physics domain remains contested.

The laurites’ contribution is the brain behind medical imaging systems that detect cancer faster than doctors. Scientists are inventing more and more applications of machine learning and advanced studies in Physics like processing data to discover Higgs particles, searching for exoplanets, and reducing noise in the measurement of gravitational waves from colliding black holes.

( The writer is Retired Principal Chief Conservator of Forests- head of Forest Force-Karnataka; views are personal)

Sunday Edition

Lighting up the Holiday Spirit

22 December 2024 | Abhi Singhal | Agenda

Unwrapping Festive Flavours

22 December 2024 | Team Agenda | Agenda

Plates that teleport to Iran

22 December 2024 | Team Agenda | Agenda

Winter Wonderland

22 December 2024 | Team Agenda | Agenda

Savour the Spirit of Christmas!

22 December 2024 | Divya Bhatia | Agenda

A Paw-some Celebration of Pet Love

22 December 2024 | SAKSHI PRIYA | Agenda