On a wing and an AI-powered tool

On a wing and an AI-powered tool


AERIAL FORCE. The region around a moving wing is physically complex, with strong vortices and sharp gradients
| Photo Credit:
Oleh_Slobodeniuk

Pressure is the most important quantity in fluid mechanics, and one of the hardest to measure. Engineers can track velocity in a flow and follow tracer particles with lasers. But the pressure field, which ultimately determines the forces on wings, turbines, and swimming animals, remains largely invisible. Engineers designing small drones that mimic insect flight, or biologists trying to understand how a dragonfly generates lift through each wing stroke need that data. Most of the time they have to guess it, model it, or go without it.

A few years ago, a class of artificial intelligence models called physics-informed neural networks, or PINNs, offered a different approach. Rather than fitting a curve to data, PINNs embed the governing equations of fluid mechanics directly into the learning process. Feed the model velocity measurements, encode the laws of motion, and the pressure field emerges as a by-product, inferred rather than measured. The approach sits at the heart of what researchers now call AI for Science, a broader movement that includes digital twins of physical systems, where AI learns from known governing laws rather than from data alone. Its appeal in engineering is direct: instead of running expensive computational simulations of fluid dynamics, researchers can recover hidden quantities directly from measured data.

The practical reality, however, was messier. PINNs turned out to be temperamental. They worked well over short time windows and simple flows, but ask them to track a system over many cycles of motion — say, a flapping wing beating through twenty strokes — and the results deteriorated badly. Errors accumulated. Frequencies were missed. The physics got lost somewhere in the mathematics of training. The instinctive fix — throwing more computational power at the problem — did not work: increasing the network size five-fold over long time domains produced no meaningful improvement. For studying the kind of complex, long-duration flows that matter most in biology and engineering, standard PINNs were falling short.

Systematic solution

A research team from IIT-Madras and the LISN-CNRS laboratory in France has now published a systematic solution to this problem. The researchers identified three distinct reasons why PINNs struggle with time: the data can be too sparse; the time window too long; or the flow too spectrally complex, containing multiple interacting frequencies that no one told the model to look for.

The test-bed was a flapping elliptic air foil operating in conditions typical of insect wings and small unmanned aerial vehicles. The researchers ran two scenarios: periodic flow, repeating with each stroke; and quasi-periodic flow, which is seemingly regular but contains subtle, clashing frequencies caused by the way air swirls off the wing’s leading and trailing edges at slightly different rhythms. The quasi-periodic flow is associated with enhanced lift generation.

The core proposal was to stop treating time as a single, undivided domain. Rather than training one large neural network over the entire time history, they divided the temporal domain into segments of two or three flapping cycles each, and trained a smaller network on each segment in sequence. At the start of each new segment, the network was initialised not from scratch but from the weights of the previously trained network. This is transfer learning: the model carries forward what it has already learned about the physics and flow structure of the previous interval.

The improvement was substantial: Pressure reconstruction errors fell from 36 per cent to around 7 per cent. For quasi-periodic flows, the model successfully reconstructed the complex frequency spectrum, including multiple interacting peaks in the drag signal, which the standard model missed entirely.

The researchers also identified a leaner variant that trains each subsequent segment with fewer iterations and a lower learning rate. It matched the accuracy of the full approach while cutting training effort by roughly a third — useful for longer time histories or more complex geometries.

The team also introduced a practical data strategy they call ‘preferential spatio-temporal sampling’. The region immediately around the moving wing is physically complex, with strong vortices and sharp gradients; the wake further downstream is smoother and more predictable. The method concentrates its sampling budget on the chaotic air-wing interface, leading to fewer data points, lower computational overhead, and improved accuracy — a meaningful reduction in GPU time and cloud computing costs.

The immediate application is in experimental fluid mechanics. Take velocity data from a wind tunnel or water tunnel, run it through a trained PINN, and recover the pressure field and aerodynamic loads without any additional instrumentation. For bio-inspired flight research, where attaching pressure sensors to a dragonfly is not a realistic option, this is a significant step. For engineers working on micro-aerial vehicles, small surveillance drones, and search-and-rescue platforms, the ability to model quasi-periodic flapping accurately over long flight strokes is directly relevant to understanding how wing geometry and stroke patterns generate lift.

Limitations

There are limits. Strongly aperiodic or chaotic flows remain out of reach: where the frequency content is wild and the system is sensitive to initial conditions, neural networks lack the representational capacity to keep up. The paper also flags a subtler constraint: because the training data and the pressure benchmarks were produced by two different computational solvers, a small slice of the reported error reflects disagreement between tools rather than any weakness in the method itself. And the study was conducted in two dimensions; extending it to realistic three-dimensional wing geometries will require further work on sampling and computational cost.

More Like This

Boy Wirat
Rananjay singh

Published on March 9, 2026



Source link

AI tool for capturing and managing hospital records

AI tool for capturing and managing hospital records


From the labs

An AI-assisted system for capturing, managing and analysing clinical data in hospitals and clinics has been developed by Plenome Technologies, a company founded by Prof. Prabhu Rajagopal of IIT-Madras.

The platform, christened AshwinAI, captures clinical data through voice, converts it into structured electronic health records, and generates diagnostic insights while ensuring secure handling of patient data. It is designed to record medical data during patient consultations. Instead of doctors manually filling forms, the system allows voice-based entry of clinical information, which is converted into structured digital records. It supports multiple Indian and foreign languages, and converts conversations and notes into structured electronic health records, thereby helping reduce paperwork for doctors and standardising medical records.

The system maintains secure patient records and medical histories, allowing hospitals to manage and retrieve data more efficiently. Once records are structured, the platform can run AI analysis to generate insights, potentially helping doctors with diagnosis patterns, treatment tracking, and patient trends. In other words, the AI layer converts raw clinical notes into usable medical intelligence.

Much of a doctor’s work still gets recorded in handwritten notes or free-text entries in hospital software. Such records are difficult for computers to analyse. AshwinAI attempts to change this by capturing information — often through voice input during consultations — and converting it into structured electronic health records that software systems can analyse.

Another noteworthy feature is the multilingual use. In India, doctors often speak to patients in regional languages but record notes in English. AshwinAI is designed to capture spoken input and convert it into usable digital records, potentially in multiple languages. Furthermore, the data layer created by AshwinAI is extremely valuable, because once patient information is captured in a structured digital format across thousands of consultations, it becomes possible to run analytics on disease patterns, treatment outcomes and clinical workflows. Such datasets can support research, improve hospital management and enable more advanced AI tools in the future.

A device for borewell rescues

A borewell rescue device has been developed by Sadham Usean Ramasamy, a PhD scholar at IIT-Madras. The device was recently demonstrated at the National Disaster Response Force campus in Arakkonam, near Chennai.

Borewell accidents involving children slipping into deep borewells have claimed many young lives. Many scientists and inventors have been working on developing rescue devices. The machine developed by Ramasamy is one such.

A tripod is balanced on the ground, above the borewell opening. A holder, with a camera, is slowly lowered into the hole. When the holder reaches the trapped child, an oxygen supply valve snaps open.

The holder also has inflatable balloons; these are first positioned around the child and then inflated. The child is securely held by the holder. The winch motor on the ground gently pulls the child out.

More Like This

Pitris

Published on March 9, 2026



Source link

Carnot battery: Carbon dioxide as ideal ‘working fluid’

Carnot battery: Carbon dioxide as ideal ‘working fluid’


A Carnot battery helps store energy in the form of heat. Named after French physicist Nicolas Léonard Sadi Carnot — considered the ‘father of thermodynamics’ — the battery converts electrical energy into heat, stores the heat and, when needed, converts it back to electrical energy.

An overview of Carnot batteries cited the use of carbon dioxide as the optimum choice of working fluid. Published in the Renewable and Sustainable Energy Reviews journal, the paper states that the battery can store large amounts of renewable energy.

The problem with renewable energy is its intermittency — solar panels and wind turbines yield energy only when the sun shines or the wind blows, respectively. They idle at other times, forcing users to draw coal-based power.

Storing such energy would require very large batteries.

A Carnot battery works like a thermal storehouse. On a sunny afternoon, when a solar unit generates more power than what is needed, the battery system conducts a thermal cycle by using the extra electricity to work a heat pump to compress carbon dioxide and store it under high pressure. During discharge, the stored carbon dioxide is expanded through a turbine to generate electricity (the storage medium could be some salts or rocks), according to an earlier paper in the Energies journal.

Typically a Carnot battery’s efficiency is 30–70 per cent. If you store 100 units of solar power in a Carnot battery, it will give back 30–70 units. The lower end is below that of lithium ion cells, but a Carnot battery is far less expensive, besides obviating the need for critical minerals such as lithium.

Carbon dioxide is a good choice as ‘working fluid’ as it is non-toxic and non-inflammable. Compared with air or hydrogen, it enables higher efficiency in a Carnot battery since it is better at heat transfer and can store more heat since it has a higher energy density. The components used are also smaller in the case of carbon dioxide.

A review published in the Journal of Energy Storage in 2022 indicates that the levelised cost of storage for a form of pumped thermal energy storage system (such as Carnot) is €70–110 per MWh, with an eye on achieving 72 per cent efficiency.

The cost for a pumped hydro project would be about €110 per MWh and for lithium-ion batteries about €300 per MWh. The levelised cost of storage refers to the total cost of the storage system per unit of electricity discharged, including initial capital outlay and charging costs.

‘Long storage’

The scale of the models studied for carbon dioxide Carnot batteries went up to 100 MW. The authors say this “shows great promise… for large-scale, long-term energy storage application”, with real efficiency of 40–80 per cent, depending on operating conditions.

Dr Satya Seshadri, Associate Professor, IIT-Madras, says the technology is being piloted across the world, including in India. NTPC had, in January 2025, a 160-MW carbon dioxide-based Carnot system installed by the company Energy Dome.

He points out that carbon dioxide-based Carnot batteries are good for long-duration energy storage — anywhere from 24 to 72 hours. “Lithium-ion batteries are great for short periods, that is 4-6 hours, but become expensive beyond that.”

On the other hand, carbon dioxide-based Carnot batteries require heavy capital expenditure and hence work best for large-scale industrial or grid use.

Seshadri also refers to the lifetime of the technology. “Many turbines and power plants run with those systems for 20-30 years.” Over time, he says, ‘round-trip efficiency does not matter as much as the lifetime reliability of the technology’.

Also, while India has greatly stepped up its renewable energy capacity, it still constitutes only 15 per cent of all energy capacity. “When we get to 50 per cent, then there will be need for storage of all durations,” he says.

Asked if such technology also has the potential to cut carbon dioxide levels and aid in achieving net-zero emission goals, he explains that these batteries are “closed-loop” — namely they circulate the same carbon dioxide within the system.

More Like This

Published on March 9, 2026



Source link

Qualcomm has an Edge in India

Qualcomm has an Edge in India


Durga Malladi, EVP and GM (Technology Planning, Edge Solutions and Data Centre), Qualcomm Technologies

For true democratisation of AI and sustainability, you have to take AI to where data originates, says a senior executive at fabless chip design giant Qualcomm Technologies.

“As data centres (DCs) roll out at large scale, the question to ask is, ‘Does it make sense to actually do all the processing in the DC or should we distribute the workload across the entire network?’” says Durga Malladi, EVP and GM (Technology Planning, Edge Solutions and Data Centre), Qualcomm Technologies. This narrative increasingly resonates with policymakers and enterprises, he adds.

“With AI agents coming in, you don’t have to pick up your phone and do things. The idea is to simply use voice and you have an agent that breaks down any complicated task, run and tap into applications behind the scenes,” Malladi explains.

Qualcomm supports smartphones, PCs, automotive electronics and new-age tech, such as Meta’s Ray Ban glasses, with Edge AI.

At the recent AI summit in New Delhi, the tech major announced that it is working closely with India’s sovereign AI startup Sarvam, whose models run on Qualcomm Snapdragon platforms.

Demand for memory

However, alongside the bullishness on Edge AI, Qualcomm has resumed making silicon for DCs, challenging Nvidia’s domination in this segment.

Its newly launched AI chips for DCs — AI200 and AI250 — are designed for improved memory capacity and running AI inference. “We have a deployment coming up in Saudi Arabia. We have also received tons of interest from local [Indian] DC players,” Malladi says.

Qualcomm CEO Cristiano Amon had, in a recent earnings call, flagged an existing memory shortage, adding the company anticipates an early resolution. Malladi says a single DC rack has about 40 terabytes memory, so the requirement for the planned capacity rollouts dwarfs the original estimate.

Qualcomm also announced it is partnering with Tata Electronics to manufacture Qualcomm automotive modules at the latter’s upcoming semiconductor assembly and test facility in Jagiroad, Assam. The modules can support digital cockpits, infotainment, connectivity and intelligent vehicle systems.

More Like This

KateLeigh

Published on February 23, 2026



Source link

YouTube
Instagram
WhatsApp